Why did Democrats all start moving in lockstep to ban gas stoves, seemingly with no prior concern at all? And sure enough, with a little digging, it’s been revealed that this isn’t just idle science taking place.
The company behind the study is called “Carbon-Free Buildings.” That company is a partner of the World Economic Forum and has a true-believer CEO who wants to rid the world of all carbon emissions (which is impossible and would lead to mass extinction). //
Now, it’s all starting to make sense. Is this moral panic over gas stoves really about children and asthma? Of course, it’s not. Rather, it is yet more nonsense from the WEF and its like-minded corporate underlings regarding climate change. It’s obvious, in my view, that the citation of health risks being bandied about currently is just a convenient cover to try to force more people off forms of energy that rely on fossil fuels (i.e. natural gas). //
But what’s scary here is how quickly they were able to mobilize. A study gets put out, Ocasio-Cortez and others run with it, and suddenly the federal government is trying to tell you what stove you can cook on. There’s a certain psychotic efficiency to the modern left, and it shouldn’t be brushed aside.
shayre utilizes your own storage, not cloud storage. You know where your files are and where you want them to be. shayre gets them where they need to go as soon as possible without any clicks. If you still want the cloud, shayre works with it too. ///
possible commercialized Syncthing fork?
FAA type certification was awarded June 11, 1963, and Helio gave the Twin Courier a designation of H-500. Foreseeing military use, the U.S. Air Force assigned the designations U-5A and U-5B to the naturally-aspirated and turbocharged versions, respectively. But despite the certification and preparation, only seven examples would ever be produced. //
The operational history of these seven aircraft is as unique as their appearance. While Helio publicly stated that all Twin Couriers were delivered to the CIA, they would go on to operate in clandestine operations under various entities of the U.S. military and government. Over their operational lives, some would be given USAF markings, while others would wear civilian paint schemes and civilian registration numbers. The N-numbers were registered to entities speculated to be shell companies for the CIA. //
Dr. Joe F. Leeker of the University of Texas, Dallas, has compiled what might be the most comprehensive history of the Twin Courier. In it, he traces the progression of each airframe through its respective history, noting that they saw service in Nepal, Bolivia, Peru, and the U.S. before being transferred to—and disappearing in—India. From there, the trail goes cold, and no Twin Couriers are known to exist today.
We can speculate, however. Given the rugged, remote areas in which the aircraft were known to operate, demanding airstrips and conditions likely claimed more than one aircraft. //
Considering the clever engineering and intriguing history of the Twin Courier, it’s unfortunate none exist today to be admired in person by future generations. Despite being certified by the FAA, it’s unlikely more will ever be built.
Update: Banning Gas Stoves
Yesterday’s announcement that the unelected Consumer Product Safety Commission was considering a ban on gas stoves drew sharp criticism from Republicans and Democrats. The commission claims that while gas stoves are in 40 million households, they emit dangerous toxins that harm users. One of the most salient points against this was a recirculated document from the National Fire Protection Association claiming electric ranges (stoves) are significantly more dangerous than their gas counterparts. https://www.nfpa.org//-/media/Files/News-and-Research/Fire-statistics-and-reports/US-Fire-Problem/Fire-causes/oscooking.pdf The data says electric ranges are:
-
3.4 times more likely to cause a fire-related death.
-
2.6 times more likely to cause a fire.
-
4.8 times more likely to cause a fire-related injury.
UPDATE: Following the unprecedented backlash, the Consumer Product Safety Commission has scrapped its plans to ban gas stoves.
Wood gas vehicles: firewood in the fuel tank
During the Second World War, almost every motorised vehicle in continental Europe was converted to use firewood. Wood gas cars (also known as producer gas cars) are a not-so-elegant but surprisingly efficient and ecological alternative to their petrol (gasoline) cousins, whilst their range is comparable to that of electric cars. Rising fuel prices and global warming have caused renewed interest in this almost-forgotten technology: worldwide, dozens of handymen drive around in their home-made woodmobiles.
Wood gasification is a proces whereby organic material is converted into a combustible gas under the influence of heat - the process reaches a temperature of 1,400 °C (2,550 °F). The first use of wood gasification dates back to 1870s, when it was used as a forerunner of natural gas for street lighting and cooking.
In the 1920s, German engineer Georges Imbert developed a wood gas generator for mobile use. The gases were cleaned and dried and then fed into the vehicle's combustion engine, which barely needs to be adapted. The Imbert generator was mass produced from 1931 on. At the end of the 1930s, about 9,000 wood gas vehicles were in use, almost exclusively in Europe.
Second World War
The technology became commonplace in many European countries during the Second World War, as a consequence of the rationing of fossil fuels. In Germany alone, around 500,000 producer gas vehicles were in operation by the end of the war. //
"Park an Italian sports car next to a wood gas car and the crowd gathers around the woodmobile. Nevertheless, wood gas cars are only for idealists and for times of crisis. //
Another problem of wood gas cars is that they are not particularly user-friendly, although this has improved compared to the technology used in the Second World War. See the second part of this pdf document (page 17 and further) for a description of what it was like to drive a wood gas car back then:
"...experience at the Wurlitzer organ could be a distinct advantage".
in moments of deep grief you’re faced with a decision: either cling to God and let him be your source of comfort, or run from him and wade through the grief on your own.
You can’t make it through the expatriate life without experiencing the touch of grief. Grief is temporarily or permanently losing something that you loved. Living a life of high mobility, constant goodbyes, and exposure to big and little traumas causes griefs to steadily stack up along the way. I’ve written a couple of books on this metaphor, which I call the Grief Tower. //
Our personal storylines tend to subconsciously ripple into an assumption that God responds the same way to our grief that we as humans do.
When people say, “Look at the bright side,” we think the right thing to do is to stay positive. We forget that God invites lament. When people say, “He works all things out for the good,” we forget that when it doesn’t feel good in the moment, God is still there to empathize, comfort, and acknowledge that this feels so hard. When people say, “You’re so strong for how you’re handling this,” we don’t remember that God doesn’t expect us to be strong. We forget that He is strong so we don’t have to be.
A variable-frequency transformer is a doubly fed electric machine resembling a vertical shaft hydroelectric generator with a three-phase wound rotor, connected by slip rings to one external power circuit. The stator is connected to the other. With no applied torque, the shaft rotates due to the difference in frequency between the networks connected to the rotor and stator. A direct-current torque motor is mounted on the same shaft; changing the direction of torque applied to the shaft changes the direction of power flow.
The variable-frequency transformer behaves as a continuously adjustable phase-shifting transformer. It allows control of the power flow between two networks. Unlike power electronics solutions such as back-to-back HVDC, the variable frequency transformer does not demand harmonic filters and reactive power compensation. Limitations of the concept are the current-carrying capacity of the slip rings for the rotor winding.
I know very little about rocket launches, but one thing I thought I understood was that launches want to take off from as close to the equator as possible, which southwest England is not. Was there something special about the payload or this launch site? Or is this the launch equivalent of fighting with one arm tied behind your back? //
The optimal launch site for a given launch is at the same latitude as that launch's orbit's inclination. Depending on available downrange space, a particular site can also launch to orbits higher than its latitude, but never lower (without doglegs, which I'm going to ignore for the rest of this comment). So a low-latitude site is better in general since it makes more orbits possible, but it's not the best site for all orbits.
This launch by Virgin Orbit is going to polar orbit, which is extremely high-latitude. So it's possible from basically any launch site, and in fact is slightly better from high-latitude sites. //
It depends on your target orbit.
If you're shooting for an equatorial orbit, launching from near the equator gives you a boost from the Earth's speed of rotation, and saves you from needing to build a wasteful dog-leg or plane-change manoeuvre into the flight plan. This saves fuel and therefore lets you fly a bigger spacecraft with the same launch vehicle.
If you're shooting for some types of polar orbit, then the Earth's rotation is just an annoying thing you need to cancel out, so launching from a high latitude is more efficient.
The point of Virgin Orbit's approach is that you can launch the airplane from any convenient place with the right infrastructure, and fly to the best latitude for the type of mission you're doing that day before you light up the rocket, and you have good odds of bring able to launch without the weather problems that can cause delays when you're constrained to one fixed site.
Everything’s bigger in Texas.
The state’s bean counters have recorded a historic $33 billion surplus in tax revenue this year — about the same amount as the combined annual budgets of Connecticut ($20B) Delaware ($5B) and Vermont ($8B). //
Texas’ unprecedented surplus also comes at a time when other huge states such as New York and California are staring down the barrel of $3B and $22B budgetary debts respectively.
There are over 600 billionaires in the United States. Why does the IRS need an entire army of 87,000 agents to deal with just over 600 people? Why do they need 87,000 new agents in enforcement to deal with these 600-plus people? Why are these enforcement agents armed? Do they anticipate a firefight against these billionaires? //
According to Reason, the IRS attacked the poorest of Americans significantly more than they did anyone else:
https://reason.com/2023/01/06/in-2022-the-irs-went-after-the-very-poorest-taxpayers/
On Wednesday, Syracuse University’s Transactional Records Access Clearinghouse (TRAC) released data provided to it by the Internal Revenue Service (IRS) on audits performed by the agency in fiscal year 2022. Despite the infusion of new funding earmarked for the IRS via last year’s Inflation Reduction Act, the agency continued historic trends of hassling primarily low-income taxpayers, with relatively few millionaires and billionaires getting caught up in the audit sweep.
“The taxpayer class with unbelievably high audit rates—five and a half times virtually everyone else—were low-income wage-earners taking the earned income tax credit,” reported TRAC, noting that the poorest taxpayers are “easy marks in an era when IRS increasingly relies upon correspondence audits yet doesn’t have the resources to assist taxpayers or answer their questions.”
Just as Durham never delivered genuine justice for the biggest political scandal in modern American history, we should be skeptical that Musk will deliver digital free speech. Musk’s “free speech” marketing campaign has enticed marginalized and desperate right-wingers to get sucked back into Twitter. But it’s false advertising until Musk stops the arbitrary and personal targeting, shuts down the shadowbanning, and embraces freedom of speech and reach.
Until he does that, Musk is no better than the Twitter leaders he replaced. And for conservatives who were hopeful he would deliver a victory, he’s no better than Durham.
Rep. Andrew Ogles provided a list of McCarthy’s concessions to journalist Roger Simon, which is quoted in full below:
- “As has been reported, it will only take a single congressperson, acting in what is known as a Jeffersonian Motion, to move to remove the speaker if he or she goes back on their word or policy agenda.
- A ‘Church’-style committee will be convened to look into the weaponization of the FBI and other government organizations (presumably the CIA, the subject of the original Church Committee) against the American people.
- Term limits will be put up for a vote.
- Bills presented to Congress will be single subject, not omnibus with all the attendant earmarks, and there will be a 72-hour minimum period to read them.
- The Texas Border Plan will be put before Congress. From The Hill: ‘The four-pronged plan aims to ‘Complete Physical Border Infrastructure,’ ‘Fix Border Enforcement Policies,’ ‘Enforce our Laws in the Interior’ and ‘Target Cartels & Criminal Organizations.’’
- COVID mandates will be ended, as will all funding for them, including so-called emergency funding.
- Budget bills would stop the endless increases in the debt ceiling and hold the Senate accountable for the same.”
On Thursday evening, Kimberley Strassel published a helpful overview of the rules, arguing, “These changes will produce the first functioning House in years, even as they tie the hands of spenders.” Strassel is skeptical of the motion to vacate but correct that concessions secured by the HFC will shift Congress closer to functionality. One way to tell the HFC scored some real wins is to see how bitterly the GOP establishment opposes the deal. //
Those who fret that this veritable laundry list of demands will create chaos are correct. It probably will. McCarthy, thanks to the motion to vacate, will lead with the immediate threat of his ouster constantly looming. Government shutdowns will be on the table. Single-subject bills will have their drawbacks. But a dysfunctional House got us here, and there’s no functional way to leave dysfunction.
Creating your own BitWarden Service
VueScan is packed full of useful features. They have outdone themselves when it comes to possibilities on what you can do by using all of these options.
But, on the flip side, the interface is cluttered with too many options that can make it confusing.
We love that VueScan is great for premade color profiles. You’ll find a range of particular film profiles, such as Portra 400. This feature provides you with a filter that helps give your shots a realistic rendering for that film type.
There are limitations for editing options, and it’s likely that you will end up polishing images later in Photoshop or Lightroom.
Environment variables are named strings available to all applications. Variables are used to adapt each application's behavior to the environment it is running in. You might define paths for files, language options, and so on. You can see each application's manual to see what variables are used by that application. //
To see your currently defined variables, open up your terminal and type the env command.
The Great Wave off Kanagawa, created by Hokusai in 1831, is one of the world's most famous paintings.
But why are there more than 100 different versions of it in galleries all around the world?
Because it isn't actually a painting...
The Great Wave off Kanagawa comes from a series called Thirty Six View of Mount Fuji, created in 1831 by the master Katsushika Hokusai.
It is but one of thousands of beautiful different designs produced by the prolific Hokusai.
Here are four more from that 1831 series.
The Great Wave is a woodblock print in the Japanese ukiyo-e style.
The artist would create an ink drawing on paper, to be pasted onto a wooden block as a guide for the engraver. This engraving was then used to print multiple, coloured copies of the original design.
. If this committee focuses solely – or even mostly – on malfeasance that negatively affects Republicans and conservatives, the rest of the country will not view the probes as credible. Most people are aware of the reality that the FBI and intelligence agencies have long had a problem with corruption. Throughout history, their misconduct has harmed Americans on both sides of the political divide – and is likely doing so even today.
The issues surrounding the FBI’s raid of Mar-a-Lago and the way it has approached the abortion issue certainly indicate that the Bureau is biased in favor of the left. This is likely the case with intelligence agencies as well. But the investigations must be geared toward rooting out all corruption regardless of political affiliation.
Additionally, it is also worth noting that even if these investigations reveal smoking guns, it won’t matter if there is no accountability. Indeed, if heads don’t roll, what exactly is the purpose of bothering to investigate in the first place? This is the concern I – and several others – have expressed. It’s not enough to simply expose wrongdoing; the people engaging in these actions must be punished. Unfortunately, we know this is not going to happen. //
Quizzical
2 minutes ago
There are two major reasons why the January 6 committee lacked any credibility. That Democrats wanted the committee to exit and Republicans didn't is not one of them. //
If the price of a serious investigation into FBI abuses is that the committee must also look at some government agencies abusing power in ways that offends the left, that's fine. The important thing is to uncover the abuses that have happened and figure out how to prevent it from happening again. If Democrats want to argue that it's okay for the FBI to tell social media companies which Americans to ban for saying things that the FBI disapproves of, then make them put that on the record.
The famous Pantheon in Rome boasts the world's largest unreinforced concrete dome—an architectural marvel that has endured for millennia, thanks to the incredible durability of ancient Roman concrete. For decades, scientists have been trying to determine precisely what makes the material so durable. A new analysis of samples taken from the concrete walls of the Privernum archaeological site near Rome has yielded insights into those elusive manufacturing secrets. It seems the Romans employed "hot mixing" with quicklime, among other strategies, that gave the material self-healing functionality, according to a new paper published in the journal Science Advances. //
It was believed that the Romans combined water with lime to make a highly chemically reactive paste (slaking), but this wouldn't explain the lime clasts. Masic thought they might have used the even more reactive quicklime (possibly in combination with slaked lime), and his suspicion was born out by the lab's analysis with chemical mapping and multi-scale imaging tools. The clasts were different forms of calcium carbonate, and spectroscopic analysis showed those clasts had formed at extremely high temperatures—aka hot mixing.
“The benefits of hot mixing are twofold,” Masic said. “First, when the overall concrete is heated to high temperatures, it allows chemistries that are not possible if you only used slaked lime, producing high-temperature-associated compounds that would not otherwise form. Second, this increased temperature significantly reduces curing and setting times since all the reactions are accelerated, allowing for much faster construction.”
It also seems to impart self-healing capabilities. Per Masic, when cracks begin to form in the concrete, they are more likely to move through the lime clasts. The clasts can then react with water, producing a solution saturated with calcium. That solution can either recrystallize as calcium carbonate to fill the cracks or react with the pozzolanic components to strengthen the composite material.
Masic et al. found evidence of calcite-filled cracks in other samples of Roman concrete, supporting their hypothesis. They also created concrete samples in the lab with a hot mixing process, using ancient and modern recipes, then deliberately cracked the samples and ran water through them. They found that the cracks in the samples made with hot-mixed quicklime healed completely within two weeks, while the cracks never healed in the samples without quicklime. //
mgsouth Seniorius Lurkius
DJ Farkus said:
So many questions... Did they pour the hot-mix, is it required to be poured hot? How high of temperatures are we talking here? I wonder how they heated batches on-site (or did they transport it for pouring)?
Thank you. Now I have a mental image of a wagon pulled by a brace of oxen, with a huge oak barrel slowly rotating in the back. Meanwhile, the drover is flicking a whip about, cursing the throng of people in the street, screaming he has a *!@#! load setting up and get out of the !#@!! way. (In Latin, of course.)
On November 4th, a class action lawsuit — Doe 1 v. GitHub Inc., N.D. Cal., No. 3:22-cv-06823, 11/3/22 — was filed in the US District Court in the Northern District in California, alleging against Microsoft and GitHub (a Microsoft subsidiary), inter alia: violation of the DMCA; breach of contract; tortious interference in a contractual relationship; unjust enrichment; unfair competition; violation of California Consumer Privacy Act; and negligence. Also sued were a confusing mishmash of for profit and non-profit related entities all using a variation of the name OpenAI (OpenAI, Inc., OpenAI, LLC, OpenAI Startup Fund GP I, L.L.C.; you get the picture). OpenAI received one billion dollars in funding from Microsoft although they seem “officially unrelated.” //
Plaintiffs allege that OpenAI and GitHub assembled and distributed a commercial product called Copilot to create generative code using publicly accessible code originally made available under various “open source”-style licenses, many of which include an attribution requirement. As GitHub states, “…[t]rained on billions of lines of code, GitHub Copilot turns natural language prompts into coding suggestions across dozens of languages.” The resulting product allegedly omitted any credit to the original creators. //
As a final note, the complaint alleges a violation under the Digital Millennium Copyright Act for removal of copyright notices, attribution, and license terms, but conspicuously does not allege copyright infringement. A material breach of a copyright license can give rise to an infringement claim, so this is an interesting move. While the plaintiffs’ attorney indicated that an infringement claim might be added later, I suspect that this was done to avoid a messy fair use dispute. The complaint includes a statement by GitHub asserting an expansive, almost global fair use assertion which is at odds with explicit relevant law in many countries and frankly at odds even with US law. Nonetheless, fair use as a defense is expensive and complicated to litigate, so perhaps they chose to focus on something that is beyond factual dispute, and still provides the same damages.
At its peak in the second century, the Roman Empire dominated nearly two million square miles of the world. As with most such grand achievements, it couldn’t have happened without the development of certain technologies. The long reach of the Eternal City was made possible in large part by the humble technology of the road — or at least it looks like a humble technology here in the twenty-first century. Roads existed before the Roman Empire, of course, but the Romans built them to new standards of length, capacity, and durability. How they did it so gets explained in the short video above. https://www.youtube.com/watch?v=z1aFWtBXHII
On a representative stretch of Roman-road-to be, says the narrator, a “wide area would be deforested.” Then “the topsoil would be removed until a solid base was found.” Atop that base, workers laid down curbs at the width determined by the road plan, then filled the gap between them with a foundation of large stones.
Atop the large stones went a layer of smaller stones mixed with fine aggregates, and finally the gravel, sand, and clay that made up the surface. All of this was accomplished with the old-fashioned power of man and animal, using tipper carts to pour out the materials and other tools to spread and compact them.