In 1961, Joachim Rudolph escaped from one of the world’s most brutal dictatorships. A few months later, he began tunnelling his way back in. Why? //
This was October 1961, just two months after the Berlin Wall had gone up. It was built by the East German government to stop the flood of people leaving the communist dictatorship for a better life in the West. But what was so extraordinary about it was the speed with which it had been built. //
Thousands of miles away in New York, a hotshot TV producer named Reuven Frank was thinking about how to tell the story of Berlin. He’d been there when the wall went up and wanted to explain what was going on beyond the headlines.
He was one of the most powerful figures at the US news network NBC. One morning he had an idea: What if he could find an escape story that was happening right now?
They could film it in real time, every twist and turn, not knowing how it would end. It could revolutionise TV news.
Frank took his idea to the NBC correspondent in Berlin, Piers Anderton, who loved it and began making enquiries.
It was not long before Anderton's search for tunnellers brought him to the charming engineering student Wolf Schroedter, who was trying to raise money for the diggers.
“We brought him to see the tunnel,” says Schroedter. “He was really impressed. He told us he wanted to film it. And that’s when we told him our conditions - if NBC wanted to film it, they would have to pay us.”
Anderton relayed all this to Frank who agreed straight away. NBC would pay for tools and materials, “and in return we would have the right to film,” says Frank. //
A few months later, NBC broadcast the film, despite an attempt by President Kennedy’s White House to block it, fearing a diplomatic incident with the Soviet Union.
It was described as without parallel in the history of television. The tunnellers heard that President Kennedy himself watched it and that he had been moved to tears. //
Helena Merriman tells the extraordinary true story of a man who dug a tunnel right under the feet of Berlin Wall border guards to help friends, family and strangers escape in a BBC Radio 4 podcast.
The New York Times’ 1619 Project is the latest attempt from the left to retell history. But Allen Guelzo thinks the Times made some key errors. “The hope of many members of the Constitutional Convention, that slavery could be abolished, was linked to their conviction that the abolition of slavery was simply one more step that needed to be taken to free us from the inheritance of British colonialism and British imperialism,” Guelzo, a research scholar at Princeton University, says. “The 1619 Project tends to invert that.” //
The system of slavery is nefarious. It has the curse of heaven in any of the states where it is operating," said Gouverneur Morris at the Constitutional Convention.
“This handsome devil is Vasili Arkhipov. On this day 57 years ago, he saved the world. At the height of Cuban missile crisis, Soviet sub B-59 was being pursued by the USS Randolph. In the panic, it dived too deep to communicate with Moscow (1/n) THREAD”
The Americans in pursuit dropped depth charges. Conditions onboard the b-59 were oppressive. And in such hellish conditions, the crew determined war had begun - and prepared to fire a nuke. As I write in #TheIrrationalApe (2/n)
Due to unique circumstance, Vasili had the deciding vote. Facing fellow officers resolutely determined to die fighting, the fate of the world fell on this Russian's broad shoulders. Facing them, he vetoed their request to engage. A passionate row ensued, but be stood firm (3/n).
His reasoning was simple & elegant; as I explain in the book... (4/n)
We are only here today because a resolute russian deployed admirable critical thinking, like his country man Stanislav Petrov years later, as I've written before (5/n)
In the 1960s, Bob Taylor worked at the heart of the Pentagon in Washington DC. He was on the third floor, near the US defence secretary and the boss of the Advanced Research Projects Agency (Arpa).
Arpa had been founded early in 1958 but was quickly eclipsed by Nasa, leading Aviation Week magazine to dismiss it as "a dead cat hanging in the fruit closet".
Nevertheless, Arpa muddled on - and in 1966, Taylor and Arpa were about to plant the seed of something big.
Next to his office was the terminal room, a pokey little space where three remote-access terminals with three different keyboards sat side by side.
Each allowed Taylor to issue commands to a far-away mainframe computer.
One was based at Massachusetts Institute of Technology (MIT), more than 700km (450 miles) up the coast.
The other two were on the other side of the country - one at the University of California and the Strategic Air Command mainframe in Santa Monica, called the AN/FSQ32XD1A, or Q32 for short.
Each of these massive computers required a different login procedure and programming language.
The next step was obvious, Taylor said. "We ought to find a way to connect all these different machines."
Taylor talked to Arpa's boss, Charles Herzfeld, about his goal.
"We already know how to do it," he said, although it was not clear that anyone really did know how to connect together a nationwide network of mainframe computers.
"Great idea," said Herzfeld. "Get it going. You've got $1m more in your budget right now. Go."
The meeting had taken 20 minutes.
Taylor, Roberts and their fellow networking visionaries had something much more ambitious in mind - a network to which any computer could connect.
As Roberts put it at the time, "almost every conceivable item of computer hardware and software will be in the network".
The solution was proposed by another computing pioneer, physicist Wesley Clark.
Clark had been following the emergence of a new breed of computer.
The minicomputer was modest and inexpensive compared with the room-sized mainframes installed in universities across the United States.
Clark suggested installing a minicomputer at every site on this new network.
The local mainframe - the hulking Q-32, for example - would talk to the minicomputer sitting close beside it.
The minicomputer would then take responsibility for talking to all the other minicomputers on the network - and for the new-and-interesting problem of moving packets of data reliably around the network until they reached their destination.
All the minicomputers would run in the same way - and if you wrote a networking program for one of them, it would work on them all.
Adam Smith, the father of economics, would have been proud of the way Clark was taking advantage of specialisation and the division of labour - perhaps his defining idea.
The existing mainframes would keep on doing what they already did well.
The new minicomputers would be optimised to reliably handle the networking without breaking down.
Each local mainframe had to be programmed merely to talk to the little black box beside it - the local minicomputer.
If you could do that, you could talk to the entire network that stood behind it.
The little black boxes were actually large and battleship grey.
They were called Interface Message Processors (IMPs).
The IMPs were customised versions of Honeywell minicomputers, which were the size of refrigerators and weighed more than 400kg (63 stone) apiece.
They cost $80,000 each, more than $500,000 (£405,000) in today's money.
The network designers wanted message processors that would sit quietly, with minimal supervision, and just keep on working, come heat or cold, vibration or power surge, mildew, mice, or - most dangerous of all - curious graduate students with screwdrivers.
Military-grade Honeywell computers seemed like the ideal starting point, although their armour plating may have been overkill.
On 29 October 1969, two mainframe computers exchanged their first word through their companion IMPs.
It was, somewhat biblically: "Lo".
The operator had been trying to type: "Login" and the network had collapsed after two letters.
A stuttering start - but the Arpanet had been switched on.
Photo illustration by Lisa Larson-Walker. Photo by Bettmann/Getty Images.
Cover Story
Future Tense
The Lines of Code That Changed Everything
Apollo 11, the JPEG, the first pop-up ad, and 33 other bits of software that have transformed our world.
Oct 14, 20198:00 PM
Recently in Future Tense
The Chintzy Way Zappos Wants to Compensate Victims of a 2012 Data Breach
In Defense of Movies’ Ham-Handed Portrayals of Computer Code
Hong Kong May Be the Battleground for a New Cyber Cold War
Watching This Decomposing Whale Carcass Is Trippy and Beautiful
Back in 2009, Facebook launched a world-changing piece of code—the “like” button. “Like” was the brainchild of several programmers and designers, including Leah Pearlman and Justin Rosenstein. They’d hypothesized that Facebook users were often too busy to leave comments on their friends’ posts—but if there were a simple button to push, boom: It would unlock a ton of uplifting affirmations. “Friends could validate each other with that much more frequency and ease,” as Pearlman later said.
It worked—maybe a little too well. By making “like” a frictionless gesture, by 2012 we’d mashed it more than 1 trillion times, and it really did unlock a flood of validation. But it had unsettling side effects, too. We’d post a photo, then sit there refreshing the page anxiously, waiting for the “likes” to increase. We’d wonder why someone else was getting more likes. So we began amping up the voltage in our daily online behavior: trying to be funnier, more caustic, more glamorous, more extreme.
Code shapes our lives. As the venture capitalist Marc Andreessen has written, “software is eating the world,” though at this point it’s probably more accurate to say software is digesting it.
Culturally, code exists in a nether zone. We can feel its gnostic effects on our everyday reality, but we rarely see it, and it’s quite inscrutable to non-initiates. (The folks in Silicon Valley like it that way; it helps them self-mythologize as wizards.) We construct top-10 lists for movies, games, TV—pieces of work that shape our souls. But we don’t sit around compiling lists of the world’s most consequential bits of code, even though they arguably inform the zeitgeist just as much.
So Slate decided to do precisely that. To shed light on the software that has tilted the world on its axis, the editors polled computer scientists, software developers, historians, policymakers, and journalists. They were asked to pick: Which pieces of code had a huge influence? Which ones warped our lives? About 75 responded with all sorts of ideas, and Slate has selected 36. It’s not a comprehensive list—it couldn’t be, given the massive welter of influential code that’s been written. (One fave of mine that didn’t make the cut: “Quicksort”! Or maybe Ada Lovelace’s Bernoulli algorithm.) Like all lists, it’s meant to provoke thought—to help us ponder anew how code undergirds our lives and how decisions made by programmers ripple into the future.
There’s code you’ve probably heard of, like HTML. Other code is powerful (like Monte Carlo simulations, which is used to model probabilities) but totally foreign to civilians. Some contain deadly mistakes, like the flaw in the Boeing 737 Max. And some are flat-out creepy, like the tracking pixel that lets marketers know whether you’ve opened an email.
One clear trend illustrated here: The most consequential code often creates new behaviors by removing friction. When software makes it easier to do something, we do more of it. The 1988 code that first created “Internet Relay Chat” allowed the denizens of the early internet to text-chat with one another in real time. Now real-time text is everywhere, from eye-glazingly infinite workplace Slack confabs to the riot of trolling and countertrolling in a Twitch livestream.
It’s not always clear at first when some code will become epoch-defining. Oftentimes it starts off as a weird experiment, a trial balloon. Back in 1961, Spacewar!, the first virally popular video game, might have seemed a pretty frivolous way to use a cabinet-size computer that cost, at the time, $120,000. (That’s more than $1 million in 2019 dollars.) But it pioneered many of the concepts that helped computers go mainstream: representing data as icons, allowing users to manipulate those icons with handheld controllers.
Code’s effects can surprise everyone, including the coders. —Clive Thompson, author of Coders: The Making of a New Tribe and the Remaking of the World
The Great Wall of China superimposed with #### code.
Photo illustration by Lisa Larson-Walker. Photo by Alex Adams/Getty Images.
Binary Punch Cards
Date: 1725
The first code
Binary programming long predates what we think of as computers. Basile Bouchon is believed to be the first person to punch holes into paper and use it to control a machine: In 1725, he invented a loom that wove its patterns based on the instructions provided in the perforated paper it was fed. A punched hole is the “one,” and the absence of a punched hole is the “zero.” As much as things have changed since then, the essential building block of code has not. —Elena Botella, Slate
The First Modern Code Executed
Date: 1948
Ushered in both the use of computer code and the computer models of nuclear devastation that shaped the Cold War arms race
The Electrical Numerical Integrator and Computer was the first programmable electronic computer. Completed in 1945, it was configured for each new problem by wiring connections between its many components. When one task, such as an addition, finished, a pulse triggered the next. But a few years later, Klára Dán von Neumann and Los Alamos scientist Nicholas Metropolis wired ENIAC to run the first modern code ever executed on any computer: hundreds of numerical instructions executed from an addressable read-only memory (ENIAC’s function table switches). They simulated the explosion of several atomic bomb designs being evaluated at Los Alamos National Lab in New Mexico, using the Monte Carlo technique by which a complex system is simulated, step by virtual step, to repeatedly map the probability distribution of possible outcomes. Von Neumann and Metropolis sent more than 20,000 cards back to the nuclear scientists at Los Alamos, tracing the progress of simulated neutrons through detonating warheads. The distant descendants of this code are still in use at Los Alamos today. —Thomas Haigh, co-author of ENIAC in Action: Making and Remaking the Modern Computer
Grace Hopper’s Compiler
Date: 1952
Made it possible for computers to process words
IF END OF DATA GO TO OPERATION 14 .
Wikipedia
Grace Hopper was programming an early computer when she decided to make the whole thing easier by rooting it in human language. Hopper, who enlisted in the US Naval Reserve during World War II, knew that people like her superiors in the military struggled to understand binary code. If programming languages could be English-based, the work would be less prone to errors and more accessible to those who didn’t have a Ph.D. in mathematics.
Some scoffed at the idea, but by the early 1950s she had devised a compiler—a set of instructions that converts a more intelligible kind of code to the lower-level code directly processed by the machine. With that tool, she and her lab developed FLOW-MATIC, the first programming language to incorporate English words based on that process. —Molly Olmstead, Slate
Spacewar!
Date: 1961
The first distributed video game
/ this routine handles a non-colliding ship invisibly
/ in hyperspace
hp1, dap hp2
count i ma1, hp2
law hp3 / next step
dac i ml1
law 7
dac i mb1
random
scr 9s
sir 9s
xct hr1
add i mx1
dac i mx1
swap
add i my1
dac i my1
random
scr 9s
sir 9s
xct hr2
dac i mdy
dio i mdx
setup .hpt,3
lac ran
dac i mth
hp4, lac i mth
sma
sub (311040
spa
add (311040
dac i mth
count .hpt,hp4
xct hd2
dac i ma1
hp2, jmp .
Steve Russell via Bitsavers.org
In late 1961 a group of young MIT employees, students, and associates (many of them members of the Tech Model Railroad Club) gained late-night access to a recently donated DEC PDP-1 computer. The leading edge of nonmilitary computing, the PDP-1 sold for $120,000 (that would be a bit more than $1 million today), featured 18-bit word length, and used paper tape for program storage. Over the course of five months, these programmers created a game in which two players control spaceships—the needle and the wedge—that engage in a one-on-one space battle while avoiding the gravity well of a star at center screen.
Spacewar! spread quickly across the early “hacker” community. It was later distributed by DEC with each PDP-1, preloaded in the core memory and ready to demonstrate when installed. The program significantly influenced the small coding community of the 1960s and inspired generations of video game creators. It lives on in emulations and is demonstrated regularly at the Computer History Museum on the last operational PDP-1. Steve Russell, the lead coder, said at a 2018 Smithsonian panel, “It’s more than 50 years old. There are no outstanding user complaints. There are no crash reports. And support is still available.”
BSD co-inventor Dennis Ritchie, for instance, used "dmac" (his middle name was MacAlistair); Stephen R. Bourne, creator of the Bourne shell command line interpreter, chose "bourne"; Eric Schmidt, an early developer of Unix software and now the executive chairman of Google parent company Alphabet, relied on "wendy!!!" (the name of his wife); and Stuart Feldman, author of Unix automation tool make and the first Fortran compiler, used "axolotl" (the name of a Mexican salamander).
Weakest of all was the password for Unix contributor Brian W. Kernighan: "/.,/.," representing a three-character string repeated twice using adjacent keys on a QWERTY keyboard. (None of the passwords included the quotation marks.)
I don't remember any of my early passwords, but they probably weren't much better. //
Magnus • October 15, 2019 5:31 PM
"I would love to learn what Donald E. Knuth's passwords used to look like."
Knuth just closes his eyes and concentrates and the computer logs him in.
The computer needs a password to log in to Knuth.
In the mid-1800s, hundreds of Red River carts rolled down from the Winnipeg area in the summer and passed through Sherburne County on the way to St. Paul. Later on, the Winnipeg-St. Paul rail connection brought grain from the fields of Manitoba and Saskatchewan south to the mills of Minneapolis.
This historic trail was also the route of the great 1917 500-mile dogsled race sponsored by the St. Paul Winter Carnival and fictionalized in the movie Iron Will. Eleven teams started in Winnipeg on January 24, but only five finished at Como Park on February 3. The arduous race and bitterly cold and snowy weather took its toll.
Most of the participants were Canadian. Albert Campbell, the eventual winner, was a mixed blood Cree trapper from Manitoba and had won the 150-mile Le Pas dog-team sweepstakes in 1916.
Mr Krenz, a sprightly 82-year-old, is in finer fettle than the country he once ran. The German Democratic Republic - East Germany - no longer exists. Thirty years after the tumultuous events of 1989 and the fall of the Berlin Wall, Mr Krenz has agreed to meet me.
For many years he was seen as the "young prince" - the successor-in-waiting to veteran East German leader Erich Honecker.
But by the time he replaced Honecker in October 1989, the ruling party was losing its grip on power.
A week before the Berlin Wall came down, Mr Krenz flew to Moscow for urgent talks with Soviet leader Mikhail Gorbachev.
"Gorbachev told me the people of the Soviet Union view East Germans as their brothers," he said.
"At the time I thought Gorbachev was sincere. That was my mistake."
Do you feel the Soviet Union betrayed you? I ask.
"Yes."
On 9 November 1989 the Berlin Wall fell. Crowds of ecstatic East Germans poured across the open border.
"It was the worst night of my life," Mr Krenz recalls. "I wouldn't want to experience that again. When politicians in the West say it was a celebration of the people, I understand that. But I shouldered all responsibility. At such an emotionally charged moment, if anyone had been killed that night, we could have been sucked into a military conflict between major powers."
Egon Krenz still takes an interest in politics. And still supports Moscow.
"After weak presidents like Gorbachev and Yeltsin, it is a great fortune for Russia that it has [President Vladimir] Putin."
He insists the Cold War never ended, but instead is "being fought now with different methods".
When we get out of the car in the centre of Berlin, a history teacher and his group of 10th graders come up to us. It's their lucky day.
"We're on a school trip from Hamburg to study the history of the GDR," the teacher tells Mr Krenz. "It's amazing to have you as a living witness. What was it like for you when the Wall fell?"
"It was no carnival," declares Mr Krenz. "It was a very dramatic night."
History of US Federal Licensing of Radio Operators
The Krio people of Sierra Leone are partly descended from former enslaved Africans who fought for their British in the American War of Independence, in exchange for promises of freedom.
After the American victory in 1783, they fled with the British to the Canadian province of Nova Scotia, from where they were sent back to Africa, and the British colony of Sierra Leone. This had been founded for freed slaves, even before the slave trade was abolished in 1807.
Others who make up Sierra Leone's Krio population include descendants of black Londoners and Maroons - escaped slaves who fought against the British in Jamaica - and those who were freed from slave-carrying ships along the Atlantic route, who were all sent to Sierra Leone's capital, Freetown.
An 18th Century Ethiopian crown will finally be returned home after being hidden in a Dutch flat for 21 years.
Ethiopian Sirak Asfaw, who fled to the Netherlands in the late 1970s, discovered the crown in the suitcase of a visitor and realised it was stolen.
The management consultant has protected it until he felt safe to send it back.
"Finally it is the right time to bring back the crown to its owners - and the owners of the crown are all Ethiopians," he told the BBC.
The crown is thought to be one of just 20 in existence. It has depictions of Jesus Christ, God and the Holy Spirit, as well as Jesus' disciples, and was likely gifted to a church by the powerful warlord Welde Sellase hundreds of years ago.
It is currently being stored at a high security facility until it can be safely returned.
A pair of warships lost during a historic 1942 naval battle have completely disappeared from their resting places at the bottom of the Java Sea. Large portions of a third ship are also missing. An international investigation has been launched in hopes of solving this bizarre maritime mystery.
The Netherlands defense ministry has confirmed that two of its ships lost during the Battle of the Java Sea—the HNLMS de Ruyter and HNLMS Java—have vanished, while a third ship, the HNLMS Kortenaer, appears to be missing some of its parts. The wrecks were rediscovered back in 2002, but a new expedition to mark the 75th anniversary of the historic battle came up short. Sonar images showed imprints of where the wrecks used to be on the ocean floor—but no ships.
“An investigation has been launched to see what has happened to the wrecks, while the cabinet has been informed,” noted the defense ministry in a statement. “The desecration of a war grave is a serious offense,” hinting that the wrecks were illegally salvaged. //
This practice is in contravention of laws set up to protect these historically sensitive sites. Around 2,200 people died when these ships went down, and the wrecks have been declared sacred war graves. “The people who died there should be left in peace,” said Theo Vleugels, director of the Dutch War Graves Foundation, in The Guardian.
Late last year, the Netherlands defense ministry confirmed that two of its ships lost during World War II had disappeared from the bottom of the Java Sea, likely the result of illegal salvaging. Now, a trio of Japanese shipwrecks off Borneo have likewise been torn apart for scrap, highlighting what appears to be a growing problem.
As reported in The Guardian, the three shipwrecks—the Kokusei Maru, Higane Maru, and Hiyori Maru—have been stripped to practically nothing. Collectively known as the Usukan Bay Wrecks (also known as the “Rice Bowl Wrecks” on account of their cargo), all three are within a kilometer of each other, and are prized by recreational divers for their near-pristine condition and rich aquatic life. The three cargo ships were torpedoed off the coast of Borneo in 1944 by US forces, and may still hold the remains of dozens of crewmen.
The incident bears a striking resemblance to the disappearance of two Dutch wrecks lost during the Battle of the Java Sea. In both cases, blame is being pointed directly at illegal salvaging operations. But in the case of the missing Japanese wrecks, there appears to be some complicity from a local university.
As for the claim that the operation was an effort to clean-up toxic materials, that’s dubious at best. According to international law, naval shipwrecks remain the property of their nations (in this case, Japan). The looters—even if sanctioned by the university—had no legal business dismantling the ships and extracting the metal without authorization from Tokyo.
Our custom Bible-based curriculum coordinates perfectly with our Torchlighter videos. Student guides contain puzzles, crafts, discussion questions and more for kids ages 8-12. Leader guides feature four Scripture-focused lesson plans, as well as additional teacher resources. Download them today!
Looking for other Torchlighter resources? Check out 10 ways you can use Torchlighters in your church.
- Before independence:
"He was a very nice guy. At that stage, he was not too sure of himself. There were very strong people in Zanu who were not afraid to oppose him. He would never take a decision on his own" - Dumiso Dabengwa
- 1980-90:
"He did everything he could to improve the lives of his people. He wanted education for all. He wanted health for all. He introduced a leadership code limiting Zanu-PF cadres to 50 acres of land" - Wilf Mbanga
- 1990-2000:
"I worked very harmoniously with him and discussed issues. He would let me have my way or we would reach a compromise" - Dumiso Dabengwa
- 2000 - 2017:
"After 2000, he started flexing his muscles. He brought in people who he could influence. Several people were compromised - he held something over them" - Dumiso Dabengwa.
"He has become fabulously wealthy. He is not the person I knew. He changed the moment Sally died [in 1992], when he married a young gold-digger [Grace Mugabe]" - Wilf Mbanga
He allowed Ian Smith, the Rhodesian prime minister who had once declared that black people would not rule the country for 1,000 years and who reportedly personally refused to let Mr Mugabe leave prison for the funeral of his then only son, to remain both an MP and on his farm. //
Mugabe timeline
21 February 1924: Born
1964: Jailed after being convicted of sedition
1973: Becomes Zanu leader
1980: Becomes prime minister of Zimbabwe
1987: Becomes president under new constitution agreed under deal to end Matabeleland massacres
1992: Wife Sally dies
1996: Marries Grace Marufu
2000: Loses referendum, land invasions begin
2002: Wins presidential election amid widespread violence and fraud allegations
2005: Launches Operation Murambatsvina (Drive Out Rubbish), which forces 700,000 urban residents from their homes - seen as punishment for opposition supporters
2008: Comes second in election, violence leads his opponent Morgan Tsvangirai to withdraw from run-off
2009: Forms coalition government
2013: Resoundingly re-elected, Tsvangirai returns to opposition
2017: Forced to resign after army seizes power
6 September 2019: Dies in Singapore, which he visits for hospital treatment
Chris Derose's 'Star Spangled Scandal' vividly recounts how the murder of Francis Scott Key's son left a lasting legal legacy. //
Chris Derose's new book, 'Star Spangled Scandal: Sex, Murder, and the Trial that Changed America,' vividly recounts how the murder of Francis Scott Key's son was one of the 19th century's most sensational murder trials and left a lasting legal legacy. This was hardly the first major violent confrontation between two major public figures. Most notably, just three years earlier, pro-slavery Rep. Preston Brooks beat abolitionist Sen. Charles Sumner over the head with his cane on the Senate floor in retribution for the latter’s “Crimes Against Kansas” speech in which he supposedly libeled Brooks’ uncle.Brooks probably would have killed Sumner had his cane not broken. Sectional disputes over slavery routinely ended in violence, and it became commonplace for members of Congress to carry firearms and knives into the capitol.But even in this age of political firestorm, what transpired between Daniel Sickles and Philip Barton Key was distinctly personal. As Chris DeRose chronicles in his new and exciting book, Star Spangled Scandal: Sex, Murder, and the Trial that Changed America, the conflict between the two men was a tale as old as time. Weaving together the threads of their stimulating (and tragically intersecting) lives, DeRose inventively treats this narrative of adultery and murder as a kind of real-life play.
With hindsight, it is easy to assume that by 1944, the Third Reich was doomed. It could have all gone very wrong.
So Japan could never have crushed U.S. maritime forces in the Pacific and imposed terms on Washington. That doesn't mean it couldn't have won World War II. Sounds counterintuitive, doesn't it? But the weak sometimes win. As strategic sage Carl von Clausewitz recounts, history furnishes numerous instances when the weak got their way. //
There are three basic ways to win wars according to the great Carl. One, you can trounce the enemy's armed forces and dictate whatever terms you please. Short of that, two, you can levy a heavier price from the enemy than he's willing to pay to achieve his goals. //
Dragging out the affair so that he pays heavy costs over time is another. And three, you can dishearten him, persuading him he's unlikely to fulfill his war aims.
“The first time Albert Einstein wrote down E=mc²”
The 1619 Project isn’t mostly about helping Americans understand the role of slavery in our history. It’s mostly about convincing Americans that ‘America’ and ‘slavery’ are synonyms.
The project’s central purpose is not simply to educate Americans about the history of labor accounting from plantation to data visualization, or an account of the history of brutal sugar cultivation, but to give a specific narrative about what America is.
The project’s summary makes the aim quite clear: “[The 1619 Project] aims to reframe the country’s history, understanding 1619 as our true founding, and placing the consequences of slavery and the contributions of black Americans at the very center of the story we tell ourselves about who we are.”
Considered this way, the 1619 Project looks very different. It isn’t mostly about helping Americans understand the role played by plantation agriculture in American history. It’s mostly about convincing Americans that “America” and “slavery” are essentially synonyms.
It’s mostly about trying to tell readers they should feel sort of, kind of, at least a little bit bad about being American, because, didn’t you hear? As several articles say explicitly, America, in its basic DNA, is not a liberal democracy, constitutional republic, or federation. It’s a slave society.
No matter that historians mostly consider the 1619 date a red herring. Enslaved people were working in English Bermuda in 1616. Spanish colonies and forts in today’s Florida, Georgia, and South Carolina had enslaved Africans throughout the mid-to-late 1500s: in fact, a slave rebellion in 1526 helped end the Spanish attempt at settling South Carolina.
The 1619 Project’s narratives seem to miss a significant part of the legacy of slavery.
Furthermore, a serious accounting for slavery has to wrestle with the experience of Native Americans and Hawaiian islanders, and especially the status of their ancestral lands and sovereign rights. More broadly, to wrestle adequately with the painful historical reality of America’s “labor freedom,” we have to be able to talk about less-than-free Asian migrant workers in California and Hawaii, as well as the indenturehood of the Scots-Irish and subsequent Appalachian poverty.
Finally, it’s worth exploring the specialness of American slavery. The New York Times is an American publication, so it makes sense to explore the American experience. But a wider-angle lens can help us understand that experience.
Those early slaves in 1619 that The New York Times focuses on arrived on the San Juan Bautista. If that name doesn’t sound English, that’s because it isn’t. It was a Portuguese ship en route to Spanish Mexico. Off the coast of Mexico, it was attacked and captured by English pirates masquerading as Dutch. They sold their enslaved human cargo at Jamestown.
But when we explain the role played by slavery, we have to recognize that slavery is no more “native” to the American experience than, well, anything. We stole the first slaves from Portugal. Slavery struggled to “take off” in much of the South because managing a plantation is extremely technical and complicated, and many Americans were not good at it. It was an influx of experienced human traffickers, slave-torturers, and large-scale agribusiness experts from Haiti and other Caribbean colonies in the 1700s that gave much of the Deep South enough “expertise” in the abuse of humanity to develop a thriving slave economy.the history of slavery is not one of some evil creativity unique to Americans. We emulated models of slavery pioneered elsewhere. We “improved” on it, of course; the American zeal for “efficiency” drove escalating brutality (although Anglo cotton plantations never reached the perigee of inhumanity achieved by the Francophone sugar plantations of Haiti and Louisiana).
This story of slavery as something somehow “foreign” to many Americans will read as a bit much to many enthusiasts of the 1619 Project. If Americans were so unhappy with slavery, why didn’t they abolish it?
My answer is simple: we did. At the risk of historical absurdity, it must be noted that when Georgia was founded in 1732, slavery was banned, making it the first place in the Western hemisphere to ban slavery. But alas, the appeal of plantation wealth was too great, and by 1752 the King George II (the father of the George we rebelled against) had taken over Georgia as a royal colony, and instituted slavery.
Thus, in 1775, there was no free soil anywhere in the Western hemisphere. Slavery was a universal law.
But then something changed. Revolutionary agitation led to war in 1776, and by 1777, Vermont’s de facto secession from New York and New Hampshire created the first modern polity in the western hemisphere to forbid the keeping of slaves. In 1777, war with Britain was barely begun.
Vermont was hardly secure. But in their opening salvo to a watching world, Vermonters made clear what they thought America was about: liberty for all mankind. In 1780, still amidst the guns of war, Massachusetts’ constitution rendered enslavement legally unenforceable, and the judiciary soon abolished it.
Americans were early adopters of abolition. We were the first to establish formally abolitionist constitutions and states, the second to ban the trade in slaves, and middle-of-the-pack in achieving uniform abolition of slavery.
What Defines Us Isn’t Our Worst Moments
The American story is not a story of a country defined by slavery, but a country defined by trying to figure out what it means to live with liberty and self-government.