this post was submitted on 12 Apr 2026
18 points (95.0% liked)

TechTakes

2563 readers
54 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Want to wade into the sandy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid.

Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned so many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

top 50 comments
sorted by: hot top controversial new old
[–] CinnasVerses@awful.systems 14 points 2 weeks ago (2 children)

Scott Alexander published a blog post about how its unfair to call Victor Orban an autocrat but:

I spent the first half of my writing career calling out biased left-wing experts, the flood swept all those people away, and now we’re ruled by germ-theory-denialists and Waffle-House-teleporters. Not a day goes by that I don’t want the old biased experts back. To paraphrase Cormac McCarthy, you never know what worse institutions your bad institutions have saved you from.

Dsquareddigest responds:

I believe the full quote is "to paraphrase Cormac McCarthy, you never know what worse institutions your bad institutions have saved you from, if you are being dumb on purpose"

It's in the dictionary next to Upton Sinclair's famous line that "it is hard to get a man to understand something when he is a massive dumbass"

[–] Architeuthis@awful.systems 14 points 2 weeks ago* (last edited 2 weeks ago) (6 children)

Unless he specifies his problem was with ostensibly leftist academics being specifically too dismissive of race science and incelist tropes this is worthless, just run of the mill face-leopard schadenfreude.

Also the second half (the what? what's the cut-off point?) of his career has been if anything more mask off, and it's not like he stopped whining about woke after posting a half-hearted disapproval of trump like three days before the election after years of writing about how cool it would be if there was less regulation especially for healthcare.

load more comments (6 replies)
[–] Soyweiser@awful.systems 8 points 2 weeks ago (2 children)

I spent the first half of my writing career calling out biased left-wing experts,

He admits it!

[–] scruiser@awful.systems 7 points 2 weeks ago

I wouldn't give him credit for a full admission. He isn't acknowledging that "biased left-wing experts" means expert like psychologists with a basic understanding of psychometric validity and geneticists with the basic understanding that popular notions of race don't have a genetic basis and biological determinism is false.

load more comments (1 replies)
[–] sansruse@awful.systems 12 points 2 weeks ago (16 children)

https://www.cnbc.com/2026/04/15/allbirds-bird-stock-shoes-ai.html

Struggling shoe retailer Allbirds makes bizarre pivot from shoes to AI, stock explodes more than 400%

I had such a hard time coming up with an original joke for this, until i realized the reason why is that allbirds is stealing jokes from the dotcom bubble in the first place.

The company, valued around $4 billion at its peak, sold its intellectual property and other assets two weeks ago for $39 million. The stock surged over 400%, from under $3 a share up to $13. The shoe company had a market cap of about $21 million Tuesday.

Oh. so, bit of a misleading headline there CNBC. This wasn't a real publicly traded company, it was a company on life support that got pivoted by a greedy founder looking to cash in. Cynical move or the delusions of a true believer? does it matter?

Regardless, the stupidity is too much, the resemblance too striking. good luck to Allbirds in the totally normal footwear-to-high tech pivot that is happening in this totally normal economy.

load more comments (16 replies)
[–] Architeuthis@awful.systems 11 points 3 weeks ago* (last edited 3 weeks ago) (6 children)

Microslop exec floats the idea that companies should be required to buy additional software licenses for each AI agent

"All of those embodied agents are seat opportunities," Jha said, envisioning organizations with more agents than humans — each effectively a user that must pay for a software license, or "seat" in industry lingo.

A company with 20 employees might buy 20 Microsoft 365 licenses today. If each employee gets five AI agents, and the workforce shrinks to 10 people, that could still mean 50 paid seats.

Also, it's apparently enough for an LLM endpoint to be paired with an email inbox to be considered an "embodied agent", words mean nothing.

[–] Soyweiser@awful.systems 7 points 3 weeks ago (1 children)

Ah right, I need to get a 365 license for word, which comes with a free copilot agent, who needs a 365 license for its copy of word, which comes with a free copilot agent, who needs a ...

[–] istewart@awful.systems 9 points 3 weeks ago

Now that we've got the concept of recursive per-seat licensing established, allow me to invite you to contemplate the possibility of the "licensing macro"

[–] gerikson@awful.systems 7 points 3 weeks ago

JFC at least wait until you have a de-facto monopoly before musing about extracting the rents! This is capitalism 101.

load more comments (4 replies)
[–] gerikson@awful.systems 10 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Stop-AI terrorists: Eliezer Yudkowsky told us to bomb the datacenters.

Yudkowsky: no no no, I said we needed airstrikes to hit the datacenters

IRGC: I gotchu fam Cheap Drones Complicate the Gulf’s AI Boom

(edit reworded comment around link to attempt to make it funnier)

[–] Soyweiser@awful.systems 8 points 2 weeks ago* (last edited 2 weeks ago) (5 children)

What makes all this extra funny is Yuds lifes work. Wants to ensure AI alignement and fix human rationality. Creates terrorists instead.

Reminds me a bit of his AI in the box experiments, which according to the stories always worked on his fans, but as soon as somebody skeptical did it, he stayed in the box.

[–] mirrorwitch@awful.systems 7 points 2 weeks ago

Critical support for comrade Yudkowsky for getting some nerds to finally engage in direct action and blow up some goddamn datacenters

load more comments (4 replies)
[–] sansruse@awful.systems 10 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

i regret to inform you all that another technonce manifesto has hit our collective psyches. If you woke up with a headache today, this is probably why, gratis Alex Karp:

https://archive.is/N20zm

greatest hits:

  1. Public servants need not be our priests. Any business that compensated its employees in the way that the federal government compensates public servants would struggle to survive.

the "government is like a business and should be run like one" meme, for the dumbguys

  1. Our society has grown too eager to hasten, and is often gleeful at, the demise of its enemies. The vanquishing of an opponent is a moment to pause, not rejoice.

naked hypocrisy from the man who wants to erase a nebulously defined "leftism" from public life.

  1. No other country in the history of the world has advanced progressive values more than this one. The United States is far from perfect. But it is easy to forget how much more opportunity exists in this country for those who are not hereditary elites than in any other nation on the planet.

Sure, our society structurally requires an increasingly large fraction of the population to be economically precarious and eternally on the precipice of financial ruin and death, but it could be even worse! you should be grateful.

  1. We should applaud those who attempt to build where the market has failed to act. The culture almost snickers at Musk’s interest in grand narrative, as if billionaires ought to simply stay in their lane of enriching themselves . . . .

BE NICE TO ELON! sure, his ideas are vaporware bullshit that don't make sense, but he produced a lot of shareholder value and is definitely not just enriching himself. Another one for the dumbest people you know to seal clap over.

Every single bullet point here is sneerable, but i'll stop there and let other people have some fun.

[–] CinnasVerses@awful.systems 8 points 2 weeks ago

There is quite a contrast between the call for conscription (6.), the whining that civil servants have too much pay and respect (8.) and the praise for public life (9. We should show far more grace towards those who have subjected themselves to public life. 18. The ruthless exposure of the private lives of public figures drives far too much talent away from government service.) I think he means that earning a living wage for getting up every morning rain or shine and delivering an old man's bank statements is BAD, but if you accept a modest position as Chief Technology Officer or Cabinet Secretary nobody should be allowed to criticize you.

[–] CinnasVerses@awful.systems 8 points 2 weeks ago

The notorious socialists at the (checks notes) World Economic Forum rank social mobility in the USA as 27th in the world behind Sweden, Germany, Canada, and Japan (!) https://en.wikipedia.org/wiki/Socioeconomic_mobility_in_the_United_States So this seems like another demonstration that being very rich is like being kicked in the head by a horse or drinking a bottle of wine a day.

load more comments (1 replies)
[–] sc_griffith@awful.systems 9 points 2 weeks ago (5 children)

i'm in the middle of freefalling down a research rabbit hole and ran across this person decrying curtis yarvin as a fake monarchist who doesn't understand what makes REAL monarchism good:

https://www.reddit.com/r/behindthebastards/comments/1iy4fto/moldbug_morons_and_monarchism_an_xpost_of_my/

someone in the replies asks the obvious question

Ok but what stops the monarch from being a tyrant

and their answer is that you can just kill the monarch

It's still One Person. A mortal, fleshy person. Their defence is that they're inoffensive, things are stable, nothing is directly their fault and people are bound by law and oath. But if they screw up badly enough that the things they're supposed to do don't happen? There's more of everyone else than One Person.

[–] gerikson@awful.systems 8 points 2 weeks ago (1 children)

Good link, thanks.

The commenter totally missed what a shock the executions of Charles I and Louis XVI were. The natural reaction to "if the king is bad just kill him" is for the king to more or less aggressively remove threats to their persons.

[–] lagrangeinterpolator@awful.systems 9 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

In basically every case in history where people decided to kill a bad king, there was a period of chaos and violence that followed it. The killing of Charles I happened during the English Civil War, and the killing of Louis XVI happened during the French Revolution. This has happened many times in Chinese history, with the fall of an imperial dynasty leading to several decades of civil war (most recently in the early 1900s). But I guess if you have a big clever brain with big clever thoughts, you don't need to look at history.

If the only way to get rid of a bad king is to kill him, he will do anything he can to defend his power, including using as much violence as necessary. (People generally do not like being killed.) Even if you successfully get rid of him, good luck establishing a proper government afterwards with all the violence you've caused. And who knows if the new king is gonna be better or worse? A better system would instead have a mechanism that replaces officials on a regular basis, say every few years, and ensure that these replacements are peaceful. Oh wait, that's liberal democracy. If we do something boring like support democracy, how will people ever think of us as special, clever thinkers with bold, contrarian thoughts?

It’s still One Person. A mortal, fleshy person. Their defence is that they’re inoffensive, things are stable, nothing is directly their fault and people are bound by law and oath.

Bro, your system involves giving all the power to one person. You cannot then say they have no responsibility or that they're "inoffensive" when they abuse it.

load more comments (1 replies)
[–] BurgersMcSlopshot@awful.systems 7 points 2 weeks ago

"It's only monarchy if it's got a Habsburg jaw, otherwise it's just sparkling tyranny."

load more comments (3 replies)
[–] antifuchs@awful.systems 9 points 2 weeks ago
[–] scruiser@awful.systems 8 points 2 weeks ago

Habryka defends colonialism, straight out, no qualifiers: https://www.lesswrong.com/posts/w3MJcDueo77D3Ldta/let-goodness-conquer-all-that-it-can-defend

Ok, fine, I'll go even further. I am glad about the colonization of North America. The American experiment was one of the greatest successes in history, and of course, it was a giant fucking mess. But despite it all, despite the Trail of Tears, despite smallpox ravaging the land, despite the conquistadors and the looting and the rapes — it was still worth it. America is worth it. Democracy was worth it.

A surprisingly high number of comments push back, but Habryka's post is still highly upvoted, and the push back is in the typical rationalist jargon filled, assume-charitably mess.

[–] gerikson@awful.systems 8 points 3 weeks ago (2 children)

LW stalwart discovers kids get sniffles from daycare, obviously this means women have to stay at home to take care of kids and not work:

https://www.lesswrong.com/posts/byiLDrbj8MNzoHZkL/daycare-illnesses

BTW almost every person born after 1970 in Sweden has been to daycare as a kid, if daycare illnesses had long-term consequences it would be showing up here

[–] CinnasVerses@awful.systems 7 points 3 weeks ago (2 children)

Sweden is an interesting example because they pioneered the let-it-rip approach to COVID. That was less disastrous than it could have been but not great even in a country with a lot of detached housing and nuclear families. https://kevinmd.com/2025/01/swedens-controversial-covid-19-strategy-lessons-from-higher-mortality-rates.html I would not have recommended putting children in daycare without strict indoor-air-quality standards between 2020 and 2024.

load more comments (2 replies)
load more comments (1 replies)
[–] samvines@awful.systems 7 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

Soon, at each new model of AI along the current capability curve, you will start to see large discrete jumps in ability in economically important areas, because the previous AI ability level in some aspect of the job just wasn't good enough and bottlenecked progress. When bottlenecks are released, it looks like a leap forward. It is going to look like unexpected gains in AI capacity, and, indeed there is no sign that the current exponential ability curve is slowing down so far but it is going to be like what happened in coding: as soon as models crossed a certain threshold with Opus 4.5, GPT-5.2, and Gemini 3, suddenly Claude Code & Codex were viable.  Before that, it was all about coding assistance, afterwards it was all about agents from despite relatively small gains in model ability

There is just something so inherently smug and annoying about Mollick. He is one of those low information boosters whose posts sound intellectual until you really think about them.

Tell me more about how the pile of cursed spaghetti that is Claude code is now viable due to model breakthroughs. All I see are hype men saying "the new model is a team of PhDs in your pocket" and then releasing disappointing updates or saying "the new model is too dangerous" because they have some vaporware powered by human crowdsourcing.

Also coding is not like other areas - you can test for hallucinations by compiling and printing and running tests.

I guess my first mistake this morning was opening linkedin

[–] YourNetworkIsHaunted@awful.systems 6 points 3 weeks ago (1 children)

I've never understood how these things are simultaneously gaining their abilities based on statistical analysis of all kinds of random writings online including social media, fanfic, reddit, etc. but also are simultaneously supposed to end up as experts rather than a much faster and more agreeable dumbass. Like, the training data may include all the great works of literature, all the scrapable scientific studies and textbooks they could steal, and so on. But it also included every moron who ever shared conspiracy theories on Twitter, every confident-sounding business idiot on LinkedIn, and every stupid word that Scott or Yud ever wrote. Surely the bullshit has to exceed the expertise by raw volume, and if they took the time and energy to curate it out the way they would need to to correct that they wouldn't be left with a large enough sample to actually scale off of.

Basically, either I'm dramatically misunderstanding something or the best we can hope for is the Average Joe on Reddit, who may not be a complete dumbass but definitely isn't a team of PhDs.

load more comments (1 replies)
load more comments (1 replies)
[–] nfultz@awful.systems 7 points 2 weeks ago (3 children)

https://www.nakedcapitalism.com/2026/04/ai-reputational-crisis-violence-data-center-protests-sam-altman-openai.html

The profound ignorance of tech on the part of most American lawmakers is no joke. In a prior life, I was once responsible for updating a future Vice Chair of the Senate Intelligence Committee on tech issues and it was like showing an alarm clock to a chicken.

haha

That same senator went on to be a huge RussiaGater and played a central role in Twitter and other social media titans upping their censorship game at the behest of US politicians.

oh :(

[–] gerikson@awful.systems 6 points 2 weeks ago* (last edited 2 weeks ago) (4 children)

That shift suggests Virginians now consider data centers almost as undesirable as nuclear power plants,

bah! Virginian voters need to read more LessWrong, where the benefits of both are explained beneath impenetrable layers of posts.

Also this evisceration of Zvi:

As for his argument regarding political violence, I’d point him toward John Locke, Nelson Mandela, Franz Fanon, or Walter Benjamin, but what’s the point, none of them printed their arguments on Magic: The Gathering cards.

load more comments (4 replies)
load more comments (2 replies)
[–] o7___o7@awful.systems 7 points 2 weeks ago

Some guys who steal other people's work choose to market their stolen wares using the name of a woman whose work was stolen.

https://openai.com/index/introducing-gpt-rosalind/

[–] scruiser@awful.systems 7 points 2 weeks ago (2 children)

A detailed analysis of why Anthropic's claims about Mythos's cybersecurity implications are bs: https://www.flyingpenguin.com/the-boy-that-cried-mythos-verification-is-collapsing-trust-in-anthropic/

And a followup post about why Anthropic's Glasswing project violates cybersecurity community norms and is an attempt to form a cartel: https://www.flyingpenguin.com/cartel-or-not-anthropic-mythos-is-a-curious-case/

load more comments (2 replies)
[–] antifuchs@awful.systems 7 points 2 weeks ago (3 children)

I’ve never seen a more compelling reason to enforce strict and short retention rules for every corp communication medium, holy shit https://www.forbes.com/sites/annatong/2026/04/16/ais-new-training-data-your-old-work-slacks-and-emails/

load more comments (3 replies)
[–] YourNetworkIsHaunted@awful.systems 7 points 2 weeks ago* (last edited 2 weeks ago)

To distract us from the ongoing cycle of violence and discourse about violence that neither cracks down or addresses it's causes, may I offer the fruit of today's YouTube rabbit hole:

AI isn't the future. It's medieval alchemy.

[–] gerikson@awful.systems 7 points 3 weeks ago* (last edited 3 weeks ago) (5 children)

There's a pretty wide divide between the speculations of the motives of the alleged arsonist of Sam Altman's SF residence last week.

LW has handled the issue obliquely, but the main concern seems to be that they are pretty convinced the dude acted out of fear of AI-induced x-risk. The optics is that his actions would paint EA in a bad light.

HN (based on this big heap of comments https://news.ycombinator.com/item?id=47724921) is more focused on the idea that Altman and co. are fomenting class hatred and the attack is more akin to Luigi Mangione's attack on a health insurance CEO. (Searching for "exinction" and "doom" in the threads doesn't throw up much)

Neither forum links to the dude's alleged slobslack.

My conclusion is that "AGI-driven X-risk" is a position too extreme at the moment for HN.

Also I believe the alleged attacker is not an avatar of a popular movement but a confused individual self-radicalized online.

Edit It's good to know that if you are a radicalized person thinking about committing violence against people or property, LW will be happy to provide you with a safe space to vent, with guarantees on your anonymity. C.f. habryka's comment here https://www.lesswrong.com/posts/igEogGD9TAgAeAM7u/jimrandomh-s-shortform?commentId=zdMRHRqWDcjswhA3i

[–] dgerard@awful.systems 9 points 3 weeks ago (1 children)

writing this up for today, will be mentioning Ziz

load more comments (1 replies)
load more comments (4 replies)
[–] scruiser@awful.systems 7 points 3 weeks ago* (last edited 3 weeks ago) (6 children)

Eliezer joins the trend of condemning "political" violence with confidence on the far end of the dunning-kruger curve: https://www.lesswrong.com/posts/5CfBDiQNg9upfipWk/only-law-can-prevent-extinction

I've already mocked this attitude down thread and in the previous weekly thread, so I'll try to keep my mockery to a few highlights...

He's admitting nuke the data centers is in fact violence!

It would be beneath my dignity as a childhood reader of Heinlein and Orwell to pretend that this is not an invocation of force.

But then drawing a special case around it.

But it's the sort of force that's meant to be predictable, predicted, avoidable, and avoided. And that is a true large difference between lawful and unlawful force.

I don't think Eliezer has checked the news if he think the US government carries out violence in predictable or fair or avoidable ways! Venezuela! (It wasn't fair before Trump, or avoidable if you didn't want to bend over for the interest of US capital, but it is blatantly obvious under Trump) The entire lead up to Iran consisted of ripping up Obama's attempts at treaties and trying to obtain regime change through surprise assassination! Also, if the stop AI doomers used some clever cryptography scheme to make their policy of property destruction (and assassination) sufficiently predictable and avoidable would that count as "Lawful" in Eliezers book? ~~If he kept up with the DnD/Pathfinder source material, he would know Achaekek's assassins are actually Lawful Evil~~

The ASI problem is not like this. If you shut down 5% of AI research today, humanity does not experience 5% fewer casualties. We end up 100% dead after slightly more time.

His practical argument against non-state-sanctioned violence is that we need a total ban (and thus the authority of state driving it), because otherwise someone with 8 GPUs in a basement could invent strong AGI and doom us all. This is a dumb argument, because even most AI doomers acknowledge you need a lot of computational power to make the AGI God. And they think slowing down AGI (whether through violence or other means) might buy time for another sort of solution that is more permanent (like the idea of "solve alignment" Eliezer originally promised them). Lots of lesswrong posts regularly speculate on how to slow down the AI race and how to make use of the time they have, this isn't even outside the normal window of lesswrong discourse!

Statistics show that civil movements with nonviolent doctrines are more successful at attaining their stated goals

Sources cited: 0

One of the comments also pisses me off:

Which reminds me about another point: I suspect that "bomb data centers" meme causal story was not somebody lying, but somebody recalling by memory without a thought that such serious allegation maybe is worthy to actually look up it and not rely on unreliable memory.

"Drone strike the data centers even if starts nuclear war" is the exact argument Eliezer made and that we mocked. It is the rationalists that have tried to soften it by eliding over the exact details.

[–] YourNetworkIsHaunted@awful.systems 9 points 2 weeks ago (2 children)

This feels somehow tied to the whole "agentic" thing I've ranged about previously. Like, individual acts of violence are strictly destructive because the people doing it aren't sufficiently "agentic" to change things, even though American history is full of cases where (usually racist) vigilante violence had a huge impact on people's decision-making. But when the government does it it's different because people in government got there by proving their agency and ability to actually impact the world. Like, it feels almost like he's offended that the NPCs might try and do something as drastic as killing someone without GM permission.

Meanwhile in reality, people legitimately do feel like they don't have a lot of options to protect themselves from the real harms this industry is doing, to say nothing of the people who buy his line about the oncoming class-K end-of-life scenario. Anger is an appropriate response to the circumstances we find ourselves in, and in a nation that has been quietly cultivating a culture of heroic violence for decades we shouldn't be surprised to see people trying to inflict that fear and rage upon the outside world.

[–] Evinceo@awful.systems 10 points 2 weeks ago

in a nation that has been quietly cultivating a culture of heroic violence for decades we shouldn’t be surprised to see people trying to inflict that fear and rage upon the outside world.

Nay a culture where every citizen is entitled to one armed crashout and threats of such have been an important lever used by the party that believes in that entitlement for decades.

[–] scruiser@awful.systems 8 points 2 weeks ago

Eliezer complaining about vigilante actions is really ironic considering one of his main themes in Harry Potter and the Methods of Rationalist was about "heroic responsibility" and complaining about how ordinary people default to doing nothing. I guess what he actually meant was for right-thinking people (people that agree with him) to take the actions he approves of.

[–] fullsquare@awful.systems 9 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

eliezer misses that (as used in decolonization/civil rights era) nonviolence is effectively a sophisticated propaganda strategy that takes existing injustices and violence and uses it to bait opponent into attacking you, all while your own people take photos and show to entire world carefully crafted messaging that appeals to general public conscience. the messaging part is extremely important in this. there's no fucking way this could work for him because his cause is comprehensible only to those who already buy his cult messaging as ground truth. he's in just for the moral superiority of being nonviolent. he's never gonna get it because comprehending it requires touching grass

[–] gerikson@awful.systems 7 points 2 weeks ago (1 children)

Yeah both non-violence and pure terrorism are communication forms at the root. I remember reading long ago that the Rote Armee Fraktion's master plan was:

  1. commit horrific acts of violence against pillars of the community / rob banks to get money
  2. said acts would unleash a repressive wave of violence from the state
  3. the proletariat would see this repressive wave, wake up, and cause the revolution

It kinda stopped at stage 2, because the BRD's security services were a bit less ex-Nazi than they expected, and also there was basically no proletariat.

Also the Southern police chief who correctly deduced that mass arrests were what the civil rights activists wanted, got the go-ahead from neighboring county jails, and then politely and non-violently arrested everyone protesting and spread them out over a wider area, thus preventing the media-friendly repression that was the goal.

load more comments (1 replies)
[–] Soyweiser@awful.systems 8 points 2 weeks ago

But it’s the sort of force that’s meant to be predictable, predicted, avoidable, and avoided. And that is a true large difference between lawful and unlawful force.

Remember the cartoon of the bombs being dropped on people and the people going 'I hear the next bombs will be sent by a woman', this but 'with lawful force'.

[–] blakestacey@awful.systems 6 points 2 weeks ago (2 children)

It would be beneath my dignity as a childhood reader of Heinlein and Orwell

Life is too short to be that pompous

load more comments (2 replies)
load more comments (2 replies)
load more comments
view more: next ›