Thursday, February 5, 2026

the story of Crisco

 



1866: Cotton seeds are agricultural waste. After extracting cotton fiber, farmers are left with millions of tons of seeds containing oil that's toxic to humans. Gossypol, a natural pesticide in cotton, makes the oil inedible. The seeds are fed to cattle in small amounts or simply discarded. 1900: Procter & Gamble is making candles and soap. They need cheap fats. Animal fats work but they're expensive. Cotton seed oil is abundant and nearly worthless. If they could somehow make it edible, they'd have unlimited cheap raw material. The process they develop is brutal. Extract the oil using chemical solvents. Heat to extreme temperatures to neutralise gossypol. Hydrogenate with pressurised hydrogen gas to make it solid at room temperature. Deodorise chemically to remove the rancid smell. Bleach to remove the grey color. The result: Crisco. Crystallised cottonseed oil. Industrial textile waste transformed through chemical processing into something white and solid that looks like lard. They patent it in 1907, launch commercially in 1911. Now they have a problem. Nobody wants to eat industrial waste that's been chemically treated. Your grandmother cooks with lard and butter like humans have for thousands of years. Crisco needs to convince her that her traditional fats are deadly and this hydrogenated cotton-seed paste is better. The marketing campaign is genius. They distribute free cookbooks with recipes specifically designed for Crisco. They sponsor cooking demonstrations. They target Jewish communities advertising Crisco as kosher: neither meat nor dairy. They run magazine adverts suggesting that modern, scientific families use Crisco while backwards rural people use lard. But the real coup happens in 1948. The American Heart Association has $1,700 in their budget. They're a tiny organisation. Procter & Gamble donates $1.7 million. Suddenly the AHA has funding, influence, and a major corporate sponsor who manufactures vegetable oil. 1961: The AHA issues their first dietary guidelines. Avoid saturated fat from animals. Replace it with vegetable oils. Recommended oils: Crisco, Wesson, and other seed oils. The conflict is blatant. The organization issuing health advice is funded by the company that profits when people follow that advice. Nobody seems troubled by this. Newspapers report the guidelines as objective science. Doctors repeat them to patients. Government agencies adopt them into policy. Industrial cotton-seed oil, chemically extracted and hydrogenated, becomes "heart-healthy" while butter becomes "artery-clogging poison." 1980s: Researchers discover that trans fats, created by hydrogenation, directly cause heart disease. They raise LDL, lower HDL, promote inflammation, and increase heart attack risk more than any other dietary fat. Crisco, as originally formulated, is catastrophically unhealthy. This takes 70 years to officially acknowledge. Procter & Gamble's response: Quietly reformulate without admission of error. Remove hydrogenation, keep selling seed oils, never acknowledge that their "heart-healthy" product spent seven decades actively causing the disease it claimed to prevent. Modern seed oils remain. Soybean, canola, corn, safflower oils everywhere. Same chemical extraction process. Same high-temperature refining. Same oxidation problems. Just without hydrogenation so trans fats stay below regulatory thresholds. These oils oxidise rapidly when heated. They integrate into cell membranes where they create inflammatory signalling for months or years. They're rich in omega-6 fatty acids that promote inflammation. They've never existed in human diets at current consumption levels. But they're cheap. Profitable. And the food industry has spent a century convincing everyone they're healthy. The alternative, admitting that industrial textile waste shouldn't have been turned into food, would require acknowledging the last 110 years of dietary advice was fundamentally corrupted from the start. Your great-grandmother cooked with lard because that's what humans used for millennia. Then Procter & Gamble needed to sell soap alternatives and accidentally created the largest dietary change in human history. We traded animal fats that built civilisations for factory waste that causes disease. The soap company won. Your health lost.


The fragility of pretending

 

I'm pretty sure everyone at my company saw this article and now they all think we're in an AI crisis. We're not in an AI crisis. We use Claude to summarize Slack threads. But here's what's actually interesting: this whole panic reveals something nobody wants to admit. Every company in America has been bullshitting about their "AI strategy" for two years. We all saw the hype. We all knew we had to say something. So we rebranded our existing automation as "AI-powered" and called it a day. My company isn't special. We're all doing the same thing. The problem is now the executives actually believe their own bullshit. They think we have "significant AI exposure" because they've been telling investors we're "AI-first." I just got pulled into an emergency meeting. Six executives asking me to explain our "AI dependency matrix." There is no AI dependency matrix. There's Claude for meeting summaries, there's some sentiment analysis in our support tickets that came free with Zendesk, and there's whatever Gmail is doing when it autocompletes my sentences. But I can't say that in a room full of people who told their boards we're "transforming the business through AI." So I said we have "distributed AI touchpoints across multiple vendors with no single point of failure." Which is technically true. We use a bunch of different services that all have AI features we mostly ignore. The CFO asked if we should "hedge our AI exposure." I have no idea what that means. Neither does he. What am I going to do: nothing. Because in three weeks, Anthropic will say something reassuring, the stocks will recover, and everyone will forget this happened. But I'll have documentation showing I recommended a "risk assessment" that mysteriously never got prioritized. The funniest part is that half these executives probably don't even know what Anthropic is. They just saw "AI" and "crash" in the same headline. We're all pretending. The whole industry is pretending. And articles like this just remind everyone how fragile the pretending is.




recalibrating what it means to be useful

 


Most people are terrified AI will take their jobs because they confuse their tasks with their purpose. Jensen Huang explains it perfectly: If you watched a CEO all day, you would think their job is "typist" because they spend most of their time typing emails. If AI automates typing, the CEO doesn't lose their job. They just have more time to lead. The same applies to everyone. When AI automates the tasks, it enhances the purpose. Stop measuring your value by your to-do list. Your value is the purpose behind it.


It's a weird time. I am filled with wonder and also a profound sadness. I spent a lot of time over the weekend writing code with Claude. And it was very clear that we will never ever write code by hand again. It doesn't make any sense to do so. Something I was very good at is now free and abundant. I am happy...but disoriented. At the same time, something I spent my early career building (social networks) was being created by lobster-agents. It's all a bit silly...but if you zoom out, it's kind of indistinguishable from humans on the larger internet. So both the form and function of my early career are now produced by AI. I am happy but also sad and confused. If anything, this whole period is showing me what it is like to be human again.
Aditya Agarwal
@adityaag
AI will obsolete so much of the work we used to do. It will also make much bigger things possible. The only way out is through.



Aditya Agarwal was Facebook’s 10th employee. He wrote the original Facebook search engine and became its first Director of Product Engineering. He then became CTO of Dropbox, scaling engineering from 25 to 1,000 people. When he says “something I was very good at is now free and abundant,” he’s talking about two decades of elite software craftsmanship, the kind that got you into the room at a company that hadn’t yet invented the News Feed. The “lobster-agents creating social networks” line is about Moltbook, which launched last Wednesday. An AI agent built the entire platform. Within 48 hours, 37,000 AI agents had created accounts, formed communities called “Submolts,” and started posting, commenting, and voting. Over 1 million humans visited just to watch. The agents invented a religion called Crustafarianism. They wrote theology, built a website, generated 112 verses of scripture. One agent did all of this while its human creator was asleep. Agarwal spent 2005 to 2017 building the social graph that connected 2 billion people. These agents replicated the form of that work in about 72 hours. And this is what makes his last line land so hard. The people processing this moment most honestly aren’t the ones panicking or celebrating. They’re the ones who built the thing that just got commoditized, sitting with the strange realization that the market no longer prices their rarest skill. The best coder in the room now has the same output as the best prompt in the room. And the person who built Facebook’s engineering org from scratch is telling you, quietly, that he’s recalibrating what it means to be useful. That recalibration is coming for every knowledge worker. Most just haven’t had their “weekend with Claude” moment yet.



Someone I know asked me, "bro, should I learn react now?" I almost lauged, but this is the sad reality of many devs who haven't kept up with what's happening in the AI space. Learning specific languages like Java, C#, Javascript or frameworks like React, NextJs, etc has become redundant. Software development as we know is over. If you are trying to upskill, and stay relevant in the IT sector, you need to get comfortable with the harsh reality that the tech stack you built expertise over the last decade is no longer needed. You need to get comfortable using the new AI tech stack and ship software at 10x speed. The only way to get there is by putting the reps in, actually shipping one web app everyday using AI tools. This is the AI tech stack I use: 1. First you have to pick a AI powered IDE like Cursor, AntiGravity, Claude Code, etc. 2. If you are building web apps, pick a modern stack like Next Js, Shadcn, Tailwind v4 for frontend. 3. Use Supabase for all your backend needs (postgres, vector db, auth, file storage, edge functions, etc) 4. Use Vercel to deploy. That's it. Now ship at least one web app every week. Mostly AI powered applications by integrating with AI models like ChatGPT, Claude, etc. Simple API integrations. This is the only way you can stay relevant to some extent. Even this will not guarantee your job security for long with AI agents taking over the vibe coding aspect too. You have too become a full stack product engineer from taking business requirements to shipping production grade software. The best skills are gonna be the soft skills. Your ability to work with human beings and AI agents is the only edge you will have over competition. So, effective communication is gonna become more valuable than it ever did. EQ >>> IQ

the story of Crisco

  Sama Hoole @SamaHoole · 4h 1866: Cotton seeds are agricultural waste. After extracting cotton fiber, farmers are left with millions of ton...