
Why Aren’t You Using a Custom GPT for Decision-Making?
Fat Tony is a character inspired by Nassim Nicholas Taleb, the renowned scholar and author known for his work on risk, uncertainty, and probability, particularly in “The Black Swan” and “Antifragile.” Fat Tony represents the archetypal street-smart skeptic, a person who relies on intuition, practical experience, and a sharp sense for spotting nonsense over academic theory or complex models.
Taleb uses Fat Tony to highlight the gap between real-world know-how and the often fragile assumptions of so-called experts and statisticians. Fat Tony embodies a contrarian approach, urging people to question authority, look out for hidden risks, and make decisions with true “skin in the game,” meaning they’re directly impacted by their own choices (Taleb, Fooled by Randomness).
I made a custom GPT to bring Fat Tony to life because sometimes, I need more than just textbook answers—I need that gritty, no-BS guidance only a guy like him could give. Inspired by Nassim Taleb’s character, I wanted direct access to a streetwise strategist, someone who doesn’t sugarcoat or follow the usual rules. So, I created this GPT as my go-to advisor for decision-making, risk, and practical sense.
Now, whenever I’m facing a big choice or just need someone to challenge the so-called “expert” opinion, Fat Tony’s right here, ready to question my assumptions, keep things real, and remind me to have some skin in the game.
Here’s a conversation I had the other day. Please imagine Fat Tony’s replies in a strong New York accent. My questions and responses are in bold below and FT’s replies are underneath in normal type.
Hey FT: Explain this to me. It’s a tweet I read today: “What do you do today that you believe will survive AI + automation? And then, do you believe in the lump of labor fallacy?”
Alright, here’s the deal.
The first part of that tweet is like asking, “What makes you irreplaceable when robots and AI take over all the boring stuff?” It’s about survival. What’s the thing you do that a machine can’t? Think about stuff that involves creativity, adaptability, and human interaction at a deep level—things that require judgment under uncertainty, not just running some algorithm. A machine can follow rules, but it struggles with ambiguity, like dealing with human emotions or navigating unpredictable situations. If you’re in a job that requires understanding complex human behavior or making decisions based on incomplete info (which is basically all decisions), you’re in safer waters.
Now, that second part—”lump of labor fallacy.” This is the idea that there’s a fixed amount of work to go around, so if machines or immigrants or anyone else takes jobs, there’ll be fewer for everyone. It’s called a fallacy because historically, technology usually creates new jobs while it destroys old ones. It might wipe out entire industries, but it also opens up new fields. The key is, the pie keeps getting bigger. Sure, some jobs disappear, but other ones pop up that didn’t even exist before. Ever hear of a “social media manager” in the 90s? Exactly.
But here’s where you gotta stay sharp: just ‘cause the fallacy says there’s no fixed number of jobs doesn’t mean your job is safe. It’s not a free pass. You gotta adapt, keep learning, and focus on things machines can’t do—at least, not yet. Figure out how you can complement the machines, not compete with ‘em.
Bottom line: automation will change the game, but it won’t eliminate it. You just gotta make sure you’re playing in the right league.

Now whaddaya think this means for educators and educational providers? Feel free to speculate
Alright, so let’s think about educators in this AI-driven world. First off, content delivery is already getting automated. You got chatbots, video lessons, and AI tutors handling the basics, right? The traditional “stand in front of a class and lecture” model is starting to look like a dinosaur. Why? ‘Cause machines can spit out facts, answer basic questions, and even grade essays better than some humans. So, if an educator’s just delivering info, they’re in trouble. That job’s getting outsourced to the algorithms, no doubt.
But here’s where it gets interesting. Education ain’t just about delivering facts—it’s about teaching people how to think, deal with real-world problems, and manage uncertainty. AI can give you all the data in the world, but it can’t teach you how to critically assess it, connect the dots, or deal with unpredictable situations. Educators who focus on critical thinking, ethics, creativity, emotional intelligence—those human skills AI sucks at—that’s where they thrive.
Now, for educational providers, the institutions, it’s a different ballgame. They’ve been milking the “old-school” model—big tuition fees, standardized curriculums, lectures-for-everyone kind of nonsense. AI’s gonna break that open. Personalized learning, tailored curriculums, skills-focused certifications—people are gonna ask, “Why do I need to spend 4 years and $100K when I can learn exactly what I need in a fraction of the time, personalized to me?” Institutions that don’t adapt are in for a rude awakening.
Speculation? Sure. Maybe the role of educators shifts from being the “source” of knowledge to more like coaches or mentors. They help students navigate through all the automated stuff, guiding them on how to learn, not just what to learn. And it’s not just about knowledge anymore—it’s about wisdom and judgment. Knowing what to do with all that information AI throws at you. In the future, the value’s in teaching people how to use tech, not fight it. Educators will be the Sherpas in this AI world, guiding people through the avalanche of information, but they gotta evolve fast.
Bottom line: if educators or educational providers are sticking to the status quo, they’re done. But if they adapt to focus on uniquely human skills and act as guides in an AI-dominated world, they’ve got a future.

So – given that most educators feel unprepared for AI and what’s ahead – what are some practical things they can do or start doing now ? And what about education providers? What should they start doing, especially given thorny issues around things like government funding?
Alright, let’s get practical, ‘cause educators and providers gotta move before they get steamrolled. First, for individual educators feeling unprepared for AI, here’s what they can do:
1. Master the Tools, Don’t Fear Them
Educators should start learning how AI tools work now, not later. You don’t have to be a programmer, but understanding how AI-powered learning platforms, adaptive learning systems, or even tools like ChatGPT work will keep you ahead. Start integrating these tools into your teaching to complement your lessons. Use AI as your assistant, not your replacement. It can grade homework, handle FAQs, or help with personalized learning plans. This lets educators focus on the stuff AI can’t do—mentoring, developing critical thinking, and engaging students in meaningful discussions.
2. Focus on Skills AI Can’t Replicate (Yet)
Shift the focus of your teaching to things that are hard for AI to do. This means more emphasis on:
- Critical thinking: Show students how to question, challenge, and assess information, not just absorb it.
- Creativity: Teach them how to innovate and come up with original ideas, combining disciplines in ways AI can’t.
- Emotional intelligence: AI doesn’t do empathy well. Educators can guide students in managing emotions, collaboration, and leadership skills.
3. Lifelong Learning Mindset
You gotta stay nimble. Start taking courses or attending workshops that deal with AI and emerging technologies. Get into the habit of lifelong learning yourself, so you can model it for your students. AI is evolving fast, and if you’re not continuously learning, you’ll get left behind. Dive into fields like data literacy, coding basics, or even ethics in technology.
4. Be a Mentor, Not a Lecturer
The future is less about standing in front of a room and more about guiding. Become the person who helps students navigate an ocean of information. You’ll be the one teaching them how to ask the right questions, how to evaluate the tech they use, and how to leverage AI in their own careers.

Now, for the education providers—schools, colleges, and universities—the game’s a bit more complicated, especially with funding issues and government bureaucracy. Here’s what they can do:
1. Rethink the Model
That old “4-year degree with a one-size-fits-all curriculum” is on life support. Providers need to shift towards more modular, skills-based education. Break down degrees into micro-credentials that are industry-specific and more flexible. This gives students faster, cheaper ways to upskill as the job market changes. You can still offer traditional degrees, but you need to cater to the increasing demand for specialized, short-term learning.
2. Leverage AI for Efficiency
Use AI for administrative tasks like enrollment, grading, and even identifying students who need help. If schools can automate the boring stuff, they can allocate more resources to where it really matters—supporting students and faculty in areas that require human touch. This also cuts costs, which is huge when you’re dealing with tight budgets and unpredictable government funding.
3. Lobby for Policy Changes
Education providers need to push for government funding to be flexible enough to support non-traditional education paths. A lot of government funding is tied to old-school metrics—completion rates, full-time student numbers, etc. Providers should lobby for funding models that incentivize lifelong learning, part-time education, and micro-credentials. Basically, they need to pressure governments to fund education that fits the future, not the past.
4. Public-Private Partnerships
Look, government money’s great, but it’s slow and comes with strings. Providers need to start partnering with tech companies, industry leaders, and even startups. These companies have a vested interest in the future workforce and might be more willing to fund or collaborate on innovative educational programs that focus on tech skills, AI literacy, and adaptability.
5. Focus on Antifragility
Here’s the kicker—schools can’t just focus on surviving AI, they need to thrive in chaos. They need to build systems that get better with disruption, not worse. This means being adaptable to new tech, but also flexible with curriculums and responsive to economic changes. It’s about designing programs that can pivot quickly when a new tech or job market trend pops up, instead of waiting for government committees to approve changes.
6. Invest in Faculty Training
If educators feel unprepared, it’s the institution’s job to fix that. Provide ongoing professional development around AI, emerging technologies, and how to teach with these tools. You’re only as strong as your weakest link, and if your faculty can’t keep up, your institution falls behind.
The Bottom Line: Educators and institutions can’t wait for the storm to pass. They gotta get ahead of it. Individual educators should focus on what makes them human—judgment, critical thinking, emotional intelligence—while embracing AI tools as a supplement. Institutions need to break away from the rigid structures of traditional education, adapt fast, and rethink funding models. The future’s gonna be messy, but that’s where the opportunities are. You just need to be ready to benefit from the chaos.


Kia ora! Hey, I'd love to know what you think.