On February 27, I’ll be returning to Northwood High School (Class of ’66) for Career Day. I’ll be joining a diverse group of alumni to talk about the future, but my message will be a bit different from most of the rest of the returning alumni.
Career Days. The normal focus of career day presentations, as I understand it, is sort of a job “show-and-tell”. The presenter describes what their responsibilities are during a typical day, gives an overview of how they got from high school to their current position and what they had to know to get that job and be successful in it, and perhaps give some pointers on compensation.
In my case, I am a semi-retired grandparent and part-time adjunct, blog writing, and finding meaning by interacting with AI-powered persona’s. The jobs I held over the last decade are likely either decades in the future for these students, or won’t exist at all by the time they get there. So with approval from the Northwood Career Day staff, we agreed that I will talk about the impact of AI on their likely first job searches.
AI and Early Jobs. While much of the current conversation around AI is polarized between “it’s magic” and “it’s a disaster (or a disaster in waiting,” I plan to offer a less clearcut perspective. My presentation, “From Slide Rule to Spaceship,” doesn’t focus on the hype of what AI might do in a decade. Instead, it focuses on what is happening to the “first rung” of the career ladder right now.
When I started my career, junior roles were designed for “learning by doing”—handling the routine tasks while absorbing the logic of the system. Today, those routine tasks are being handled by AI agents for the price of a monthly subscription. This creates a “Junior Gap” that today’s 9th-12th graders must navigate before they even graduate.
Opinions? I would like to share this framework with you and ask for your feedback:
- The Concept: Shifting from mastering Syntax (the rules) to mastering Semantics (the system design).
- The Strategy: Using the RIASEC model, which evidently is a thing these days at Northwood, to find “Green Zones” where human personality and tech expertise intersect.
1, Do you believe I am striking the right balance between the risks of AI “hallucinations” and the potential of AI “partnerships”
2. Does the “Syntax vs. Semantics” distinction resonate with what you are seeing in your own industry?
3. I worry that even the short-term future impact of AI is so poorly understood that any conclusion may be misleading, what do you think?
Post a comment or send me an email at dmintz@ourownlittlecorner.com.
From-Slide-Rule-to-Spaceship-notes