An upskill battle. The professional reconversion journey to becoming a prompt engineer
I spent half an hour in therapy today crying my heart out that someone at the office is giving better AI prompts than me. At $175 per therapy session, I’m seriously considering sending the invoice to OpenAI.
We are turning half of the white-collar job market into prompt engineers. Professional reconversion pushes a lot of buttons for a lot of people: status, resistance to change, meaning, security at the workplace.
Here are a few absolutely reasonable thoughts to help everyone push through the intimidating wall of adoption.
You are not lagging behind
AI went spectacularly mainstream less than 6 months ago, and we’re already feeling like we’re slow to adapt. No employer is going to expect five years of experience as a prompt engineer in 2023 (okay, some might, but that’s beside the point). Reasonably speaking, there is no senior qualification for this job.
If you’re reading this by 2024, you’re early. You are going to be in the first 25% of adopters. Here’s why:
- AIs are not yet fully-fledged business tools. If the advanced features of your AI tools are in Beta stage, why do we expect humans to be in their final Pokemon form?
- A lot of people are resistant to new. If you step outside of your bubble, you will meet a lot of non-users. They get the technology behind it outlandishly wrong and their knowledge of AI stops at “stranger-danger”. Not everyone is frantically looking for prompt engineering tips and tricks.
- This “stranger-danger” reaction will include some potential clients and employers. I’ve seen one or two job offers for “honest art, no AI, will verify”. There is some apprehension in some circles and a belief that hand-made is a superior degree of quality. You will have both employers and employees late to the market and that’s okay. Nature is testing different types of humans, and we’ll see who fares better. Because of late bloomers, we all get a bit more leeway before we can adapt.
The learning curve for prompt engineers is not a dot
The most popular AIs have an input field and a GO! button. It looks deceivingly simple. People believe the advanced course in AI looks like this: open the AI tool. Congratulations, you’re qualified.
- Artificially-Intelligent tools rely on natural language to communicate with humans. AI doesn’t change that there are people out there who are bad communicators. The best qualified to be prompt engineers are still those who know the process behind drawing a character, writing a brand, validating a study. Prompting is an additional skill, not a separate industry.
- AI development has hit a strange and unexpected wall in the middle of the field, called the black box issue. We know the input, and we know the output. How the learning process happens inside the system is not a complete mystery, but it’s not perfectly clear either. The AI has the predictability of a smart toddler. You have no idea how it got on the roof, and you have no idea how to get it down from there. Learning how to speak to the AI is going to be a process and it will take some months to master. This adds to the steepness of the learning curve.
- Despite the fact that it understands natural language and puts it in context, you will discover AIs are also formulaic. Every day, we find new and quirky techniques to get better outputs out of it. You can improve your results by telling it some very weird things, like reminding it to only work with the English database or to start from scratch with no artifacts (yes, even when the conversation is new).
*An article on that to follow, subscribe to our Facebook, LinkedIn or Instagram channels to stay in the loop.
Artificial Intelligence is brilliantly smart and weirdly dumb at the same time
Yesterday we tried to convince MidJourney to change the background from black to white on one picture. It pouted, slammed the door, and yelled “You’re not my real mom, you can’t tell me what to do!!”. It’s now blasting Imagine Dragons upstairs at full volume.
AI needs sophisticated human input before and after the fact. We were perfectly clear on what we wanted MidJourney to do. Apparently, the magic words for AI are not “pretty please”. AI will get more refined, but the black box issue means we don’t control the timeline. If it learns, it learns but there is no guarantee of it. AI is more of an eager intern than a glistening rockstar employee.
- Humans need to control the process minutely. Whatever skill you had as a professional before AI ramped-up, is still on the market today. Understanding artistic forms, currents, and composition gives you an edge as a prompt engineer.
- In a YouTube video from Wired, a DJ asks the AI to create a playlist for a Disco theme party. Most songs were on point, except for “Gotta Keep On” by Sweet Cream, which was a complete AI fabrication. While the names capture the spirit of the Disco era, neither the band nor the song was ever a thing. AI hallucinates. It serves you false claims with great confidence. High-stakes businesses need a professional to catch hallucinations before they end up in official research studies or university manuals. The higher the stakes, the bigger the need for an extra round of validation.
- Adjustments on output are hit-or-miss. It took us 45 minutes to try and convince MidJourney to give us the lilly-white background on our almost perfect picture. Eventually, we went to our design colleagues. I can’t describe you the grin on Luca’s face as we were prompting him on what we wanted. Yes, our colleague Luca takes requests in natural language too and “pretty please” worked. It took him 15 minutes to deliver the output, and two hours to boast about it.
In closing
Most of our team has been working in creative industries for (on average) 10-15 years. Every two or three years a smarter mouse trap comes out. “The new software update is going to cut your work in half,” “You can now singlehandedly do the work of an entire team,” “With version 9.1.1, you can just tap with your nose, fingers are obsolete.” Updates were always welcomed, but in reality, none of them eliminated crunch time, tight deadlines, overtime. Our output just became better and better. The bar on quality was raised every time. The deadlines got tighter because hei - I heard you can now singlehandedly do the work of an entire team.
There is a lot of uneasiness about allowing AI to do certain human tasks. Vanderbilt University apologized after they “...used ChatGPT to craft a consoling email after the mass shooting at Michigan State University,” ABCNews reports. With some tasks, the human effort in and of itself carries weight. The “human in the loop” expectation is often in place because of our innate sentimentality.
Yes, humans are still a thing.
Here is another relevant article on how the job market might evolve. Legislation in EU, US and China is planning on restricting AIs.