For creatives, automation and AI have already made our lives easier in a lot of ways, but as we further incorporate them into our workflows to tackle the menial stuff, we might only be left with the hardest tasks. We look at an example of how introducing AI into Kickstarter’s approval process took away some of the joy and some of the darker consequences of leaving humans with only the toughest jobs.
No More “Slam Dunks”
For some time, we’ve tried to take a more optimistic stance on how AI will actually free us up creatively by saving us the effort of doing menial “soul sucking” tasks, leaving us to focus on the more enjoyable creative tasks. But an article in the Atlantic by former Kickstarter vice president of data Fred Benenson suggests our optimism might be misplaced. As it turns out, the most creative tasks are actually the hardest ones and enjoyable depends on how much challenge you still enjoy when tackling only these becomes your “one job”.
Benenson talks about how five years ago, it was company staff that were deciding which projects were approved to start soliciting money from the public. But when the number of both investors and investment-seeking creators exploded, they turned to AI to deal with the increased number, including periodic surges in new ideas.
- Holiday weekends meant the approvals team was backed up with hundreds of proposals and the same number of frustrated creators waiting for a response.
- Benenson oversaw the development of an automated system that considered each project’s stated purposes and its creator’s track record, among other factors.
- High-scoring projects would immediately get approval, meaning the system soon took over 40 to 60 percent of the manual approvals for incoming projects.
- Although it sped up the process, it also meant a dramatic drop in the average quality of projects that human reviewers would see. These were the ones that needed more nuanced consideration.
The outcome? It meant no more “slam dunks” — as in no more instances where a project was so strong or excited staffers so much that approving them was a no-brainer. What was left for them were all the tough calls or others with a dubious or even questionable level of promise.
The Balance of Mastery, Challenge and Enjoyment
In an email to Benenson, author and clinical psychologist Alice Boyes talked about the need for balance: “Decision making is very cognitively draining, so it’s nice to have some tasks that provide a sense of accomplishment but just require getting it done and repeating what you know, rather than everything needing very taxing novel decision making.”
Humans need a blend of mastery, challenge and enjoyment for a healthy mood. The exact mix differs, but all of those ingredients need to be there in some measure. What throws off the ratio is the introduction of AI systems that pass unclear or low-confidence decisions to humans, a trait of the best of these systems, according to Benenson.
Small and regular challenges that are well within our abilities, give us the sense of mastery and continuity (the feeling of progression) over our work as well as needed breaks from harder tasks. On the flipside, if every task we manage to complete is by the skin of our teeth because of exceeding difficulty, we never develop a sense of regularity or that we’re gaining something. We’re constantly fumbling over one finish line, unsure of how to get to the next one or where it is.
The Darkest Implications
To see how bad this could get once there are no more “easy calls,” consider the job of a Facebook content moderator. Where the platform’s AI is pretty adept at flagging certain content that obviously violates its standards, the technology just isn’t up to par. What’s left for the human moderator is to deal with not just a lot of the NSFW or NSFL content that ends up there, but a lot of stuff that isn’t objectionable at first glance, but that needs to a nuanced understanding of different contexts to see if it actually violates Facebook’s Community Standards.
In several articles for The Verge, Casey Newton has written extensively about the misery these contracted workers experience when they constantly need to tackle such tasks that are so granular or downright traumatic and perform them with a high degree of accuracy and efficiency. The issue is less that humans are being made to do these tasks (someone has to do it, to be sure), but the sheer volume of these tasks combined with a lack of ways to off-set that stress.
The Takeaway: AI as Subcontractor
Technology has always been about enabling us to do more, but perhaps this should be clarified to “greater” in the context of creativity whether we’re artists or creatives.
- There is still some enjoyment in small challenges and some of the menial work because it reinforces our sense of mastery.
- From plugins to machine transcription, Automation and AI still have a valuable role in saving time on repetitive, menial or low-difficulty-but-tedious tasks. But the question is what we do with the time saved (or gained, depending on how you look at it).
- It’s tempting to feel obligated to fill that extra time indiscriminately with more projects or tasks (with no direction), but what about the act of distancing, switching to other tasks within your workload, or further refining the work that’s already completed?
To simplify this: where we’re sure we can do so, we subcontract parts of bigger projects to handle the workload. If we treat AI as a subcontractor (albeit a very efficient one), then we retain the power in the relationship as we should.
The problem is that issues arise in “no excuses” scenarios where we are subcontractors alongside AI where “it’s already done the ‘easy’ work for you. Why aren’t you handling an increased load of harder stuff?”