Generative AI, large language models, chatbots…. As the chatter about AI increases every day, there has been a barrage of new terms to learn. One has turned out to be a very apt description of some AI-generated content: workslop.
Research published by the Harvard Business Review defines “workslop” as “AI generated work content that masquerades as good work but lacks the substance to meaningfully advance a given task.” Workslop can also be riddled with mistakes. We’ve all seen the extremely polished images that are just a little off or social posts that have so many adjectives but say nothing.
There have been a number of embarrassing headline-grabbing examples this year. In May, several large, national U.S. newspapers published a summer book reading list that included books that didn’t exist. More recently, the leading newspaper in Pakistan suffered worldwide ridicule after printing a story that included a note at the end that was clearly from the generative AI they used to write the story.
If you’re creating content for your small business, association, or nonprofit, it sure is nice to get some help, save time, and possibly some money, by using AI. But how can you take advantage of AI to help you create content that isn’t “workslop,” or watered down, generic, and overall lacking substance or creativity? Or worse – false and misleading?
Why is AI-Generated Content so Lacking
Generative artificial intelligence platforms, like ChatGPT and Microsoft’s CoPilot, respond to your prompts to provide answers and create content such as text, images, and videos. AI search, like Google’s “AI Mode,” provides information based on your search terms. All of them get their information from the same place: the internet. That includes the good, the bad, and the ugly.
An analysis by Semrush showed that Reddit, the web forum and discussion platform, is by far the top source for AI answers, followed by Wikipedia and YouTube. It’s a pretty scary statistic knowing that none of the three are known for providing a high level of factual information.
Put In Guardrails
Organizations must set some standard rules or policies around AI usage. It’s essential to ensure everyone at your organization is on the same page about when it’s ok to use a third-party AI platform and when it’s not. There are some great templates available to get you started. You must then customize the policy to your organization to uphold your values and protect your relationship with your customers, clients, and employees. A few important things to consider:
- Platform Selection: Researching and selecting one platform that is the best for your needs will allow for consistency and better oversight. Many organizations have opted to require their employees to use CoPilot because they’re using the M365 platform in their business.
- Ethical, Privacy, and Accuracy: You must set guidelines for employees on what they are allowed to enter into the AI platform and how to use the output. Platforms like Claude are language learning models (LLMs) and, unless you opt out, use your conversations and inputs to train its system. That will include your proprietary or client information. In addition, it’s key to be very clear about how to use the output to prevent the embarrassment of workslop. As Forbes points out, workslop ends up costing MORE in the long run due to the time spent cleaning up the content.
- Governance and Oversight Rules: Any good policy will be clear on who is responsible for oversight of the platform and holding everyone accountable. This will prevent misuse and protect your organization.
Check out this blog from Diligent for much more about AI governance and how to implement it at your organization.
You Will Always Be the Key
I recently attended Jason Barger’s Thermostat Culture’s Live event focused on authentic leadership. One speaker, a workplace culture strategist, emphasized the importance of emotional intelligence, or the emotional quotient (EQ), in the workplace. She said that EQ, which is your ability to manage your emotions as well as understanding others’ emotions, is the “one thing AI can’t take away from us.”
Your emotional ability feeds into your creativity and gives you your gut feelings. It informs how you use your experiences in your day-to-day work and how you make decisions. While AI is sourcing the internet for its information, you’re sourcing your own knowledge, experiences, and feelings. AI can’t generate original ideas, but YOU can.
Humanizing Your Content: A Simple Checklist
Whether you’re at a large company with strict policies or at a nonprofit hoping to get some help with your huge task list – you, your EQ, and your knowledge are key to using AI responsibly and productively, protecting your business, and preventing workslop.
Before you enter a prompt into your chosen platform, ensure you’re following your organization’s policy, and then ask yourself:
- What is your brand voice?
- What tone do you want? Informative? Fun?
- Who is your audience?
- What is the “big idea” and how can AI enhance it?
- What is the goal of your content?
Your prompt should have the answer to all of these questions before asking the platform for what you need. This will ensure that the output is the most helpful for what you need. When you have the AI-generated content, your work is not over. Take a close look at the output and ask yourself:
- Is this right for my brand?
- Will this resonate with my audience?
- How can I personalize this to my brand?
- What AI red flags should I fix?
It’s a simple checklist but can help ensure your content is right for your business and doesn’t get lost in the sea of AI-generated content we see everywhere we turn.

