IN THIS ARTICLE
Get expert insights straight to your inbox.
Springboard for Business recently hosted a pair of webinars aimed at helping executives learn how they can bring their entire team up to speed on AI in the workplace.
Our guests asked some great questions based on those events. The following blog post covers those questions we didn’t directly address during those webinars.
AI in the workplace: challenges and opportunities
The following questions came from guests who attended our AI webinars.
How do we help employees overcome a fear of AI? Some eagerly embrace, while others are highly fearful. What’s the balance?
Anytime new technology is introduced into the workforce, you’ll see two camps form: those who will embrace it and those who will fear it.
What makes AI different, however, is that it arrived so quickly. ChatGPT was released on November 30, 2002, but within a few months, it (and other Large Language Models) seemed everywhere. Combine this dynamic with a seemingly endless parade of clickbaity headlines, and you’ve got a perfect storm stoking the fears of even some tech enthusiasts.
As we highlighted during the webinar, the first order of business for any manager should be to demystify the tech and create opportunities for AI enthusiasts and AI pessimists alike to learn, collaborate, and share insights.
Some simple ways to begin doing this today include:
- Host “prompt-a-thons”
Similar to the hack-a-thons prevalent in tech companies, prompt-a-thon participants share a common task or goal, then compete to see who can develop the best, most innovative, or most productive prompt to achieve that goal. It really is a great way to give your team some hands-on time with the tool, plus it helps them to think creatively – and work collaboratively – in getting the tool to produce the desired result.
- Explore the use of prompt aid tools
A new crop of resources is available, such as Promptbase, that help newer users make tools like DALL-E and ChatGPT, and Midjourney more usable and deliver better first-pass quality. This takes the edge off of authoring prompts, and getting positive results from the get-go will make it a more positive experience.
- Create open communication channels
Whether you’re using Slack or some other tool, encouraging team members to exchange prompts, highlighting the ways they’ve used AI to improve performance or outcomes, even sharing mistakes and missteps so teammates don’t repeat them… anything that grounds the tech and encourages a spirit of exploration and collaboration.
- Celebrate victories
As teams collect wins, celebrate them – even if they initially seem small. The name of the game is to encourage everyone – even those who are reluctant to embrace the tools – to participate in making AI part of the team.
To truly make AI work for the team – and to get rid of the “lumpiness” in adoption – educating the team as a cohort vs. individually is the way to go. (You can read all about how Springboard can help, you can read all about it here.)
What are some uses of AI that you see becoming front and center in the workplace that haven’t been fully integrated yet?
Commercially, universally available AI is still in its infancy, so many organizations are still trying to get their arms around the tool’s full potential. We also see some teams within an organization –notably marketing – leverage AI more heavily than others. This stands to reason because many tools fall along creative lines – ChatGPT, Midjourney, and the like.
But when it comes to “AI for the masses,” Intelligent Virtual Assistants (IVA) is where it’s at.
In its current incarnation, the universe of data from which many AI platforms draw their results can be limited – such as capping at a certain date, as is the case with ChatGPT. As those limitations disappear, however, AI’s utility and function can take us to some pretty unique and compelling places.
Think about how much easier a function like procurement can be when you can ask your favorite AI tool to “get a list of the top three most reliable solid-state hard drives currently available, see which local vendors can deliver two dozen to us tomorrow, send them an email inquiring about bulk discount rates, then create a spreadsheet reflecting what you’ve found.” Admittedly, this is an overly-simplistic example, but you get the idea. Tasks that used to take hours can be done in minutes.
Marrying this with Natural Language Processing (NLP) empowers workers to simply speak the above request as quickly as they ask Siri or Alexa to play their favorite song. Relying on AI to automatically respond to emails, book and move meetings, and even make phone calls on our behalf (as we’ve seen Google Duplex demonstrate as far back as 2018) is just about here.
All signs indicate AI-powered IVAs will continue to gain steam. According to market analyst ReportLinker, the IVA market will continue to expand throughout 2028 at a CAGR of 32.72%, ultimately hitting $45.83 billion.
The truth is that we’re just at the beginning of the journey with AI, and now that it is widely and commercially available, innovative end-users will continue to push the boundaries.
How much AI capability will become embedded in the tools we use vs. calling it in as needed?
AI already appears in so many places that it’s hard to imagine this trend won’t continue. Tools like Grammarly, for example, have recently introduced a virtual writing assistant that rides along with its traditional language-checking tools. LinkedIn’s AI can lend a hand in writing job descriptions. A host of AI email assistants is available via the Chrome Web Store. Everyone is getting in on the act.
This isn’t a passing fad, so we will likely continue to see AI integration as the norm. Still, as it increasingly becomes an “easy button” for content creation, it’s important to remember that genuineness counts.
No matter how advanced the AI becomes or how well-trained the model, in 1:1 interactions, creativity and authenticity will trump speed and scale.
How do we protect creator rights?
This is a huge question, and it’s at the heart of the most recent Screen Actors Guild and Writer’s Guide strikes impacting the entertainment industry. It’s an incredibly nuanced issue, but it seems creators have to fight this battle each time new technology rolls out. In the 90s, we saw a spate of legal activity surrounding sampling songs to create or augment new music. The early aughts caught peer-to-peer file sharing in the crosshairs. And today, we’re trying to get our arms around AI.
What’s particularly tricky about AI is that its use is inherently a creative process. When an AI produces something, who is the true owner of the output? Is it the individual who developed and fine-tuned the prompts? Is it the individuals who trained the models? Is it the creators of the source material used to train the AI? These are all critical questions to address, and, at least for now, the answer to each is likely “it depends.”
As we discussed during the webinar, taking a multi-faceted approach that’s based on understanding the ethical use of AI is a great place to start for organizational leaders. However, for it to really take root, we need policies with teeth.
In late 2022, the White House Office of Science and Technology Policy (OSTP) released the Blueprint for an AI Bill of Rights aimed at safeguarding US citizens’ civil liberties and mitigating opportunities for human bias to enter into a digital algorithm.
More specifically for digital media creators, more code-based prohibitions – such as the NoAI and NoImageAI tags proposed by Raptive – can prevent LLMs from using content bearing those tags as a source of model training.
Stay in the loop with Springboard
As with all things AI, changes are coming down on a weekly basis, so watch this space as we continue to monitor trends and tactics in this rapidly emerging world.
If you’re ready to level up your team’s data-driven decision-making and AI skills, Springboard would be happy to help. Just reach out to one of our team members, and we’ll get a call on the books to discuss your unique needs.