What Small Business Owners Aren’t Considering (But Need to!) When it Comes to Generative AI
Small business owners increasingly use advanced generative artificial intelligence tools to enhance digital marketing strategies and maintain a competitive edge. iStock’s VisualGPS research reveals that 42 percent of SMBs and SMEs already leverage AI-generated content to strengthen their online presence, highlighting a growing recognition of AI's potential for e-commerce success.
Learning about the data that went into training these types of tools is a critical element that many forget, leaving individuals and businesses vulnerable to copyright issues. At the same time, the “fine print” of a contract may not include the necessary protections when information is uploaded to an AI tool or platform.
Overall, there are certain elements small businesses should keep in mind when evaluating generative AI tools that can help maximize efficiencies and resources while ensuring they're safe to use.
Understand the Extent of the Tool’s Protection
When evaluating an AI tool, ask, “Can I safely use the AI-generated images commercially?”
Not all AI tools offer the same amount of protection or were trained in the same way. Some use images directly from the internet to train models instead of images they own. Therefore, you’ll need to understand how the specific AI tool is trained to ensure outputs don't create unintentional risks.
Commercially safe outputs include AI-generated content without biases or unauthorized use of a celebrity likeness, famous landmark, or trademarked logo. Confirming the images are commercially safe will help foster trust among key stakeholders and audiences and encourage engagement.
Also, be skeptical of free (or practically free!) tools. It may seem ideal to use a low/no-cost tool for a quick turnaround, but the potential risks can be large, especially for SMBs. Before using, confirm the copyright owner and the image source to be sure it doesn't violate existing protections, and be sure to understand what indemnification or legal protection is offered.
Prioritize Brand Integrity When Using AI Tools
Striking a balance between AI-generated and authentic content helps ensure brand consistency. Aligning content with a brand identity cultivates trust and credibility among consumers. For example, creative agency McCann created a game called Mucus Masher for its client Reckitt’s cold medicine, Mucinex. It incorporated generative AI for players to destroy Mr. Mucus, which simulates how Mucinex attacks cold and flu symptoms, creating a connection between the product and the game.
Brand connection and authenticity are incredibly important when engaging with consumers. iStock’s VisualGPS research found that 46 percent of consumers believe AI can pass as authentic if it accurately represents a real-life scene/object and if it's indistinguishable from human-created art/photographs. AI can be powerful, but if it’s too outlandish and disconnected from the brand, consumers will turn away.
Balance AI With Human Involvement
Before embarking on AI initiatives, discuss the intended use with relevant stakeholders. This openness facilitates better decision making and aligns all stakeholders on the objectives and potential implications. At the same time, balance AI-generated content with existing visuals, such as authentic real-world imagery and video. Generative AI is one tool of many at the disposal of businesses, so think about when AI-generated images are appropriate vs. human-created before embarking on a certain strategy.
AI is a powerful tool capable of maximizing efficiencies but should be used in tandem with other services and offerings depending on what makes the most sense for a specific project or business. Working with in-house legal counsel or a consultant well-versed in AI policies and compliance standards will help proactively address legal considerations and minimize risks. Don’t be afraid of AI — use it smartly and you’ll quickly experience its benefits.
Karissa Liloc (she/her) is a principal product manager at Getty Images and iStock focused on generative AI and other tools that make it easier for customers to find and use imagery.
Related story: Making AI Really Work for Retail
Karissa Liloc (she/her) is a principal product manager at Getty Images and iStock focused on generative AI and other tools that make it easier for customers to find and use imagery. Previously at Disney and Gap, Inc, she has a track record of bringing cross-functional teams together to provide solutions to customer problems and business challenges. Her most recent work has been bringing Generative AI by Getty Images to market, offering commercially safe, easy-to-use, AI image generation that compensates creators.