While this heavy science was being created out of sight of everyday Americans, a new technological craze was transforming the way marketers approached the business of creating desire around a product. In the 1950s, television broadened marketers’ reach and forced them to learn how to target audience segments, conduct sophisticated demographic research, and systematically apply psychological principles to marketing and advertising. These skills would be critical 60 years later with the advent of digital marketing.
Television legitimized marketing as a business science, and advertising revenues skyrocketed. In 1950, gross annual ad industry billings sat at around $1.3 billion. Ten years later, the industry was moving $6 billion a year in revenue. Marketing was officially big business.
Fast forward to 2006. Digital marketing was just getting into the swing of things with online content creation, email marketing, ecommerce, and A/B testing. Podcasting was brand-new, people were starting to count clicks, and there was a lot of excitement about this new gizmo called the RSS reader. Marketers thought they were getting the hang of this digital thing.
Meanwhile, unbeknownst to most marketers (or nearly anyone outside of computer technology), Geoffrey Hinton of the University of Toronto published a paper called “Learning Multiple Layers of Representation,” which presented artificial intelligence in terms of neural networks that could do more than just classify sensory data like speech or images. These new networks could be programmed to make associations around information in order to generate data. In essence, Hinton said, these neural networks could “learn.” It was the beginning of deep learning — and yet another game changer for marketing.
New AI applications increasingly narrow the technological gap between the data collection and deployment stages to provide solutions that help in the strategic decision-making process of marketing. The beauty of AI is that in many cases it’s self-teaching, or cognitive, meaning the longer it’s in use, the more accurate and beneficial its decisions. These applications are programmed not only to replicate how the human brain works, but also to continually evolve to better mimic intelligent processes and automate them.