Trend Detection – Delineating possibilities and utopia

In times where seemingly every second “The Economist” Special Report focuses on either Artificial Intelligence (AI) or Big Data, general expectations regarding current technological capabilities are higher than ever. Rightly so, as there have been so many notable advances in recent years. What does this mean for trend detection?

The Big Data Hype and its Fallacy

If software enables us to communicate with smartphone applications, computers beat humans in various board games and self-driving cars navigate the streets more securely than we do, it is fairly easy to extrapolate these advances to other domains.

Given the digitalization of life and interaction (e.g through social media), it is a fairly widespread belief that applying sophisticated tools of analysis to this abundance of data should allow us to enhance our understanding of reality and its direction. The detection of trends of major and minor dimension should thus be possible and enable better decision making. However, despite all our technological progress (and those lofty claims some companies bring forward) things are not nearly close to such a vision.

In this long read, we will try to delineate possibilities and utopia in trend detection and show how companies can still use existing technology for their benefit.

What are trends? And where can we spot them?

As a starting point, looking back at how individuals and companies detected (and still detect) trends before we attributed omnipotent power to big data, serves as a good reminder of the difficulty of the endeavor of such. While the existence of more data is a good thing per se, it does not eliminate the complexity of tasks we want to tackle with it.

In absence of powerful tools of analysis, trend researchers were conceptualizing valid short-cuts to still allow them to make sense of all information, which could bear analytical benefit on the path towards trend detection. A few of the most common techniques were (and still are to some degree) the following: (1) media monitoring, (2) expert circles and (3) focus groups.

Techniques of media monitoring primarily relied on summaries by third party providers and their editorial capabilities. However, the scope of media outlets to be included in such monitoring is always limited by one’s financial capabilities. Naturally then, in order to limit the scope of media, they are ranked by attributed authority (e.g. through proxies like circulation, etc.). Newspapers as such are already filters for “reality”, but some of them are viewed as more reliable or impactful depending on their public image.

Expert circles are, to some degree, an even more simplified method as they rely on an – arbitrarily – defined group of experts. Whoever is interested in ‘the trend’ defines a panel of people, which is viewed to have sufficient expert authority on the issue / sector / industry being monitored. This panel periodically gives its assessment on the relevant trends and developments and ideally converges towards a consensus.

Focus groups usually refer to a broader group of people, who share some sort of characteristic or socioeconomic background. Focus groups are used to observe representatives for a broader, relevant societal spectrum, which is of interest for the trend researcher. By collecting data or organization of voiced opinions, the trend researcher has a very active part in aggregating the information, which could be distilled into a trend valid for the respective focus group.

Even this short glimpse into conventional ways to detect trends points towards various pitfalls:

  • Who is viewed to have authority on issues? Who determines whether one is viewed to have authority?
  • Who is voicing their opinion in newspapers? How do newspapers decide on collecting and publishing information?
  • With what certainty can we say that someone is representative for a wider spectrum? Especially if we want to observe trends in very specific areas?

These issues derive from the situation, that not all primary sources of information can be accessed and analyzed and one is dependent on secondary sources (experts, newspapers, etc.).

What kind of trend do we want to detect? And what do we want to use it for?

There are a lot of different types of trends, which could be of interest to organizations, companies and individuals.

Fashion retailers, for example, are in the business of detecting the respective local clothing trends and tailoring their product portfolio accordingly. What are women aged 25-32 wearing in Northern France? How are male students in urban centers dressed in the winter? These trends might appear rather quickly and don’t build up over a long time, which requires a close monitoring technique.

At the same time, some companies are interested in monitoring and understanding energy consumption patterns of single households as well as shift in their attitude to the environment. Those shifts are not to appear from one day to the other, but are the result of gradual changes within the overall tectonics. To detect them is therefore a continuous challenge, but might affect the very fundamentals of the business.

Independent of the nature of the trend and its general embodiments – be it short-term, long-term, quantifiable or qualitative – there is always a central, very much human element of interpretation involved. When is a trend supposed to be meaningful? When should be acted upon a trend?

Only if sufficiently known what kind of shape a trend will have, is it possible to set up an action plan upfront. In a rather open-end process, newly detected trends might require customized responses, which will force to re-conceptualize actions from the ground up.

Certainly, in highly quantified settings, such as the stock market, is it possible to establish rules on how to buy or sell stock depending on price and trading behavior. In other environments, though, we can rarely establish this if-this-then-that-logic.

Having this in mind, we can turn back to our initial question regarding trend detection and its feasibility given big data analytics.

Rapid Trend Detection in a Controlled Environment

When it comes to making sense of primary information sources, social media (Facebook, Twitter, etc.) is contributing to lessen the dependency of intermediary filtering authorities.

With around 600 million tweets per day, making sense of this vast bulk of 140-character-messages is by far not an easy endeavor, as it always remains a trade-off between scope of search and speed. When companies define a basket of keywords they want to track, this limits the scope of relevant tweets to be analyzed, but certainly also the flexibility regarding trends to detect.

Indeed, it’s established practice within large corporations to track keywords related to their products and brands, in order to mitigate negative press, reviews or anything hampering their reputation.

When it comes to a more broad detection of trends, it’s a more difficult endeavor. Twitter itself has an inhouse data science unit, which is trying to uncover the nature and dynamic of rapidly rising topics and make this knowledge accessible for corporations, organizations and governments. Indeed, it could contribute to understand the inherent “virality” of topics and thus enable companies to appeal to a larger audience.

With clustering algorithms, sentiment analysis and natural language processing, it is at least possible to enable decision makers to get a more intuitive feel for the nature of the conversation. And, since the importance of topics is – more a less – a function of the number of tweets and the sum of followers of the involved twitter accounts – one can establish relevancy thresholds. This enables companies to more effectively navigate the ocean of relevant content. However, this only applies to the twitter world (and there might be a certain bias in who is registered on the platform) – once expanded to more social media platforms, the complexity of the task skyrockets.

Long-Term Trend Detection

While social media is the best pulse for rapid trend detection, is might not be the suitable choice to enhance the understanding about long-term trends. To discuss what kind of opportunities are out there, let’s turn back to our initial, conventional techniques and see how sophisticated data science can help fine-tune these for enhanced insight.

(1) Media monitoring

While social media produces large streams of constant information, traditional media online is rather slow-moving. Additionally, there is – depending on the industry or sector of relevance – not an unlimited amount of sources. As long as these sources can specified, algorithms can be trained to resemble human monitoring and filtering techniques.

For UBS and Razorfish, idalab has for example conceptualized and implemented a highly-calibrated content filtering engine in the modern art sphere. It allows UBS to provide an exclusive news feed to its private clients, incorporated in the application “Planet Art”. In practice, algorithms screen the news sources for relevant articles. The suggestions are reviewed by a panel of art experts and their input is subsequently used to refine the algorithm. Effectively filtering through a diverse set of sources simulates human monitoring, while taking advantage of human judgement if appropriate.

These filtering engines, designed with sufficient flexibility, can enable companies to cost-effectively stay informed about trends and topics within their industry. It also helps the strategy and business development team to stay on top of the game.

(2) Expert circles

Just like filtering techniques can be supported (or taken over by, if put drastically) intelligent algorithms, expert judgement can also be incorporated into algorithmic solutions.

Companies and organizations, based on media monitoring, form hypotheses about trends, which they aim to validate externally. Turning to industry experts is a valid approach in this regard, but incorporating their judgement is a timely – and oftentimes costly – route. Nevertheless, having sufficient and structured information about various experts, their background and their judgement regarding various trends and their likelihood can be the enabler to reliably predict expert judgements on trends.

Having access to such a simulation engine, enables companies to quickly assess how various stakeholders would react to or comment on a trends. Judging on their credibility will still remain task of the corporation. Only if time series data is available, experts can actually be ranked regarding the accuracy of their judgement.

(3) Focus groups

Focus groups are almost fully substituted by A/B tests. While those do not completely resemble focus groups, as in an online setting not all information about the respective user is known (even though rather precise guesses about ages, heritage, income, etc can be made).

Nevertheless, A/B tests allow for rapid observation of user groups if confronted with different “realities”. How do they react to various items displayed differently? What item is viewed for the longest duration? While this approach in especially applied in an e-commerce setting, it can contribute insights for most corporations.

A/B tests on the company’s website have to deal with the general problem that only the existing user base is part of the participating focus group. This becomes a problem, if a company wants to develop a new product which caters a different segment of the population than the existing user base.

But even in such a situation, advertisement (often referred to as high-performance marketing) across various platforms and the analytics solutions, which come along with it, offer lots of possibilities for rapid check of trend hypothesis. And even in this process, new trend hypothesis can be pointed towards by user behavior.

Conclusion

Trend detection remains a hot topic within strategy departments of companies, political parties and NGOs alike. Aligning with the market, spotting new hypotheses and validating hypotheses is an essential task to steer the respective organization towards the future.

Against this backdrop, the abundance of data and new sophisticated big data analytics tools evoke the immediate thought that trend detection should somehow be “the solution”.

Discussing the nature of trends though, revealed that this is not necessarily the case. Nevertheless, data science can significantly contribute to develop those support tools, which are needed to cost-effectively help companies make more informed decisions, more rapidly validate hypothesis and therefore guarantee and solidify the medium and long-term market position.

Contact the author
Niels Reinhard
+49 (30) 814 513-13
Subscribe
Share