By Kleber Gallardo and Paul McLaughlin
Today more than ever the world runs on data. Not just governments and large corporations but small business, service industries, and educational institutions and individuals. Within this universe is a subset of users, decision makers, who need to analyze huge volumes of data, hundreds of millions even billions of transactions , with the requirement of identifying risk, risk laden behaviors, or conditions. There is so much data we need tools to find and identify these conditions and predictive models to help determine when and if to intervene. With the correct tools and strategies data you can make the data work for the benefit of individuals, private organizations, and government agencies.
The human brain cannot absorb all of the data that is available today and process it in a manner that will maximize the results of the decision making process. Today, each of us has a personal health care record that is enormous, over our lifetime it can grow to one-hundred gigabytes. How do physicians and other caregivers access this information? How do they analyze it? How do they know when to act on it? They will need analysis tools to help them uncover the patterns within the data.
The Internet of Things[i] (IoT) is the network of physical objects or "things" embedded with electronics, software, sensors, and network connectivity, which enables these objects to collect and exchange data. In the case of healthcare this means that machine to machine data, data from appliances, medical images, and electronic notes from doctors can be collected, assimilated, analyzed, and present to the physicians and practitioners the summarized information they need to make decisions in real time. The objective of the new breed of analytic tools that use statistics, predictive analytics, and cognitive intelligence is to analyze, make inferences, and garner insights from data that is too vast for an individual to absorb.
The benefit from these types of tools extends well beyond just healthcare. Most importantly, these tools need to provide insights, identify anomalies, and predict future behaviors. In healthcare this new breed of analytics tools serve to analyze the vast amount that is available and directly provide business insight and actionable intelligence. But this same technology can be employed to analyze sales results, identify the most successful salespeople and attempt to emulate them; identify fraudulent behaviors and disrupt them; or identify high risk behaviors and patterns and intercept them where appropriate. We are in a unique point of time, we are creating new, more powerful, more intuitive systems; systems capable of cognitive analytics.
For decades we have been developing rules based approaches to solving problems. Deterministic models have been the tool that has been used to develop solutions. Deterministic models rely on the programmer’s knowledge of both the expected outcomes and the process. Processes, by and large, that are trails or paths through data using rules, if then else statements, and algebraic formulas to articulate the outcome (Petrocelli). The new strategy employs probabilistic models of creating solutions. A probabilistic model analyzes information, identifies trends, clusters, and behaviors that previously would have remained buried in the data. It does this by using statistical analysis that projects the probability of future behavior based on past history. The new approach is called cognitive computing.
Cognitive computing utilizes an open approach, machine learning and sophisticated natural language processing. Characteristic of cognitive applications, is the capability to understand language, apply logic, interpret, intuit and relate information, predict, evaluate, and make decisions. This provides a whole new group of computational solutions.
The Four Big Strengths of Cognitive Computing
- Identifying and Isolating Unknown Patterns of Behavior
One strength of cognitive computing is that it can identify patterns of behavior, transactions, or trends within data that invisible to detection given the rarity of the transaction and the volume of data. Since most software is deterministic it assumes that there is an existing, recognized, and human designed blueprint, or schema. A schema is merely a model. Where deterministic solutions require that the model be identified and understood in advance cognitive computing adapts itself to situations where the schema is not known in advance. “When faced with an unknown information, humans build new schema naturally while most software needs to have it spelled out ahead of time.” (Petrocelli).Cognitive computing has closed the gap between the two.”
- Assimilate and Employ Vast Amounts of Data
Bounded rationality “assumes that people, while they may seek the best solution, normally settle for much less, because the decisions they confront typically demand greater information, time, processing capabilities than they possess. They settle for “bounded rationality or limited rationality in decisions.” (Chand). It is not just the possession of data that creates bounded rationality, but the ability to consume, absorb, and make rational decisions based on the information available. Cognitive computing helps address the limits of bounded rationality.
Humans are limited to the amount of information they can synthesize and utilize during any decision event. Computers have no such limitation, and can actually absorb new information from additional sources, while fully engaged in a decision event. Cognitive computing enhances the learning abilities of a human being with the processing power of big data technologies.
- Data Driven and Probabilistic Recommendations
Decision making is greatly enhanced by cognitive computing’s ability to access and synthesize additional information, and to have an almost infinite appetite for new data. Cognitive computing also finds patterns and trends buried within the data and uses that information to further a solution to the problem at hand. Finding the patterns in the data is only one aspect of what makes this type of computing cognitive. Projecting future behaviors or results based on prior histories and incorporating that information into its recommended solution is what truly separates probabilistic solutions from deterministic ones.
The unpredictable nature of data and specifically the amount of data required for a particular project or problem lends itself to a cloud solution. In the cloud users will have access to computing resources as well as addressing storage needs. There is a second type of scalability that is very important that of the human infrastructure required to maintain an ongoing cognitive computing initiative. The mathematics required to fully exploit the power of cognitive computing are beyond the comprehension and skill levels of most programmers. Those that can handle both the mathematics and the programming are very expensive. A cloud solution allows both areas of scalability to be addressed. A properly applied cognitive computing strategy frees up analysts to do what they do best; analyze and discover.
Who Can Benefit from Cognitive Computing
Judith Hurwitz notes in a recent article “…that a cognitive approach to advanced analytics will have a dramatic impact on hundreds of different market segments. When we have the ability to gain insights that is hidden and then apply learning to that data there is a potential to transform industries ranging from healthcare, to financial services, metropolitan area planning, security, and IT itself. At the heart of business transformation is the ability to make sense.” Better decisions result from better, timelier information cognitive computing provides freedom from bounded rationality and the constraints of assimilating additional data on the fly. Industry, business, government and education are all being drowned in data, a cognitive computing strategy will allow them to harness the data rather than be overwhelmed by it.
Virtually any enterprise that needs to make data driven decisions will benefit from cognitive computing. Absolute Insight is a product that Alivia Technology is developing to take advantage of these new technologies and apply it to healthcare, government and industry.
- Interview, Chand, Smitri. "YourArticleLibrary." n.d. Models of Decision Making Models. Internet. November 29, 2015.
- Judith Hurwitz. "Judith's Balancing Act." November 11, 2015. Hurwitz & Associates. November 29, 2015.
- Petrocelli, Tom. Neuralytix. February 24, 2014. November 28, 2015.
The importance of social media chatters cannot be ignored anymore when making trading decisions. Since the beginning of time, traders relied on ‘whispers’ and ‘rumors’ for trading….stocks got bought on rumors and sold on news…that’s how the market worked before the introduction of electronic trading.
All of a sudden there was a race to capture and analyze market data, write algos and put servers next to exchanges for the speedy execution of trades .Whoever could shaved off few milliseconds and got to the exchange first could make money….than the technology that allowed firms with deep pockets to developed expensive trading infrastructure got cheap, really cheap…. now anyone can set up a HFS in their bedroom and do program trading…trouble is no one can make money when everyone does the same thing. Advantages of HFS and program trading are pretty much gone. They are utilities now.
In the old days, majority of news and rumors came from the trading floors, nearby coffee shops and bars.
Today, it is coming from all over the world through social media chatters and blog posts. The process involves assessing the sentiment of the content, for example, whether comments are positive or negative regarding a particular company; sector or industry. The relevance of the content, for instance whether a company is the subject of a news article or blog, who is writing the blog, how many followers chatting and retweeting the blog.
This is a frontier territory for most, but not for the technically advanced firms…they are using big data technology to sniff and capture investors, workers and consumers sentiments. All in real time from hundreds of thousands of blogs, chats from social media sites and then capturing and neatly bundling them with other trading utilities and creating real time trading signals.
Now we have come full circle. Traders again try to capture sentiments before anyone one else can. However this time, they can combine trading utilities with unstructured data and can take advantage of a whisper 10,000 miles away from an Indian village where someone just tweeted about how a clean tech company’s product is improving lives….a company that happens to be on your watch list.
Next post: will discuss tools of the trade
Big Wall Street firms are finding ways to harness the power of big data in decision making process from customer acquisitions to portfolio management. Here is a look at what SunGard is forecasting for the Wall Street.
SunGard has identified 10 trends shaping big data initiatives across all segments of the financial services industry in 2012:
- Companies require larger market data sets and deeper granularity to feed predictive models, forecasts and trading throughout the day.
- New regulatory and compliance requirements are placing greater emphasis on governance and risk reporting, driving the need for deeper and more transparent analyses across global organizations.
- Financial institutions are ramping up their enterprise risk management frameworks to help improve enterprise transparency, auditability and executive oversight of risk.
- Financial services companies are looking to leverage large amounts of consumer data across multiple service delivery channels to uncover consumer behavior patterns and increase conversion rates.
- Emerging markets like Brazil, China and India are outpacing Europe and America as significant investments are made in local and cloud-based data infrastructures.
- Advances in big data technology will help financial services firms unlock the value of data in operations to help reduce costs and discover new revenue opportunities.
- Traditional data warehouse systems will need to be re-engineered with big data technologies to handle growing volumes of information.
- Predictive credit risk models that tap into large amounts of payment data are being adopted in consumer and commercial collections practices to help prioritize collections activities.
- Mobile applications, tablets and smartphones are creating greater pressure for company networks to consume, index and integrate structured and unstructured data from a variety of sources.
- Big data initiatives are driving increased demand for algorithms to process data, and emphasizing challenges around data security and access control as well as minimizing impact on existing systems.
With the rise of electronic trading, algorithms and high-frequency trading, market data and messaging volumes, Investment firms need new tools and strategies to handle the crush of data, identify trading signals, and predict market movements to make investment decisions.
Wall Street knows how to handle structured data. Today, however, unstructured data rules the Internet. The internet is an ocean of content, swirling with documents, news, blogs, buzz, speculation and rumor.
How does a firm exploit the web's flood of unstructured data and combine with structured data to gain an edge?
Fortunately, firms like Alivia with deep financial knowledge and big data mining capabilities are working with investment managers and developing new strategies and decision making analytics to harness the promise of big data.
Talk to us and see how Alivia can help improve your performance with big data analytics and decision making tools.