There are three important concepts in a cognitive system:
- contextual insights
- hypothesis generation
- continuous learning in time
Cognitive computing enables the examination of a wide variety of diverse types of data and the interpretation of that data to provide insights and recommend actions. The essence of cognitive computing is the acquisition and analysis of the right amount of information in context with the problem being addressed. A cognitive system must be aware of the context that supports the data to deliver value. When that data is acquired, curated, and analyzed, the cognitive system must identify and remember patterns and associations in the data. This iterative process enables the system to learn and deepen its scope so that understanding of the data improves over time. One of the most important practical characteristics of a cognitive system is the capability to provide the knowledge seeker with a series of alternative answers along with an explanation of the rationale or evidence supporting each answer.
A cognitive computing system consists of tools and techniques, including Big Data and analytics, machine learning, Internet of Things (IoT), Natural Language Processing (NLP), causal induction, probabilistic reasoning, and data visualization. Cognitive systems have the capability to learn, remember, provoke, analyze, and resolve in a manner that is contextually relevant to the organization or to the individual user. The solutions to highly complex problems require the assimilation of all sorts of data and knowledge that is available from a variety of structured, semi‐structured, and unstructured sources including, but not limited to, journal articles, industry data, images, sensor data, and structured data from operational and transactional databases.
For a cognitive system to be relevant and useful, it must continuously learn and adapt as new information is ingested and interpreted. To gain insight and understanding of this information requires that a variety of tools understand the data no matter what the form of the data may be. Today, much of the data required is text‐based. Natural Language Processing (NLP) techniques are needed to capture the meaning of unstructured text from documents or communications from the user. NLP is the primary tool to interpret text. Deep learning tools are required to capture meaning from nontext‐based sources such as videos and sensor data. For example, time series analysis analyzes sensor data, whereas a variety of image analysis tools interpret images and videos. All these various types of data have to be transformed so that they can be understood and processed by a machine. In a cognitive system these transformations must be presented in a way that allows the users to understand the relationships between a variety of data sources. Visualization tools and techniques will be critical ways for making this type of complex data accessible and understandable. Visualization is one of the most powerful techniques to make it easier to recognize patterns in massive and complex data. As we evolve to cognitive computing we may be required to bring together structured, semi‐structured, and unstructured sources to continuously learn and gain insights from data. How these data sources are combined with processes for gaining results is key to cognitive computing. Therefore, the cognitive system offers its users a different experience in the way it interacts with data and processes.