Jump to content

Cognitive computing

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by BillKosMD (talk | contribs) at 15:36, 12 May 2017 (speech capabilities in CC refers to "speech recognition" and vision is "object recognition"). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Cognitive computing (CC) describes technology platforms that, broadly speaking, are based on the scientific disciplines of artificial intelligence and signal processing. These platforms encompass machine learning, reasoning, natural language processing, speech recognition and vision (object recognition), human–computer interaction, dialog and narrative generation, among other technologies.[1][2]

Definition

At present, there is no widely agreed upon definition for cognitive computing in either academia or industry.[3][4]

In general, the term cognitive computing has been used to refer to new hardware and/or software that mimics the functioning of the human brain[5][6][7][8][9][10] (2004) and helps to improve human decision-making.[11] In this sense, CC is a new type of computing with the goal of more accurate models of how the human brain/mind senses, reasons, and responds to stimulus. CC applications link data analysis and adaptive page displays (AUI) to adjust content for a particular type of audience. As such, CC hardware and applications strive to be more affective and more influential by design.

IBM describes the components used to develop, and behaviors resulting from, "systems that learn at scale, reason with purpose and interact with humans naturally". According to them, while sharing many attributes with the field of artificial intelligence, it differentiates itself via the complex interplay of disparate components, each of which comprise their own individual mature disciplines.[1][3][12][13]

Some features that cognitive systems may express are:

  • Adaptive: They may learn as information changes, and as goals and requirements evolve. They may resolve ambiguity and tolerate unpredictability. They may be engineered to feed on dynamic data in real time, or near real time.[13]
  • Interactive: They may interact easily with users so that those users can define their needs comfortably. They may also interact with other processors, devices, and Cloud services, as well as with people.
  • Iterative and stateful: They may aid in defining a problem by asking questions or finding additional source input if a problem statement is ambiguous or incomplete. They may "remember" previous interactions in a process and return information that is suitable for the specific application at that point in time.
  • Contextual: They may understand, identify, and extract contextual elements such as meaning, syntax, time, location, appropriate domain, regulations, user’s profile, process, task and goal. They may draw on multiple sources of information, including both structured and unstructured digital information, as well as sensory inputs (visual, gestural, auditory, or sensor-provided).[14]

Use cases

Cognitive computing has been subject to a great deal of marketing hype over the years and there continues to be a struggle with finding a non-proprietary definition, but as cognitive computing platforms have emerged and become commercially available, evidence of real-world applications are starting to surface. Organizations that adopt and use these cognitive computing platforms, purpose-build applications to address specific use cases that are relevant to their internal and external users, with each application using some combination of available functionality necessary for the use case.

Examples of such real-world use cases include the following:

These and many more examples are available on the respective cognitive computing platform provider blog websites, helping to demystify the possibilities into real world applications today. This is important, since even as recently as April 8, 2016 in a Fortune magazine article,[15] Meg Whitman cast doubt on IBM Watson's present day capabilities; IBM's CEO Virginia Rometty responded with "We are building an era, a platform, an industry, and making a market with it. We have competitors who don't disclose for a decade, [so] I'm going to protect it and nurture it—we will disclose eventually".

See also

References

  1. ^ a b Kelly III, Dr. John (2015). "Computing, cognition and the future of knowing" (PDF). IBM Research: Cognitive Computing. IBM Corporation. Retrieved February 9, 2016.
  2. ^ Augmented intelligence, helping humans make smarter decisions. Hewlett Packard Enterprise. http://h20195.www2.hpe.com/V2/GetPDF.aspx/4AA6-4478ENW.pdf
  3. ^ a b "IBM Research: Cognitive Computing".
  4. ^ "Cognitive Computing".
  5. ^ "Hewlett Packard Labs".
  6. ^ Terdiman, Daniel (2014) .IBM's TrueNorth processor mimics the human brain.http://www.cnet.com/news/ibms-truenorth-processor-mimics-the-human-brain/
  7. ^ Knight, Shawn (2011). IBM unveils cognitive computing chips that mimic human brain TechSpot: August 18, 2011, 12:00 PM
  8. ^ Hamill, Jasper (2013). Cognitive computing: IBM unveils software for its brain-like SyNAPSE chips The Register: August 8, 2013
  9. ^ Denning. P.J. (2014). "Surfing Toward the Future". Communications of the ACM. 57 (3): 26–29. doi:10.1145/2566967.
  10. ^ Dr. Lars Ludwig (2013). "Extended Artificial Memory. Toward an integral cognitive theory of memory and technology" (pdf). Technical University of Kaiserslautern. Retrieved 2017-02-07. {{cite journal}}: Cite journal requires |journal= (help)
  11. ^ "Research at HP Labs".
  12. ^ Kelly, J.E. and Hamm, S. ( 2013). Smart Machines: IBM's Watson and the Era of Cognitive Computing. Columbia Business School Publishing
  13. ^ a b Ferrucci, D. et al. (2010) Building Watson: an overview of the DeepQA Project. Association for the Advancement of Artificial Intelligence, Fall 2010, 59–79.
  14. ^ Deanfelis, Stephen (2014). Will 2014 Be the Year You Fall in Love With Cognitive Computing? Wired: 2014-04-21
  15. ^ "HPE Chief Meg Whitman Disses IBM Watson".

Further reading