Cognitive manufacturing topics start to be trending on industry 4.0 roadmap for organizations as it helps them to improve their fundamental business metrics, including product reliability, safety, quality, productivity while reducing downtime and lowering costs at the same time. A cognitive engine can analyze the symptoms of break down in a machine which can help to reduce downtime expenses. Additionally, cognitive manufacturing helps in creating new value from manufacturing data by looking deeply into the manufacturing process and business environment to drive information that is important to a manufacturer. Businesses are increasingly deploying cognitive manufacturing techniques to improve the performance of their plant operations which will also help to reduce costs.
According to Metrology News article : Cognitive manufacturing uses cognitive computing, the Industrial IoT, and advanced analytics to optimize manufacturing processes in ways that were not previously possible. It helps organizations improve fundamental business metrics such as productivity, product reliability, quality, safety and yield, while reducing downtime and lowering costs. Cognitive technologies can find meaning in this data in ways that, until now, only the human brain could comprehend. This level of understanding will be considered essential for success in the modern manufacturing era as heightened competitiveness and cost sensitivities demand new levels of agility, responsiveness and innovation from manufacturers.
On the other hand, Analytics Insight report  believes that cognitive technologies were not possible on industrial solutions before the age of big data. Cognitive systems need data to analyze lots and lots of data. For most manufacturers, having enough data is no longer a problem. In fact, most manufacturers have access to more data than they can analyze using older methods and probably more data than they actually need. Conventional computing will make way for new technologies to handle the large influx of data and the complexity of the analytics to pave way for cognitive manufacturing. Furthermore, cognitive manufacturing fully utilizes the data residing across equipment, systems and processes to derive actionable insight across the entire value chain through different processes from design through manufacture to support activities. Built on the foundations of IoT and employing analytics combined with cognitive technology, Industry 4.0 or cognitive manufacturing drives its key productivity improvements in reliability, quality and efficiency of the manufacturing environment.
In spite of all the information and business proposition around cognitive manufacturing today, probably one most common mistake is associate cognitive function with just increase the size of the data that we are collecting or use more advanced AI-based techniques as deep learning solutions to solve the same problems, but with more powerful IoT infrastructures, machines and techniques. If we go to the basis of the cognitive science, cognitive solutions should bring to the industrial ecosystem a new dimension in term of machine cognition, understanding and self-capabilities to transform current manufacturing system to an autonomous production and orchestration production environments.
If we talk about cognitive functions, Natural Language Processing, including Natural Language Understanding, Natural Language Interaction and Natural Language Generation, is one of the most famous form of AI to give the capability to machines and robots to interact in a common language with the human in production and social environments. The current NLP zeitgeist arose from half a decade of steady improvements under the standard evaluation paradigm. Systems’ ability to comprehend has generally been measured on benchmark data sets consisting of thousands of questions, each accompanied by passages containing the answer. When deep neural networks swept the field in the mid-2010s, they brought a quantum leap in performance. Subsequent rounds of work kept inching scores ever closer to 100% (or at least to parity with humans).
Researchers would continue publishing new data sets of even trickier questions, only to see even bigger neural networks quickly post impressive scores. But many people in the field are growing weary of such leaderboard-chasing. What has the world really gained if a massive neural network achieves on some benchmark by a point or two? It’s not as though anyone cares about answering these questions for their own sake; winning the leaderboard is an academic exercise that may not make real-world tools any better. Indeed, many apparent improvements emerge not from general comprehension abilities, but from models’ extraordinary skill at exploiting spurious patterns in the data. Do recent “advances” really translate into helping people solve problems?
According to MIT technology review article , today’s models are nowhere close to achieving that level of comprehension, so how did the NLP community end up with such a gap between on-paper evaluations and real-world ability? To borrow a line from the paper, the NLP researchers have been training to become professional sprinters by glancing around the gym and adopting any exercises that look hard. To bring evaluations more in line with the targets, it helps to consider what holds today’s systems back. A human reading a passage will build a detailed representation of entities, locations, events, and their relationships—a “mental model” of the world described in the text. The reader can then fill in missing details in the model, extrapolate a scene forward or backward, or even hypothesize about counterfactual alternatives.
Nevertheless, the cognitive computing market size was valued at $8.87 billion in 2018, and is projected to reach $87.39 billion by 2026, growing at a CAGR of 31.6% from 2019 to 2026. Cognitive computing is a next-generation system that interacts in human language and helps experts to make better decisions by understanding the complexity of unstructured data. The global cognitive computing market encompasses technologies such as natural language processing, machine learning, automated reasoning, and information retrieval, which are used in translating unstructured data to sense, infer, and predict the best solution (see figure).
As you can appreciate to this point, companies are still proactive to invest huge amounts of money on advanced AI-solutions, but as we already said before, there is not just solve the same problem with more powerful tool, in some point, better sooner than later, industry and AI-experts should talk and redefine the Industry 4.0 ultimate goals, if the common ambition is the use of AI as a disruptive force to transform the industry of the future.
In future posts, we will be continue writing about technology and business trends for enterprises. Furthermore, we recommend consulting the following literature to continue your digital transformation journey:
- Designed for Digital: How to Architect Your Business for Sustained Success, MIT review
- The Future Is Faster Than You Think: How Converging Technologies Are Transforming Business, Industries, and Our Lives, by Simon & Schuster
- Artificial Intelligence: The Insights You Need, by Harvard Business Review
- The Year in Tech, 2021: The Insights You Need, by Harvard Business Review
- The Deep Learning Revolution, by MIT Press
- Competing in the Age of AI, by Harvard Review Press
The objective of this blog is to provide a personal vision of how digital transformation trends will be impacting in our daily activities, businesses and lifestyle.
Industry 4.0 and Smart-mobility expert, his research interest includes Industry 4.0, Smart-Maintenance, Process Optimization, Machine Learning, AI engineering and Cloud-based solutions.