The advancement of technology has led to greater productivity gain since the start of the Industrial Revolution. It has changed the conventional production relationships, encouraged better collaboration between suppliers, producers and customers as well as improved integration and automation of the industrial environment. The increasing industrial productivity has thus proved to spur economic growth, foster better industrial development and help upskill the workforce.
When Industry 4.0 comes of age within the next decade or so, the key objective is to drive manufacturing forward: to be faster, more efficient and customer-centric while pushing beyond automation and optimisation to discover new business opportunities and models. Companies and countries will engage in a production race that leads to the introduction of disruptive technologies into manufacturing lines.
While a sustaining technology relies on incremental improvements to an already established technology, a disruptive technology, identified as one having a lack of refinement as well as its own set of challenges in performance, can improve production by creating faster, more flexible, and more efficient processes to produce higher-quality goods at reduced costs.
A combination of several major disruptive technology innovations maturing at the same time is expected to significantly shift the landscape of the manufacturing industry. These technologies— advanced robotics, artificial intelligence, sophisticated sensors, internet of things (IoT), cloud computing and big data analytics— already exist in this field today in some form, but as they integrate with one another the physical and virtual world will be able to interlink and transform the industry. These disruptive technologies are anticipated to become the enabler to the applications of Industry 4.0.
The Nine Enabling Technologies of Industry 4.0
The nine disruptive technologies listed in the table are the nine technological advancements that have been identified to be the backbone of Industry 4.0. It is assumed that a fully implemented Industry 4.0 manufacturing system will have most of these nine technological advancements integrated.
Many companies might have already implemented some of these technologies but are only using them in isolation rather than in interconnected autonomous systems where the technologies can cooperate and communicate freely amongst and across each other. To stay competitive in Industry 4.0, it is important to get a clear understanding of these nine technological advancements and start evaluating how they are going to impact any specific industry, and the benefits that can be gained with implementation.
Big data has been a popular buzzword for the last few years. It is a platform that brings useful insight to the industrial environment through collective data from multiple sources. There are still massive sets of untapped data in the industrial world and companies have started collecting and extracting valuable information to optimise production quality, save energy and reduce production costs.
The big data platform can collect, store and analyse massive amount of data derived from along the industrial value chain. Information such as trends, patterns, and the relationship between inputs, process, and outputs are gathered and scrutinised to provide insights as to how improvements can be made across the industrial value chain.
The analytics of data is especially useful in predictive manufacturing and is a major theme for industrial technology development. It assists manufacturers to maintain a competitive edge in operational management control as well as in improving their production efficiency and yield rates.
Subsequently, it helps to improve the whole value chain towards optimising products and pricing strategies. For instance, the data generated can shed light on new trends therefore allowing time for the company to develop strategies to take advantage of the said new trend. The collection and comprehensive evaluation of data from many different sources—production equipment and systems as well as enterprise- and customer management systems—will also become standard to support real-time decision making.
Big Data application can also improve other aspects of whole value chain. Among the possibilities are:
- a) Lowering number of defects by correlating all data captured in the operational process. This allows identification of patterns to assist in discharging faulty products early in the process thus improving production quality.
- b) Real-time decision making via comprehensive evaluation of data from various sources such as production equipment and systems as well as the enterprise and customer-management.
With the increased connectivity and use of standard communications protocols that come with Industry 4.0, the need to protect critical industrial systems and manufacturing lines from cybersecurity threats escalate dramatically. Most manufacturers will want to protect their most valuable data including intellectual property as well as data on customers and products. As a result, secure, reliable communications as well as sophisticated identity and access management of machines and users becomes essential.
Cybersecurity is crucial in protecting valuable data from being exploited by other parties. The utilisation of new technology such as internet of thing (IoT) and cloud computing may present risks to the overall system in the industrial environment—investments are therefore required to strengthen the security of these environments. Globally, it is estimated that in the year 2021, spending on cybersecurity will surpass $1 trillion. Currently in 2017, the current cybersecurity market is worth at least $120 billion.
Seeing that current cybersecurity solutions have largely been developed for IT-centric devices and environments, the new challenge is to embed security into the IoT devices that would be residing in these new Industry 4.0 manufacturing ecosystems. Therefore, an ideal cybersecurity system would be a combination of traditional network security and embedded security on the IoT devices to safeguard systems against cyber-attacks. The system should be equipped with advance interaction and visualisation as well as artificial intelligence to build autonomic capability into the security systems for perceptive operations.
Augmented Reality (AR) turns our real-life environment into a digital interface by putting virtual objects in realtime. It deploys the existing environment and overlays new information on top of it—unlike virtual reality which creates a totally artificial environment and can be seen through a variety of experiences.
AR technology brings an added value to the real world by enriching physical objects with virtual information. Between 2015 and 2016, investment in AR and virtual reality (VR) increased by 85 percent and IDC projects AR/VR spending to climb by almost 95% in 2018, growing to a worldwide total of $17.8 billion from $9.1 billion in 2017.
In a manufacturing environment, human error and inefficient production methods are the biggest pain points that lead to increased operational costs. AR is attempting to address these issues by combining both the virtual and physical world. When combined, users will be provided with real-time information to improve decision-making and work planning.
With the aid of AR devices that use sensors, cameras and motion sensors that accumulate information and display images into real world during work, they can see the planned model as how it will look in real life and will be able to make better adjustments in shorter time span.
By allowing customers to experience products virtually, manufacturers can demonstrate products while highlighting design, function and actual size without creating the physical copy. Based on customers’ feedback, new iterations can be made even before the real production versions.
Additive manufacturing, known by the more common name of 3D printing, is a new technology that has limitless applications in many manufacturing phases from preproduction to end products. A process that used to take weeks to accomplish on a small scale, 3D printing can now construct large and complex structures, like housing, in less than a day. To simplify, additive manufacturing techniques are used to create three-dimensional objects based on digital models by layering or printing successive layers of materials. Organic compounds, custom plastics, and a host of other materials can be utilised in the process, harmonising product and nature through innovation.
Additive manufacturing encompasses many technologies including Rapid Prototyping (RP), Direct Digital Manufacturing (DDM), layered manufacturing and additive fabrication. On its own, 3D printers market size is predicted to grow from $2.7 billion in 2016 to $7.7 billion in 2025, at a CAGR of 12 percent.
This technology makes prototyping so much simpler, allowing modification of approach to meet diverse needs. Additive manufacturing techniques cuts down waste due to overproduction thus leading the way to more efficient manufacturing practices. The advent of additive manufacturing is enabling products to be introduced into the market more efficiently since it not only saves time and money but also enables designers to pay more attention to streamlining product function.
Most of today’s IT systems are not fully integrated. Companies, suppliers, and customers are rarely closely linked and neither are inter-departments of one company such as engineering, production, and service. With Industry 4.0, companies, departments, functions, and capabilities will become much more cohesive. The platform will pave the way to a cross-company approach to use and share data, bringing benefits to the internal business environment as well as producers and suppliers.
This platform integration occurs in vertical systems (within the industry value chain) and in the horizontal systems (across multiple value chains). The vertical integration links the components within the value chain elements such as business processes, ICT systems, communication layer and field layer. Meanwhile, the
horizontal integration connects across multiple value chains in various avenues such as the process, data, and companies.
This technology allows for different computer systems to be linked together, enabling actual communication and the passing on of data between systems as software can act as a coordinated whole. This is ideal for all manufacturing companies as machinery from across the factory can be linked up together across the production line ensuring everything runs smoothly and efficiently.
In 2020, more than 26 billion devices will be able to communicate and signal via the Internet. Sensors and smart appliances will be everywhere enabling both by M2M (machine to machine) and M2H (machine to human). This statistic highlights the potential in allowing deep integration within the industrial environment.
In a nutshell, VR is computer-generated simulation of a 3D image that leverages real time data to reflect reality or the physical world in virtual form. It is a 100 percent virtual, immersive and digital experience. Recent developments in VR technology, and the release of low cost VR Headsets have opened up a variety of uses that see the technology
being integrated into everyday engineering and
There are many benefits of utilising simulation in the industry, especially in plant or factory operation, including allowing major improvement in industrial production. Operators will be able to provide safer environments for training, allowing workers to understand the risk of working without having to physically be in a dangerous environment.
Integration of VR into manufacturing is still in its infancy, however the trend on total spending on virtual reality products and services is expected to increase from $11.4 billion in 2017 to nearly $215 billion in 2021.
While the public usually associates Virtual Reality (VR) with gaming, VR has been used in engineering and
manufacturing industries for a long time. For product development, designers can build VR prototypes to check for accuracy, improve communication to customers, as well as to teams located in different places physically. Process simulations can also be used to identify bottlenecks, maximise efficiency, fine-tune workflows,and mitigate risks.
Manufacturers in many industries have long used robots to tackle complex assignments, but now robots are evolving for even greater utility. The key idea in Industry 4.0 is for robots to become more autonomous, flexible, and cooperative. Eventually, they will interact with one another and work safely side by side with humans and learn from them. These robots will cost less and will have a greater range of capabilities than those used in manufacturing today.
Autonomous robots improve the accuracy of routine tasks by reducing the effort and time required for each
task. For example, autonomous robots can perform inspections on inbound goods and provide real-time data to suppliers. Efficiency of the industrial processes will be improved as mistakes are expected to be reduced
through the adoption of collaborative robots.
A projection echoed by a 2015 Bank of America report stated that by 2025 robots could take over 45 percent of manufacturing jobs, cutting labour costs by $9 trillion. As technology becomes cheaper or is offered in more costeffective ways, robotics-as-a-service will help to perform dull, dirty, and dangerous tasks and allow humans to channel their energies into other value-added activities.
Cloud computing helps business in all industries adapt to today’s rapidly changing technology. In simpler terms, cloud computing means storing and accessing data and programmes over the Internet instead of your computer’s hard drive. The cloud is just a metaphor for the Internet. With artificial intelligence and automation being integrated in production, cloud computing is a way for businesses to readily change with the times without losing data.
Currently, companies are already using cloud-based software for some enterprise and analytics applications, but with Industry 4.0 more production-related undertakings will require increased data sharing across sites and company boundaries. The need for a cloud based platform is essential with the escalating data sharing across industrial environments as it offers a network of remote servers to store as well as manage and process data.
At the same time, the performance of cloud technologies will improve, achieving reaction times of just several milliseconds. As a result, machine data and functionality will increasingly be deployed to the cloud, enabling more data-driven services for production systems.
Cheaper and faster technology is making cloud computing increasingly technically and economically viable. Taking the example of digital storage, the cost has plunged from $300,000 per gigabyte of data in 1981 to $0.03 per gigabyte in 2014. Not to mention the internet speed, files that would have taken days to download over a 28.8 kbps dial-up connection can be transferred in mere seconds over today’s broadband connections.
As machine learning techniques matures, the door is opened for prescriptive analytics where cloud computing will have the ability to predict when failures are likely to happen in the future based on historical machine data, usage information, and likely environmental conditions— rather than just flagging that a potential issue has been detected in real-time. Vendors of manufacturing execution systems are among the companies that have started to offer cloud-based solutions.
The Internet of things (IoT) is becoming an increasingly growing topic of conversation. Described as a network of devices that have local intelligence which enables communication between facility, machines, and finished as well as unfinished products it is a concept that not only has the potential to impact how we live but also how we work.
At its core, IoT is simple—it is about connecting devices over the Internet, letting them talk to us, the applications, and each other. It incorporates machine learning, big data technology, sensor data, machine-to-machine communication and automation technologies. Compared to humans, connected intelligent machines have the ability to collect and process large data sets with more accuracy and consistency.
Today, only certain manufacturer’s sensors and machines are networked and make use of embedded computing. They are typically organised in a vertical automation pyramid in which sensors and field devices with limited intelligence and automation controllers feed into an overarching manufacturing process control system.
The IoT gained prominence due to the availability of Internet, both wired and wireless. This is coupled with the accelerated decline of cost for sensors, which has reduced by 100 times in last 10 years. It is estimated that IoT applications in manufacturing and factory settings are expected to generate $1.2 to $3.7 trillion of economic value annually by 2025.
In the context of Industry 4.0, IoT has the potential to create smart factories that would improve overall control in the production process. For example, production facilities that have been equipped with semi-automated control are able to decentralise the production process. Products are identified by radio frequency identification codes, and workstations know which manufacturing steps must be performed for each product and can adapt to perform the specific operation.
With the IoT, more devices—sometimes including even unfinished products—will be enriched with embedded computing and connected using standard technologies. It allows industrial devices to communicate and interact both with one another and with more centralised controllers, as necessary. It also decentralises analytics and decision-making, enabling real-time responses.
The Impact of Adopting Technology Advances
Industry 4.0 will allow manufacturers to bypass current structural constraints and leapfrog into greater efficiency in energy use and value production. It must be noted that such developments will also bring disastrous results to people involved in industries that will be phased out, therefore both the positive and negative sides of the issue must be addressed with the same level of importance.
This new technology adoption can be analysed from four perspectives: digital infrastructure and ecosystem, human capital development, business environment and strategic investment. The focus is not on specific types of technology, but rather how introducing new tools changes how the workforce is put together to deliver goods and services.
i) Digital Infrastructure & Ecosystem
The impact of adopting technologies in the digital infrastructure and ecosystem is evident from the efficiency of the infrastructure itself. Through technology adoption, the digital ecosystem offers better ways of leveraging the information to unlock its potential values. For example, the combination of big data and cloud computing will be able to change data into useful information that helps decision makers. Groups of technology like 3D printing, augmented reality and simulation technology converts the digital designed data into a tangible workpiece; while the integration of manufacturing system and IoT use captured information to schedule the ideal maintenance times.
ii) Human Capital Development
With the advent of technology comes the demand for employees with higher competencies, especially in software development and IT technologies. New technical, information and communication technologies and the workforce changes associated with them can support and drive innovative work practice, improve processes, and impact on efficiency and productivity.
A different skills set will be required, one that requires developed thinking skills. This will mitigate the impact of automation in the near term which displaces some of the low skilled jobs—the new challenges in human capital development is to ensure how workers can be upskilled to ensure their viability in the Industry 4.0 era.
iii) Business Environment
The introduction of new technologies creates entirely new ways of serving existing needs and significantly
disrupts existing value chains. The behaviour of consumers is also changing to adapt with the use of these technology advancements, which in turn generates new ways to design, market and deliver products and services. It will bring innovative opportunities to businesses especially for SMEs to boost revenue growth. SMEs with enhanced equipment and new data application will have an advantage to understand the wide range of consumer behaviour.
iv) Strategic investment
Adopting technologies will certainly improve the companies and people in various ways. One of the most important improvements is helping companies to create a foundation for greater exchange of data and expertise. This will lead to better relationships with international business partners and competitors. It is a perfect opportunity for companies to have strategic investments that will enhance its products, technologies and processes.
Next Emerging Technology on the Horizon
Industry 4.0 and the technology advancements it brings have entered the business mainstream, and it is a hot topic everywhere, from the boardroom to the breakroom. The vigour and influence of these technologies is multiplied by the rapidly burgeoning connectivity between billions of people through enhanced mobility and the ease of accessing the growing nexus of data and knowledge.
Looking ahead, numerous significant trends are expected to take the lead in how the technological advances will begin to integrate into businesses. Listed below are three identified technologies that are rapidly emerging on the horizon and which will bring huge impact to the industry. These technologies are artificial intelligence (AI), blockchain and quantum computing.
i) Artificial intelligence (AI)
Everyone is talking about AI, but what does AI really mean? Even though AI is not really considered as an emerging technology, advancements in technology have improved AI by leaps and bounds. Artificial intelligence (AI) is an umbrella term for smart technologies that are aware of and can learn from their environments, enabling them to subsequently take autonomous action.
Advances in machine learning have allowed the development of programmes that can teach themselves to learn, understand, reason, plan, and act when exposed to new data in the right quantities. In a broader definition, AI is a collective term for computer systems that can sense their environment, think, learn, and initiate action in response to what they are sensing and their objectives.
AI technology will supplement the effort to build the smart networked factory, in which data from supply chains, design teams, production lines and quality control are linked to form a highly integrated, intelligent creation engine. AI can also use past information and a preprogrammed system to make proper decisions and execute appropriate actions for usage in E-commerce, smart homes and personal digital assistance fields. It is estimated that by 2020 AI will be a mainstream offering in the market where International Data Cooperation (IDC) predicts the number of AI sales will exceed $10 billion.
There has been a great deal of hype surrounding Blockchain, a distributed electronic ledger that uses software algorithms to record and confirm transactions with reliability and anonymity. The record of events is shared between many parties and information once entered cannot be altered, as the downstream chain reinforces upstream transactions. Blockchain is seen as the heart of the Industry 4.0 where it allows millions of smart devices to perform transparent and frictionless financial transactions, without human intervention but fully autonomous, in the IoT universe.
The blockchain technology will empower people and encourage growth on a large scale as they become their own bank and have full control of their own money. The main global effect on growth will be more liquidity, diversity, and a substantial increase in the number of various participants in all sectors of the
Among the many foreseen uses for Blockchain is enabled smart contracts that allow parties to automatically negotiate terms and conditions (such as price, quality level and delivery date) without the need for a middleman. Blockchain is expected to provide transparency in many transaction processes and is also touted to be secure and safe from hacking due to its large, distributed nature.
It is estimated that the blockchain technology will be mainstream in 2020 starting in the financial industry where 20 percent of finance trade globally will be incorporating blockchain/distributed ledger technology.
iii) Quantum computing
Quantum computing is a new approach to process information by using new types of algorithms in a more holistic approach. In classic computing, information is encoded in bits; which take the value of 1 and 0 (binary) that act as an on or off switch for computer to function. In quantum computing, two properties of quantum computers known as quantum superposition and quantum entanglement are used to disrupt the binary. The Quantum superposition allows quantum bits (qubits) to be a 0 and 1 at the same time. While quantum entanglement entwines multiple qubits, allowing for a greater number of calculations.
By using these two principles, qubits can act as more sophisticated switches which enable quantum computers to solve difficult problems that are not viable using current computing technology. Experts have predicted that the quantum computing would be mainstream in 10 to 20 years from now since the hardware development is still a bit premature at this stage.
Among the many envisioned usages of quantum computing is the ability to process the massive amounts of data produced daily (2.5 exabytes of data) since quantum computers can operate 100,000 times faster than the current computers. In the IT security landscape, quantum computers will change the landscape of data security, providing a more secure computing environment.
|No of Pages: 60
Published Date: 26 February 2018
Link: [Download]| [View]