Artificial Intelligence and the Corporate World Transformation

Worldwide Analytics Cognitive AI  and Big Data Predictions

Worldwide, companies collect and own huge amounts of data in the form of documents. Due to a lack of digitization, these can often not be served for business processes – or only with a huge manual effort behind. These documents usually contain important and business-critical information, so the loss or even the time delay in gathering information can have a major impact on the success of a business.


However, with the rapid advances in automated text capture cognitive technology, organizations are now able to easily digitize, classify, and automatically read their unstructured business documents for transfer to relevant business processes. With such fully automated solutions, companies can not only save time and money, but also greatly improve the data quality in their systems and massively accelerate response times and important decisions.


Especially computer vision has evolved enormously in recent years. The ability to quickly recognize and process text on each device has greatly improved since the time when documents had to be scanned and analysed with OCR technology. This rapid development is also reflected in the numbers in the industry: IDC predicts that the world market for content analytics, discovery and cognitive systems software will reach $ 9.2 billion by 2019 – more than twice as much as in 2014. To make the most of these market changes, IT solution providers need to better serve the rapidly growing needs of machine learning and artificial intelligence (AI). Only then can they meet the customer requirements of tomorrow and remain relevant.


Employees in the center of each business


There is a groundless fear that artificial intelligence automation solutions could replace skilled employees in companies. Despite or because of solutions based on artificial intelligence, well-trained employees are needed who understand the core values of the company as well as the technological processes. People have qualities that AI solutions depend on, such as empathy, creativity, judgment, and critical thinking. That’s why qualified employees are essential for the success of a company in the future as well.


Companies as drivers of digital transformation


Businesses first and foremost require systems that support and relieve their professionals of their day-to-day routine work, enabling them to work more productively and creatively. Above all else, modern systems must be capable – on the basis of past experience – of learning behaviour independently and of making suggestions for the future course of action. To do this, companies need professionals who are able to lead these systems to enable automated workflows in the first place.


Robotic Process Automation (RPA) and machine learning drive the automation of routine repetitive tasks. RPA software is a powerful solution for more efficient manual, time-consuming and rule-based office activities. They reduce throughput times and at a lower cost than other automation solutions. In addition, artificial intelligence will make more types of these tasks automatable. The combination of RPA and machine learning will undoubtedly create a large market segment with high demand; namely for the identification of processes and their intelligent implementation.


The next five years


It is expected that once companies have automated various tasks through the use of artificial intelligence, they will increasingly want to monitor and understand the impact of these processes on their organization. As a result, they will undergo a fundamental change over the next three to five years. This is mainly due to the convergence of RPA and AI in the following areas:

The use and advancement of RPA will entail a wave of machine learning advancements, such as: for task automation or document processing. Even processes that affect basic decision-making benefit greatly from RPA. Use cases traditionally associated with capturing data from documents, on the other hand, will converge with ever new document-based RPA use cases. AI technology is now being used more widely and offers advantages for the identification and automation of processes as well as their analysis.


AI will also lead to the automation of basic tasks performed today by qualified staff. It will have a major impact on the composition and size of the workforce of companies, especially in the fintech, health, transport and logistics sectors. Above all, companies from all industries benefit from optimized processes for customer relations. However, authorities can also offer citizens quicker reaction times and improved service through intelligent automation.

And finally, robotics is much more than just R2-D2 or C-3PO. Software robotics will think much faster than most people, penetrate the work environment in companies – in data and document capture, RPA, analytics and for monitoring and reporting – intelligent and situational.


Ready for change


Businesses need to prepare for the age of AI today to stay successful. This requires a significant shift in the required skills in the company. Above all, it is up to the employees to be open to the new technologies and to see them as an opportunity to gain competitive advantages.

In general, intelligent systems will do more work in the future. For example, in the case of lending, the role of the person in charge will continue to decline because the system will be able to independently make intelligent decisions based on the borrower’s previous financing behaviour. Ultimately, the clerk only has to worry about rule-based exceptions. This will relieve the loan officers of many routine tasks, allowing them to spend more time on customer care. Overall, this significantly increases bank productivity.

A further shift in competence results from the fact that the process requires less human control and expertise. As software becomes increasingly knowledgeable, it becomes less dependent on employees. This means that their duties are smaller, but at the same time more responsible.

CISCO: Cloud networking trends

Essential characteristics of cloud

The annual Cisco Global Cloud Index (2016-2021) shows that data-center traffic is growing rapidly due to increasingly-used cloud applications. According to the study, global cloud traffic will reach 19.5 zettabytes (ZB) in 2021. This is an increase of 6.0 ZB compared to 2016, which is 3.3 times higher, with an annual growth rate of 27%. In three years, cloud traffic will account for 95 % of total traffic, compared to 88 % in 2016.

According to the study, both B2C and B2B applications contribute to the growth of cloud services. For consumers, video streams, social networking, and web search are among the most popular cloud-based apps. For employees, it’s ERP, collaboration and analysis solutions.


Security and IoT as a growth driver

Increasing IoT applications, such as smart cars, smart cities, connected healthcare and digital care, require a highly scalable server and storage solutions to meet new and expanded data center needs. In 2021, there will be 13.7 billion IoT connections, compared to 5.8 billion in 2016, the study said.

In the past, security concerns were a major barrier to cloud usage. Improvements in data center control and data control reduce the risk to businesses and better protect customer information. New security features coupled with tangible benefits from cloud computing, such as scalability and efficiency, play an important role in cloud growth.


Hyperscale Datacenters Growth

The increasing demand for data center and cloud capacity has led to the development of hyper-scaled public clouds based on Hyper-scale data centers. The study predicts that there will be 628hyper-scale data centers worldwide in 2021, compared to 338 in 2016, nearly the double. In three years Hyperscale data centers will have:

  • 53 % of all data center servers (2016: 27 %)
  • 69 % of the computing power of data centers (2016: 41 %)
  • 65 % of data center data stored (2016: 51 %)
  • 55 % of all datacenter traffic (2016: 39 %)

The growth of data center applications is exploding in this new multi-cloud world. The predicted increase requires further innovation, especially in the public, private and hybrid cloud sectors.


Virtualization of data centers and cloud computing growth

By 2021, 94 % of the workloads and server will be processed in cloud data centers, the remaining 6 % in traditional data centers. All data center workloads and server instances will more than double (2.3x) between 2016 and 2021, while cloud-based workloads and server instances will almost triple (2.7x) over the same period).

The density of workloads and server instances in cloud data centers was 8.8 in 2016, rising to 13.2 by 2021. In traditional data centers, density increases from 2.4 to 3.8 over the same period.


Big Data and IoT fuel data explosion

Worldwide, the amount of data stored in data centers will increase almost fivefold, from 286 Exabytes in 2016 to 1.3 ZB 2021 (4.6x, with annual growth of 36%). Big data will grow almost 8x, from 25 to 403 EB. In 2021, it will contain 30 % of all data stored in data centers compared to 18 % in 2016.

The amount of stored data in devices in 2021 will be 4.5 times higher at 5.9 ZB than data stored in data centers. Mainly due to the IoT, the total amount of generated data (which will not necessarily be saved) will reach 847 eg by 2021, in 2016 it was 218 eg. This generates more than 100 times more data than saved.


Applications contribute to data growth

By 2021, Big Data will account for 20% (2.5 ZB annually, 209 EB monthly) of data center traffic, compared to 12 % (593 EB annually, 49 EB monthly) in 2016, video streaming will account for 10 % of data center traffic, compared to 9 % in 2016. Video will account for 85 % of data center traffic to users, compared to 78 % in 2016, internet search will account for 20 % of data center traffic, compared to 28 % in 2016, social networks will account for 22 % of data center traffic, compared to 20 % in 2016.


SaaS is the most popular cloud service model by 2021

By 2021, 75 % (402 millions) of all cloud workloads and server instances will be SaaS-based, compared to 71 % (141 million) in 2016 (which represents 23 % of annual growth).

16 % (85 millions) of all cloud workloads and server instances will be IaaS-based, compared to 21 % (42 million) in 2016 (which represents 15 % annual growth).

9 % (46 millions) of all cloud workloads and server instances will be PaaS-based, compared to 8 % (16 million) in 2016 (which represents 23% annual growth rate).


As part of the study, cloud computing includes platforms that provide continuous, on-demand network access to configurable resources (e.g., networks, servers, storage, applications, and services). These can be quickly deployed and shared with minimal management effort or interactions with service providers. Deployment models include Private, Public, and Hybrid Clouds. Cloud data centers can be operated by both service providers and private companies.


The key differences between cloud data centers and traditional data centers are virtualization, standardization, automation, and security. Cloud data centers offer higher performance, higher capacity, and easier management compared to traditional data centers. Virtualization serves as promoter for the consolidation of hardware and software, greater automation and an integrated approach to security.

When is a Cloud Service Provider GDPR-suitable?

Cloud providers are much more committed to the Data Protection Regulation (GDPR) than before. As of 25 May 2018, the new regulation on the processing of personal data will apply – but what exactly does that mean for us as a cloud user? How do you know if a service or provider meets GDPR requirements? And when does a cloud service actually qualify as a GDPR -compliant?


The values governing the processing of personal data are initially governed by Article 5 (1) of the GDPR; Further regulations can be found, inter alia. in Articles 25 and 32. In what follows, explanations on main demands are, especially in relation to cloud services, can be found.


Data must be processed lawfully and fairly – GDPR Art. 5


The processing of personal data in the cloud is legal only if the data subject has consented or if another legal basis exists. The data processing must take place in a manner that is comprehensible to the person concerned, i.e. the cloud provider must be able to provide clear guarantees as the transparency is now included as a fundamental aspect of these principles.


Confidentiality, integrity and availability – GDPR Art. 5.1 f & Art. 32


The data must be processed in a manner that ensures adequate security of the data, including protection against unlawful processing, loss or damage. Furthermore, the processing must not be expected to breach the dignity of the persons concerned or to restrict their freedoms.


Security and state of the art processing- GDPR Art. 32


During the processing, a sufficiently high security must be guaranteed. The legislator demands that the level of security be constantly improved and always based on the so-called “state of the art” methods.


Privacy by Design and Privacy by Default – GDPR Art. 25


Taking into account the state of art, Data protection must be guaranteed by privacy-friendly technology design (Privacy by Design) and privacy-friendly default settings (Privacy by Default).


Accountability – GDPR Art. 5.2, Art.28, Art 30 & Art.35  


Basically, the controller is responsible for compliance with all mentioned requirements and must be able to prove this in advance (accountability). He must include the processing in the cloud in his directory of processing activities and, if necessary, conduct a risk analysis, a so-called privacy impact assessment. The controller now shares this responsibility with the cloud provider, who in turn also has to provide sufficient guarantees that the requirements of the GDPR are complied with.


Processing – GDPR Art. 28


In cloud computing, the user orders the provider to process the data. In order for the cloud user to be able to live up to his responsibility to the data subjects in this case too, he ensures his agreement with the cloud provider with an order processing agreement that also fulfils the requirements of the GDPR. Part of such an agreement must be that the cloud provider provides all information necessary to demonstrate compliance with the requirements.


Proof by certificates


Of course, for you as a cloud user, it is difficult and almost unacceptable to check compliance with these requirements yourself. It is helpful that cloud providers can use an “approved certification process in accordance with Article 42 to demonstrate compliance with the above requirements. Although no “approved” certificate is yet available, this does not mean that certificates specifically aimed at the requirements of the GDPR cannot already be used as proof of GDPR conformity.


For example, the Trusted Cloud Data Protection Profile (TCDP) was developed with respect to the GDPR. Certifications according to the TCDP should be converted into certificates according to the GDPR standard after the extension of the procedure and standard test. With the research project “AUDITOR” there is also a follow-up project to the TCDP, whose goal is the conception and implementation of an applicable EU-wide data protection certification of cloud services. The first catalog with certification criteria should be completed by the end of April 2018.


So, if you choose a cloud service that is TCDP certified, you’re already on the safe side; From the deadline of May 25, you should additionally ensure that the conversion into a certificate according to the GDPR standard actually takes place or that the service proves compliance with the GDPR with another suitable certificate.

Data Analytics Trends for 2018

Using data profitably and creating added value is a key factor for companies in 2018. The world is becoming increasingly networked and ever larger amounts of data are accumulating. BI and analytics solutions and the right strategies can be used to generate real competitive advantage. Here below are listed the top tends concerning Data Analytics of 2018.


How new technologies support analysis

Learning (ML) technology is getting improved day by day and becoming the ultimate tool in creating in-depth analysis and accurate predictions. ML is part of the AI that uses algorithms to derive modules from structured and unstructured data. The technology supports the analysts with automation and thus increases their efficiency. The data analyst no longer has to spend time on labor-intensive tasks such as basic calculation, but can deal with the business and strategic implications of analysis to develop appropriate steps. ML and AI will therefore not replace the analyst, but make its work more efficient, effective and precise.


Natural Language Processing (NLP)

According to Gartner, every second analytical query on search, natural language processing (NLP) or language should be generated by 2020. NLP will allow more sophisticated questions to be asked about data and relevant answers that will lead to better insights and decisions. At the same time, research is making progress by exploring ways in which people ask questions. Results of this research will benefit data analysis – as well as results in the areas of application of NLP. Because the new technology does not make sense in every situation. Their benefit is rather to support the appropriate work processes in a natural way.


Crowdsourcing for modern governance

With self-service analytics, users from a wide range of areas gain valuable insights that also inspire them to adopt innovative governance models. The decisive factor here is that the data is only available to the respective authorized users. The impact of BI and analytics strategies on modern governance models will continue in the coming year: IT departments and data engineers will only provide data from trusted data sources. With the synchronized trend towards self-service analytics, more and more end users have the freedom to explore their data without security risk.


More flexibility in multi-cloud environments

According to a recent Gartner study, around 70%of businesses will implement a multi-cloud strategy by 2019 in order to stop being dependent on a single legacy solution. With a multi-cloud environment, they can also quickly define which provider offers the best performance and support for a given scenario. However, the added flexibility of having a multi-cloud environment also adds to the cost of allocating workloads across vendors, as well as incorporating internal development teams into a variety of platforms. In the multi-cloud strategy, cost estimates – for deployment, internal usage, workload, and implementation – should, therefore, be listed separately for each cloud platform.


Increasing importance of the Chief Data Officer

With data and analytics now playing a key role for companies, a growing gap is emerging between responsibilities for insight and data security. To close them, more and more organizations are moving to analytics at the board level. In many places, there is now a so-called Chief Data Officer (CDO) or Chief Analytics Officer (CAO), who has the task to establish a data-driven corporate culture – that is to drive the change in business processes, overcome cultural barriers and the value of analytics to communicate at all levels of the organization. Due to the results orientation of the CDO / CAO, the development of analytical strategies is increasingly becoming a top priority.


The IoT innovation

The so-called Location of Things, a subcategory of the Internet of Things (IoT), refers to IoT devices that can calculate and communicate their geographical position. On the basis of the collected data, the user can also take into account the location of the respective device as well as the context that may be involved in the evaluation of activities and usage patterns. In addition to tracking objects and people, the technology can also interact with mobile devices such as smartwatches, badges, or tags, enabling personalized experiences. Such data makes it easier to predict which event will occur where and with what probability.


The role of the data engineer is gaining importance

Data engineers make a significant contribution to companies using their data for better business decisions. No wonder that demand continues to rise: from 2013 to 2015, the number of data engineers has more than doubled. In October 2017, LinkedIn held more than 3,500 vacancies under this title. Data engineers are responsible for extracting data from the company’s foundational systems so those insights can serve as decision-making basics. The data engineer does not just have to understand what information is hidden in the data and what it does for the business. He also has to develop the technical solutions to make the data usable.


Analytics brings science and art together

The use of technology is getting easier. Everyone can “play” with data today without having to have deep technical knowledge. Researchers who understand the art of storytelling are pursued for data analysis. More and more companies see data analysis as a business priority. And they recognize that employees with analytical thinking and storytelling skills can gain competitive advantage. Thus, the data analysis brings together aspects of art and science. The focus shifts – from simple data delivery to data-driven stories that lead to concrete decisions.


Universities are intensifying data science programs

For the second time in a year, the Data Scientist ranked first in America’s annual Glassdoor ranking of the best jobs in America. The current report by PwC and the Business-Higher Education Forum shows how high applicants with data knowledge and analytical skills are in the favor of employers: 69% of the companies surveyed indicated that they would prefer suitably qualified candidates over the next four years instead of candidates without appropriate competencies. In the face of growing demand from employers, it is becoming more and more urgent to train competent data experts. In the United States, universities are expanding their data science and analytics programs or establishing new institutes for these subjects. In Germany too, some universities have begun to increase their supply.

#IoT 2018: The Three Most Important #SecurityTrends


You don’t have to wait long for predictions about the massive growth of the IoT. As at the end of 2017, Gartner analysts predicted that we already have around 8.4 billion IOT devices worldwide. This is an increase of 31% compared to the 2016 figures. In 2020, it should be around 20.4 billion IoT devices globally.

That’s not surprising. Because in addition to the ever-growing number of products that are equipped with ever-widening skills to network, there are a variety of new associations, technology partnerships, standards committees and industry initiatives. Closed and established with the goal of enabling companies to truly benefit from the competitive advantages of the IoT.


IoT data breaches will not only continue to increase, but the consequences will be more severe than before.



As with any emerging and rapidly advancing technology era, such development rarely takes place without challenges. And in the case of IOT, safety is one of those challenges. In core, three trends will accompany us in 2018.

In distinction to the previously discovered weaknesses, which were directed in particular against brands and models in the automotive industry, we were confronted with weaknesses in 2017 within the Controller Area Network (CAN) bus protocols faced. A bus protocol that is used not only in the vast majority of vehicles but is widely used in industrial production, health care is another example.

The vulnerability was discovered by U.S. Pat. Industrial Control Systems Cyber Emergency Response Team (ICS-CERT). Under certain conditions, attackers were able to disable the onboard security systems. To make it even more complicated, it is not a vulnerability that could be fixed by a patch. That’s because it’s an inherent weakness of the protocol design itself.


But in 2017 IoT security was another innovation. For the first time, an implantable medical device was recalled due to IT security issues. To this end, the US Food and Drug Administration responded in a gigantic recall campaign to over 465,000 patients who had been given a particular networked pacemaker. The FDA asked the patients to visit their doctor and get a firmware update of the concerned pacemaker. The device has a vulnerability that could potentially be exploited for attacks. Hackers, for example, would be able to influence the tempo of the signal generator or prematurely switch to energy-saving mode. Unlike the CAN bus protocol, this vulnerability comes with a patch. Patients must consult their doctor personally, but surgery is not necessary. Part of the update is to limit the number of wireless commands that the device can receive while preventing the transmission of unencrypted data. With that being said, we are obviously well on our way to an age when doctors are patch managers as well. It is a disturbing and irreversible trend that vulnerabilities in protocols and devices are increasingly likely to endanger human lives when these protocols and devices are used in an environment for which they were not initially designed.


More security awareness, yes, but secure implementation takes time


It is expected that IoT device manufacturers, especially end-user devices, will continue to bring in market devices that are poorly or not fully secured. However, the safety awareness of consumers is growing. Although not strong enough to change the buying behavior. Cool features and an affordable price still make the difference. For the first time, Amazon Echo and Google Home are high on the wishlist of technology-savvy consumers. On the other hand, there is a small but growing group of consumers who have major concerns about the safety of these products. The first major waves of attack, such as the Mirai botnet, have received the attention of security experts. For the average consumer, the scope of this type of attack has not yet become apparent. Nevertheless, the pressure on manufacturers is growing and with it the demand for better security and data protection measures.


Building security into the equipment from the start will be more difficult and time-consuming than expected. This applies equally to IoT devices intended for end users as well as those used in companies. An example: encryption. One has the ability to encrypt data that an IoT device collects both while they are on the device and when that data is sent to another device or aggregated and analyzed in the cloud. At first glance, this looks like a suitable and straightforward approach. As far as encryption is concerned, there are many good recommendations as to which algorithms are suitable. In addition, there are several OpenSource encryption solutions. So far so good. It is much more difficult to protect and manage the associated keys. Insufficient key management invalidates the entire encryption process. A badly managed key can make the encrypted data unusable. For example, if the key used to encrypt the data in question is not available within an authentication process. The sheer variety of devices in the IoT is compounding exponentially the challenges of encryption and key management. To date, only few have the necessary expertise and suitable technologies to deal with this.


The consolidation has begun.


 At the moment, analytics and visualization tools are particularly promising for companies and in the context of IIoT, the industrial Internet of Things. These tools attempt to analyze the vast amounts of data that make sense and produce results that help in day-to-day business. Especially in 2017, providers and users of IoT technologies had to put up with more questions about what they think about the different aspects of data protection. All in all, it makes little sense to collect, analyze, or even worse, analyze data based on this analysis if you ultimately can not trust the data. In order to be able to trust them, one must be able to authenticate origin and source. This begins with verifying the device identity (and whether that device uses legitimate validated software from a trusted source), protecting the collected data from the beginning and, of course, the entire communication and transmission path. That these questions are asked in terms of security is one of the sign of consolidation in the IoT. Manufactures have left the phase of prototypes and feasibility studies, moving in the production phase with real users who are increasingly asking critical questions.


And consolidation will continue to accelerate. Specifically, the market for enterprise / cloud IoT platforms is unhealthily bloated with an unsustainable number of products. It’s safe to assume that just about every developer would be happy to shorten the list of available products for IoT platforms, preferring to incorporate better artificial intelligence into the remaining ones. Add to this a healthy, or perhaps rather unhealthy, number of safety standards and associations that wants to create a solid safety basis. A variety of initiatives seem to go in the same direction, in fact they often have different goals. Governments and legislators are also in the process of finding ways to create the necessary security conditions better than before.


Consolidation and standardization will help to better integrate IoT devices into industrial multi-core environments. And these efforts will ensure that basic security techniques are easier to implement. In particular, those that provide sufficient confidence in an IoT-based environment.

The IoT is a fascinating, fast-growing, and emerging field that will increasingly become the backbone of digital transformation. And it promises them not inconsiderable competitive advantages, which understand how to use it within their entrepreneurial visions, goals and implementation.

Requirements include a strong trust anchor, efficient implementation of the necessary IT security measures, risk assessments in an IoT ecosystem, and meaningful results from IoT projects. The year 2018 will bring us some decisive progress here.

#CyberSecurity Landscape in 2018 – The focus is on vertical industries

It is well known that the fourth industrial revolution opens up a multitude of new business opportunities. At the same time, however, the danger for cyber-attacks is also increasing. It’s imperative that companies prepare themselves to put them out of danger zone.

Not only should they think about security solutions directly when planning IT technology, but they should also develop a keen awareness of the corporate culture for security – which requires significant investment. According to Gartner’s estimations, security spending in 2018 will continue to rise sharply globally, reaching around $ 93 billion. For the coming year, this means that Cyber Security will capture some of the key trends.

IT security experts are still in demand

As technology evolves, security expertise needs to adapt to changing needs. The challenge is to train cyber security specialists to acquire and develop the skills in order to become companies “superheros”. Cyber Security Ventures Report predicts there will be 3.5 million cybersecurity job openings by 2021 and 3.5 million will be unfilled. The responsibility lies in the hands of governments, universities, schools and companies to meet this need.

Protection and resilience

In this day and age it is difficult to completely avoid security gaps. Therefore, you should not dismiss them as improbable, but make appropriate arrangements. As a result of this development, the resilience of the IT infrastructure will be more in focus, and not just only prevention. For companies, it is important to talk openly about their own weak points, to raise their awareness and to show responsibility. Funds, till now used to prevent cyber-attacks, must be redistributed to detect security threats in a timely manner and to remain operational in the case of an attack.

Next-generation security solutions are driven by digital ecosystems

In times of the internet of things the protection of customer data becomes more and more important. Vulnerabilities exposing sensitive data can have serious consequences as companies will be held accountable for personal data in the future. This ownership is a major challenge for companies, and it is the responsibility of technology manufacturers to ensure a degree of security for their users. As the need for cyber security solutions and regulations grows, companies need to develop appropriate strategies to minimize any risk. These strategies should not only meet today’s expectations, but also incorporate new business models promoted by new technologies.

Cyber-attacks increasingly sophisticated

Of the e-mails received, around 70% are spam and the majority of them contain phishing messages. Other known threats include rogue programs as Trojan horses, malware or DDoS attacks. Over the past few months, they have led to massive data loss and continued to make company or customer data vulnerable to cybercriminals. With 93% of the attackers, the money is in the cEnter – this shows the latest report from Verizon. Hackers try to gain the highest possible profit through simple tricks and are often successful in smaller companies with inadequate security solutions.

New technologies: a blessing and a curse at the same time

Innovative technologies enable cybercriminals to use sophisticated methods for their attacks. But these innovations can also help build and strengthen defence and protection against hackers. A major threat, for example, comes from artificial intelligence (AI) applications. However, AI can also be used to detect potential risks faster. How important AI is for IT security is an outlook on the global market for artificial intelligence solutions: according to a recent study, it will grow to $ 18.2 billion by 2023. Likewise, the Internet of Things, with an estimated circulation of 22.5 billion networked items, is both a driver of innovation and a door opener for increased threat potential, according to a Business Insider platform report. On the one hand, security becomes a challenge, but on the other hand, the data gained through Internet-enabled devices can help detect breaches early.

The focus is on vertical industries

While cyberattacks affect all sectors of the economy, there are still some key sectors that are likely to be particularly vulnerable to cyberattacks:


The Financial Sector, BFSI (Banking, Financial Services and Insurance): The BFSI sector is under increasing pressure. This is due to competitors with digital offerings and the constant pressure to modernize their existing systems. The value of customer data is increasing as customers demand more comfortable and personalized service. Nevertheless, trust remains crucial. According to a recent study, about 50% of customers would change banks as a result of a cyber-attack, while 47% would completely lose confidence. Large-scale cyber-attacks have already left a large number of banks victims of a hacker attack. This shows that the sector has to adapt to these risks. So it’s important that banks invest more in security solutions to ensure 24/7 protection. Shared Ledgers will significantly shape the future of the banking sector. The most popular technology, Block chain, will be the backbone of cryptocurrencies like Bitcoin. The block chain method provides permanent records of transactions. It is thus part of the accounting control procedures that cannot be manipulated – and have the potential to completely redesign the BFSI sector.


Healthcare: More and more patient data is digitized. In addition, artificial intelligence and Internet-enabled devices will increase the speed of diagnosis and improve patient care. However, the integration of personal data and Internet-enabled devices also entails risks. Earlier this year, Experian predicted that the healthcare sector would be the market most affected by cyber-attacks and WannaCry ransomware, as some examples have already shown. This means that the health sector should similarly invest in risk analysis as the banking sector. In addition, the implementation of industry-wide standards is needed.


Retail: In the retail market, customized shopping experiences are becoming increasingly important, so data analysis tools help merchants implement them. However, there is also a great responsibility to protect this data, which can include more than just shopping habits and login data, but also account details and addresses. Thanks to Internet technologies, augmented reality and face recognition, the shopping experience is becoming increasingly networked, but here, too, stronger networking also entails a greater risk of data loss. Therefore, the creation of a resilient strategy approach, such as in the banking and healthcare sectors, is also crucial for the trade.


Telecommunications: Telecommunications companies as Internet service providers are among the industries that are at increased risk for cyber security. They should include security measures in network infrastructure, software, applications and endpoints to minimize the risk of customer vulnerabilities and data loss. Nowadays, consumers are increasingly wondering who they entrust their data to. For service providers, this is a good opportunity to provide additional security services. In addition, collaboration between competitors may increase cyberattack resilience.


Manufacturing industry: Even the manufacturing industry is not safe from hacker attacks. According to an IBM study, the production industry is the third most vulnerable sector to hackers. In this area hackers focus mainly on spying on data as they are very lucrative. The main objectives are networked machines, robots and 3D printers. Vulnerabilities enable attackers to get production plans. In addition, they can intervene in processes and sabotage productions. These vulnerabilities not only cause high financial damage, but also the lives of factory workers can be at stake. Manufacturers should therefore continue to monitor their production line for vulnerabilities and implement control mechanisms that limit access to other areas of the production system when an area is already affected.


Authorities: No organization is immune to security breaches and data misuse, not even government agencies. The main target of attack is data stored in the ministries, from voter information to military defense plans. While governments around the world are increasing their cybersecurity budgets and striving to integrate them as quickly as possible, there are still opportunities for criminals to avoid them. Some organizations are already focusing on funding programs that use white-hat hackers to test the IT system and identify potential vulnerabilities for payment. With the growing number of hacker attacks per year, investment in security is becoming more and more important to governments around the world.

What does this mean for the year 2018?

Cybercrile - Get ready to anticipate

Overall, it can be seen that companies in all industries, as well as individuals, need to refine their cybersecurity awareness, recognize the risks, and take appropriate countermeasures. Key competitive advantages are companies investing in security solutions. At the same time, cyber security must also become an issue for state governments and at the international level, and laws and regulations must be adapted accordingly. In addition, governments need to invest in education and disclosure of cyber-threat. New regulations also play an important role here, enabling, for example, telecommunications providers to develop and implement suitable solutions against cyberattacks.

#MobileApp Forecasts 2018

Mobile Services Will Soar Globally To $32.4 Billion By 2018

In their forecasts for the year 2018, the app analytics company “AppAnnie” picks out the key changes in applications and the mobile app market industries. AppAnnnie sees the European legislation PSD2 as a trigger for the development of the financial sector and fin-tech companies. Not only that but the PSD2 directive will bring about one of the most important disturbances in the banking sector.
For those who don’t know, the PSD2 Directive is an extension of the first Payment Services Directive (PSD) adopted by the European Commission in 2007. The aim is to regulate the activities of payment service providers and to create a harmonized framework across the EU. This regulation is expected to increase the number of providers in the ecosystem and increase competition, with a view to offering consumers greater choice and greater transparency.
Danielle Levitas, SVP Research of App Annie, explains: “The European Open Banking legislation will unbundle the value chain of European banks. Aggregated apps are increasingly becoming the primary channel for private finance activities. Once the benefits of these changes have spread, we anticipate similar innovations outside the European markets. Nonetheless, traditional retail banking will continue to develop innovative ideas around the world. For example, in November 2017, Wells Fargo announced the launch of Greenhouse in the first half of 2018. It’s a standalone app that combines Mobile First bank accounts with spend analysis. “


P2P Payment – A FinTech App Revolution

P2P Payment - A FinTech App Revolution

In this context, App Annie also sees a total change in consumers sharing their money, especially among the Y-generation, also called Millennials, who were teenagers around the year 2000. These young people hardly carry any more cash in their purses. P2P transfers. Peer-to-peer (P2P) platforms facilitate the making of payments directly between peers, by allowing people to transfer funds from their own bank accounts, for example, into the bank accounts of others through online technology or mobile phones. P2P options are increasingly becoming available, including on social media networks like WeChat (through their ‘WeChat Wallet’ offering, in partnership with Standard Bank).


The interesting fact of P2P method is that the rates and terms of P2P transactions are more favourable for consumers, both in price and convenience. For consumers, it reduces the use of intermediaries or third-party institutions in their day-to-day dealings together with the need to physically travel to a bank to make payments, to deposit cash, or to access other key financial services.
A well-known example of a type of P2P platform is Blockchain, which allows peers to transact with one another and to record their transactions. Blockchain has its own virtual currency, in the form of Bitcoins, which can be used to conclude online purchases – buyers simply pay the Bitcoin amount at checkout and sellers receive that payment in their own currency.

Experts only sees an increase in transaction volume for P2P payment apps. The AppAnnie’s forecast is based on the increased number of instant transfers as well as payment service providers. According to App Annie, this is also because retailers and sellers are increasingly offering these services as a payment option.


Paying via social networks – WeChat from China makes it happen

In addition, App Annie notes that the tasks of banks and payment services are also influenced by very different players, such as messaging services and social networks. “WeChat stands out among these companies,” says App Annie. “For many users, WeChat is China’s main distribution channel for services and the central hub for many business activities.”


The future of the App Economy


Both the Apple App Store and the Android market will be celebrating their 10th anniversary in 2018. They can look back on a steady growth in apps: At the end of October 2017, more than 2 million apps were available on the iOS App Store and more than 3.5 million on Google Play. This trend will continue in the new year. App Annie expects worldwide consumer spending in all mobile app stores to increase by about 30% to more than $ 110 billion in 2018. As in previous years, the Chinese app stores will continue to be an important market in 2018 – especially the iOS App Store. App Annie writes: “In fact, we expect the Chinese growth rate to leave the rest of the world behind.”


The future of App Stores


As the number of apps increases crucially, individual apps are becoming harder to find. This situation must be solved for example through categorization and editorial support. In June 2017, both Apple and Google announced the launch of updates for the iOS App Store and Google Play to address the issue of “lack of attention through app curation and editorial content”. These features will help users find the best and most adapted app to their requirements, while also managing select categories based on their areas of professional expertise. This way the existing apps can only benefit specially app in the entertainment and leisure sectors.


Augmented reality in apps is powered by Facebook, Google, Apple, Alibaba

Augmented reality in apps

Augmented reality has long sounded like a wild futuristic concept, but the technology has actually been around for years. It becomes more strong and continuous with each passing year, providing an surprising means of covering computer-generated images on a user’s view of reality by creating a combined view rooted in both real and virtual worlds.
The available selection of augmented reality apps is diverse, but Thanks to Pokémon GO and Snapchat, AR technology reached worldwide recognition. For now AR technology is used across numerous industries such as: healthcare, public safety, gas and oil, tourism, marketing etc. Worldwide shipment of smart augmented reality glasses are forecast to reach around 5.4 million units by 2020. And the global augmented reality market is expected to grow significantly to about 90 billion U.S. dollars by 2020.


Video Streaming – iOS and Google Play


“For video streaming services, 2017 was another outstanding year with new audiences and critical praise for in-house production. From January to October 31, 2017, video streaming apps have increased significantly. The increase in entertainment category was over 85% for iOS and over 70 % for Google Play compared to the same period in 2016. That’s a record increase!


The New Paradigm Of Retail


2018 retail is characterized by an interactive shopping experience that is technology-enabled. Ideally, retailers will focus to gather more and more business intelligence about consumer buying habits across all channels, collect new insights into shopper’s habits in order to plan their merchandising, pricing and promotional strategies designed specially to meet their individual needs.
These digital innovations will mark a big change in consumers shopping habits in 2018 as well. The overall changes will be in nature of existing retail channels (e.g., mobile app, web, physical store). For example, as you can see in China, according to App Annie, private customers in Western markets are increasingly turning to physical stores to pick up ONLY their mobile-bought goods. At the same time, the risk for the cash registers in the shops to disappear is very high.


The changing market fot food delivery


Worldwide, the market for food delivery stands at €83 billion, or 1% of the total food market and 4% of food sold through restaurants and fast-food chains. It has already matured in most countries, with an overall annual growth rate estimated at just 3.5% for the next five years.
These numbers shows how ordering food via platforms is already so popular that more partnerships are expected in 2018, according to App Annie. Food delivery service provider (DaaS), as described e.g. UberEATS, Deliveroo or Takeaway are currently trying to gain market share in premium markets where customers are more willing to pay more for good food.


2017 Digital Evolution Report – CyberCrime, Digitization, Blockchain and Artificial Intelligence

Cyber-crime, Smart-Cities, Digitization, Blockchain and Artificial Intelligence are those words which really got the hype on the platform of IT in 2017. Cybercriminals have smacked many companies many times. Digitization is progressing despite lame internet connections. Blockchain became Gold Chain and Artificial Intelligence is experiencing an incredible revival.

Key Technologies 2017

Ransomware: The ransom and the cyber blackmailer


Ransomware remains a leader in digital security threats. According to ITRC Data Breach report, in 2015 more than 177,866,236 personal records exposed via 780 data security breaches, and the previous mentioned number lift up to 30% in 2016 with security breaches arising on multiple fronts, companies, healthcare systems, governmental and educational entities, and individuals started to realize how real the threat of cybersecurity attacks was. 2017 so far, was a very highlighted year for cyber-crimes. 519 Cyber-attacks were placed from Jan 2017 until September 2017 affecting financial sectors, health-care sectors, gaming companies, containing information about credit cards, health data of billions of people around the world. With all these attacks phishing, spying on webcams or networked household appliances (IoT) remain risky.


Very popular in this year’s cyber attack list are the #wannacry and Equifax data breach attacks. These attacks unbaled 300000 computer systems for 4 days and affected financial data on more than 800 million customers and 88 million businesses worldwide and more than 45% of all detected ransomware.

Cyber policies are currently very much in vogue, but in which cases of damage do these insurances actually comes in? ABA, American Bankers Association, explains how companies should best go about finding a suitable policy and what makes good cyber insurance.


The General Data Protection Regulation (GDPR): What needs to be changed?


Companies only have a few months left to prepare for the new European #DataProtection Regulation. On 25 May 2018, all companies managing personal data of citizens of the European Union will be required to comply with the new regulations and requirements of the General Data Protection Regulation (GDPR).

This regulation will impose significant new obligations on companies that manage personal data, as well as severe penalties for those who’ll violate these rules, including fines of up to 4% of global turnover or € 20 million highest amount being withheld. But what is to change concretely? Here is a “Guide to compliance with the EU GDPR” and a framework to become step by step GDPR-fit.


Digital Transformation: Slow Internet connections as a brake pad


Digitization is progressing, but most users still complain about slow Internet connections. Despite the 7th place in the worldwide internet ranking, Belgium is still far behind the world’s fastest internet country. Notwithstanding all the shortcomings of the national IT infrastructure, companies are dealing with the technical and organizational challenges that result from the digital IT transformation.


The crazy rise of Bitcoin


In the period of a year the value of bitcoin has been multiplied by ten. A bitcoin was worth “only” 1000 dollars on January 1, 2017 … and 8000 dollars ten days ago. In April 2017 Japan officially recognised bitcoin and virtual currencies as legal methods of payment. You should know that Bitcoin represents less than 50% of the money supply of all cryptocurrencies in circulation. this is partly explained by the network situation and the rise of the Ethereum currency. Even if bitcoin is a legal in the vast majority of countries around the world, only a few governments have recognized the legal status of bitcoin in a particular regulatory manner.


IoT Projects: The 5 Biggest Mistakes and the Five Steps to Success


Closely linked to Digital Change is Internet of Things (IoT) and Industry 4.0 projects. Pioneers already pointed out the four biggest mistakes in IoT projects. If a company wants to exploit the potential of the IOT, it means a lot of work and often frustration – the technical, commercial and cultural challenges are manifold. Until an IoT solution is successfully established on the market, many decisions have to be carefully considered.

But how does an IoT project succeed? Four steps are needed to make an IoT project a success.


Blockchain: The new gold chain

The blockchain is a much-debated technology with disruptive potential and three key characteristics: decentralization, immutability, and transparency. It could help to automate business processes, increase the security of transactions and replace intermediaries such as notaries or banks. Blockchain turns out to be the silent revolution that will change our lives. On top of that, it can turn into a gold chain for early adopters.


Cloud: Companies use public cloud despite security concerns

For years, companies have avoided the public cloud, as it is difficult to get a grip on in terms of security. However, this year, companies in the EMEA region increased their investment in the public cloud despite ongoing security concerns and lack of understanding of who is responsible for data security. However, caution is still needed to provide attacks such as wannacry.


Artificial intelligence

In 2016, Gartner put artificial intelligence and advanced machine learning in first place in its forecast for 2017, stating that this trend was really pronounced during 2017. Briefly 80 % of companies have already invest in Artificial Intelligence (AI). Nevertheless, one out of every 3 deciders believes that their organization needs to spend more on AI technology over the upcoming years if they want to keep pace with their competitors. Artificial intelligence penetrates into all areas of life. But how does it work?

One example is the automated and personalized customer approach to AI. With personalized campaigns and individual customer approach, the marketing of the future wants to win the battle for the buyer. As a rule, the necessary data are already available in companies, but the resources and software tools for their profitable use are not.
In 2018 Businesses will have an availability of AI-supported applications and should therefore focus on the commercial results achieved through these applications that exploit narrow AI technologies and leave the AI in the general sense to researchers and writers of science fiction;


The future of the human worker

AI systems can be used without a doubt. The world is becoming increasingly complex, which requires a thoughtful and wise use of our human resources. This can support high-quality computer systems. This also applies to applications that require intelligence. The flip side of AI is that many people are scared about the possibility of smart machines, arguing that intelligence is something unique, which is what characterizes Homo Sapiens. Not only that but many people still think that Artificial intelligence is the new threat to employment. It will replace the man and steal all the jobs. And they thinks that the future is dark.

Yet technological progress has never caused unemployment. On the contrary, since the industrial revolution, employment has multiplied. But, always, with each progress, fears resurge. Today, it is artificial intelligence that scares, or is used to scare. Economic history, and economic science therefore invites us to remain calm in the face of technological progress in general, and artificial intelligence in particular. By allowing the invention of new things to be exchanged, by stimulating entrepreneurship, it is not a danger but only an opportunity.


DATA based business models

Data Driven Business Model puts data at the center of value creation. This central place of data in the Business Model can be translated in different ways: analysis, observation of customer behaviour, understanding of customer experience, improvement of existing products and services, strategic decision-making, and marketing of data.

These data can be gathered from different sources, generated directly by the company, processed and enriched by various analyses and highlighted by data access and visualization platforms. Once data is collected, It’s essential to manage the multiple sources of data and identify which areas will bring the most benefit. Tracking the right data points within an organization can be profitable during the decision-making process. This allows an organization’s management to make data-driven decisions while amplifying synergy within the day-to-day operations.
As for revenue models, these can be based on a direct sale of data, a license, a lease, a subscription or a free provision financed by advertising.


#GDPR – Reform of EU Data Protection: 5 months left to be Fully Prepared

#GDPR - Reform of EU Data Protection- 5 months left to be Fully Prepared

Companies only have a few months left to prepare for the new European #DataProtection Regulation. On 25 May 2018, all companies managing personal data of citizens of the European Union will be required to comply with the new regulations and requirements of the General Data Protection Regulation (GDPR).

This regulation will impose significant new obligations on companies that manage personal data, as well as severe penalties for those who’ll violate these rules, including fines of up to 4% of global turnover or € 20 million highest amount being withheld.

Few months left before the entry into force of the Regulation, yet many companies have not started preparations and will have to develop and implement a compliance strategy. To facilitate their journey, we’ve listed, here below, eight rules to follow.


Understand your Data


The first step to comply with the GDPR is to understand how personal data is stored, processed, shared and used within the company. Through careful auditing, you will need to compare existing practices with the requirements of the new regulations and identify the changes needed to ensure your business in the way that best suits you. Remember that the obligations of the GDPR do not only apply to the strategies and measures put in place by your company but also extend to the providers who process personal data on your behalf.


Determine who is responsible for data protection


If some companies will have to appoint a data protection officer, everyone working within the company will have to adopt a data protection compliance program. Data protection officer may need to strengthen his strategies in this area and train his staff.

Please note that not all companies will necessarily have to appoint a Data Protection Officer, but good practice suggests that such a delegate is essential for companies that engage in two types of activities: large-scale processing of specific categories of data and large-scale monitoring of data, such as behavioral advertising targeting.


Ensure a legal basis for Data processing


Your company will want to examine the legal basis on which your strategy for handling various types of personal data is based. If it is based on consent, you will need to identify the method used to obtain that consent and will have to clearly demonstrate how and when that consent is given. Relying on consent means that data subject can withdraw his/her consent at any time and that data controller must then stop any data processing activity about this data subject.


Understand the rights of the people concerned


In accordance with the GDPR, any person whose data you process is given new rights, including the right of access to personal data, the right to correct and delete such data, or the right to portability of personal data.

Can your business easily locate, delete, and move customer data? Is it able to respond quickly to requests for personal data? Does your company, and the third parties that work for it, keep track of where these data are stored, how they are processed, and who they were shared with?


Ensure confidentiality from conception


As part of the GDPR, companies are required to implement a confidentiality strategy from the design stage when developing a new project, process, or product. The goal is to ensure the confidentiality of a data’s project as soon as it is launched, rather than implementing retrospective confidentiality measures, with the aim of reducing the risk of violation.

Have you limited access to personal data to those who need it in your business? A data protection impact assessment is sometimes necessary before processing personal data.


Be prepared for violation


Your company will need to implement appropriate policies and processes to handle data breaches. Make sure you know which authorities you will need to report any data breaches, as well as the deadlines. Any breach may result in a fine. Put in place clear policies and well-practiced procedures to ensure that you can react quickly to any data breach and notify in time where required.


Communicate the main information


In accordance with the GDPR, you will be required to provide the data subject with the legal basis for the processing of their data and to ensure that they are aware of the authorities from which they may lodge a complaint in the case of any problem. Make sure your online privacy policy is up to date.


Collaborate with your suppliers


GDPR compliance requires an end-to-end strategy that contains vendors processing personal data on your behalf. The use of a third party for data processing does not exempt companies from the obligations incumbent on them under the GDPR.


With any international data transfers, including intra-group transfers, it will be important to ensure that you have a legitimate basis for transferring personal data to jurisdictions that are not recognized as having adequate data protection regulation. Verify that the third-party data processor on your behalf has established strict data protection standards, has extensive experience in the field of large-scale data security management, and it has tools to help improve data governance and reduce the risk of breach.


Ensure your vendor meets globally recognized standards for security and data protection, including ISO 27018 – Code of Practice for Protecting Personal Data in the Cloud. Ask your vendor to provide you with all information about the network and data security who resides there (for example, its encryption policies and controls in place at the application level), its security policies, as well as its training, risk analysis, and testing strategies.

Vertiv: #DataCenter of Industry 4.0

Data Center Trends

When it comes to deploying IT capacity at the edge, a lot of environmental factors can influence organization’s needs. Each business will vary in deployment size, environmental isolation requirements and ease of transferability to name just a few. So, with the revolution of Industry 4.0, there are many options for your edge data center investment.  As the next generation of data center will no longer be limited to central, large-scale facilities, but will seamlessly integrate the edge of networks that are becoming increasingly intelligent and mission-critical.


These 4.0 data centers are currently under construction and will significantly shape the IT networks of the 2020s. The emergence of this edge-based infrastructure is one of the top five data center trends identified by a global panel of experts from Vertiv, formerly Emerson Network Power, for 2018.


The main reason and motivation behind a new IT infrastructure are the growing volumes of data-driven by smartphone use and the #InternetOfThings in particular – so they can meet the growing demands of consumers. While companies can go many ways to accommodate this growth, most IT executives choose to bring the infrastructure they need closer to the end user – the edge of the network. Whichever approach companies choose, speed and consistent services will be critical to consumers. That infrastructure needs to be scalable to accommodate accelerating data growth and flexible to allow new ways to use real-time data analytics. 


In the past, Vertiv had identified trends around cloud, integrated systems, and infrastructure security. For 2018, the company expects the following five trends that will shape the data center ecosystem:


Development of the 4.0 Data Center


Generally, large data centers have been placed where energy costs are lower and space is inexpensive. To overcome speed, space, availability issues – Edge data center must be placed much closer to the users. These data center will be generally smaller but more of them in a kind of mesh network arrangement. Because whether it’s a traditional IT cabinets or 150-square-foot micro data centers, companies rely more and more on the edge of the network. 4.0 data centers will give companies the possibility to integrate edge and core holistically and harmoniously. Thus, these new data center architectures are much more than simple and distributed networks.


This development is made possible by innovative architectures that provide real-time capacity through scalable and cost-effective modules. These data centers will be cost-effective via the usage of optimized cooling solutions and high-density power supplies, as well as lithium-ion batteries and innovative power distribution units. In addition, these concepts integrate state-of-the-art monitoring and management technologies that allow the simultaneous operation of hundreds or even thousands of distributed IT nodes. As a result, complex structures dissolve. Latency and acquisition costs decrease and utilization is optimized. In addition, enterprises can add network-based IT capabilities when needed.


Cloud providers focus on colocation


Even though the revenue from wholesale and retail data center colocation market worldwide will reach up to 33 billion U.S dollars, with the increasing numbers of IoT devices, cloud usage is increasing so rapidly that cloud providers often cannot meet the demand for capacity. In most cases, providers focus on delivering services and other priorities, rather than building new data centers. They compensate for under capacities with offers from colocation providers. By focusing on efficiency and scalability, colocation vendors can quickly meet the increasing demand for data center capacity while further reducing costs. And the provider has the freedom to choose their colocation partners that best meet their end-user needs and enable edge computing.


Reconfiguration of the data centre middle class


With the evolution of the market and rapidly changing consumer’s needs, it is no secret that the biggest growth area in the data center market will be in the hyperscale environment – typically cloud or colocation providers – and edge computing. Traditional data center operators now have the opportunity to reorganize and configure their facilities and resources that are critical to local operations thanks to the growth of colocation and cloud resources.

Multiple data center companies will continue to consolidate their internal IT resources. They will probably use all the options for outsourcing to the cloud or to work with colocation providers and reduce their own infrastructures. Their motivation for this transformation will be quick configurations that can be implemented rapidly and are scalable at short notice. Certainly, these new facilities will be smaller, but more efficient and safer – with high availability at the same time. This matches perfectly to the extremely critical nature of the data that these companies want to protect.


High-density Data Center


High density has been a hotly challenging subject in the world of data centers for years. But with global data consumption in the zettabytes—and the subsequent demand on IT resources—it’s finally the time to start building up, not out. The data center is all about power and cooling, and high density is how you maximize the usage of those two things.

Over the past decade, data centers have made huge advancements to be able to support the high-density computing of today. Traditionally, data centers used multiple racks of low-power systems that weren’t capable of getting work done efficiently. This is about to change!

Although densities below 10 kW per rack remain the standard, 15 kW is no longer a rarity in hyperscale facilities, and some even approach 25 kW. The main driver of this transformation is the introduction and proliferation of hyper-converged computing systems. This expansion of hyper-converged computing systems is driven by products that are offering new levels of automation, tighter integration between technologies, and, in many cases, software-defined solutions based on scale-out architectures.


The world is reacting to edge computing


By bringing processing and computing capacities to the edge of the network, businesses reduce latency by highlighting processing and lightening the load on the primary network, supporting better, faster decision-making. The decision to implement an edge computing architecture is typically driven by the need for location optimization, security, and most of all, speed. Three main reasons for the transition to edge computing are the growth of IoT, the pace of technology-empowered business, and evolving user expectations. As today’s data management systems require the most immediate information to support “real-time” decisions that can have an impact of millions of dollars to the bottom line, more and more companies are shifting their computing capacity to the edge of their networks. Also, location optimization reduces data processing from minutes and hours to milliseconds and microseconds and as close to real-time as you can currently get. 


About Vertiv
Vertiv designs, builds and services critical infrastructure that enables vital applications for data centers, communication networks and commercial and industrial facilities. For more information, visit