Physical & Cloud #DataProtection: Best Practices for your #Backup and #RecoveryProcess

Data has become one of the most valuable assets of organizations. Massive data is the new currency. Thanks to advancements in technology and connectivity, data creation is skyrocketing upwards. According to IDC this data is expected to double every two years for the next decade, hitting 45,000 exabytes in 2020. These data are stored in ever-increasing environments and connected devices, therefore backup and restore capability of an information system is a real challenge to ensure business continuity and the availability of associated data.

Data Protection

What must IT departments do to fulfill the data security mission? Well, the data security policy is at the heart of each business concerns and should be a fundamental part of their security strategy. Planned security measures can then create tactical and operational rules through the joint efforts of security and storage teams. To this end, storage must be an integral part of the company’s security strategy.

 

To achieve these objectives, a company must establish a cluster around the following five essential aspects:
• Allocation of responsibilities;
• Risk Assessment;
• Development of a data protection procedure;
• Communication of data protection procedure;
• Execution and testing of the data protection procedure.

 

  1. Allocation of responsibilities

The goal is to make storage security a fully-fledged feature of the IT security architecture. Even if the company decides that the responsibility for backup or storage security rests within the storage team, it must nevertheless integrate any safety measures in this area with task to secure the rest of the infrastructure. This integration will contribute to the establishment of in-depth protection. It is also advisable to share responsibility for extremely sensitive data. It’s therefore better to ensure that the person authorizing access is not the same as the person responsible for enforcement.

 

  1. Assessment of storage risks in the area of ​​IT security

Managers must review each step of their backup methodology to identify security vulnerabilities. Can an administrator secretly make copies of backup tapes? Are they stored in boxes accessible to everyone? Is there a rigorous end-to-end monitoring chain for backup tapes? If critical data is backed up and transported, vulnerabilities of this nature could make it easy prey. If the risk analysis reveals many vulnerabilities, the company must seriously question the encryption of its data.

 

  1. Development of an information protection program that guarantees the security of company data, at all times, wherever they are

Multi-level protection should be adopted by taking existing best practices for the data network in order to apply to the storage network, while adding specific layers adapted to the characteristics of the archived data, for example:

  • Authentication: application of multi-level authentication techniques and anti-spoofing (anti-identity or address spoofing).
    • Authorizations: access rights according to roles and responsibilities (as opposed to total administrative access).

It is imperative to duplicate backup tapes because it is never good to depend on a single copy of the data. Despite the longevity of the bands, they are still exposed to environmental and physical damage. A common practice is to perform nightly backups and then store these off-site tapes without any verification. Recommended best practices include duplicating backup tapes and then storing offsite copies.

Magnetic tapes remain the preferred storage mode for backups because they are economical and offer sufficient capacity to back up an entire operating system on a single cartridge. When stored properly, archival tapes have a lifetime of more than 30 years, making them an exceptionally reliable storage medium.

 

  1. Communication of the procedure to be applied with regard to the protection and security of information

Once the procedure for protecting and manipulating sensitive data has been defined, it is important to ensure that those responsible for their safety are informed and trained. Safety rules are the most important aspect of assigning responsibilities. Functional managers need to be aware of risks, countermeasures and costs.

Data loss and intellectual property theft affect the entire enterprise, not just the IT department. As such, the Director of Security must undertake a data security approach by training the different functional frameworks in the risks, threats and potential harms arising from security breaches, as well as the cost of the various possible countermeasures in this area. In this way, company executives can raise awareness about the cost / benefit of investments in data security.

 

  1. Implementation and testing of Data Protection and Security Plan

Securing data is not about technology but about procedure. This is why it is essential to test the procedure. In addition, as the growth of the company is accompanied by an evolution in security and data protection needs, IT security practices must also evolve.

Once the complete security plan has been developed, defined and communicated to the concerned team, only then it’s the right time to implement it. IT team must ensure the implementation of the tools, technologies and methodologies necessary for the classification of information. New technologies may be required to classify information or label it with metadata so that it is backed up according to appropriate rules and procedures.

Once in place, the procedure must be tested, both concerning backup and restore. The test is to introduce, into the process, any possible and imaginable danger, whether it is the loss of a tape or a server, network problems, equipment or filing of data or any other scenario which could affect the company’s performance.

It is advisable to carry out tests with personnel who are less familiar with the procedure, to ensure that it can nevertheless be applied without difficulty in the absence of the usual supervisor (due to illness, holidays or departure).

How Artificial Intelligence is impacting the Tourism Sector?

Artificial intelligence has existed for several years, yet we witness that it is now reaching another dimension, thanks to more powerful computers and the multiplication of available data. By its capacity to raise all sectors of activity, it is undeniable that it represents a great interest for Tourism. With the wealth of data available to professionals, today there are a multitude of technologies and recommendations applications, real time chatbot and personalized concierge services. The aim is to simplify the work of tourism industry professionals so that they can return to their core business with powerful tools and technologies and make an important difference in terms of profit and customer satisfaction. But the question one must ask is how to use Artificial Intelligence wisely?

Artificial Intelligence and Tourism

The first point: if we think about tourism future, in terms of types of travelers, its certain that we will be dealing with several categories of profiles, which may overlap. Our first category, for example, will be constituted, as is the case today, of travelers wishing to disconnect radically from their “everyday” environment in order to immerse themselves in another culture. And this, by all possible means.

Others, more vigilant, are the second category that will want to practice simple trips, without risks, even without surprises, neither good nor bad. This does not exclude, on the contrary, the survival of an adventure tourism.

For, the last profile, the purpose of a journey will be less the destination than the experience that one can have there. They will travel to learn how to cook a rare product or to learn a new activity based on information provided by our peers. The purpose of their travel will be based on learning.

Whatever the size of the group and the number of establishments it counts, it seems to me that we are moving towards a world where the tourist supply will continue to increase, thanks to two levers: new destinations and new traveler’s profiles. It will be required to be extremely flexible towards the customer’s expectations, to which one must respond with the development of innovative services to accompany them at each stage of their journey before, during and after their stay .

 

How can AI added value be applied to Tourism?
By Customization. And that is what profoundly changes the ins and outs. Rather than bringing the same experience to the same type of travel, artificial intelligence offers the possibility of matching the desires, habits, preferences of the tourist with the proposed product. “Artificial intelligence makes a data pool meaningful. By learning what the customer is looking for, buying, and loving, it makes it possible to generate customized and targeted offers that are more likely to be converted into a purchase.

Today, cognitive systems are capable of interacting in natural language, they can process a multitude of structured and unstructured data, developed with geo-localized content, and learn from each interaction. These systems will rapidly become essential in the development of strategic topics for the industry, such as “smarter destination”, on the personalization of the customer experience and its loyalty, as well as on the provision of management, analysis and Marketing, all this by using BigData. These services will be an asset to make the whole of the tourism sector more efficient by helping the actors and structures in place.

 

How far can artificial intelligence push the tourism industry?
Not up to replace the human. Robots are used for certain tasks, but not as a replacement for humans, although, in the long term, this could happen, but the problem of the energy that robots consume must be solved. Referring to artificial intelligence is often trying to compare with human intelligence, it’s important to notice that the aim of cognitive systems is NOT to replace human beings; Robots cannot reason or learn as a human being can do. They serve the needs and imagination of tourism professionals who, with the help of partners, take benefit from them thanks to their knowledge.

 

Like I’ve mentioned above that AI isn’t a new technology, we have been interested init since the 50/60 years, but if today the subject seems quite new, it is because the data is only available now. Tourism, like all industries, is digitized and gives a potentiality of data where one can apply machine learning. So AI is a revolution in progress, to the extent that it leads to new ways of thinking about the supplier’s offer.

Understanding the #Blockchain Economic Revolution

The Blockchain is a revolution that is undoubtedly leading to a complete overhaul of economic activity. It’s not a simple geek trend but still most people have absolutely no idea what the blockchain stands for. It’s essential to distinguish clearly the differences between bitcoin, crypto-currency and the breakthrough of technology underlying below the nameà the #Blockchain. You must know that there are several types of blockchains on the market, and bitcoin is another version of it which got huge success in recent years.

 

To be short, #Blockchain is an information storage and transmission technology that is transparent, secure and operates without a central control unit. Transactions between network users are grouped in blocks. Each block is validated by the nodes of the network called “minors”, according to the techniques that depend on the type of block. This process puts everything on trust between the market players without going through a central authority. It’s an open source system where each link in the chain offers autonomous legitimacy. The decentralized nature of the chain, coupled with its security and transparency, suggests a revolution of an unimaginable enigma. The fields of opportunity open far beyond those who have access to the monetary sector.

Understanding the Blockchain Economic Revolution

 

In fact, it is a revolution, as has been in human history, the advent of commerce. When individuals bought and sold their products face to face, with the handshake, the trust was established. Second, globalization has created new needs. Entities have been set up to protect sellers and buyers. Laws and legal services have developed around financial exchanges. Each market had to have intermediaries at the grass-roots level, without it being possible to assess or quantify a degree of trust between people. What changes with the blockchain is not only its decentralized aspect, but also absence of intermediates. Blockchains could replace most “trusted third parties” centralized by distributed computing systems. More than that, many observers highlight the blockchain as an alternative to any back-office systems in the banking sector. It would also help eradicate corruption in global supply chains.

 

The boom of the Internet offers some good indications on how the blockchain could develop. The Internet has reduced communication and distribution costs. For ex, the cost of a WhatsApp message is much cheaper than an SMS. Just as the cost of a software or an online platform is cheaper than having to sell its products through a physical store. The marginal operating costs, thanks to the Web, have been reduced to almost zero. This has caused profound changes in the telecommunication, media and software markets. The Blockchains result allowed to limit all marginal transaction costs close to 0.

 

Blockchains are a low-cost market disruptor for any business that acts as an intermediate in market. They allow things that have never been possible by using existing infrastructure and financial resources. We can exchange things that were not previously considered assets. It can be data, our reputation or unused power. The possibilities are as vast as they are unimaginable, but that does not mean that each type of element will be profitable for a company.

 

It is preferable not to dwell first on the technological aspect. It is much better to focus on the root of your customer’s problem. Successful businesses know how to identify, what is missing or a concern to their prospects, and know how to solve it. Blockchain technology is valuable in a setting where data has to be shared and edited by many unapproved parties. That is the infrastructure. The added value comes from the services that are built around it, with applications or modules.

Currently we are in the infrastructure market phase, there are still standards or platforms to democratize blockchain technology. In the near future, thanks to the crazy pace of development of this system, it will be easier for developers and entrepreneurs to use the blockchain on a daily basis. As easily as the MySQL or MongoDB databases we use today. Once the infrastructure stage is over, the evolution of blockchains will really become exciting. The infrastructure will be a huge database on which companies will be able to operate all kinds of connected objects or devices. The connected devices will collect data, blockchains will ensure, shear and process data; Artificial intelligence applications will automate activities.

 

Just imagine these farms where the product is grown and picked up by robots, delivered at home via drones, with a connected refrigerator that alerts us when we need something from there. An artificial intelligence system manages presets objectives to perfectly match the supply and demand. Blockchains are much more than just a bitcoin. They are the real building blocks of our future world.

The Impact and Challenges of Artificial Intelligence for Next-Gen Enterprises

Artificial Intelligence (AI) is not a new phenomenon. It continues to develop and its applications are already very present in our personal daily life (gaming, robotics, connected objects …), arousing as much enthusiasm as fear. This complex concept gained its success in the science fiction world. Although AI is still calling for a more or less fantasized imagination, it is an integral part of reality, and it can be found in many services, systems and applications.

 

What can be the role of artificial intelligence in the enterprise of the future? Will AI make organizations smarter? These are the main questions that have motivated big companies, with the objective of analyzing and anticipating the impacts of this revolution in progress. In this post, I’ll be discussing organizational, legal and ethical issues related to the governance of artificial intelligence in large enterprises.

 

A critical factor in adapting the company to the evolutions and challenges of AI environment is to rethink relationship with the company’s stakeholders, and in particular with the customer. It doesn’t mean that one must highlight that “the customer is important” but to emphasize the interaction with the customer. With that being said, client exists only through the interest and interactions developed with them.

So, the question companies should ask is how can they develop a successful interaction with the help of artificial intelligence? What does this mean concretely in terms of channels, content, customer knowledge and, above all, commitment to the customer?

 

Some companies have “Innovation and prospective” unit to carry out an analysis and a reflection of AI impact within the company. This is to take the step without neglecting the employees who are at the center of the subject. These cells allow the sharing of ideas. As the applications of artificial intelligence within the company are diverse, such as increase of human expertise through virtual assistants; optimization of certain products and services; new perspectives in research and development through the evolution of self-learning programs. The objective of this unit is to exchange in order to make the prospective, in a participatory way, through conferences, roundtables, written reports or scenarios, depending on the choice of the structures.

 

The Impact of Artificial Intelligence for Enterprises

 

Artificial intelligence technologies are already anchored in our daily lives. These technological advances intensely question the managerial and organizational practices around innovation in large companies. Many conducted surveys demonstrate that in general, companies do not have a dedicated budget for artificial intelligence. Nevertheless, there are either investment projects or resources that can be allocated to artificial intelligence teams integrated into the wider data teams. Be that as it may, the subject of artificial intelligence is present in large enterprises; It may remain theoretical but may also be the subject of initial experiments, notably concerning the predictive algorithms. Artificial intelligence does not fundamentally change everything in the company, it will rather “increase” performance, automating or perfecting certain processes and / or operations.

 

Benefits for organizations:

 

Today, artificial intelligence already generates many benefits for organizations, notably by:

  • Responding to Big Data issues; Artificial intelligence relies in large part of the search and mass analysis of data from which it can learn;
  • Increasing human decision-making expertise, online help assistant: a Hong Kong-based company, Deep Knowledge Venture (DKV), possesses, for example, artificial intelligence at its board of directors. Vital (Validating Investment Tool for Advancing Life Sciences) who makes investment recommendations and is also entitled to vote;
  • Optimizing services and products: improving customer knowledge, decision-making and operational processes;
  • Strengthening systems security: in the area of ​​cybersecurity, artificial intelligence becomes a structuring element of IT infrastructures, in order to secure networks. Automatic recognition is well established for the detection of fraud, and experts are under way to create algorithms that will identify threats that human brains and traditional security mechanisms fail to recognize.
  • Helping to make discoveries: some companies in the field of health analyze all the scientific publications related to a particular area of ​​research, which allows them to look for new properties, new molecules.

 

Challenges:

 

The challenges for large companies are numerous. Starting with cultural and organizational changes. As noted in the Telecom Foundation’s Watchbook No. 8: “The craze for artificial intelligences has been accelerated by the availability of AI capabilities in the form of APIs (on the one hand, Vision or predictive), and the source code of the platforms of Machine Learning released by major Internet operators, on the other hand “.

These technology facilitators will keep pushing companies to become APIs (APIs) in order to optimize their resources. It is therefore necessary to understand the world of APIs in this transversal, cross-enterprise approach, which is not without posing a number of challenges for large companies. To succeed one must develop the fallowing roadmap strategy:

  • Build stronger relationships with clients;
  • Optimize internal processes;
  • Accelerate the development of new developments.

 

To conclude I’ll say that we live a golden age of artificial intelligence, boosted by the increasing interest of web giants for the stakes of Big Data. The first AI investors are indeed the pure players of the internet and the main players of the software. The movement is launched, and it is our responsibility to anticipate the effects of this revolution on large companies.

IT Challenge : Re-invent business models … or disappear!

 Re-invent business models ... or disappear!

Technologies have significantly lowered market entry barriers and the development of free-of-charge models has favored the appearance of business models that have destabilized the positions of historical actors in most sectors.

As a matter of fact, In this digital age, existing approaches to develop, elaborate and describe business models are no longer appropriate for new connected business models. Technologies and services are becoming obsolete faster than in the past, consumers are pleased with innovation and customer experience, the need for agility weighs on production capacities and information systems, therefore cooperation becomes a must.

 

In this changing context, the risk of disappearing, for a company, has never been more present: this is what motivates collaboration and alliances between often competing actors. Cooperation can be seen in a positive competitive way front of the decline phase that threatens businesses of all sizes. This context justifies a reflection on the identification of large organization’s strengths and weaknesses in comparison to their competitors. By 2020, the ability to renew its business model will be critical to the growth and profitability of large firms.

 

Data Capitalizing and Customer Experience

 

Data is the black gold of present and future. But according to Gartner, 85% of the largest organizations in the 500 fortune ranking won’t get a real competitive advantage with the Big Data by 2020. The majority of them will still be in the experimental stage while those who have capitalized on the Big Data will gain a competitive advantage of 25% or more compared to their competitors. Therefore, the development of new products and services, facilitated by the intelligent use of data, creates real changes and new business opportunities.

In a context where the risk of disintermediation is major, control of customer relations, mass customization, co-design with consumers will be fundamental to the success of companies in 2020.

 

Challenges: Differentiating and Innovating

 

  • Understand: it’s essential to keep track of business model and strategies of its competitors of the digital world, the latter being both potential threats and powerful levers of development.
  • Transform: Large groups, especially if they are economically powerful, often find it difficult to transform their organization and integrate innovation, because of their complexity.
  • Listen: anticipating the needs of consumers and focusing on the customer experience means constantly evolving business models in order to develop business agility.
  • Collaborate: creating strategic partnerships with the company’s ecosystem, especially suppliers, accelerates innovation processes and reduces time to market.
  • Adjust: the digital transformation must take into account the context and the business challenges of the company.
  • Innovate: know-how in terms of software development can become more and more strategic for the company.

#BusinessIntelligence: for a better Control of Data

Business intelligence (BI) is a subject in full evolution, addressing the general management as well as the trades. BI helps decision-makers to get an overview of the different activities of the company and its environment. This cross-sectional view requires knowledge of the various business lines and involves certain organizational and managerial specificities. From the exploitation of business data to IT governance, the Business Intelligence point of view, and its decision-making tools such as reporting, dashboard and predictive analysis are so important for the success of a business.

The organization of BI in the company is highly dependent on the organization of the company itself. However, BI can have a structuring impact for the company, notably through the formalization of data repositories and the setting up of a competence center.

business intelligence

What is the purpose of Business Intelligence?

 

Business Intelligence (BI) encompasses IT solutions that provide decision support to professionals with end-to-end reports and dashboards to track analytical and forward-looking business activities of the company.

 

This notion was appeared in the end of the 1970s with the first infocentres. In the 1980s, the arrival of relational databases and the client / server made it possible to isolate production computing from decision-making devices. At the same time, different actors embarked as specialist of “business” layers analysist, in order to mask the complexity of the data structures. Beginning in the 1990s and 2000s, BI platforms were built around a data warehouse to integrate and organize information from enterprise applications (extraction, transfer and Consolidation). The only objective was to respond optimally to queries from reporting tools and dashboards of indicators and made it available to operational managers.

 

How does decision-making tools work today?

 

Over the past few years, BI platforms have benefited from NoSQL databases, enabling them to directly process unstructured data. Today Business Intelligence applications benefit from a more powerful hardware architecture, with the emergence of 64-bit, multi-core, and in-memory (RAM) architectures. In this way, they can execute more complex processes, such as data mining and multidimensional analyzes, which consist in modeling data according to several axes (turnover / geographical area, customer, product category, etc.). ..).

 

Which fields are covered by the BI?

 

Traditionally focused on accounting issues (consolidation and budget planning), BI has gradually expanded to cover all major areas of the company, from customer relationship management to supply chain management and human resources.

 

  • Finance, with financial and budgetary reports, for example;
  • Sale, with analysis of sales outlets, analysis of the profitability and impact of promotions for example;
  • Marketing, with customer segmentation, behavioral analysis for example;
  • Logistics, with optimization of inventory management, tracking of deliveries for example;
  • Human resources, with the optimization of the allocation of resources for example;

 

Specialized publishers have developed ready-to-use indicator libraries to monitor these different activities. Finally, with the emergence of new web technologies (including HTML5 and the JavaScript and AJAX graphical interfaces) we’ve seen the appearance of new players proposing a BI approach in the cloud or SaaS mode.   

 

Today, information is omnipresent; the difficulty is not to collect it, but to make it available in the right form, at the right time and to the right person, who will know how to exploit it and drive added value. So the BI market offers fairly comprehensive and complete solutions for the data reporting and consolidation aspects of both proprietary and open source domains. Possible developments in the short to medium term would include proactive and simulation analysis tools as well as the interactivity and user-friendliness of data access and the combination of structured and unstructured data from Internal and external data.

Secure #IOT: and if #BigData was the key?

By 2020, the planet will have more than 30 billion connected objects according to IDC. The security of these objects is a major discussion topic. Ensuring the security, reliability, resilience, and stability of these devices and services should be a critical concern not only for manufacturer, companies using them but also for the end user. Security solutions abound on the market, but has anyone just thought of Big Data?

 

The Internet of objects is third industrial technological revolution, enabling companies to work smarter, faster and of course in a more profitable way. IOT represents endless and challenging opportunities, and above all, it shows that a full-fledged ecosystem is being created. This is very different from big data, because most companies consider big data to be static; the data is generated in logs that have utility only where they are, because there is no connectivity. With the Internet of objects, the data is mobile.

 

A good example of the potential created by the Internet of objects is the work done by Deloitte and a medical device manufacturer in order to optimize the management of chronic diseases in patients with implanted devices. They have established remote data transmissions from patient pacemakers. Pacemakers communicate via Bluetooth at low frequency and contact the healthcare provider using a handset. With this connected object, the physician can obtain real time information to better determine the treatment protocols.

 

However, there’s one critical issue that still need to be addressed to facilitate the Internet of objects adoption by every organization, and this issue concerns the IOT security as well as all the elements that makes it up. With billions of objects and terminals connected to the Internet, including cars, homes, toasters, webcams, parking meters, portable objects, factories, oil platforms, energy networks and Heavy equipment, the Internet of objects abruptly multiplies the surface of threats, increasing the number of vulnerabilities and creating millions of opportunities for threats and attacks.

IOT Risk Management

The recent DDoS attack illustrates the alarming dangers and risks associated with unsecured devices and components of Internet of objects. This should certainly have the effect of raising awareness for businesses and individuals, and should lead them to take actions for the security of Internet of objects. According to a recent study released by computer security firm ESET and the NCSA (cyber security alliance), about 40% of respondents in the US have no confidence in the security and privacy of connected objects. So these security issues will remain at the forefront as long as manufacturers will not seriously removed security vulnerabilities, and companies won’t increase their internal cybersecurity measures to effectively detect and counter future security threats. Although it is necessary to take into account many parameters to secure the Internet of the objects (security of the terminals, network security, etc.), one of the key pieces of the puzzle is to determine how to take advantage of massive quantities of data continuously generated by the devices.

 

A data-driven approach to prevent IOT cyber attacks

 

Big data plays a crucial role in protecting a company and its assets against cyber threats. The future of the fight against IOT cybercrime will be based on the use of data for cybersecurity. According to a recent Forrester report, “Internet object security means monitoring at least 10 times, if not more than 100 times more physical devices, connections, authentications and data transfer events as today. Having a better ability to collect event data and intelligently analyze them through huge data sets will be crucial to the security of connected systems. “

Given all this, companies need to think about these two following things to prepare for this new era …

 

The first is that companies need to rethink the security perimeter. Recent attacks that have targeted connected objects have made clear that the “security perimeter” is now more conceptual than physical. The constantly evolving nature of our new hyperconnected world also leads to the constant evolution of threats. As the technical community continues to connect the world and contribute to innovations that improve home security, improve medical care and transform transport, it is clear that the hackers will seek to exploit these same innovations for harmful purposes. We need to rethink the perimeter of security as the corporate edge continues to expand beyond the traditional borders to which we were used to.

 

Then, the detection of the threats must adapt to the magnitude of the connected objects. The world continues to hyper-connect, the number of security events that any enterprise must store, consult and analyze are also increasing significantly. Having a cybersecurity platform capable of supporting billions of events is essential to ensure total supervision of all devices connecting to and accessing a company’s network. The use of technologies such as #MachineLearning for the detection of anomalies will allow companies to continue to detect suspicious behavior on the workstations without any human intervention. The ML scalability coupled with the Internet of the objects will be the key to the anticipated detection of the threats specific to IOT.

 

As we know, by 2020, the planet will have more than 30 billion connected objects. To get the most out of these revolutionary innovations and prevent them from becoming a nightmare in terms of IT security, organizations will have to learn how to manage, process, store, analyze and redistribute a vertiginous volume of data in real time and all of this by respecting security norms.. We increasingly depend on these devices for essential services, and their behavior may have global reach and impact.

 

Sources:

Big Data: 2017 Major Trends

big data trends 2017

Over the past year, we’ve seen more and more organizations store, process and exploit their data. By 2017, systems that support a large amount of structured and unstructured data will continue to grow. The devices should enable data managers to ensure the governance and security of Big Data while giving end-users the possibility to self-analyze these data.

Here below the hot predictions for 2017.

 

The year of the Data Analyst – According to forecasts, the Data Analyst role is expected to grow by 20% this year. Job offers for this occupation have never been more numerous before. Similarly, the number of people qualified for these jobs is also higher than ever. In addition, more and more universities and other training organizations offer specialized courses and deliver diplomas and certifications.

 

Big Data becomes transparent and fast – It is obviously possible to implement machine learning and perform sentiment analysis on Hadoop, but what will be the performance of interactive SQL? After all SQL is one of powerful approach to access, analyze, and manipulate data in Hadoop. In 2017, the possibilities to accelerate Hadoop will multiply. This change has already begun, as evidenced by the adoption of high performance databases such as Exasol or MemSQL, storage technology such as Kudu, or other products enabling faster query execution.

 

The Big Data is no longer confined to Hadoop – In recent years, we have seen several technologies developing with the arrival of Big Data to cover the need to do analysis on Hadoop. But for companies with complex and heterogeneous environments, the answers to their questions are distributed across multiple sources ranging from simple file to data warehouses in the cloud, structured data stored in Hadoop or other systems. In 2017, customers will ask to analyze all their data. Platforms for data analytics will develop, while those specifically designed for Hadoop will not be deployable for all use cases and will be soon forgotten.

 

An asset for companies: The exploitation of data lakes – A data lake is similar to a huge tank, it means one needs to build a cluster to fill up the tank with data in order to use it for different purpose such as predictive analysis, machine learning, cyber security, etc. Until now only the filling of the lake mattered for organizations but in 2017 companies will be finding ways to use data gathered in their reservoirs to be more productive.

 

Internet of Objects + Cloud = the ideal application of Big Data – The magic of the Internet of Objects relies on Big Data cloud services. The expansion of these cloud services will allow to collect all the data from sensors but also to feed the analyzes and the algorithms that will exploit them. The highly secure IOT’s cloud services will also help manufacturers create new products that can safely act on the gathered data without human intervention.

 

The concentration of IoT, Cloud and Big Data generates new opportunities for self-service analysis – It seems that by 2017 all objects will be equipped with sensors that will send information back to the “mother server”. Data gathered from IoT is often heterogeneous and stored in multiple relational or non-relational systems, from Hadoop cluster to NoSQL databases. While innovations in storage and integrated services have accelerated the process of capturing information, accessing and understanding the data itself remains the final challenge. We’ll see a huge demand for analytical tools that connect natively and combine large varieties of data sources hosted in the cloud.

 

Data Variety is more important than Velocity or Volume – For Gartner Big Data is made of 3 V: Large Volume, Large Velocity, Large Variety of Data. Although these three Vs evolve, the Variety is the main driver of investment in Big Data. In 2017, analysis platforms will be evaluated based on their ability to provide a direct connection to the most valuable data from the data lake.

 

Spark and Machine Learning makes Big Data undeniable – In a survey for Data Architect, IT managers and analysts, almost 70% of respondents favored Apache Spark compared to MapReduce, which is batch-oriented and does not lend itself to interactive applications or real time processing. These large processing capabilities on Big Data environments have evolved these platforms to intensive computational uses for Machine Learning, AI, and graph algorithms. Self-service software vendor’s capabilities will be judged on the way they will enable the data accessible to users, since opening the ML to the largest number will lead to the creation of more models and applications that will generate petabytes of data.

 

Self-service data preparation is becoming increasingly widespread as the end user begins to work in a Big Data framework – The rise of self-service analytical platforms has improved the accessibility of Hadoop to business users. But they still want to reduce the time and complexity of data preparation for analysis. Agile self-service data preparation tools not only enable Hadoop data to be prepared at source, but also make it accessible for faster and easier exploration. Companies specialized in data preparation tool for Big Data end-user, such as, Alteryx, Trifacta and Paxata are innovating and consistently reducing entry barriers for those who have not yet adopted Hadoop and will continue to gain ground in 2017.

 

Data management policies in hybrid cloud’s favor – Knowing where the data come from (not just which sensor or system, but from which country) will enable governments to implement more easily national data management policies. Multinationals using the cloud will face divergent interests. Increasingly, international companies will deploy hybrid clouds with servers located in regional datacenters as the local component of a wider cloud service to meet both cost reduction objectives and regulatory constraints.

 

New safety classification systems ensures a balance between protection and ease of access- Consumers are increasingly sensitive to the way data is collected, shared, stored – and sometimes stolen. An evolution that will push to more regulatory protection of personal information. Organizations will increasingly use classification systems that organize documents and data in different groups, each with predefined rules for access, drafting and masking. The constant threat posed by increasingly offensive hackers will encourage companies to increase security but also to monitor access and use of data.

 

With Big Data, artificial intelligence finds a new field of application – 2017 will be the year in which Artificial Intelligence (AI) technologies such as automatic learning, natural language recognition and property graphs will be used routinely to process data. If they were already accessible for Big Data via API libraries, we will gradually see the multiplication of these technologies in the IT tools that support applications, real-time analyzes and the scientific exploitation of data.

 

Big Data and big privacy – The Big Data will have to face immense challenges in the private sphere, in particular with the new regulations introduced by the European Union. Companies will be required to strengthen their confidentiality control procedures. Gartner predicts for 2018 that 50% of violations of a company’s ethical rules will be data-related.

 

Sources:

Top 10 Big Data Trends 2017 – Tableau

Big Data Industry Predictions for 2017 – Inside Bigdata

Value Creation with #BigData and #ConnectedObjects

The Internet of Things and the Big Data have extended the digital revolution to all parts of the economy. With the Internet of objects (IoT) and gathered data we are at the dawn of a new digital revolution. If #BigData helps companies to understand the behavior and expectations of their customers, the connected objects are contributing to the process.

 

Three aspects of the digital revolution in particular are shaking up technology, industry and the economy with profound social consequences: “the decrease of computing and telecommunication costs, which are gradually becoming cheap resources and easily accessible to everyone, IOT evolutions leading into an era of continuous and never-ended innovation and the desire to create something outside the box, a new economic mechanisms which in particular enables the development of activities with increasing returns that redefine the competitive rules of the game”.

IOT

 

One by one, all economic sectors are switching to the digital age by threatening disappearance of businesses that won’t evolve. Companies must consider their positioning in this new paradigm, rethink their business model, to develop new competitive advantages – those of the previous era becoming partially obsolete – and then to transform to implement the new vision.

 

Positioning and competitive advantages: Companies must first understand the potential value creation of connected objects and Big Data in their markets. Here are four key capabilities of connected objects combined with Big Data:

 

  • Monitoring: The sensors placed on the connected objects will provide with more information and control in order to identify and fix these problems. The data can also be used indirectly to better contemplate the design of future objects, to better segment the market and prices, or to provide a more efficient after-sales service;
  • Control: use of the gathered data by algorithms placed in the product or in the cloud makes it possible to remotely control the objects if they are equipped with actuators;
  • Optimization: the analysis of the current and past operating data of an object, crossed with all the other environmental data and the possibility of controlling them, makes it possible to optimize the efficiency of the object;
  • Autonomy: the combination of all previous capabilities and the latest developments in artificial intelligence allows to achieve a high level of autonomy of individual objects (such as household vacuum robots) or complete systems (such as smartgrid).

In addition, connected objects require companies to re-evaluate their environment, as the data produced and the services and platforms that accompany them allow for system optimization on a large scale. For example, public transport is already being considered in the context of a wider mobility market, in which the aim is no longer to operate a bus or subway network, but to help a Customer to go from point A to point B.

The ecosystem then expands to include transportation facilities in and around the city (bus, metros, individual car, taxis, car-sharing, etc.). .), GPS and mobile applications, social networks of users and infrastructures of the city (road, car parks, etc.).

CONNECTED OBJECTS

Transformation of the business model: Once measured the appearance of connected objects and their impact on a defined market, companies must think of their transformation to excel in this new paradigm. First, the company must evolve most of its functions and their expertise, in terms of:

 

  • Design: connected objects are more scalable, more efficient and less energy-consuming. Greater collaboration is needed between software teams and hardware teams to design new products and services that integrate more intelligence, sensors and remote capabilities in the cloud using Big Data;
  • Marketing: the new data created by the connected objects make it possible to better segment the market and individualize the customer relationship. This individualized marketing also makes it possible to design services more easily adaptable while preserving economies of scale;
  • Customer services: the role of customer services is gradually evolving towards the prevention of breakdowns, sometimes at a distance. The analysis of the data also allows these services to understand the causes of breakdown, in particular to improve the design.

 

We are witnessing a new era of the Internet of Things that, along with Big Data and cloud computing, is one of the key foundations for companies of the future. To do their best, companies will have to acquire much more robust technological infrastructure as these objects should be created within a safe environment where we trust digital technology. More fundamentally, companies need to evolve their structure and governance to gain agility and adaptability.

First step in #Cloud? List of questions you must ask your Cloud Service Provider before moving to Cloud

Services based on the cloud computing model are gaining more and more importance and changing considerably the way companies used to manage their data. Some see it as a practical solution for automatic storage, others a way to guard against data loss, and others, the way to have constant access to their data.  Cloud computing can be considerable as one of smart beneficial adoption for companies, such as storage on demand, cost savings and multiple servers etc.

 

Studies have revealed that over half of companies adapting cloud computing resources have significant improvements in their productivity, plus, studies conducted by the Cloud Industry Forum shows that 90% of UK and European companies run at least two cloud services. However, it is not enough to register with a provider and to migrate data on a remote server and then passively enjoy the savings. Certainly cloud has its advantages, but companies must learn about cloud transition challenges before taking any further step in its direction.

 

A cloud failure is a good example of critical problems that any organization may face when operating cloud services. Cloud failures can make data inaccessible for companies or can even be a cause of valuable data lose. As a business, what can you do to prevent data loss? And what questions entrepreneurs should ask before choosing any cloud solution? Here below is a list of important you must ask any cloud vendor before signing the contract!

 

    • Who is responsible for my data?
      Although data centres belong to the provider of cloud services, the ultimate responsibility of the stored data will usually attached to the client. Therefore, if data is lost for X reasons, this will certainly be the company, not the provider, who will be held accountable.

 

    • Your Cloud solution allow access 24h / 24, 7/7 … 365 days a year?
      Your company operates in real time and is based on your computer data: customer files, emails, accounting and important administrative documents … In case of a disaster such as, computer crash, virus or damaged hard drive, you won’t have enough time to save all you documents & files before the economic damage and data lose due to this accident.
      You’ll be happy to know that cloud is the ideal solution allowing you access to your data anywhere any time. But is this so sure?
      Before outsourcing your data in the cloud, check its availability rate, in other words, it’s the number of hours or days of maximum uptime guaranteed by your cloud provider. Time out break, whatever its duration, can result in a loss of money for company. This availability rate is a percentage and it is the third figure AFTER the point that matters. A guaranteed uptime of 99.9% may sound interesting on paper, but on the whole year, this can leave you with a service downtime of about nine hours without violating its obligations. A guarantee of 99.999% instead will turn in equivalent to only five minutes of downtime per year.
      So we advise you to avoid hosting below 99.995% (hours per year)!

 

    • In case of natural disaster, can you cloud provider save immediately all your data WITHOUT loss?
      You must ask cloud provider, what type of restitution method will implement upon the occurrence of such incidents. Can you recover all your files on a simple demand? How quickly and what format?
      Insist your provider that you get ALL of your data on demand, whatever the reason, within a reasonable time and in a format your servers can read! You should know that most cloud services includes a backup by default. It remains significant to Carefully Consider the provider’s backup policies by checking the data retention time (this allows you to recover your data as They Were at a Particular time).

 

    • Your data are “in the cloud” but where exactly?
      Do not let anyone play with your data privacy! To guarantee your data confidentiality, check in which country they are hosted. The jurisdiction of certain countries allows their different services to have a free access to your data. Therefore, if you host your data in Europe, America or Asia, you are not subject to the same legislation. The Privacy Shield ( “Personal Data Protection Shield”), the agreement regulates the use of personal data of European citizens by companies on American soil, was validated by States Member last July 8 but leaves many gray areas in practice.

 

    • What are the security measures in place to protect your data?
      All Cloud providers have invested in higher or lower security measures to protect against external attacks. Do not hesitate to meet your future cloud service technical service and to ask them directly about security measures.

 

    • What level of support is available?
      One of the most important points when choosing a Cloud partner, it is the level of service it offers, also called SLA (Service Level Agreement). Small businesses will be more dependent on advice and expert assistance on such matters in comparison of large groups. Big organizations have indeed the possibility to set up a public or private solution and manage it alone which is too complex and time-consuming for small structures.
      Dependent firms of this type of technical assistant need to determine what kind technical input they can reasonably expect from their cloud service provider. This assistance will focus mainly on maintenance and automatic updates of data. The choice of supplier is therefore essential on this aspect, which could make the difference between successful deployment and rollout to loss, where the customer is unable to take advantage of the tools at his disposal.

 

    • What about the general conditions?
      It is recommended that companies take enough time to read the small print at the bottom of the contract, since they often contain essential clauses that could have important implications later on. For example, a standard contract often will state the level of compensation that a client can benefit if the service guarantees are not met, as well as information about data recovery procedures.
      The contract should also explain what happens to the data of the company at the end of a contract. The last thing a customer needs to know is that the supplier has not totally erased its data at the end of the contract, which can be cause of potential threats.

 

If you have asked yourself these questions and you got all your answers, then you are on the right path for a successful cloud project and finding a serious supplier with whom you can maintain good relation for years, knowing that your data will be safe and always available when you need it.