#Data : An Important Piece To “The #InternetOfThings” Puzzle

Internet of things

Every day, connected objects generate billions of information that must be processed and analyzed to make them usable. Thanks to the development of connectivity on multiple devices, the arrival of inexpensive sensors and the data inflation they transmit, IoT have took irreplaceable place in our daily lives. IDC forecasts worldwide IOT market to grow more than $7,1 trillion by 2020. The number of devices will more than double from the current level, with 40.9 billion forecasted for 2020.


These very serious estimations do not, however, take into account the full extent of this digital revolution. If the design of connected objects is the showcase of the IoT and its vast possibilities, it still requires strong skills on the processing of the exploited data collected from sensors terminals, machines and platforms to interpret it in order to boost productivity and increase performance.


Just as in jewel market, the big winners are gold/diamond dealers. In the IOT domain, this role is played by companies able to manage the mountains of data generated by these connected devices because the collected data is profoundly changing the way businesses used to operate. Almost every day, new applications are imagined, with consequences at all levels of organizations because the real added value of connected objects only comes from the uses and the ability of companies to create new services.


Several studies demonstrate that companies are still facing a gap between the collection of new data and the presentation of the analyzed information so that it can be understood and explored in great detail, whether it is for a connected house, connected car, a portable terminal or an industrial solution.


Here below is the list of tips companies must consider before every IOT project implementation:


  • Sort valuable information among a big volume of data:
    Exploiting IoT means generating a huge amount of data. The challenge for companies is to filter the stray information and find the ones that are really important. This is why many companies integrate a flow analysis and a process analysis. The first provides real-time information from data streams such as navigation paths, logs, measurement data, and the second is to take machine data captures.


  • Set and manage priorities:
    The IoT implies different levels of necessity in terms of urgency and latency. It’s important to take this into account because one expects to interact with the “real world” in real time. For example, sensors in mines must trigger an alert as soon as they detect the presence of toxic gases. Similarly, other IoT information may not be needed “just in time”, such as regularly collected data to further refine and improve the predictive model itself. This data can potentially be collected and processed several times a day, for example.


  • Design considerations for IoT technologies:
    Information security, privacy and data protection should systematically be worked at the design stage. Unfortunately, in many cases, they are added on later once the intended functionality is in place. This not only limits the effectiveness of the added-on information security and privacy measures, but also is less efficient in terms of the cost to implement them. Although industries are actively working to address this, it stays a major IoT problem.


  • Cross the Data:
    In the case of preventive operations, for example, companies want to collect data from objects (such as smart meters) and cross them with relevant relational data, such as maintenance agreements, warranty information and life cycle components. It is therefore essential that companies can rely on the data from which they make important decisions.


  • Tracing the data:
    The increased collection of data may raise issues of authentication and trust in the objects. In addition, it should also be noted that by using information collected about and from multiple objects related to a single person, that person may become more easily identifiable and better known. So in order to fully exploit the potential of IoT, tools must be much more flexible and allow users to shape and adapt data in different ways, depending on their needs or those of their organization.


Collaboration between the IT team and business experts is more critical than ever before in analyzing IoT data. In addition to those who understand the data, it takes experts to analyze gathered data from specific devices or sensors. While any analyst can understand the data in the context of a company’s performance indicators, only a data specialist would be able to explain what kind of hidden data contains a wealth of information, and how with the right tools, companies can unleash that potential.

From Data to Knowledge: #BigData and #DataMining

The increasing digitization of our activities, the constantly accumulating capacity to store digital data, the accumulation of data of all kind resulting therefrom, generates a new sector of activity whose purpose is the analysis of large quantities of data. New approaches, new methods, new knowledge are emerging, and ultimately no doubt new ways of thinking and working. Thus, this very large amount of data, (=big data), and its processing, (=data mining), affect different sectors such as the economy, marketing, but also research and knowledge.

The economic, scientific and ethical implications of this data are quite significant. The fact that we are in a constantly evolving sector, where changes are frequent and rapid doesn’t make the analysis easy … However, a deep knowledge of data is necessary in order to better understand what data mining is.

From Data to Knowledge: #BigData and #DataMining

1 – What is data mining?             


Explore very large amounts of data. The purpose of data mining is to extract knowledge from large quantities of data by automatic or semiautomatic methods. Data mining, data drilling, knowledge Discovery from Data (KDD), are also referred as data mining.


  • How and why are such quantities of new data generated? Every minute 149519 e-mails are sent worldwide, 3.3 posts are published on Facebook, 3.8 million quarries are booked on Google, 65k photos are loaded on Instagram, 448k tweets are sent, 1400 posts are published via WordPress, 500 videos are uploaded on YouTube and last but not the least 29 million messages are sent via WhatsApp. These numbers can make one’s head go spin around, but important thing to note is that humans aren’t the only producers of data, machines also contribute with their sim cards, their sensors, and so on.
  • What to do with these data? If one understands the contemporary phenomenon of data accumulation, it is perhaps more difficult to perceive in what way these data, are changing the world. Depends how one is able to treat them. Science, IT, Medical sector relies heavily on statistics, on counting, and so on. From the moment when a set of data can be dealt with exhaustively, where cross-breeding and sorting can be carried out on a scale scarcely imaginable a few decades ago, these are analysis of our environment that are changing and being multiplied. In short, data is a tool for management and decision support and evaluation every sector and the raw material of the information is allowing the understanding of a phenomenon, a reality.


2 – Value of Data


While IT organizations are best able to grasp the market potential of data accumulation and processing, this is not the case everywhere, where the idea that data is new oil is making its way more slowly than one might have imagined.

  • What is the market value of the data? Building data through a variety of IT operations is a valuable potential that companies are not always aware of or using it. Even if they do not necessarily know how to exploit data themselves, they have resources that aren’t profitable for them yet. These gathered data and their use is a key issue for companies. The Big Data is a real source of marketing opportunities.
  • Data to be protect that is complex to exploit: Personal data poses many problems for researchers specialized in their analysis. First, they point to the need to better protect them and ensure their conservation. Moreover, it requires very specialized skills to be treated in order to produce interesting results.


3 – Data mining and targeted marketing 


One of the most significant applications of data mining is undoubtedly in the regeneration of marketing, because data mining allows companies to reach consumers very precisely by establishing precise and reliable profiles of their interest, purchasing methods, their standard of living, etc. Moreover, there is no need to go through a complicated process of search, each of the Internet users leaves enough traces when surfing, tweeting, publishing on Facebook, so that his profiling is possible, without his knowledge most of the time…

  • A new space for social science research: Viewed from another angle, this accumulated data is a gold mine for researchers. Some behavioral researchers have looked at the attitudes of Internet users using dating sites. In addition to finding that the data they use is more reliable than that obtained by meeting individuals (they are easier to lie to an investigator than to a machine …), they can make analyzes that are not politically correct but very informative!


4 – The data mining forecast tool

Data mining is also a tool that allows to multiply the properties related to the calculation of probability. Indeed, because it makes it possible to cross a volume of data, but above all, because it makes it possible to apply these calculations to many different fields, it appears today as able to make Forecasts. Plus, Data mining for forecasting offers the opportunity to leverage the numerous sources of time-series data, both internal and external, available to the business decision-maker, into actionable strategies that can directly impact profitability. Deciding what to make, when to make it and for whom is a complex process. Understanding what factors drive demand and how these factors interact with production processes or demand and change over time are keys to deriving value in this context.  Today scientists do not hesitate to announce that they will soon be able to predict the future. All this, thanks to the Data!

  • Probabilities and predictions: Today, predictive statistics tackle all sorts of issues: natural disasters, health, delinquency, climate … Statistical tools are numerous and are combined to improve outcomes, such as when using “random checks”. Even more fascinating, software is capable of improving itself and accumulating ever more data to boost their performance … In the meantime, it is possible to rely on these analyzes to try to avoid the flu or get vaccinated wisely.
  • Anticipating or Preventing Crimes: If the idea that a software would be able to predict crimes and misdemeanors reminds one of Spielberg’s film “Minority report”, reality has now caught up with the fiction: the PredPol (predictive policing) software makes it possible to estimate better than other human technique or analysis, places where crime is likely to occur, and consequently better place police patrols and other preventive measures.
  • Preventing fraud: Other perspectives offered by data mining, improve the fight against fraud and “scams” in insurances sector. Here again, it is a matter of better targeting the controls and apparently it works: This technique gives very clear results. In more than half of cases, when a controller will do a targeted control on the basis of the data mining, he’ll find good results. Insurance companies also apply this type of analysis to detect scams.

Physical & Cloud #DataProtection: Best Practices for your #Backup and #RecoveryProcess

Data has become one of the most valuable assets of organizations. Massive data is the new currency. Thanks to advancements in technology and connectivity, data creation is skyrocketing upwards. According to IDC this data is expected to double every two years for the next decade, hitting 45,000 exabytes in 2020. These data are stored in ever-increasing environments and connected devices, therefore backup and restore capability of an information system is a real challenge to ensure business continuity and the availability of associated data.

Data Protection

What must IT departments do to fulfill the data security mission? Well, the data security policy is at the heart of each business concerns and should be a fundamental part of their security strategy. Planned security measures can then create tactical and operational rules through the joint efforts of security and storage teams. To this end, storage must be an integral part of the company’s security strategy.


To achieve these objectives, a company must establish a cluster around the following five essential aspects:
• Allocation of responsibilities;
• Risk Assessment;
• Development of a data protection procedure;
• Communication of data protection procedure;
• Execution and testing of the data protection procedure.


  1. Allocation of responsibilities

The goal is to make storage security a fully-fledged feature of the IT security architecture. Even if the company decides that the responsibility for backup or storage security rests within the storage team, it must nevertheless integrate any safety measures in this area with task to secure the rest of the infrastructure. This integration will contribute to the establishment of in-depth protection. It is also advisable to share responsibility for extremely sensitive data. It’s therefore better to ensure that the person authorizing access is not the same as the person responsible for enforcement.


  1. Assessment of storage risks in the area of ​​IT security

Managers must review each step of their backup methodology to identify security vulnerabilities. Can an administrator secretly make copies of backup tapes? Are they stored in boxes accessible to everyone? Is there a rigorous end-to-end monitoring chain for backup tapes? If critical data is backed up and transported, vulnerabilities of this nature could make it easy prey. If the risk analysis reveals many vulnerabilities, the company must seriously question the encryption of its data.


  1. Development of an information protection program that guarantees the security of company data, at all times, wherever they are

Multi-level protection should be adopted by taking existing best practices for the data network in order to apply to the storage network, while adding specific layers adapted to the characteristics of the archived data, for example:

  • Authentication: application of multi-level authentication techniques and anti-spoofing (anti-identity or address spoofing).
    • Authorizations: access rights according to roles and responsibilities (as opposed to total administrative access).

It is imperative to duplicate backup tapes because it is never good to depend on a single copy of the data. Despite the longevity of the bands, they are still exposed to environmental and physical damage. A common practice is to perform nightly backups and then store these off-site tapes without any verification. Recommended best practices include duplicating backup tapes and then storing offsite copies.

Magnetic tapes remain the preferred storage mode for backups because they are economical and offer sufficient capacity to back up an entire operating system on a single cartridge. When stored properly, archival tapes have a lifetime of more than 30 years, making them an exceptionally reliable storage medium.


  1. Communication of the procedure to be applied with regard to the protection and security of information

Once the procedure for protecting and manipulating sensitive data has been defined, it is important to ensure that those responsible for their safety are informed and trained. Safety rules are the most important aspect of assigning responsibilities. Functional managers need to be aware of risks, countermeasures and costs.

Data loss and intellectual property theft affect the entire enterprise, not just the IT department. As such, the Director of Security must undertake a data security approach by training the different functional frameworks in the risks, threats and potential harms arising from security breaches, as well as the cost of the various possible countermeasures in this area. In this way, company executives can raise awareness about the cost / benefit of investments in data security.


  1. Implementation and testing of Data Protection and Security Plan

Securing data is not about technology but about procedure. This is why it is essential to test the procedure. In addition, as the growth of the company is accompanied by an evolution in security and data protection needs, IT security practices must also evolve.

Once the complete security plan has been developed, defined and communicated to the concerned team, only then it’s the right time to implement it. IT team must ensure the implementation of the tools, technologies and methodologies necessary for the classification of information. New technologies may be required to classify information or label it with metadata so that it is backed up according to appropriate rules and procedures.

Once in place, the procedure must be tested, both concerning backup and restore. The test is to introduce, into the process, any possible and imaginable danger, whether it is the loss of a tape or a server, network problems, equipment or filing of data or any other scenario which could affect the company’s performance.

It is advisable to carry out tests with personnel who are less familiar with the procedure, to ensure that it can nevertheless be applied without difficulty in the absence of the usual supervisor (due to illness, holidays or departure).

How Artificial Intelligence is impacting the Tourism Sector?

Artificial intelligence has existed for several years, yet we witness that it is now reaching another dimension, thanks to more powerful computers and the multiplication of available data. By its capacity to raise all sectors of activity, it is undeniable that it represents a great interest for Tourism. With the wealth of data available to professionals, today there are a multitude of technologies and recommendations applications, real time chatbot and personalized concierge services. The aim is to simplify the work of tourism industry professionals so that they can return to their core business with powerful tools and technologies and make an important difference in terms of profit and customer satisfaction. But the question one must ask is how to use Artificial Intelligence wisely?

Artificial Intelligence and Tourism

The first point: if we think about tourism future, in terms of types of travelers, its certain that we will be dealing with several categories of profiles, which may overlap. Our first category, for example, will be constituted, as is the case today, of travelers wishing to disconnect radically from their “everyday” environment in order to immerse themselves in another culture. And this, by all possible means.

Others, more vigilant, are the second category that will want to practice simple trips, without risks, even without surprises, neither good nor bad. This does not exclude, on the contrary, the survival of an adventure tourism.

For, the last profile, the purpose of a journey will be less the destination than the experience that one can have there. They will travel to learn how to cook a rare product or to learn a new activity based on information provided by our peers. The purpose of their travel will be based on learning.

Whatever the size of the group and the number of establishments it counts, it seems to me that we are moving towards a world where the tourist supply will continue to increase, thanks to two levers: new destinations and new traveler’s profiles. It will be required to be extremely flexible towards the customer’s expectations, to which one must respond with the development of innovative services to accompany them at each stage of their journey before, during and after their stay .


How can AI added value be applied to Tourism?
By Customization. And that is what profoundly changes the ins and outs. Rather than bringing the same experience to the same type of travel, artificial intelligence offers the possibility of matching the desires, habits, preferences of the tourist with the proposed product. “Artificial intelligence makes a data pool meaningful. By learning what the customer is looking for, buying, and loving, it makes it possible to generate customized and targeted offers that are more likely to be converted into a purchase.

Today, cognitive systems are capable of interacting in natural language, they can process a multitude of structured and unstructured data, developed with geo-localized content, and learn from each interaction. These systems will rapidly become essential in the development of strategic topics for the industry, such as “smarter destination”, on the personalization of the customer experience and its loyalty, as well as on the provision of management, analysis and Marketing, all this by using BigData. These services will be an asset to make the whole of the tourism sector more efficient by helping the actors and structures in place.


How far can artificial intelligence push the tourism industry?
Not up to replace the human. Robots are used for certain tasks, but not as a replacement for humans, although, in the long term, this could happen, but the problem of the energy that robots consume must be solved. Referring to artificial intelligence is often trying to compare with human intelligence, it’s important to notice that the aim of cognitive systems is NOT to replace human beings; Robots cannot reason or learn as a human being can do. They serve the needs and imagination of tourism professionals who, with the help of partners, take benefit from them thanks to their knowledge.


Like I’ve mentioned above that AI isn’t a new technology, we have been interested init since the 50/60 years, but if today the subject seems quite new, it is because the data is only available now. Tourism, like all industries, is digitized and gives a potentiality of data where one can apply machine learning. So AI is a revolution in progress, to the extent that it leads to new ways of thinking about the supplier’s offer.

Understanding the #Blockchain Economic Revolution

The Blockchain is a revolution that is undoubtedly leading to a complete overhaul of economic activity. It’s not a simple geek trend but still most people have absolutely no idea what the blockchain stands for. It’s essential to distinguish clearly the differences between bitcoin, crypto-currency and the breakthrough of technology underlying below the nameà the #Blockchain. You must know that there are several types of blockchains on the market, and bitcoin is another version of it which got huge success in recent years.


To be short, #Blockchain is an information storage and transmission technology that is transparent, secure and operates without a central control unit. Transactions between network users are grouped in blocks. Each block is validated by the nodes of the network called “minors”, according to the techniques that depend on the type of block. This process puts everything on trust between the market players without going through a central authority. It’s an open source system where each link in the chain offers autonomous legitimacy. The decentralized nature of the chain, coupled with its security and transparency, suggests a revolution of an unimaginable enigma. The fields of opportunity open far beyond those who have access to the monetary sector.

Understanding the Blockchain Economic Revolution


In fact, it is a revolution, as has been in human history, the advent of commerce. When individuals bought and sold their products face to face, with the handshake, the trust was established. Second, globalization has created new needs. Entities have been set up to protect sellers and buyers. Laws and legal services have developed around financial exchanges. Each market had to have intermediaries at the grass-roots level, without it being possible to assess or quantify a degree of trust between people. What changes with the blockchain is not only its decentralized aspect, but also absence of intermediates. Blockchains could replace most “trusted third parties” centralized by distributed computing systems. More than that, many observers highlight the blockchain as an alternative to any back-office systems in the banking sector. It would also help eradicate corruption in global supply chains.


The boom of the Internet offers some good indications on how the blockchain could develop. The Internet has reduced communication and distribution costs. For ex, the cost of a WhatsApp message is much cheaper than an SMS. Just as the cost of a software or an online platform is cheaper than having to sell its products through a physical store. The marginal operating costs, thanks to the Web, have been reduced to almost zero. This has caused profound changes in the telecommunication, media and software markets. The Blockchains result allowed to limit all marginal transaction costs close to 0.


Blockchains are a low-cost market disruptor for any business that acts as an intermediate in market. They allow things that have never been possible by using existing infrastructure and financial resources. We can exchange things that were not previously considered assets. It can be data, our reputation or unused power. The possibilities are as vast as they are unimaginable, but that does not mean that each type of element will be profitable for a company.


It is preferable not to dwell first on the technological aspect. It is much better to focus on the root of your customer’s problem. Successful businesses know how to identify, what is missing or a concern to their prospects, and know how to solve it. Blockchain technology is valuable in a setting where data has to be shared and edited by many unapproved parties. That is the infrastructure. The added value comes from the services that are built around it, with applications or modules.

Currently we are in the infrastructure market phase, there are still standards or platforms to democratize blockchain technology. In the near future, thanks to the crazy pace of development of this system, it will be easier for developers and entrepreneurs to use the blockchain on a daily basis. As easily as the MySQL or MongoDB databases we use today. Once the infrastructure stage is over, the evolution of blockchains will really become exciting. The infrastructure will be a huge database on which companies will be able to operate all kinds of connected objects or devices. The connected devices will collect data, blockchains will ensure, shear and process data; Artificial intelligence applications will automate activities.


Just imagine these farms where the product is grown and picked up by robots, delivered at home via drones, with a connected refrigerator that alerts us when we need something from there. An artificial intelligence system manages presets objectives to perfectly match the supply and demand. Blockchains are much more than just a bitcoin. They are the real building blocks of our future world.

The Impact and Challenges of Artificial Intelligence for Next-Gen Enterprises

Artificial Intelligence (AI) is not a new phenomenon. It continues to develop and its applications are already very present in our personal daily life (gaming, robotics, connected objects …), arousing as much enthusiasm as fear. This complex concept gained its success in the science fiction world. Although AI is still calling for a more or less fantasized imagination, it is an integral part of reality, and it can be found in many services, systems and applications.


What can be the role of artificial intelligence in the enterprise of the future? Will AI make organizations smarter? These are the main questions that have motivated big companies, with the objective of analyzing and anticipating the impacts of this revolution in progress. In this post, I’ll be discussing organizational, legal and ethical issues related to the governance of artificial intelligence in large enterprises.


A critical factor in adapting the company to the evolutions and challenges of AI environment is to rethink relationship with the company’s stakeholders, and in particular with the customer. It doesn’t mean that one must highlight that “the customer is important” but to emphasize the interaction with the customer. With that being said, client exists only through the interest and interactions developed with them.

So, the question companies should ask is how can they develop a successful interaction with the help of artificial intelligence? What does this mean concretely in terms of channels, content, customer knowledge and, above all, commitment to the customer?


Some companies have “Innovation and prospective” unit to carry out an analysis and a reflection of AI impact within the company. This is to take the step without neglecting the employees who are at the center of the subject. These cells allow the sharing of ideas. As the applications of artificial intelligence within the company are diverse, such as increase of human expertise through virtual assistants; optimization of certain products and services; new perspectives in research and development through the evolution of self-learning programs. The objective of this unit is to exchange in order to make the prospective, in a participatory way, through conferences, roundtables, written reports or scenarios, depending on the choice of the structures.


The Impact of Artificial Intelligence for Enterprises


Artificial intelligence technologies are already anchored in our daily lives. These technological advances intensely question the managerial and organizational practices around innovation in large companies. Many conducted surveys demonstrate that in general, companies do not have a dedicated budget for artificial intelligence. Nevertheless, there are either investment projects or resources that can be allocated to artificial intelligence teams integrated into the wider data teams. Be that as it may, the subject of artificial intelligence is present in large enterprises; It may remain theoretical but may also be the subject of initial experiments, notably concerning the predictive algorithms. Artificial intelligence does not fundamentally change everything in the company, it will rather “increase” performance, automating or perfecting certain processes and / or operations.


Benefits for organizations:


Today, artificial intelligence already generates many benefits for organizations, notably by:

  • Responding to Big Data issues; Artificial intelligence relies in large part of the search and mass analysis of data from which it can learn;
  • Increasing human decision-making expertise, online help assistant: a Hong Kong-based company, Deep Knowledge Venture (DKV), possesses, for example, artificial intelligence at its board of directors. Vital (Validating Investment Tool for Advancing Life Sciences) who makes investment recommendations and is also entitled to vote;
  • Optimizing services and products: improving customer knowledge, decision-making and operational processes;
  • Strengthening systems security: in the area of ​​cybersecurity, artificial intelligence becomes a structuring element of IT infrastructures, in order to secure networks. Automatic recognition is well established for the detection of fraud, and experts are under way to create algorithms that will identify threats that human brains and traditional security mechanisms fail to recognize.
  • Helping to make discoveries: some companies in the field of health analyze all the scientific publications related to a particular area of ​​research, which allows them to look for new properties, new molecules.




The challenges for large companies are numerous. Starting with cultural and organizational changes. As noted in the Telecom Foundation’s Watchbook No. 8: “The craze for artificial intelligences has been accelerated by the availability of AI capabilities in the form of APIs (on the one hand, Vision or predictive), and the source code of the platforms of Machine Learning released by major Internet operators, on the other hand “.

These technology facilitators will keep pushing companies to become APIs (APIs) in order to optimize their resources. It is therefore necessary to understand the world of APIs in this transversal, cross-enterprise approach, which is not without posing a number of challenges for large companies. To succeed one must develop the fallowing roadmap strategy:

  • Build stronger relationships with clients;
  • Optimize internal processes;
  • Accelerate the development of new developments.


To conclude I’ll say that we live a golden age of artificial intelligence, boosted by the increasing interest of web giants for the stakes of Big Data. The first AI investors are indeed the pure players of the internet and the main players of the software. The movement is launched, and it is our responsibility to anticipate the effects of this revolution on large companies.

IT Challenge : Re-invent business models … or disappear!

 Re-invent business models ... or disappear!

Technologies have significantly lowered market entry barriers and the development of free-of-charge models has favored the appearance of business models that have destabilized the positions of historical actors in most sectors.

As a matter of fact, In this digital age, existing approaches to develop, elaborate and describe business models are no longer appropriate for new connected business models. Technologies and services are becoming obsolete faster than in the past, consumers are pleased with innovation and customer experience, the need for agility weighs on production capacities and information systems, therefore cooperation becomes a must.


In this changing context, the risk of disappearing, for a company, has never been more present: this is what motivates collaboration and alliances between often competing actors. Cooperation can be seen in a positive competitive way front of the decline phase that threatens businesses of all sizes. This context justifies a reflection on the identification of large organization’s strengths and weaknesses in comparison to their competitors. By 2020, the ability to renew its business model will be critical to the growth and profitability of large firms.


Data Capitalizing and Customer Experience


Data is the black gold of present and future. But according to Gartner, 85% of the largest organizations in the 500 fortune ranking won’t get a real competitive advantage with the Big Data by 2020. The majority of them will still be in the experimental stage while those who have capitalized on the Big Data will gain a competitive advantage of 25% or more compared to their competitors. Therefore, the development of new products and services, facilitated by the intelligent use of data, creates real changes and new business opportunities.

In a context where the risk of disintermediation is major, control of customer relations, mass customization, co-design with consumers will be fundamental to the success of companies in 2020.


Challenges: Differentiating and Innovating


  • Understand: it’s essential to keep track of business model and strategies of its competitors of the digital world, the latter being both potential threats and powerful levers of development.
  • Transform: Large groups, especially if they are economically powerful, often find it difficult to transform their organization and integrate innovation, because of their complexity.
  • Listen: anticipating the needs of consumers and focusing on the customer experience means constantly evolving business models in order to develop business agility.
  • Collaborate: creating strategic partnerships with the company’s ecosystem, especially suppliers, accelerates innovation processes and reduces time to market.
  • Adjust: the digital transformation must take into account the context and the business challenges of the company.
  • Innovate: know-how in terms of software development can become more and more strategic for the company.

#BusinessIntelligence: for a better Control of Data

Business intelligence (BI) is a subject in full evolution, addressing the general management as well as the trades. BI helps decision-makers to get an overview of the different activities of the company and its environment. This cross-sectional view requires knowledge of the various business lines and involves certain organizational and managerial specificities. From the exploitation of business data to IT governance, the Business Intelligence point of view, and its decision-making tools such as reporting, dashboard and predictive analysis are so important for the success of a business.

The organization of BI in the company is highly dependent on the organization of the company itself. However, BI can have a structuring impact for the company, notably through the formalization of data repositories and the setting up of a competence center.

business intelligence

What is the purpose of Business Intelligence?


Business Intelligence (BI) encompasses IT solutions that provide decision support to professionals with end-to-end reports and dashboards to track analytical and forward-looking business activities of the company.


This notion was appeared in the end of the 1970s with the first infocentres. In the 1980s, the arrival of relational databases and the client / server made it possible to isolate production computing from decision-making devices. At the same time, different actors embarked as specialist of “business” layers analysist, in order to mask the complexity of the data structures. Beginning in the 1990s and 2000s, BI platforms were built around a data warehouse to integrate and organize information from enterprise applications (extraction, transfer and Consolidation). The only objective was to respond optimally to queries from reporting tools and dashboards of indicators and made it available to operational managers.


How does decision-making tools work today?


Over the past few years, BI platforms have benefited from NoSQL databases, enabling them to directly process unstructured data. Today Business Intelligence applications benefit from a more powerful hardware architecture, with the emergence of 64-bit, multi-core, and in-memory (RAM) architectures. In this way, they can execute more complex processes, such as data mining and multidimensional analyzes, which consist in modeling data according to several axes (turnover / geographical area, customer, product category, etc.). ..).


Which fields are covered by the BI?


Traditionally focused on accounting issues (consolidation and budget planning), BI has gradually expanded to cover all major areas of the company, from customer relationship management to supply chain management and human resources.


  • Finance, with financial and budgetary reports, for example;
  • Sale, with analysis of sales outlets, analysis of the profitability and impact of promotions for example;
  • Marketing, with customer segmentation, behavioral analysis for example;
  • Logistics, with optimization of inventory management, tracking of deliveries for example;
  • Human resources, with the optimization of the allocation of resources for example;


Specialized publishers have developed ready-to-use indicator libraries to monitor these different activities. Finally, with the emergence of new web technologies (including HTML5 and the JavaScript and AJAX graphical interfaces) we’ve seen the appearance of new players proposing a BI approach in the cloud or SaaS mode.   


Today, information is omnipresent; the difficulty is not to collect it, but to make it available in the right form, at the right time and to the right person, who will know how to exploit it and drive added value. So the BI market offers fairly comprehensive and complete solutions for the data reporting and consolidation aspects of both proprietary and open source domains. Possible developments in the short to medium term would include proactive and simulation analysis tools as well as the interactivity and user-friendliness of data access and the combination of structured and unstructured data from Internal and external data.

Secure #IOT: and if #BigData was the key?

By 2020, the planet will have more than 30 billion connected objects according to IDC. The security of these objects is a major discussion topic. Ensuring the security, reliability, resilience, and stability of these devices and services should be a critical concern not only for manufacturer, companies using them but also for the end user. Security solutions abound on the market, but has anyone just thought of Big Data?


The Internet of objects is third industrial technological revolution, enabling companies to work smarter, faster and of course in a more profitable way. IOT represents endless and challenging opportunities, and above all, it shows that a full-fledged ecosystem is being created. This is very different from big data, because most companies consider big data to be static; the data is generated in logs that have utility only where they are, because there is no connectivity. With the Internet of objects, the data is mobile.


A good example of the potential created by the Internet of objects is the work done by Deloitte and a medical device manufacturer in order to optimize the management of chronic diseases in patients with implanted devices. They have established remote data transmissions from patient pacemakers. Pacemakers communicate via Bluetooth at low frequency and contact the healthcare provider using a handset. With this connected object, the physician can obtain real time information to better determine the treatment protocols.


However, there’s one critical issue that still need to be addressed to facilitate the Internet of objects adoption by every organization, and this issue concerns the IOT security as well as all the elements that makes it up. With billions of objects and terminals connected to the Internet, including cars, homes, toasters, webcams, parking meters, portable objects, factories, oil platforms, energy networks and Heavy equipment, the Internet of objects abruptly multiplies the surface of threats, increasing the number of vulnerabilities and creating millions of opportunities for threats and attacks.

IOT Risk Management

The recent DDoS attack illustrates the alarming dangers and risks associated with unsecured devices and components of Internet of objects. This should certainly have the effect of raising awareness for businesses and individuals, and should lead them to take actions for the security of Internet of objects. According to a recent study released by computer security firm ESET and the NCSA (cyber security alliance), about 40% of respondents in the US have no confidence in the security and privacy of connected objects. So these security issues will remain at the forefront as long as manufacturers will not seriously removed security vulnerabilities, and companies won’t increase their internal cybersecurity measures to effectively detect and counter future security threats. Although it is necessary to take into account many parameters to secure the Internet of the objects (security of the terminals, network security, etc.), one of the key pieces of the puzzle is to determine how to take advantage of massive quantities of data continuously generated by the devices.


A data-driven approach to prevent IOT cyber attacks


Big data plays a crucial role in protecting a company and its assets against cyber threats. The future of the fight against IOT cybercrime will be based on the use of data for cybersecurity. According to a recent Forrester report, “Internet object security means monitoring at least 10 times, if not more than 100 times more physical devices, connections, authentications and data transfer events as today. Having a better ability to collect event data and intelligently analyze them through huge data sets will be crucial to the security of connected systems. “

Given all this, companies need to think about these two following things to prepare for this new era …


The first is that companies need to rethink the security perimeter. Recent attacks that have targeted connected objects have made clear that the “security perimeter” is now more conceptual than physical. The constantly evolving nature of our new hyperconnected world also leads to the constant evolution of threats. As the technical community continues to connect the world and contribute to innovations that improve home security, improve medical care and transform transport, it is clear that the hackers will seek to exploit these same innovations for harmful purposes. We need to rethink the perimeter of security as the corporate edge continues to expand beyond the traditional borders to which we were used to.


Then, the detection of the threats must adapt to the magnitude of the connected objects. The world continues to hyper-connect, the number of security events that any enterprise must store, consult and analyze are also increasing significantly. Having a cybersecurity platform capable of supporting billions of events is essential to ensure total supervision of all devices connecting to and accessing a company’s network. The use of technologies such as #MachineLearning for the detection of anomalies will allow companies to continue to detect suspicious behavior on the workstations without any human intervention. The ML scalability coupled with the Internet of the objects will be the key to the anticipated detection of the threats specific to IOT.


As we know, by 2020, the planet will have more than 30 billion connected objects. To get the most out of these revolutionary innovations and prevent them from becoming a nightmare in terms of IT security, organizations will have to learn how to manage, process, store, analyze and redistribute a vertiginous volume of data in real time and all of this by respecting security norms.. We increasingly depend on these devices for essential services, and their behavior may have global reach and impact.



Big Data: 2017 Major Trends

big data trends 2017

Over the past year, we’ve seen more and more organizations store, process and exploit their data. By 2017, systems that support a large amount of structured and unstructured data will continue to grow. The devices should enable data managers to ensure the governance and security of Big Data while giving end-users the possibility to self-analyze these data.

Here below the hot predictions for 2017.


The year of the Data Analyst – According to forecasts, the Data Analyst role is expected to grow by 20% this year. Job offers for this occupation have never been more numerous before. Similarly, the number of people qualified for these jobs is also higher than ever. In addition, more and more universities and other training organizations offer specialized courses and deliver diplomas and certifications.


Big Data becomes transparent and fast – It is obviously possible to implement machine learning and perform sentiment analysis on Hadoop, but what will be the performance of interactive SQL? After all SQL is one of powerful approach to access, analyze, and manipulate data in Hadoop. In 2017, the possibilities to accelerate Hadoop will multiply. This change has already begun, as evidenced by the adoption of high performance databases such as Exasol or MemSQL, storage technology such as Kudu, or other products enabling faster query execution.


The Big Data is no longer confined to Hadoop – In recent years, we have seen several technologies developing with the arrival of Big Data to cover the need to do analysis on Hadoop. But for companies with complex and heterogeneous environments, the answers to their questions are distributed across multiple sources ranging from simple file to data warehouses in the cloud, structured data stored in Hadoop or other systems. In 2017, customers will ask to analyze all their data. Platforms for data analytics will develop, while those specifically designed for Hadoop will not be deployable for all use cases and will be soon forgotten.


An asset for companies: The exploitation of data lakes – A data lake is similar to a huge tank, it means one needs to build a cluster to fill up the tank with data in order to use it for different purpose such as predictive analysis, machine learning, cyber security, etc. Until now only the filling of the lake mattered for organizations but in 2017 companies will be finding ways to use data gathered in their reservoirs to be more productive.


Internet of Objects + Cloud = the ideal application of Big Data – The magic of the Internet of Objects relies on Big Data cloud services. The expansion of these cloud services will allow to collect all the data from sensors but also to feed the analyzes and the algorithms that will exploit them. The highly secure IOT’s cloud services will also help manufacturers create new products that can safely act on the gathered data without human intervention.


The concentration of IoT, Cloud and Big Data generates new opportunities for self-service analysis – It seems that by 2017 all objects will be equipped with sensors that will send information back to the “mother server”. Data gathered from IoT is often heterogeneous and stored in multiple relational or non-relational systems, from Hadoop cluster to NoSQL databases. While innovations in storage and integrated services have accelerated the process of capturing information, accessing and understanding the data itself remains the final challenge. We’ll see a huge demand for analytical tools that connect natively and combine large varieties of data sources hosted in the cloud.


Data Variety is more important than Velocity or Volume – For Gartner Big Data is made of 3 V: Large Volume, Large Velocity, Large Variety of Data. Although these three Vs evolve, the Variety is the main driver of investment in Big Data. In 2017, analysis platforms will be evaluated based on their ability to provide a direct connection to the most valuable data from the data lake.


Spark and Machine Learning makes Big Data undeniable – In a survey for Data Architect, IT managers and analysts, almost 70% of respondents favored Apache Spark compared to MapReduce, which is batch-oriented and does not lend itself to interactive applications or real time processing. These large processing capabilities on Big Data environments have evolved these platforms to intensive computational uses for Machine Learning, AI, and graph algorithms. Self-service software vendor’s capabilities will be judged on the way they will enable the data accessible to users, since opening the ML to the largest number will lead to the creation of more models and applications that will generate petabytes of data.


Self-service data preparation is becoming increasingly widespread as the end user begins to work in a Big Data framework – The rise of self-service analytical platforms has improved the accessibility of Hadoop to business users. But they still want to reduce the time and complexity of data preparation for analysis. Agile self-service data preparation tools not only enable Hadoop data to be prepared at source, but also make it accessible for faster and easier exploration. Companies specialized in data preparation tool for Big Data end-user, such as, Alteryx, Trifacta and Paxata are innovating and consistently reducing entry barriers for those who have not yet adopted Hadoop and will continue to gain ground in 2017.


Data management policies in hybrid cloud’s favor – Knowing where the data come from (not just which sensor or system, but from which country) will enable governments to implement more easily national data management policies. Multinationals using the cloud will face divergent interests. Increasingly, international companies will deploy hybrid clouds with servers located in regional datacenters as the local component of a wider cloud service to meet both cost reduction objectives and regulatory constraints.


New safety classification systems ensures a balance between protection and ease of access- Consumers are increasingly sensitive to the way data is collected, shared, stored – and sometimes stolen. An evolution that will push to more regulatory protection of personal information. Organizations will increasingly use classification systems that organize documents and data in different groups, each with predefined rules for access, drafting and masking. The constant threat posed by increasingly offensive hackers will encourage companies to increase security but also to monitor access and use of data.


With Big Data, artificial intelligence finds a new field of application – 2017 will be the year in which Artificial Intelligence (AI) technologies such as automatic learning, natural language recognition and property graphs will be used routinely to process data. If they were already accessible for Big Data via API libraries, we will gradually see the multiplication of these technologies in the IT tools that support applications, real-time analyzes and the scientific exploitation of data.


Big Data and big privacy – The Big Data will have to face immense challenges in the private sphere, in particular with the new regulations introduced by the European Union. Companies will be required to strengthen their confidentiality control procedures. Gartner predicts for 2018 that 50% of violations of a company’s ethical rules will be data-related.



Top 10 Big Data Trends 2017 – Tableau

Big Data Industry Predictions for 2017 – Inside Bigdata