Smart Cities – Privacy, Security, #CyberAttacks and #DataProtection


Smart city components

“Smart cities” is a buzzword of the moment. There is currently no single accepted definition of a “smart city” and much depends on who is supplying the characteristics: industry, politicians, civil society and citizens/users are four immediately and obviously disparate sets of stakeholders. It is easier perhaps not to define smart cities but to elaborate their key features in orser to better understand this concept. The connecting key infrastructure that is most often mentioned as making cities “smart” includes:

 

  • networks of sensors attached to real-world objects such as roads, cars, fridge, electricity meters, domestic appliances and human medical implants which connect these objects (=IOT) to digital networks. These IoT networks generate data in particularly huge amounts known as “big data”.
  • networks of digital communications enabling real-time data streams which can be combined with each other and then be mined and repurposed for useful results;
  • high capacity, often cloud-based, infrastructure which can support and provide storage for this interconnection of data, applications, things, and people.

 

Scanning through numerous smart city projects and initiatives undertook, eight key activities can be identified that often define a smart city, ie: smart governance, smart infrastructure, smart building, smart connectivity, smart healthcare, smart energy, smart mobility and smart citizens.

 

A European survey shows that the benefits of smart cities are obvious, but IT security and technological challenges are a major barrier to their acceptance. Ruckus, a network connectivity provider, has published the results of its Smart Cities Survey with UK market research firm, Atomik Research. The survey surveyed 380 European IT decision-makers from the public sector.

 

The aim of the study is to understand the attitudes towards the implementation of smart city concepts and to learn what opportunities they offer to the industry. The majority of respondents (82%) believe that smart city technologies are helping to increase citizens’ security and reduce crime rates, for example via smart lighting or networked surveillance cameras. Although the benefits seem to be well known, fears of cyber attacks are a major barrier to the Smart City. For 58% of the IT decision makers surveyed, the biggest problem is followed by a lack of technology infrastructure and funding.

 

Benefits of citywide connectivity

 

The survey results show that the infrastructure and technology platforms created for Smart Cities could be used to add significant value to the public sector and to develop innovative applications that directly address citizens’ needs. Other areas that benefit from the smart city model include local health (81%) and transport (81%), which provide greater access to public services for citizens through extensive networking. According to IT decision-makers, smart city concepts also provide crucial benefits for the security of citizens (72%), public transport (62%) and the health service (60%).

Nick Watson, vice president of EMEA at Ruckus, said: “A basic understanding of the benefits to citizens shows that policymakers are aware of the benefits of this technology. As the return on investment becomes clearer and smart cities become more and more commonplace, targeted advocacy will allow organizations to work together to make the city of the future a reality. Of course, given the amount of sensitive data that could be divulged, it is not surprising that security concerns play a big role. Only a           secure, robust and reliable network will allow to address these concerns and create a secure foundation for smart cities. “

 

Benefits of smart cities

 

The survey shows that the public sector is well aware of the added value that smart cities has to offer. Almost two-thirds (65%) of respondents said smart cities bring benefits. 78% of respondents said that they recognize that there are strong economic reasons for investing in smart city concepts. These reasons include firstly the credibility of a smart city (20%) and future infrastructure (19%). On the other hand, there is the related attractiveness, which leads to the resettlement of companies (18%) and suggests that the true value of smart cities lies in generating revenue and boosting the local economy.

These findings are a positive step towards ideal framework conditions in which smart cities can successfully develop. To make smart cities a reality across Europe, it takes an overarching approach involving all departments of a city. However, the Ruckus survey also found that isolated projects (39%) still pose a major barrier to smart cities.

Although lack of funding is seen as the third most obstacles to rapid implementation, 78% of respondents across countries expect to have the budget for smart city solutions by 2019. This should also be facilitated by promotional announcements such as the Wifi4EU program. It gives cities the security that the infrastructure will be available to support smart technologies.

 

Overcome barriers

 

To provide these services, a stable public WiFi network is crucial. 76% of respondents agree that this is the most important factor in successfully implementing smart city concepts. 34% agree that Wi-Fi is more important than a wired network. Wi-Fi is probably the preferred infrastructure because people are familiar with it and it gives everyone access to information. If you want to be able to connect with your citizens and use the services you offer more effectively, you need a suitable infrastructure to connect with the public in a way that benefits them.

WLAN is the “glue” for intelligent cities’ network. It makes it easier to distribute the load and reduces connection problems. The access point at the edge of the network is the ideal interface that acts as a message broker by delivering traffic, performing and returning simple data processing, and placing the software through controllers.

However, not all WLAN technologies are the same. Power supply (53%), interference (52%) and backhauls (45%) are the biggest obstacles to setting up a public WLAN infrastructure. 51% of IT decision makers called the consolidation of existing networks as another crucial obstacle. This is particularly important because the number of connected devices is increasing at a time when existing networks are not prepared for the exponential growth of data consumption. IT decision makers have the clear task of choosing the right technology partner to meet the technological needs of their city.

For Ruckus, the findings of this study are an opportunity to engage in dialogue with various public-sector organizations on how smart city technologies and a public Wi-Fi network can add value. The survey shows that WLAN is considered necessary for the creation of smart cities because:

  • It gives access to everyone information (71%);
  • it delivers the necessary infrastructure to offer additional services (70%);
  • it overcomes the digital divide between citizens (67 percent);
  • it is cheaper for governments (61%);
  • it could lead to better service (37%);

The research shows that Wi-Fi is a key contributor to helping smart cities deliver reliably and sustainably, but along the way, European policymakers still have some obstacles to overcome. It is reassuring to see that there is a widespread belief that smart cities add value to society. But if the government and the public sector are not investing in the right technology, then they risk missing the numerous opportunities for cities, citizens and themselves.

#GDPR: Does your Business comply with the new #DataProtection requirements?

Our data is one of our most prized asset. As an organisation, our clients entrust us with this data. In our vision data and its security must be critical for each operations, innovation and competitive position. As an enterprise, you can be more successful in your respective line of business when you manage to get your data security right.

 

Therefore, the EU’s GDPR brings data protection legislation into line with new, previously unforeseen ways that data is now used. This wide Basic Data Protection Act (EU-GDPR) can be very complex and opaque. IBM Security has developed a five-phase framework to help organizations implement the mandatory regulation from 2018 onwards.

 

In addition to that, IBM Security has also worked in the past to create a service that will help companies prepare for the upcoming GDPR. Instead of accessing complicated, multi-dimensional matrices or diagrams, a simple framework was compiled.

 

Step by Step GDPR

 

Each journey begins with the first step, and so IBM Security has also extracted five separate steps for the journey to GDPR’s expertise. This allows companies to fallow a step by step guidelines through the five, to the point, phase framework. The framework also takes account of the fact that each company will have its own needs during the process. Therefore, it is designed as simply as possible.

 

Based on the main focus of the GDPR, the five steps within the framework are subdivided into the areas of data protection and security. Since both areas are closely interwoven, IBM Security has selected the following area definitions for us: In the field of data protection everything is about what data is collected and why they are managed, shared, processed and moved around. Security, on the other hand, is much more concerned with how data can be controlled and protected. This also means that within a company, security can be achieved without data protection, but no data protection can be guaranteed without adhering to security standards.

 

The five-phase framework for the GDPR

IBM’s GDPR Framework

 

The approach for a basic GDPR expertise in five steps is the fallowing:

 

Phase 1: this first step is related to company assesses. It is necessary to examine which of the collected and stored data are affected by the GDPR guidelines. A plan is then drawn up to reveal this data.

 

Phase 2: is about the company’s own approach, a solid plan that governs the collection, use, and storage of data. This approach is based on the architecture and strategy on the basis of which risks and company objectives are exploited. Designing privacy, data management and security management are top priority.

 

Phase 3: the company’s way of doing are rethought. It is important to understand that the data gathered so far are as valuable to the people as they are to the company. At this point, sustainable data protection guidelines have to be developed. However, it is also about introducing safety controls and administrative controls (also: TOM – Technical and Organizational Measures) and appointing a Data Protection Officer so the GDPR training can be delivered to the right persons for the job.

 

Phase 4: in this phase, companies are ready to implement their data protection approach. Data streams are continuously checked from this phase, and access to data is monitored. In addition, security checks are performed and unimportant data is deleted.

 

Phase 5: the company is ready to comply with the GDPR guidelines. From then on, all requests for access, correction, deletion and transmission of data are met. In addition, by documenting all activities, the company is prepared for possible audits and can, in the case of a data lap, inform regulators and affected parties.

 

Above is the direct approach of IBM Security to make companies fit for GDPR. The way to get there is not always easy, but the framework should at least show it more clearly. Companies are themselves responsible for compliance with the applicable regulations and laws, which are included in the EU-GDPR. Note that IBM does not provide any legal advice and does not warrant that IBM’s services or products comply with applicable laws or regulations.

#Data : An Important Piece To “The #InternetOfThings” Puzzle

Internet of things

Every day, connected objects generate billions of information that must be processed and analyzed to make them usable. Thanks to the development of connectivity on multiple devices, the arrival of inexpensive sensors and the data inflation they transmit, IoT have took irreplaceable place in our daily lives. IDC forecasts worldwide IOT market to grow more than $7,1 trillion by 2020. The number of devices will more than double from the current level, with 40.9 billion forecasted for 2020.

 

These very serious estimations do not, however, take into account the full extent of this digital revolution. If the design of connected objects is the showcase of the IoT and its vast possibilities, it still requires strong skills on the processing of the exploited data collected from sensors terminals, machines and platforms to interpret it in order to boost productivity and increase performance.

 

Just as in jewel market, the big winners are gold/diamond dealers. In the IOT domain, this role is played by companies able to manage the mountains of data generated by these connected devices because the collected data is profoundly changing the way businesses used to operate. Almost every day, new applications are imagined, with consequences at all levels of organizations because the real added value of connected objects only comes from the uses and the ability of companies to create new services.

 

Several studies demonstrate that companies are still facing a gap between the collection of new data and the presentation of the analyzed information so that it can be understood and explored in great detail, whether it is for a connected house, connected car, a portable terminal or an industrial solution.

 

Here below is the list of tips companies must consider before every IOT project implementation:

 

  • Sort valuable information among a big volume of data:
    Exploiting IoT means generating a huge amount of data. The challenge for companies is to filter the stray information and find the ones that are really important. This is why many companies integrate a flow analysis and a process analysis. The first provides real-time information from data streams such as navigation paths, logs, measurement data, and the second is to take machine data captures.

 

  • Set and manage priorities:
    The IoT implies different levels of necessity in terms of urgency and latency. It’s important to take this into account because one expects to interact with the “real world” in real time. For example, sensors in mines must trigger an alert as soon as they detect the presence of toxic gases. Similarly, other IoT information may not be needed “just in time”, such as regularly collected data to further refine and improve the predictive model itself. This data can potentially be collected and processed several times a day, for example.

 

  • Design considerations for IoT technologies:
    Information security, privacy and data protection should systematically be worked at the design stage. Unfortunately, in many cases, they are added on later once the intended functionality is in place. This not only limits the effectiveness of the added-on information security and privacy measures, but also is less efficient in terms of the cost to implement them. Although industries are actively working to address this, it stays a major IoT problem.

 

  • Cross the Data:
    In the case of preventive operations, for example, companies want to collect data from objects (such as smart meters) and cross them with relevant relational data, such as maintenance agreements, warranty information and life cycle components. It is therefore essential that companies can rely on the data from which they make important decisions.

 

  • Tracing the data:
    The increased collection of data may raise issues of authentication and trust in the objects. In addition, it should also be noted that by using information collected about and from multiple objects related to a single person, that person may become more easily identifiable and better known. So in order to fully exploit the potential of IoT, tools must be much more flexible and allow users to shape and adapt data in different ways, depending on their needs or those of their organization.

 

Collaboration between the IT team and business experts is more critical than ever before in analyzing IoT data. In addition to those who understand the data, it takes experts to analyze gathered data from specific devices or sensors. While any analyst can understand the data in the context of a company’s performance indicators, only a data specialist would be able to explain what kind of hidden data contains a wealth of information, and how with the right tools, companies can unleash that potential.

From Data to Knowledge: #BigData and #DataMining

The increasing digitization of our activities, the constantly accumulating capacity to store digital data, the accumulation of data of all kind resulting therefrom, generates a new sector of activity whose purpose is the analysis of large quantities of data. New approaches, new methods, new knowledge are emerging, and ultimately no doubt new ways of thinking and working. Thus, this very large amount of data, (=big data), and its processing, (=data mining), affect different sectors such as the economy, marketing, but also research and knowledge.

The economic, scientific and ethical implications of this data are quite significant. The fact that we are in a constantly evolving sector, where changes are frequent and rapid doesn’t make the analysis easy … However, a deep knowledge of data is necessary in order to better understand what data mining is.

From Data to Knowledge: #BigData and #DataMining

1 – What is data mining?             

 

Explore very large amounts of data. The purpose of data mining is to extract knowledge from large quantities of data by automatic or semiautomatic methods. Data mining, data drilling, knowledge Discovery from Data (KDD), are also referred as data mining.

 

  • How and why are such quantities of new data generated? Every minute 149519 e-mails are sent worldwide, 3.3 posts are published on Facebook, 3.8 million quarries are booked on Google, 65k photos are loaded on Instagram, 448k tweets are sent, 1400 posts are published via WordPress, 500 videos are uploaded on YouTube and last but not the least 29 million messages are sent via WhatsApp. These numbers can make one’s head go spin around, but important thing to note is that humans aren’t the only producers of data, machines also contribute with their sim cards, their sensors, and so on.
  • What to do with these data? If one understands the contemporary phenomenon of data accumulation, it is perhaps more difficult to perceive in what way these data, are changing the world. Depends how one is able to treat them. Science, IT, Medical sector relies heavily on statistics, on counting, and so on. From the moment when a set of data can be dealt with exhaustively, where cross-breeding and sorting can be carried out on a scale scarcely imaginable a few decades ago, these are analysis of our environment that are changing and being multiplied. In short, data is a tool for management and decision support and evaluation every sector and the raw material of the information is allowing the understanding of a phenomenon, a reality.

 

2 – Value of Data

 

While IT organizations are best able to grasp the market potential of data accumulation and processing, this is not the case everywhere, where the idea that data is new oil is making its way more slowly than one might have imagined.

  • What is the market value of the data? Building data through a variety of IT operations is a valuable potential that companies are not always aware of or using it. Even if they do not necessarily know how to exploit data themselves, they have resources that aren’t profitable for them yet. These gathered data and their use is a key issue for companies. The Big Data is a real source of marketing opportunities.
  • Data to be protect that is complex to exploit: Personal data poses many problems for researchers specialized in their analysis. First, they point to the need to better protect them and ensure their conservation. Moreover, it requires very specialized skills to be treated in order to produce interesting results.

 

3 – Data mining and targeted marketing 

 

One of the most significant applications of data mining is undoubtedly in the regeneration of marketing, because data mining allows companies to reach consumers very precisely by establishing precise and reliable profiles of their interest, purchasing methods, their standard of living, etc. Moreover, there is no need to go through a complicated process of search, each of the Internet users leaves enough traces when surfing, tweeting, publishing on Facebook, so that his profiling is possible, without his knowledge most of the time…

  • A new space for social science research: Viewed from another angle, this accumulated data is a gold mine for researchers. Some behavioral researchers have looked at the attitudes of Internet users using dating sites. In addition to finding that the data they use is more reliable than that obtained by meeting individuals (they are easier to lie to an investigator than to a machine …), they can make analyzes that are not politically correct but very informative!

 

4 – The data mining forecast tool

Data mining is also a tool that allows to multiply the properties related to the calculation of probability. Indeed, because it makes it possible to cross a volume of data, but above all, because it makes it possible to apply these calculations to many different fields, it appears today as able to make Forecasts. Plus, Data mining for forecasting offers the opportunity to leverage the numerous sources of time-series data, both internal and external, available to the business decision-maker, into actionable strategies that can directly impact profitability. Deciding what to make, when to make it and for whom is a complex process. Understanding what factors drive demand and how these factors interact with production processes or demand and change over time are keys to deriving value in this context.  Today scientists do not hesitate to announce that they will soon be able to predict the future. All this, thanks to the Data!

  • Probabilities and predictions: Today, predictive statistics tackle all sorts of issues: natural disasters, health, delinquency, climate … Statistical tools are numerous and are combined to improve outcomes, such as when using “random checks”. Even more fascinating, software is capable of improving itself and accumulating ever more data to boost their performance … In the meantime, it is possible to rely on these analyzes to try to avoid the flu or get vaccinated wisely.
  • Anticipating or Preventing Crimes: If the idea that a software would be able to predict crimes and misdemeanors reminds one of Spielberg’s film “Minority report”, reality has now caught up with the fiction: the PredPol (predictive policing) software makes it possible to estimate better than other human technique or analysis, places where crime is likely to occur, and consequently better place police patrols and other preventive measures.
  • Preventing fraud: Other perspectives offered by data mining, improve the fight against fraud and “scams” in insurances sector. Here again, it is a matter of better targeting the controls and apparently it works: This technique gives very clear results. In more than half of cases, when a controller will do a targeted control on the basis of the data mining, he’ll find good results. Insurance companies also apply this type of analysis to detect scams.

Physical & Cloud #DataProtection: Best Practices for your #Backup and #RecoveryProcess

Data has become one of the most valuable assets of organizations. Massive data is the new currency. Thanks to advancements in technology and connectivity, data creation is skyrocketing upwards. According to IDC this data is expected to double every two years for the next decade, hitting 45,000 exabytes in 2020. These data are stored in ever-increasing environments and connected devices, therefore backup and restore capability of an information system is a real challenge to ensure business continuity and the availability of associated data.

Data Protection

What must IT departments do to fulfill the data security mission? Well, the data security policy is at the heart of each business concerns and should be a fundamental part of their security strategy. Planned security measures can then create tactical and operational rules through the joint efforts of security and storage teams. To this end, storage must be an integral part of the company’s security strategy.

 

To achieve these objectives, a company must establish a cluster around the following five essential aspects:
• Allocation of responsibilities;
• Risk Assessment;
• Development of a data protection procedure;
• Communication of data protection procedure;
• Execution and testing of the data protection procedure.

 

  1. Allocation of responsibilities

The goal is to make storage security a fully-fledged feature of the IT security architecture. Even if the company decides that the responsibility for backup or storage security rests within the storage team, it must nevertheless integrate any safety measures in this area with task to secure the rest of the infrastructure. This integration will contribute to the establishment of in-depth protection. It is also advisable to share responsibility for extremely sensitive data. It’s therefore better to ensure that the person authorizing access is not the same as the person responsible for enforcement.

 

  1. Assessment of storage risks in the area of ​​IT security

Managers must review each step of their backup methodology to identify security vulnerabilities. Can an administrator secretly make copies of backup tapes? Are they stored in boxes accessible to everyone? Is there a rigorous end-to-end monitoring chain for backup tapes? If critical data is backed up and transported, vulnerabilities of this nature could make it easy prey. If the risk analysis reveals many vulnerabilities, the company must seriously question the encryption of its data.

 

  1. Development of an information protection program that guarantees the security of company data, at all times, wherever they are

Multi-level protection should be adopted by taking existing best practices for the data network in order to apply to the storage network, while adding specific layers adapted to the characteristics of the archived data, for example:

  • Authentication: application of multi-level authentication techniques and anti-spoofing (anti-identity or address spoofing).
    • Authorizations: access rights according to roles and responsibilities (as opposed to total administrative access).

It is imperative to duplicate backup tapes because it is never good to depend on a single copy of the data. Despite the longevity of the bands, they are still exposed to environmental and physical damage. A common practice is to perform nightly backups and then store these off-site tapes without any verification. Recommended best practices include duplicating backup tapes and then storing offsite copies.

Magnetic tapes remain the preferred storage mode for backups because they are economical and offer sufficient capacity to back up an entire operating system on a single cartridge. When stored properly, archival tapes have a lifetime of more than 30 years, making them an exceptionally reliable storage medium.

 

  1. Communication of the procedure to be applied with regard to the protection and security of information

Once the procedure for protecting and manipulating sensitive data has been defined, it is important to ensure that those responsible for their safety are informed and trained. Safety rules are the most important aspect of assigning responsibilities. Functional managers need to be aware of risks, countermeasures and costs.

Data loss and intellectual property theft affect the entire enterprise, not just the IT department. As such, the Director of Security must undertake a data security approach by training the different functional frameworks in the risks, threats and potential harms arising from security breaches, as well as the cost of the various possible countermeasures in this area. In this way, company executives can raise awareness about the cost / benefit of investments in data security.

 

  1. Implementation and testing of Data Protection and Security Plan

Securing data is not about technology but about procedure. This is why it is essential to test the procedure. In addition, as the growth of the company is accompanied by an evolution in security and data protection needs, IT security practices must also evolve.

Once the complete security plan has been developed, defined and communicated to the concerned team, only then it’s the right time to implement it. IT team must ensure the implementation of the tools, technologies and methodologies necessary for the classification of information. New technologies may be required to classify information or label it with metadata so that it is backed up according to appropriate rules and procedures.

Once in place, the procedure must be tested, both concerning backup and restore. The test is to introduce, into the process, any possible and imaginable danger, whether it is the loss of a tape or a server, network problems, equipment or filing of data or any other scenario which could affect the company’s performance.

It is advisable to carry out tests with personnel who are less familiar with the procedure, to ensure that it can nevertheless be applied without difficulty in the absence of the usual supervisor (due to illness, holidays or departure).

How Artificial Intelligence is impacting the Tourism Sector?

Artificial intelligence has existed for several years, yet we witness that it is now reaching another dimension, thanks to more powerful computers and the multiplication of available data. By its capacity to raise all sectors of activity, it is undeniable that it represents a great interest for Tourism. With the wealth of data available to professionals, today there are a multitude of technologies and recommendations applications, real time chatbot and personalized concierge services. The aim is to simplify the work of tourism industry professionals so that they can return to their core business with powerful tools and technologies and make an important difference in terms of profit and customer satisfaction. But the question one must ask is how to use Artificial Intelligence wisely?

Artificial Intelligence and Tourism

The first point: if we think about tourism future, in terms of types of travelers, its certain that we will be dealing with several categories of profiles, which may overlap. Our first category, for example, will be constituted, as is the case today, of travelers wishing to disconnect radically from their “everyday” environment in order to immerse themselves in another culture. And this, by all possible means.

Others, more vigilant, are the second category that will want to practice simple trips, without risks, even without surprises, neither good nor bad. This does not exclude, on the contrary, the survival of an adventure tourism.

For, the last profile, the purpose of a journey will be less the destination than the experience that one can have there. They will travel to learn how to cook a rare product or to learn a new activity based on information provided by our peers. The purpose of their travel will be based on learning.

Whatever the size of the group and the number of establishments it counts, it seems to me that we are moving towards a world where the tourist supply will continue to increase, thanks to two levers: new destinations and new traveler’s profiles. It will be required to be extremely flexible towards the customer’s expectations, to which one must respond with the development of innovative services to accompany them at each stage of their journey before, during and after their stay .

 

How can AI added value be applied to Tourism?
By Customization. And that is what profoundly changes the ins and outs. Rather than bringing the same experience to the same type of travel, artificial intelligence offers the possibility of matching the desires, habits, preferences of the tourist with the proposed product. “Artificial intelligence makes a data pool meaningful. By learning what the customer is looking for, buying, and loving, it makes it possible to generate customized and targeted offers that are more likely to be converted into a purchase.

Today, cognitive systems are capable of interacting in natural language, they can process a multitude of structured and unstructured data, developed with geo-localized content, and learn from each interaction. These systems will rapidly become essential in the development of strategic topics for the industry, such as “smarter destination”, on the personalization of the customer experience and its loyalty, as well as on the provision of management, analysis and Marketing, all this by using BigData. These services will be an asset to make the whole of the tourism sector more efficient by helping the actors and structures in place.

 

How far can artificial intelligence push the tourism industry?
Not up to replace the human. Robots are used for certain tasks, but not as a replacement for humans, although, in the long term, this could happen, but the problem of the energy that robots consume must be solved. Referring to artificial intelligence is often trying to compare with human intelligence, it’s important to notice that the aim of cognitive systems is NOT to replace human beings; Robots cannot reason or learn as a human being can do. They serve the needs and imagination of tourism professionals who, with the help of partners, take benefit from them thanks to their knowledge.

 

Like I’ve mentioned above that AI isn’t a new technology, we have been interested init since the 50/60 years, but if today the subject seems quite new, it is because the data is only available now. Tourism, like all industries, is digitized and gives a potentiality of data where one can apply machine learning. So AI is a revolution in progress, to the extent that it leads to new ways of thinking about the supplier’s offer.

Understanding the #Blockchain Economic Revolution

The Blockchain is a revolution that is undoubtedly leading to a complete overhaul of economic activity. It’s not a simple geek trend but still most people have absolutely no idea what the blockchain stands for. It’s essential to distinguish clearly the differences between bitcoin, crypto-currency and the breakthrough of technology underlying below the nameà the #Blockchain. You must know that there are several types of blockchains on the market, and bitcoin is another version of it which got huge success in recent years.

 

To be short, #Blockchain is an information storage and transmission technology that is transparent, secure and operates without a central control unit. Transactions between network users are grouped in blocks. Each block is validated by the nodes of the network called “minors”, according to the techniques that depend on the type of block. This process puts everything on trust between the market players without going through a central authority. It’s an open source system where each link in the chain offers autonomous legitimacy. The decentralized nature of the chain, coupled with its security and transparency, suggests a revolution of an unimaginable enigma. The fields of opportunity open far beyond those who have access to the monetary sector.

Understanding the Blockchain Economic Revolution

 

In fact, it is a revolution, as has been in human history, the advent of commerce. When individuals bought and sold their products face to face, with the handshake, the trust was established. Second, globalization has created new needs. Entities have been set up to protect sellers and buyers. Laws and legal services have developed around financial exchanges. Each market had to have intermediaries at the grass-roots level, without it being possible to assess or quantify a degree of trust between people. What changes with the blockchain is not only its decentralized aspect, but also absence of intermediates. Blockchains could replace most “trusted third parties” centralized by distributed computing systems. More than that, many observers highlight the blockchain as an alternative to any back-office systems in the banking sector. It would also help eradicate corruption in global supply chains.

 

The boom of the Internet offers some good indications on how the blockchain could develop. The Internet has reduced communication and distribution costs. For ex, the cost of a WhatsApp message is much cheaper than an SMS. Just as the cost of a software or an online platform is cheaper than having to sell its products through a physical store. The marginal operating costs, thanks to the Web, have been reduced to almost zero. This has caused profound changes in the telecommunication, media and software markets. The Blockchains result allowed to limit all marginal transaction costs close to 0.

 

Blockchains are a low-cost market disruptor for any business that acts as an intermediate in market. They allow things that have never been possible by using existing infrastructure and financial resources. We can exchange things that were not previously considered assets. It can be data, our reputation or unused power. The possibilities are as vast as they are unimaginable, but that does not mean that each type of element will be profitable for a company.

 

It is preferable not to dwell first on the technological aspect. It is much better to focus on the root of your customer’s problem. Successful businesses know how to identify, what is missing or a concern to their prospects, and know how to solve it. Blockchain technology is valuable in a setting where data has to be shared and edited by many unapproved parties. That is the infrastructure. The added value comes from the services that are built around it, with applications or modules.

Currently we are in the infrastructure market phase, there are still standards or platforms to democratize blockchain technology. In the near future, thanks to the crazy pace of development of this system, it will be easier for developers and entrepreneurs to use the blockchain on a daily basis. As easily as the MySQL or MongoDB databases we use today. Once the infrastructure stage is over, the evolution of blockchains will really become exciting. The infrastructure will be a huge database on which companies will be able to operate all kinds of connected objects or devices. The connected devices will collect data, blockchains will ensure, shear and process data; Artificial intelligence applications will automate activities.

 

Just imagine these farms where the product is grown and picked up by robots, delivered at home via drones, with a connected refrigerator that alerts us when we need something from there. An artificial intelligence system manages presets objectives to perfectly match the supply and demand. Blockchains are much more than just a bitcoin. They are the real building blocks of our future world.

The Impact and Challenges of Artificial Intelligence for Next-Gen Enterprises

Artificial Intelligence (AI) is not a new phenomenon. It continues to develop and its applications are already very present in our personal daily life (gaming, robotics, connected objects …), arousing as much enthusiasm as fear. This complex concept gained its success in the science fiction world. Although AI is still calling for a more or less fantasized imagination, it is an integral part of reality, and it can be found in many services, systems and applications.

 

What can be the role of artificial intelligence in the enterprise of the future? Will AI make organizations smarter? These are the main questions that have motivated big companies, with the objective of analyzing and anticipating the impacts of this revolution in progress. In this post, I’ll be discussing organizational, legal and ethical issues related to the governance of artificial intelligence in large enterprises.

 

A critical factor in adapting the company to the evolutions and challenges of AI environment is to rethink relationship with the company’s stakeholders, and in particular with the customer. It doesn’t mean that one must highlight that “the customer is important” but to emphasize the interaction with the customer. With that being said, client exists only through the interest and interactions developed with them.

So, the question companies should ask is how can they develop a successful interaction with the help of artificial intelligence? What does this mean concretely in terms of channels, content, customer knowledge and, above all, commitment to the customer?

 

Some companies have “Innovation and prospective” unit to carry out an analysis and a reflection of AI impact within the company. This is to take the step without neglecting the employees who are at the center of the subject. These cells allow the sharing of ideas. As the applications of artificial intelligence within the company are diverse, such as increase of human expertise through virtual assistants; optimization of certain products and services; new perspectives in research and development through the evolution of self-learning programs. The objective of this unit is to exchange in order to make the prospective, in a participatory way, through conferences, roundtables, written reports or scenarios, depending on the choice of the structures.

 

The Impact of Artificial Intelligence for Enterprises

 

Artificial intelligence technologies are already anchored in our daily lives. These technological advances intensely question the managerial and organizational practices around innovation in large companies. Many conducted surveys demonstrate that in general, companies do not have a dedicated budget for artificial intelligence. Nevertheless, there are either investment projects or resources that can be allocated to artificial intelligence teams integrated into the wider data teams. Be that as it may, the subject of artificial intelligence is present in large enterprises; It may remain theoretical but may also be the subject of initial experiments, notably concerning the predictive algorithms. Artificial intelligence does not fundamentally change everything in the company, it will rather “increase” performance, automating or perfecting certain processes and / or operations.

 

Benefits for organizations:

 

Today, artificial intelligence already generates many benefits for organizations, notably by:

  • Responding to Big Data issues; Artificial intelligence relies in large part of the search and mass analysis of data from which it can learn;
  • Increasing human decision-making expertise, online help assistant: a Hong Kong-based company, Deep Knowledge Venture (DKV), possesses, for example, artificial intelligence at its board of directors. Vital (Validating Investment Tool for Advancing Life Sciences) who makes investment recommendations and is also entitled to vote;
  • Optimizing services and products: improving customer knowledge, decision-making and operational processes;
  • Strengthening systems security: in the area of ​​cybersecurity, artificial intelligence becomes a structuring element of IT infrastructures, in order to secure networks. Automatic recognition is well established for the detection of fraud, and experts are under way to create algorithms that will identify threats that human brains and traditional security mechanisms fail to recognize.
  • Helping to make discoveries: some companies in the field of health analyze all the scientific publications related to a particular area of ​​research, which allows them to look for new properties, new molecules.

 

Challenges:

 

The challenges for large companies are numerous. Starting with cultural and organizational changes. As noted in the Telecom Foundation’s Watchbook No. 8: “The craze for artificial intelligences has been accelerated by the availability of AI capabilities in the form of APIs (on the one hand, Vision or predictive), and the source code of the platforms of Machine Learning released by major Internet operators, on the other hand “.

These technology facilitators will keep pushing companies to become APIs (APIs) in order to optimize their resources. It is therefore necessary to understand the world of APIs in this transversal, cross-enterprise approach, which is not without posing a number of challenges for large companies. To succeed one must develop the fallowing roadmap strategy:

  • Build stronger relationships with clients;
  • Optimize internal processes;
  • Accelerate the development of new developments.

 

To conclude I’ll say that we live a golden age of artificial intelligence, boosted by the increasing interest of web giants for the stakes of Big Data. The first AI investors are indeed the pure players of the internet and the main players of the software. The movement is launched, and it is our responsibility to anticipate the effects of this revolution on large companies.

IT Challenge : Re-invent business models … or disappear!

 Re-invent business models ... or disappear!

Technologies have significantly lowered market entry barriers and the development of free-of-charge models has favored the appearance of business models that have destabilized the positions of historical actors in most sectors.

As a matter of fact, In this digital age, existing approaches to develop, elaborate and describe business models are no longer appropriate for new connected business models. Technologies and services are becoming obsolete faster than in the past, consumers are pleased with innovation and customer experience, the need for agility weighs on production capacities and information systems, therefore cooperation becomes a must.

 

In this changing context, the risk of disappearing, for a company, has never been more present: this is what motivates collaboration and alliances between often competing actors. Cooperation can be seen in a positive competitive way front of the decline phase that threatens businesses of all sizes. This context justifies a reflection on the identification of large organization’s strengths and weaknesses in comparison to their competitors. By 2020, the ability to renew its business model will be critical to the growth and profitability of large firms.

 

Data Capitalizing and Customer Experience

 

Data is the black gold of present and future. But according to Gartner, 85% of the largest organizations in the 500 fortune ranking won’t get a real competitive advantage with the Big Data by 2020. The majority of them will still be in the experimental stage while those who have capitalized on the Big Data will gain a competitive advantage of 25% or more compared to their competitors. Therefore, the development of new products and services, facilitated by the intelligent use of data, creates real changes and new business opportunities.

In a context where the risk of disintermediation is major, control of customer relations, mass customization, co-design with consumers will be fundamental to the success of companies in 2020.

 

Challenges: Differentiating and Innovating

 

  • Understand: it’s essential to keep track of business model and strategies of its competitors of the digital world, the latter being both potential threats and powerful levers of development.
  • Transform: Large groups, especially if they are economically powerful, often find it difficult to transform their organization and integrate innovation, because of their complexity.
  • Listen: anticipating the needs of consumers and focusing on the customer experience means constantly evolving business models in order to develop business agility.
  • Collaborate: creating strategic partnerships with the company’s ecosystem, especially suppliers, accelerates innovation processes and reduces time to market.
  • Adjust: the digital transformation must take into account the context and the business challenges of the company.
  • Innovate: know-how in terms of software development can become more and more strategic for the company.

#BusinessIntelligence: for a better Control of Data

Business intelligence (BI) is a subject in full evolution, addressing the general management as well as the trades. BI helps decision-makers to get an overview of the different activities of the company and its environment. This cross-sectional view requires knowledge of the various business lines and involves certain organizational and managerial specificities. From the exploitation of business data to IT governance, the Business Intelligence point of view, and its decision-making tools such as reporting, dashboard and predictive analysis are so important for the success of a business.

The organization of BI in the company is highly dependent on the organization of the company itself. However, BI can have a structuring impact for the company, notably through the formalization of data repositories and the setting up of a competence center.

business intelligence

What is the purpose of Business Intelligence?

 

Business Intelligence (BI) encompasses IT solutions that provide decision support to professionals with end-to-end reports and dashboards to track analytical and forward-looking business activities of the company.

 

This notion was appeared in the end of the 1970s with the first infocentres. In the 1980s, the arrival of relational databases and the client / server made it possible to isolate production computing from decision-making devices. At the same time, different actors embarked as specialist of “business” layers analysist, in order to mask the complexity of the data structures. Beginning in the 1990s and 2000s, BI platforms were built around a data warehouse to integrate and organize information from enterprise applications (extraction, transfer and Consolidation). The only objective was to respond optimally to queries from reporting tools and dashboards of indicators and made it available to operational managers.

 

How does decision-making tools work today?

 

Over the past few years, BI platforms have benefited from NoSQL databases, enabling them to directly process unstructured data. Today Business Intelligence applications benefit from a more powerful hardware architecture, with the emergence of 64-bit, multi-core, and in-memory (RAM) architectures. In this way, they can execute more complex processes, such as data mining and multidimensional analyzes, which consist in modeling data according to several axes (turnover / geographical area, customer, product category, etc.). ..).

 

Which fields are covered by the BI?

 

Traditionally focused on accounting issues (consolidation and budget planning), BI has gradually expanded to cover all major areas of the company, from customer relationship management to supply chain management and human resources.

 

  • Finance, with financial and budgetary reports, for example;
  • Sale, with analysis of sales outlets, analysis of the profitability and impact of promotions for example;
  • Marketing, with customer segmentation, behavioral analysis for example;
  • Logistics, with optimization of inventory management, tracking of deliveries for example;
  • Human resources, with the optimization of the allocation of resources for example;

 

Specialized publishers have developed ready-to-use indicator libraries to monitor these different activities. Finally, with the emergence of new web technologies (including HTML5 and the JavaScript and AJAX graphical interfaces) we’ve seen the appearance of new players proposing a BI approach in the cloud or SaaS mode.   

 

Today, information is omnipresent; the difficulty is not to collect it, but to make it available in the right form, at the right time and to the right person, who will know how to exploit it and drive added value. So the BI market offers fairly comprehensive and complete solutions for the data reporting and consolidation aspects of both proprietary and open source domains. Possible developments in the short to medium term would include proactive and simulation analysis tools as well as the interactivity and user-friendliness of data access and the combination of structured and unstructured data from Internal and external data.