Challenges of #ArtificialIntelligence


Until few years ago, #ArtificialIntelligence (#AI) was similar to nuclear fusion in unfulfilled promise. It had been around a long time but had not reached the spectacular heights foreseen in its initial stages. However now, Artificial intelligence (AI) is no longer the future. It is here and now. It’s realizing its potential in achieving man-like capabilities, so it’s the right time to ask: How can business leaders adapt AI to take advantage of the specific strengths of man and machine?


AI is swiftly becoming the foundational technology in areas as diverse as self-driving cars, financial trading, connected houses etc. Self-learning algorithms are now routinely embedded in mobile and online services. Researchers have leveraged massive gains in processing power and the data streaming from digital devices and connected sensors to improve AI performance. Therefore, the progress in robotics, self driving cars, speech processing, natural language understanding is quite impressive.


But with all the advantages AI can offer, there are still some challenges for the companies who wants to adapt #AI. As AI is a vast domain, lisitng all challenges is quite impossible, yet we’ve listed few generic challenges of Artificial Intelligence here below, such as: AI situated approach in the real-world; Learning process with human intervention; Access to other disciplines; Multitasking; Validation and certification of AI systems.


Artificial Intelligence’s Situated Approach:

Artificial Intelligence systems must operate and interact with the real world and their environment, receiving sensor data, determining the environment in which they operate, act on the real world, are such examples. Artificial Intelligence systems must behave autonomously and maintain their integrity under various conditions. To meet these requirements, AI systems must manage unstructured data as well as semantic data.


AL system and Human Intervention:

AI systems are programmed to interact with human users: they must therefore be able to explain their behavior, justify in a certain way the decisions they make so that human users can understand their actions and motivations. If this understanding is not forthcoming, human users will have little or no confidence in the AI’s systems, which isn’t acceptable. In addition to that, AI systems need some flexibility and adaptability in order to manage different users and different expectations. It is important to develop interaction mechanisms that promote good communication and interoperation between humans and AI systems.


AI, Opening to other disciplines:

An AI will often be integrated into a larger system of many other elements. Openness therefore means that AI scientists and developers will have to collaborate with specialists in other computer science disciplines (ex, modelling and predicting, verification and validation, networks, visualization, human-machine interaction, etc.) to compose a competitive and wider system of AI. The second aspect to consider is the impact of AI systems on many facets of our lives, our economy and our society, and therefore the collaboration with non-computer specialists such as psychologists, biologists, mathematicians, economists, environmentalists and lawyers is a must.



Many AI systems are excellent-competent in a specific area, but turn out incompetent outside of their specific areas. However, systems operating in a real environment, such as robots, must be able to perform several parallel actions, such as memorizing facts, assimilating new concepts, acting on the real world and interacting with humans.


Validation and certification: 

The essential element of AI’s critical systems, the certification of AI systems or their validation by appropriate means, are real challenges, especially if they meet the expectations mentioned above (adaptation, multitasking, learning processes with human intervention). The verification, validation and certification of conventional systems (which are therefore not part of the AI) is already a difficult task – even if there are already exploitable technologies. The application of these tools to complex AI systems is a daunting task that needs to be addressed in order to be able to use these systems in environments such as airplanes, nuclear power plants, hospitals, and so on.


Other Generic Challenges: 

In addition to the previous challenges, the following requirements for AI systems should lead to new research activities: some are extremely complex and cannot be satisfied in the short term, but worth attention.
Implanting norms and values ​​into AI systems goes far beyond existing science and technology: for example, should a robot that will buy milk for its owner stop on the way to help a person whose life is in danger? Could a powerful AI technology be used by artificial terrorists? At present, AI research is far from being able to meet these requirements.
The privacy requirement is particularly important for AI systems confronted with personal data, such as intelligent assistants / companions or data mining systems. This requirement was already valid for conventional systems, but AI systems have the particularity that they will generate new knowledge from private data and will probably make them public if there are no technical means capable of imposing restrictions.

Final challenge concerns the scaling-up. AI systems must be able to manage large amounts of data and situations. We’ve seen learning algorithms that absorb millions of data points (signals, images, videos, etc.) and large-scale reasoning systems, such as the IBM Watson system, using encyclopedic knowledge. However, the question of scaling for the many V’s (variety, volume, speed, vocabularies, etc.) remains unanswered.


AI Applications: 

This is not strictly a challenge for AI, but it is important to highlight that AI systems contribute to resolve societal problems: AI applications cover the entire range of human activities, such as environment and energy, health and assisting living and home maintenance, transportation and smart cities, etc. They can be beneficial to mankind and the economy, but they can also pose threats if they are not controlled as planned.

Benefits of a User-Centric Information Systems

Continuous improvements in technological applications have allowed more and more organizations to develop systems with user-focused designs. Not only they get higher success in day-to-day usage, but it helps to increases the overall probability of technology adoption. With a User-Centric Information Systems, organizations have a comprehensive examination of the latest strategies and methods for creating technological systems with end users as the central point of the design process.

Here below is a checklist that lets you decide whether including user experience in IT project development worth the time and resources?


01: Integrate the End-User’s Perspective


Focusing on the backbone of the IS system, the IT production teams must ensure that the infrastructure works: servers, switches, storage, etc. Even if the gathered information is a good indicator of the end user’s experience, it remains insufficient.

Imagine a strategic application deployed in an organization, which is taking enormous time to start, while in data center, all signals are green. In the absence of an end user-centric information system, you need to rely on help desk calls to acknowledge the problem – and we all know that all problems aren’t carried forward.

With the right tools, you can detect real time problems and take better preventive measures before the user can be affected.


02: Prevention is better than cure


Anticipating and resolving problems proactively before they affect end-users reduces the financial impact on the IT Team and the entire organization. But how can one be proactive while concentrating all their efforts to correct problems posteriori? Well, with an IS analysis solution, centered on the end-user, organizations are able to detect and interpret any event related to each application and connection within their environment. This high level of visibility allows them to understand what’s happening and why, but also to proactively solve problems before they affect user productivity.

If a new business application requires the use of a specific version of an Internet browser, users of previous versions may be blocked. Once the problem has been solved for a given user, a final user-centric information system immediately identifies all the stations and users in a similar configuration and therefore potentially at risk. IT support can solve the problem proactively, without waiting for other users to remount the problem.


03: Optimize the IT Investment’s ROI on business


Optimizing the operating costs of informatics system does not simply mean reducing IT costs. It’s only for the ISD to better target its investments by privileging the projects and resources likely to increase the competitiveness of the company.

End-user centric information system analysis solution helps to better identify the IT resources and services used within the organization. Equipped with contextual analytical data, the ISD can better focus its priorities and reassign existing assets in order to perpetuate IT investments.

The consumerization of IT and the ever-increasing expectations of executives expect a perfect cohesion in the relationship between ISD and users. An efficient analytical solution for monitoring the information system is therefore a real asset to optimize the ISD’s efficiency.


04: Do not be overwhelmed by the complexity of your infrastructure


In the era of consumerization of IT, the increasing diversity of systems, applications and generated data increases the complexity of the information system every day.

Confronted with ever more complex and difficult to manage environments, IT teams need to have a clear and accurate view of past, current and future events.

A workstation-centric information system analysis solution is the missing link to ensure full visibility into the effective use of IT resources and services from the end-user’s point of view. A perfect understanding of the user context is just as important as the quality of the applications. For example, many users connect to the information system outside of office hours. What would happen if some of them sent a large print to a central printer before disconnecting? It could be a potential internal threat. Data about this type of behavior exist, but do you really know where to look for them?

The detection and reporting of this type of behavior is only one example of how the analytical data gathered from end-user can be used. More and more companies are adopting a user-centric information analysis system to make sense of the enormous volumes of data related to the connectivity and IS performance to solve problems proactively.


05: Plan, organize and measure results more precisely


As a cornerstone of the IT Infrastructure Library framework(ITIL), change management is designed to ensure that all IT infrastructure modifications requests are processed efficiently through the use of standardized methods and procedures. It’s important to limit service disruptions and to position themselves from the perspective of the end-user before, during and after the modification/change.

Take an example of Microsoft office update: The simple information available about the installed Office application doesn’t allow to develop an action plan guaranteeing a transparent update for the users.

The IT team must be fully aware of the context in which the application is used, as well as the network connections established by the application, to ensure that they remain available after the new version is deployed. Thus, IT must ensure that all workstations are compatible with the new version, both in terms of configuration and the state of integrity required. In fact, some applications running on MS Office 2003 might not support an upgrade to MS Office 2013 for CPU, RAM, or other reasons.


Public, Private or Hybrid Cloud – How to make the right choice?

Cloud computing is a booming industry and has significant economic benefits, including better efficacy of IT and computing needs scalability. Cloud computing concept has clearly shifted from buzz to business and in so doing, has transformed the nature of IT service delivery. Just look at the numbers, according to Gartner, the cloud software market reached $209.2 billion by 2016 and is projected to grow to total $246.88 billion, not to mention the billions of dollars that will be invested in infrastructure to support private and hybrid clouds.

Cloud Service Forecast Fartner

The three types of cloud – private, public and hybrid – are generally grouped under the banner of cloud computing, but they are actually different. Choosing the right cloud can be a challenge to outsource applications, data and services. For the organizations, the decision to use -private, public or hybrid- cloud depends on the services they use and their ability to integrate the chosen model.  But before moving their critical systems within the cloud, a question always comes up within the IT and management team, “What option do we have to opt for Public, Private or Hybrid Cloud?”


Each type of cloud has its advantages and disadvantages, which make it the best or the worst solution for a given company, situation or application. Similarly, each has an impact on application and network performance, which must be taken into account before implementation. So, let’s examine each type of cloud.


Public Cloud

In a public cloud, services and infrastructure are provided off-site, over the Internet. This means that companies are not looking for a very specific kind of infrastructure and can subscribe and start using storage, processing and other services immediately, via an online portal.

Therefore, the public cloud flexibility and ease-of-use, make it an ideal solution for companies that need to rapidly launch a service in the marketplace, have few regulatory constraints and use data that does not require a close integration with other parts of the company.

However, concerns remain about security, the protection of confidential information and the control of data in a public cloud. Another major problem is performance. Transferring services to a public cloud means accepting that business applications are run from anywhere in the world, regardless of where the service provider’s data center is located.

Most public cloud service providers do not indicate the location of their data center in their general terms and conditions, which gives them blank card to move workloads to reduce their operating costs. In short, the distance to be covered and the time needed to access applications can increase significantly for all users of the company. More surprisingly, these distances can change in an unpredictable way.


Advantages of Public cloud:
• It can be used instantly and accessible to all budgets.
• It is suitable for development and experimentation.
• The public cloud is perfectly “elastic” in order to adapt to the increasing needs of a company.


Limits of Public cloud:
• The public cloud, although flexible, is not necessarily adapted to all the needs of a company, not being tailored like the private cloud.
• The more you use the public cloud, the more expensive it is;



Private Cloud

With a private cloud, organizations own and operate internal IT services that host critical internal applications and data within the firewall. However, they can transfer workloads from one server to another in case of peak usage or when deploying new applications. It can be preferable solution for those organizations who have not embraced the public cloud as quickly for critical applications and data due to security requirements, integration issues, and concerns about availability. It can also be a very attractive proposition for companies in sensitive and highly regulated sectors, such as pharmaceutical or financial services. Similarly, many companies still prefer the private cloud for their critical data because it provides total control over data and applications. This eliminates concerns about data security and control, but it is more difficult to adapt to changing needs.

Private clouds also enable IT departments to better leverage their existing infrastructure. Typically, when deploying a private cloud, companies consolidate distributed computing resources and virtualize them in the data center. The IT department can manage them more cost-effectively while providing services faster.

However, it is a double-edged sword, because deploying a private cloud can put a strain on existing resources and work processes. When IT departments consolidate resources, applications and data generally move away from many users. Employees need to travel a longer distance on the WAN to get the information they need. The resulting latency can often radically reduce the performance and productivity of the enterprise.


Advantages of Private Cloud:
• It is tailored to your needs and your infrastructure
• Its cost is fixed (determined by the size of the infrastructure).


Limits of Private cloud:
• This is a costly investment, and depreciation must be expected.
• The time required to adapt the size of the infrastructure to the needs of the company may be too long compared to the speed of the infrastructure.


Hybrid Cloud

In many cases, the hybrid cloud offers the best of both technologies. It becomes the norm because it allows companies to alternate between the two models depending on the conjuncture.

By splitting elements into a hybrid cloud, companies can keep every aspect of their business in the right environment. However, the merging of the public cloud and the private cloud poses an additional problem: the integration of services becomes more difficult because there is a loss of data consistency. This results in additional management, as well as potential differences in the interface, security, processing and reporting systems that need to be addressed.

As a composite architecture, the hybrid cloud has a dual implication, exposing networks to the potential impacts of deploying a public cloud and deploying a private cloud: applications delivered via a Public service are still likely to be located anywhere in the world, while private cloud applications are still consolidated in a small cluster of data centers, resulting in a potential blockage affecting network operation.


Advantages of Hybrid cloud:
• Each data item is naturally stored in the most appropriate Cloud environment.
• This solution combines the major advantages of the public cloud (flexibility, speed of implementation, development and experimentation) and those of the private cloud (security and total control of data).
Limits of the Hybrid cloud:
• This solution is however exposed to the disadvantages of the different types of Cloud and the risks during the deployment of each Cloud solution.
• The use of two different cloud types increases the management required


Accelerating cloud services


Whether a company choose a private, public cloud service or (most likely) hybrid cloud approach, WAN optimization allows it to take advantage of cloud computing offerings in terms of cost, economies of scale and ease of management while attaining the levels of performance and visibility needed to ensure the productivity of its staff.
Given the take-off speed of cloud computing, sooner or later, more companies will have to consider the benefits it can bring. Companies need to evaluate the cloud model that suits them best, but whatever model is chosen, a thorough understanding of the impact of each cloud service type on their IT infrastructure and topology is essential to ensure that it will result in no degradation of performance for users.


Do you have a Cloud project? Contact-us and we will support you to evolve your IT architecture by integrating a Cloud component.

#SaaS and European Legislation on #DataProtection

This article presents a summary of the legislative and regulatory aspects that European companies must take into account when choosing a SaaS provider. Particularly in terms of #DataProtection. When choosing a SaaS provider, companies must implement a checklist with controls and negotiations that they would apply if they work with a relocated service provider for their IT operations.


The main features of Software as a Service are as follows:


– The user accesses the application via the Internet.

– The cost depends on the actual consumption of the service (software).

– The supplier of the application (software) is responsible for its maintenance and availability.


Typically, when a European company adopts a cloud service, it’s responsible for how the SaaS provider processes its data, not the other way around. Due to some uncertainty, as to how and where the SaaS provider will store the data, there is a risk that it may overstep on its customers national or European regulations, which impose strict controls on the processing data of outside the European Union.


In SaaS solutions, the client company data is stored on the servers of the provider. This may include personal data or sensitive data such as health data. This relocation of the data implies respecting their confidentiality and ensuring their safety. The contract must frame the risks and remind everyone of their obligations.


In accordance with the 1995 European Data Protection Directive, which was transposed into the national law of 27 EU Member States, the transfer of personal data outside the European Economic Area (EEA), including the countries of the European Union (plus Iceland, Liechtenstein and Norway) is prohibited unless certain conditions are met. By transfer, the Directive implies that the data will be processed in one way or another in a non-EEA country; On the other hand, the transit of data via these countries is authorized.


For Directive, personal data is equal to any information concerning an identified or identifiable natural person. This broad definition may include various information about a person, such as name, address, IP address, or credit card information.


Cloud Computing = Outsourcing?


Outsourcing is the well-known method whereby a third party supports one or more company functions, which often lack resources (time, expertise or both). It is common, for example, to outsource a project that requires an increase in resources or a function that will no longer be useful once the project is completed (one-time need for development, software integration, etc.).


With cloud computing, companies do not realize that they need to take the same precautions as outsourcing. Personal data may be transferred outside the EEA if they are processed in a country on the European Commission’s list of countries or territories which provide adequate protection for personal data. (Visit the European Commission website to check countries list ).


The United States is not on the list of countries approved by the Commission, but data can be transferred to US companies that have signed the Safe Harbor agreement requiring them to apply seven principles for the processing of information under the supervision of the Federal Trade Commission.


If a country is not on the European Commission’s list of approved non-EEA countries, companies or service providers may take other measures to provide suitable protective policies for personal data and enable their transfer.


The security of the SaaS provider must be evaluated by the companies


In addition to these measures, companies considering SaaS and wanting to avoid the failure to comply with #DataProtection laws generally have to prove that they have evaluated the safety of the supplier and specified measures to protect personal or other sensitive data processed by the supplier.


These measures may include asking the supplier a security evaluation from a third party, requesting that the data must be encrypted during transit, checking the provider’s data retention and destruction policies, setting up audit trails or the data and obtain information about any third-party company with which the supplier could share data.


With that being said, companies should not just look at data protection legislation when they want to adopt SaaS. Thus, national laws on financial legislation in EU countries limit the places where companies can store financial information. For example, European companies must keep electronic invoices for five to ten years. In addition, amendments to the European Council Directive 2010/45 / EU stipulate that this information must be stored on servers located either in the country of establishment of the undertaking or in a neighboring country providing access to the relevant tax authorities.


Confidentiality of Data


The confidentiality of data hosted in Cloud is today the most important of the brakes for the companies wanting to use this service. The standard of confidentiality becomes very important when the hosted data presents a strategic content for the company or when it can be considered as personal data.


The confidentiality of the data may be called into question by members of the service provider or the client company, as well as by a person totally outside these services. It is therefore necessary to put in place, a high level of security, for access to these data, especially if they are accessible via the Internet. The confidentiality of data can also be undermined by regulations applicable to the claimant, especially if the applicant is domiciled in the United States.


As the SaaS market matures, it is becoming increasingly simple to use these services without fear of breaking the law. Over the years, we have seen the evolution, the contracts have grown and different models have been set up on both the customer and the service provider’s side.