The Lessons of #CloudComputing – What Have We Learned So Far?

Cloud computing

In a remarkably short period of time, cloud computing has moved from a marginal to a fundamental element of IT operations. Thus, in just ten years, CIOs were presented with an opportunity to break the rules and create a new model for the implementation of IT. The cloud now regroups one-third of all spending on IT infrastructure, according to IDC. In terms of software, Gartner predicts the worldwide total spending in business applications of cloud services will grow 18% in 2017 to $246.8B, up from $209.2B in 2016.

While cloud computing is essential in the IT strategy, it’s good to highlight where the cloud has brought benefits and in which areas companies still have to prove.

Let’s find below what is working and where there is still work to be done.

 

Practice of cloud computing to bridge gaps in services

Cloud computing should be seen as a form of flexible outsourcing. It is only one vector among others for the provision of services. In computing on demand, what matters is how the company subscribes to services and benefits, not how they are delivered.

Cloud computing has the advantage of offering a very different model from the traditional ways of purchasing enterprise computing, where an ISD would acquire hardware and software for a specific location. This may be appropriate for services limited to a regional market, but can encounter problems of latency at the global level. This can be very problematic, especially if you are managing IT for a highly transactional business such as a financial institution, or if you ship large amounts of data, such as in oil and gas exploration. The cloud enables CIOs to ease these performance issues by purchasing on-demand computing to create omnipresent service delivery.

You can rely on the third party to provide the diverse service you need, while they, as an expert, have the capability to deliver peak capabilities and performance where and when you need it.  As a CIO, you can expect the platform to work and be available. Now, many IT executives will do their best to avoid possessing new physical hardware, while being assured that the service will be well provided. The cloud enables IT managers to take a step towards hosting and achieve high levels of backup and security for a defined fee.

 

Adopt a cloud computing mindset

A company must have a long-term goal to migrate as much IT capacity to the cloud as it provides a cost-effective way to gain access to new skills and expertise. It can be difficult to keep in touch of all the innovations associated with the cloud; so, make sure to spend enough time to brainstorm and talking with future IT professionals to get an idea of ​​what’s going to happen. They are probably more aware of the next big phenomenon that will affect the company. The culture that surrounds IT management is evolving, and that’s why you have to take a look at the new services that are available on the market and encourage employees to adopt a mindset favoring cloud computing.

However, a migration to the cloud must be carefully managed, including governance and information security. By definition, major providers (such as Amazon and Google) should be much better at securing data. However, CIOs need to be aware that convincing the rest of the company of the benefits of cloud computing can be a slow process, particularly with regard to governance, security and approval issues.

 

Let the cloud take care of core domains

Cloud computing integration must bring a tremendous solution to your organization’s operational challenge. The starting point concerns a number of core areas that CIOs can easily deliver on demand.

If you are using products such as Salesforce and Office 365, you’d have to be crazy to want to host them yourself. Better to let someone else, expert in this field, take charge of your operational concerns. The cloud also serves as a one-time solution to problems involving certain operational projects.

 

Finding a balance and determining how to manage legacy systems

IT managers would be foolish to dismiss cloud computing, if only in terms of the quality of the service. However, while businesses will continue to migrate on-demand services, much remains to be done. We’ve noticed that some companies are opting ​​for an on-demand model and there are others that buy more internal resources.

For CIOs, moving to an on-demand model can be a headache. You have your systems inherited and at some point, you will have to consider migrating these services to cloud computing, but it’s possible, as the momentum is in favor of computing demand, despite persistent concerns about security and governance.

 

Security and Privacy

The main challenge to cloud computing is how it addresses the security and privacy concerns of businesses thinking of adopting it. The fact that the valuable enterprise data will reside outside the corporate firewall raises serious concerns. After the #WannaCry attack, hacking and various attacks to cloud infrastructure are affecting multiple (potential) clients. These risks can be mitigated by using security applications, encrypted file systems, data loss software, and buying security hardware to track unusual behavior across servers.

 

Reliability and Availability

Cloud providers still lack round-the-clock service; this results in frequent outages. It is important to monitor the service being provided using internal or third-party tools. It is vital to have plans to supervise usage, SLAs, performance, robustness, and business dependency of these services.

#DigitalTransformation: No Future Without a Functional Website

From the butcher around the corner to the largest multinationals in the world, about every company is actively present on the internet via a website and that has become absolutely necessary. It’s just logical reasoning as the customers spend more and more time on the internet, via multiple devices, it’s normal to be online as a business.

And only being present is not enough. A site is a lot more than just a business card today. It has become a full communication channel. And consumers expect you to be as accessible via the site as via the phone. And of course, it is also a graphic and semantic reflection of your company. The kind of site you have, the pictures that are on it and the language already tells others very much about the type of business you are.

 

Customers are more and more on the internet, it’s only logical that you should be there as a business

 

Digital transformation

Daily task

As an online business, you don’t want to confuse quality with quantity. Don’t spend your time on easy and low impact tasks instead of strategic one’s, especially when your website is your only sales channel. Even if you opt to work with a specialized IT company, keeping the site running is a full day job. In some cases, only the deep adaptations to the site and the performance take specialists into account. But keeping up-to-date with information, uploading photos of products, keeping track of payments, ensure that all descriptions and prices are correct and the overall look is an in-house task.

 

Basic conditions

Handling a running website isn’t as easy as counting one-two-three. Because of the requirements of a good website, there are some basic conditions. You need to make sure that your website’s graphic don’t confuse visitor and make him leave your page. That means finding immediately what customers looking for and not losing too much time with pop-ups and such notifications. Even the language selection must be automate as much as possible based on user history. Also, a call to action (“call us” or “push here for a chat conversation”.) must be functional. All deeper content may be further in the site.

 

Classic

Just like in clothing, there are trends and modes in website design. Long scrolling home pages are the best example. Be sure that if a visitor has to scroll a lot, it must be for a good reason.

At the top of the page a functional basic menu and at the bottom of it user must get credentials or customer testimonials. Be very economical with too many colors, striking fonts or bold layouts. And there are good reasons why you do not often see that kind of thing: it draws attention to the essence, it makes reading the site just harder and it looks very unprofessional. As far as reading is concerned, too long text on a webpage are too heavy. You limit yourself to a handful of paragraphs per page. If you have to explain difficult things, for example, if you want to propose a particular policy as an insurance company, you can make a synthesis on the homepage and click your visitors to the full text. Whether you provide a separate PDF that they can download and print.

There is nothing above a classic, quiet layout. May be in 1998 you could get away with all those flashy conditions, not any more.

 

Responsive design

ResponsiveDesign

Another thing to keep in mind is that more internet surfers are not only using a PC to use your website but they use multiple devices such as tablet or phone to visit your site. Therefore, responsive design has been invented to facilitate the navigation via multiple devices formats. That is another important trend of today: sites adapt themselves to the device they are viewed on. This way you offer an optimal navigation and all information is easily readable on a large screen PC, tablets and on smartphone. Studies shows that more and more customers use a mobile device to consult company’s websites and that’s about 80%.

If you have difficulties or you need advices on technical issues, don’t hesitate to contact-us. We’ll be happy to assist you.

Survey: Is #CloudSecurity Still a Concern in 2017

The need to run Business more efficiently, improve time-to-market and enhance user experience is driving more and more enterprises to embrace the cloud as part of their IT strategy. You must note that “Cloud” still has many different meanings; IaaS, SaaS, PaaS and so on. Equally interesting is the fact that enterprises today deploy a variety of cloud delivery models to restructure processes and increase agility. IT teams usually have good visibility into and control over their on-premise networks. But when it comes to cloud environments, it’s not as easy to see and react to threats. Regardless of how your organization defines “Cloud”, it’s important to make sure your security can adapt to your organization’s cloud strategies.

 

57% of companies remain skeptical about the security of migration to cloud environments. The loss of “physical control” of data remains a major concern. Companies are still suspicious about the risk of switching from traditional computing to cloud computing environments, reveals a new study by Forbes, which also mentions that the massive trend is towards migration to the cloud.

Cloud Security

The survey reveals that even if the cloud is not a new technology, this market still has a strong growth potential, if security is strong. Forbes says that 65% of companies remain skeptical about the security of migration to cloud environments. Specifically, 40% of companies are concerned about the loss of “physical control” of the data involved in cloud computing.

 

The study also finds that companies seem more comfortable with hybrid cloud deployments in this period of migration to the cloud. 44% of organizations prefer this method. In addition to that, Hybrid cloud adoption grew 3X in the last year, increasing from 19% to 57% of organizations surveyed. Private clouds also seem to be a safer option for many.

 

Think Security Upstream of Cloud Migration Projects

Security Threat in clouds

At the top of the cloud migration concerns, unauthorized access ranks first among 61% of respondents in the study. For 52% online piracy is a second fear.

Cloud security risks are on the top of the barriers list of cloud adoption (33%). The most dramatic shift is the lack of staff and expertise to manage cloud security (28%) – moving from #5 to #2 and trading places with legal and regulatory concerns (24%) as key barriers to cloud adoption.

 

Finally, it is noted that companies are increasingly considering enhancing the security upstream of their cloud deployments, with a focus on new internal policies. 56% of respondents said they plan to improve identity management and authentication. 51% of companies use encryption to go to the cloud. Finally, 45% of medium and large companies plan to implement audits as part of a migration to the cloud.

 

Only 13% of companies still reject the idea of ​​moving to cloud computing infrastructures. But 30% admit that if they perceive that security is improving, they may reconsider their point of view. While process efficiencies and network agility are key cloud drivers, enterprises of all sizes continually cite cloud security as their top concern. Despite this, cloud adoption continues to rise.

 

Cloud adoption certainly provides many benefits, but enterprise security needs to adapt to this new environment. The end goal of a cloud security strategy must be to permit organizations to realize the full benefits of the cloud without letting security slow them down.

*For the survey, more than 600 IT professionals worldwide, in various sectors, were selected.

From Data to Knowledge: #BigData and #DataMining

The increasing digitization of our activities, the constantly accumulating capacity to store digital data, the accumulation of data of all kind resulting therefrom, generates a new sector of activity whose purpose is the analysis of large quantities of data. New approaches, new methods, new knowledge are emerging, and ultimately no doubt new ways of thinking and working. Thus, this very large amount of data, (=big data), and its processing, (=data mining), affect different sectors such as the economy, marketing, but also research and knowledge.

The economic, scientific and ethical implications of this data are quite significant. The fact that we are in a constantly evolving sector, where changes are frequent and rapid doesn’t make the analysis easy … However, a deep knowledge of data is necessary in order to better understand what data mining is.

From Data to Knowledge: #BigData and #DataMining

1 – What is data mining?             

 

Explore very large amounts of data. The purpose of data mining is to extract knowledge from large quantities of data by automatic or semiautomatic methods. Data mining, data drilling, knowledge Discovery from Data (KDD), are also referred as data mining.

 

  • How and why are such quantities of new data generated? Every minute 149519 e-mails are sent worldwide, 3.3 posts are published on Facebook, 3.8 million quarries are booked on Google, 65k photos are loaded on Instagram, 448k tweets are sent, 1400 posts are published via WordPress, 500 videos are uploaded on YouTube and last but not the least 29 million messages are sent via WhatsApp. These numbers can make one’s head go spin around, but important thing to note is that humans aren’t the only producers of data, machines also contribute with their sim cards, their sensors, and so on.
  • What to do with these data? If one understands the contemporary phenomenon of data accumulation, it is perhaps more difficult to perceive in what way these data, are changing the world. Depends how one is able to treat them. Science, IT, Medical sector relies heavily on statistics, on counting, and so on. From the moment when a set of data can be dealt with exhaustively, where cross-breeding and sorting can be carried out on a scale scarcely imaginable a few decades ago, these are analysis of our environment that are changing and being multiplied. In short, data is a tool for management and decision support and evaluation every sector and the raw material of the information is allowing the understanding of a phenomenon, a reality.

 

2 – Value of Data

 

While IT organizations are best able to grasp the market potential of data accumulation and processing, this is not the case everywhere, where the idea that data is new oil is making its way more slowly than one might have imagined.

  • What is the market value of the data? Building data through a variety of IT operations is a valuable potential that companies are not always aware of or using it. Even if they do not necessarily know how to exploit data themselves, they have resources that aren’t profitable for them yet. These gathered data and their use is a key issue for companies. The Big Data is a real source of marketing opportunities.
  • Data to be protect that is complex to exploit: Personal data poses many problems for researchers specialized in their analysis. First, they point to the need to better protect them and ensure their conservation. Moreover, it requires very specialized skills to be treated in order to produce interesting results.

 

3 – Data mining and targeted marketing 

 

One of the most significant applications of data mining is undoubtedly in the regeneration of marketing, because data mining allows companies to reach consumers very precisely by establishing precise and reliable profiles of their interest, purchasing methods, their standard of living, etc. Moreover, there is no need to go through a complicated process of search, each of the Internet users leaves enough traces when surfing, tweeting, publishing on Facebook, so that his profiling is possible, without his knowledge most of the time…

  • A new space for social science research: Viewed from another angle, this accumulated data is a gold mine for researchers. Some behavioral researchers have looked at the attitudes of Internet users using dating sites. In addition to finding that the data they use is more reliable than that obtained by meeting individuals (they are easier to lie to an investigator than to a machine …), they can make analyzes that are not politically correct but very informative!

 

4 – The data mining forecast tool

Data mining is also a tool that allows to multiply the properties related to the calculation of probability. Indeed, because it makes it possible to cross a volume of data, but above all, because it makes it possible to apply these calculations to many different fields, it appears today as able to make Forecasts. Plus, Data mining for forecasting offers the opportunity to leverage the numerous sources of time-series data, both internal and external, available to the business decision-maker, into actionable strategies that can directly impact profitability. Deciding what to make, when to make it and for whom is a complex process. Understanding what factors drive demand and how these factors interact with production processes or demand and change over time are keys to deriving value in this context.  Today scientists do not hesitate to announce that they will soon be able to predict the future. All this, thanks to the Data!

  • Probabilities and predictions: Today, predictive statistics tackle all sorts of issues: natural disasters, health, delinquency, climate … Statistical tools are numerous and are combined to improve outcomes, such as when using “random checks”. Even more fascinating, software is capable of improving itself and accumulating ever more data to boost their performance … In the meantime, it is possible to rely on these analyzes to try to avoid the flu or get vaccinated wisely.
  • Anticipating or Preventing Crimes: If the idea that a software would be able to predict crimes and misdemeanors reminds one of Spielberg’s film “Minority report”, reality has now caught up with the fiction: the PredPol (predictive policing) software makes it possible to estimate better than other human technique or analysis, places where crime is likely to occur, and consequently better place police patrols and other preventive measures.
  • Preventing fraud: Other perspectives offered by data mining, improve the fight against fraud and “scams” in insurances sector. Here again, it is a matter of better targeting the controls and apparently it works: This technique gives very clear results. In more than half of cases, when a controller will do a targeted control on the basis of the data mining, he’ll find good results. Insurance companies also apply this type of analysis to detect scams.

Challenges in #MachineLearning Adaptation 

It’s very possible that at the time you read these lines, you’ve already use the result of machine learning algorithms several times today: your favorite social network might have already suggested you a new friend-list, search motor certain pages relevant to your history etc. You’ve dictated a message on your phone, read an article that was in your news feed based on your preferences and that may have been translated automatically. And even without using a computer, you may have been listening to news or just heard the weather forecast.

 

We are living in a world where most transactions and stock decisions that make and undo an economy, and more and even medical diagnoses are based on the qualities of the algorithm than on those of human experts, incapable of treating the mountain of information necessary for a relevant decision-making.

 

Such algorithms learn from data in order to issue predictions and make data based decisions and are called machine learning. These automated learning algorithms and systems have undergone significant advances in recent years thanks to the availability of large volumes of data and intensive computing, as well as interesting advances in optimization. A major feature of deep learning is its ability to learn descriptors while clustering the data. However, there are, many limitations and challenges that we have classified as: Data sources; Symbolic representations vs continuous representations; Continuous and endless learning; Learning under constraints; Computing architectures; Unsupervised learning; Learning process with human intervention, explanations …

challenges in machine learning

Data Sources

There are many challenges in this area such as, learning from heterogeneous data available on multiple channels; manage uncertain information; identify and process with rare events beyond the purely statistical approaches; work by combining sources of knowledge and data sources; integrating models and ontologies into the learning process; and finally get good learning performance with little data, when massive data sources are not available.

 

Symbolic Representations vs Continuous Representations

Continuous representations allow the machine learning (ML) algorithm to approach complex functions, while symbolic representations are used to learn rules and symbolic models. The most significant recent advances concern continuous representations. These, however, leave out the reasoning while it would be desirable to integrate it in the continuous representation to be able to make inferences on the numerical data. Moreover, in order to exploit the power of deep learning, it may be useful to define continuous representations of symbolic data, as has been done for example for text with word2vec and text2vec representations.

 

Continuous and endless learning

Some AI systems are expected to be resilient, in order to be able to operate 24/7 without interruption. Interesting advances have been made in lifelong learning systems that will continually build new knowledge while they are operating. The challenge here is the ability of AI systems to operate online in real time, while being able to revise existing beliefs, learned from previous cases, independently. Self-priming is an option for these systems because it allows the use of elementary knowledge acquired at the beginning of the operation to guide future learning tasks, as in the NELL (NeverEnding Langage Learning) system developed at the Carnegie-Mellon University.

 

Learning under constraints

Privacy Protection is undoubtedly the most important constraint to be taken into account. Researchers specializing in Machine learning recently recognized the need to protect privacy while continuing to learn from personal data (from records about individuals). To fulfill this purpose, privacy-oriented learning systems are being developed by researchers. Generally, machine learning must take into account other external constraints such as decentralized data or energy limitations. Researches on the general problem of machine learning with external constraints is therefore necessary.

 

Computing architectures

Modern machine learning systems require intensive computing performances and efficient data storage to scale up data size and problem dimensions. Algorithms will be run on GPUs and other powerful architectures; Data and processes must be distributed across multiple processors. New research needs to focus on improving machine learning algorithms and problem formulation to make the most of these computing architectures.

 

Unsupervised Learning

The most remarkable results obtained in the field of machine learning are based on supervised learning, that is, learning from examples in which the expected result is provided with the input data. This involves prior labeling of data with the corresponding expected results, a process that requires large-scale data. The Amazon’s Mechanical Turk (www.mturk.com) is a perfect example of how large companies mobilize human resources to annotate data. But the vast majority of data exists with no expected result, ie without desired annotation or class name. It is therefore necessary to develop unsupervised learning algorithms to manage this enormous amount of unlabeled data. In some cases, a minimal amount of human supervision can be used to guide the unsupervised algorithm.

 

Learning process with human intervention, explanations

The challenges relate to the establishment of a natural collaboration between the machine learning algorithms and the users in order to improve the learning process. To do this, machine learning systems must be able to show their progress in a form that is understandable to humans. Moreover, it should be possible for the human user to obtain explanations from the system on any result obtained. These explanations would be provided during the learning process and could be linked to input data or intermediate representations. They could also indicate levels of confidence, as appropriate.

 

Transfer Learning

Transfer learning is useful when little data is available for learning a task. It consists in using for a new task knowledge that has been acquired from another task and for which more data is available. This is a rather old idea (1993), but the results remain modest because it is difficult to implement. Indeed, it implies being able to extract the knowledge that the system has acquired in the first place, but there is no general solution to this problem (how, how to reuse them …). Another approach to transfer learning is “shaping”. It involves learning a simple task and then gradually becoming more complex until it reaches the target task. There are some examples of this procedure in the literature, but no general theory.