We are used to seeing technological trends on the market separated by the way that companies or different types of organisations work, when in reality they are closely linked. In this article we will briefly explore this interaction.

If we explore business activity, we can see how the way a company or organisation works (e.g. the consumption models) and is modified by technological trends (e.g. mobile apps) and, in turn, the use of mobile technologies drives changes in the way we interact with service providers (such as public authorities, through their digital channels).

The reality is that both of them intersect, given that they supply one another and gain better results when they interact in harmony.

Something quite common for all of us is buying online, which is an example of business trends. There are new challenges in this business model, mostly in what we like to call the last mile, for organisations that are used to dealing with their clients in store. An essential element in a company’s relationship with their client is the user experience; if it’s good, then they will go back, but if it’s bad, they will cross us off their list and post about it on social media.

This is where one of the biggest challenges lies: how to guarantee that user experience when the means of service provision are out of the organisation’s hands. Generally speaking, online sales involve distribution logistics, the quality of which is beyond the seller. We often see complaints from clients regarding the delivery of their purchased products, which has a direct impact on the seller and not the logistics company. It is therefore important to have mechanisms in place that will allow us to monitor the quality of the end-to-end service for our client.

An example would be monitoring mechanisms designed by the owners of the product that state the product shipment cycle or a rapid response mechanism in the event there are any problems detected by the client, whether it be via a social networking account, a call centre that can respond to a review posted on these sites or the definition of KPIs with the logistics companies (measured depending on the delivery speed and the client’s satisfaction with the service). These are just a few of the points that could be touched upon along the provider’s value chain.

Handling information

This leads us onto another very contemporary point: handling information, which is all the more profuse and often not properly structured. How can a company handle the amount of data that is generated with hundreds or thousands of separate orders? And how can information be extracted from this sea of data in order to be useful in improving the optimising sales processes, for example? At this point we are engaging in technological solutions that include automisation, on-demand processing on the cloud and the creation of control panels geared towards continuous improvement. Here is where a technology appears to provide a response to large-scale automisation, with Machine Learning (ML) being one of the first technologies that can be applied in order to reach this objective, followed by Artificial Intelligence (AI) as an additional step in this large-scale information automisation and extraction. Without a clear strategy for handling data and obtaining important information, the sustainable growth of any organisation is impossible, hence the importance to dispose of these technologies as a response to our client’s evolution.

Coupled with this client evolution, we can also say that there is no longer just one point of contact with clients or with a business user, as was the case during the computer era. Nowadays we have multiple devices that allow us to use different interfaces according to needs, as has started to be the case with the so-called “wearables” (such as smart watches, pressure monitors, etc.) that incorporate additional elements of interaction with our clients and users, which in turn allows us to offer different experiences depending on the moment. These devices could be considered within the personal IoT range, however, there is also the industrial IoT, which requires a shorter latency period in order to work efficiently. Thus, the need arises to develop what is known as edge computing, whereby the decisions are made in-situ, as acting in real time is vital.

Stand-alone devices

An example of this trend can be seen in the growing use of stand-alone elements (such as drones, robots, machinery, etc.). These devices work with a range of intelligence that includes semi stand-alone devices and those that are completely stand-alone and operate in different environments, such as air, sea or land. These stand-alone elements capture a large amount of data and process it using AI to perform tasks that would normally require a human. This entails local processing, sending data to the cloud (or a central location for collecting historic data), communications links with a broad scope of operation, security mechanisms and strict controls and highly automated handling centres. We could say that the evolution of these systems will lead us to driverless cars on the roads in the not too distant future.

A trend that is starting to be developed on the public cloud to meet these processing needs near to the point of decision is what is driving the creation of the distrbuted public cloud (as Gartner calls it), whereby the public cloud provider services are no longer centralised and they migrate out of their data centres. However, the service provider is responsible for all the aspects related to the service’s architecture. This trend will create a new operational and service model of the services on the cloud.

This fast technological evolution, the availability of multiple channels of interaction and the demand from clients and users for greater readiness and response speed also creates a trust crisis. How can we guarantee that the information that organisations gather in order to provide better services is processed appropriately? And, on the other hand, how does the organisation recognise the growing responsibility to compile, store and use information in a trustworthy and auditable way, as well as having mechanisms in place to minimise the theft or improper use of said information?

Guaranteeing security

This is where the government and security management comes into play, along with the implementation of technologies that allow for the security to be managed. It should be noted that ML and AI are also going to play an essential role in guaranteeing company security, considering the exponential growth of the points of presence and data collection, which involves controlling and protecting more information; being able to scale by automising and recognising patterns requires a large processing capacity and it seems obvious to acknowledge that tools such as ML and AI will be fundamental in managing actions that the growing quantity of events will demand.

An aspect that generally tends to be overlooked is the broad scope of the communications, their capacity to transport information and the latency period inserted into the link. These are critical when services are required in real time and, even if progress is sustained, it is also true that changes in traditional architectures are needed in order to provide for current business needs; on the one hand, users have more residential and mobile capacity, and on the other, we are rolling out control elements and management more often at points where traditionally a visit from an operator was required. Besides this, we are also gravitating more towards cloud services.