The Attributer’s Blog – Digitally Architected

In the world of digital transformation, things are moving quickly. The concept of zero trust has matured significantly. Now we can see a much more holistic approach, often referred to as the ‘software defined perimeter’. One of the key features is the use of secure APIs for machine-to-machine communications. 

Many web applications are now bundles of microservices rather than a single core service. The move away from monolithic applications was driven by the need for agility of the business to avoid digital disruption. It has also become an advantage in securing the applications by segregating each and every function and procedure with its own API. This API-driven architecture has little human intervention. Automation of app-to-app interaction allows applications to embed services from other sources and to expose raw data via the API. It is a very powerful technique in cloud environments, both private and public. ‘Compare-the-Market’ type web services are built like this, drawing data from multiple sources and collating it to present product and service ratings to human users. One of the best examples is TripAdvisor.

The power of this architectural technique can also be its downfall. LandMark White is one of Australia’s largest independent property valuation firms. In February 2019 it announced that one of its online platforms had been compromised and that more than 100,000 records had been accessed by unauthorised third parties. The records related to property valuations, information on borrowers, lenders, homeowners and property agents. The reputational and financial damage has been huge. The problem was that an API-based architecture had been poorly designed and implemented. With so many cloud applications being of this type it is essential to use a zero trust model in which nothing is taken for granted.

External parties can participate in your applications through APIs but without direct access. The API is an application-level access broker. User access may be via direct web browsing or app-to-app via APIs with no explicit user present.

The principles of zero trust that are needed to make this work are:

  • There is never any implied or absolute trust – in every instance you must calculate the trust level based on a variety of factors;
  • All environments must be considered public and hostile – exposed to anyone and everyone;
  • Strong identification, authentication and authorisation must be applied to all entities, whether human or not;
  • Contextual authorisation must be used and must depend on dynamically changing risk factors, including identity, role, group membership, user attributes (such as trained and qualified), consent information (such as age), behaviour, history, time and day, device type, connection type, location, sensitivity of the data, reputation of the application, how frequently, how much, session life and refreshment period, application-specific rules (such as bank employee and customer), and more;
  • Trust is dynamic – a contract between two or more entities that want to do business, valid only in the current instance – not static – shifting as time passes and context changes;
  • Consider endpoint security: device type and inventory, device health, device reputation, device management;
  • Consider network security – dynamic routing, micro segmentation, software defined perimeters, traffic introspection;• Consider workload security – API security, context aware authorisation, web application security in the business logic. Need to identity context for API traffic to authenticate every packet;
  • Treat all APIs as if external and public – accessible by hostile actors from outside your domain;
  • Transaction security: transaction verification, continuous session validation and security. Different transaction types might have dynamic changes in digital risk and trust levels as you move through the business process logic.
  • Data security – at rest and in transit – confidentiality and integrity checking using crypto, user privacy and consent, data loss prevention, embedded data security policies and enforced policy execution;
  • Assurance framework – auditing, event logging, reporting, forensics;
  • Smart threat detection – machine learning and adaptive logic;

The list above suggests a holistic approach to architecture, not simply a collection of controls. The two major considerations are the mobility and changing context of both users and data. We shall need to move towards standard protocols to provide a next generation, token-based, authorisation service (such as OAUTH). Such a service will include the issue of both ‘bearer tokens’ (taken on face value) and ‘bound tokens’ (linked to the users through some provable secret).

You need a vendor-agnostic architecture framework to do all this. You need SABSA.

The Attributer

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.