TL Consulting Group

microservices architecture

Building a Secure & Scalable Microservices Authorisation Architecture with Kubernetes

Microservices architecture has become the go-to choice for modern companies, offering unparalleled flexibility and scalability, especially in cloud-native environments. However, this architectural shift introduces new and unique security challenges, with authorisation standing out as a paramount concern.

Building a Secure & Scalable Microservices Authorisation Architecture with Kubernetes Read More »

Cloud-Native, , , , , , ,

Navigating the Future of Software Development

Navigating the Future of Software Development The world of software development is rapidly changing. To stay competitive, organisations need to not only keep up with the changes but also strategically adopt methods that improve agility, security, and dependability. The emergence of cloud computing, microservices, and containers has given rise to an innovative approach to creating and deploying software in a cloud-native way. Cloud-native applications are designed to be scalable, resilient, and secure, and they are often delivered through DevOps or DevSecOps methodologies. The markets for cloud-native development, platform engineering, and DevSecOps are all witnessing substantial growth, fuelled by the growing demand for streamlined software development practices and heightened security protocols. This article will explore how the intersection of cloud-native development, platform engineering, and DevSecOps is reshaping the landscape of software development.  Cloud Native Development: Building for the Future Cloud-native development represents a significant transformation in the approach to designing and deploying software. It revolves around crafting applications specifically tailored for cloud environments. These applications are usually constructed from microservices, which are compact, self-contained units collaborating to provide the application’s features. This architectural approach endows cloud-native applications with superior scalability and resilience when compared to conventional monolithic applications.  Key Benefits of Cloud Native Development:  Platform Engineering: The Glue that Holds It Together  Platform engineering is the bridge between development and operations. It is about providing the tools and infrastructure that developers need to build, test, and deploy their applications seamlessly. Think of it as an internal developer platform, offering a standardised environment for building and running software.  Why Platform Engineering Matters:  DevSecOps: Weaving Security into the Fabric  DevSecOps extends the DevOps philosophy by emphasising the integration of security into every phase of the software development lifecycle. It shifts security from being an afterthought to an initiative-taking and continuous process.  The Importance of DevSecOps:  Embarking on the Cloud Native, Platform Engineering, and DevSecOps Odyssey  While there exist various avenues for implementing cloud-native, platform engineering, and DevSecOps practices, the optimal approach hinges on an organisation’s unique requirements. Nevertheless, some overarching steps that organisations can consider include:  In summation, cloud-native development, platform engineering, and DevSecOps are not mere buzzwords; they are strategic mandates for organisations aiming to flourish in the digital era. These practices pave the way for heightened agility, cost-effectiveness, security, and reliability in software development.  Conclusion: As market intelligence attests, the adoption of these practices is not decelerating; it is gaining momentum. Organisations that wholeheartedly embrace cloud-native development, invest in platform engineering, and prioritise DevSecOps will be ideally positioned to navigate the challenges and seize the opportunities of tomorrow. The moment to embark on this transformative journey is now, ensuring that your software development processes are not just future-ready but also primed to deliver value at an unprecedented velocity and with unwavering security. 

Navigating the Future of Software Development Read More »

Cloud-Native, DevSecOps, , , , , ,

Top Cloud Plays in 2023: Unlocking Innovation and Agility

Top Cloud Plays in 2023: Unlocking Innovation and Agility Cloud Computing has been around since the early 2000’s, while the technology landscape continues to evolve rapidly and adoption increased (20% CAGR), offering unprecedented opportunities for innovation and digital transformation. The meaning of digital transformation is also changing with cloud decision makers viewing Digital transformation as more than a “lift and shift”, instead they see vast opportunity within the Cloud ecosystems to help reinforce their long-term success. As businesses increasingly embrace cloud, certain cloud plays have emerged as key drivers of success, underpinned by companies including Microsoft, AWS, Google Cloud and VMWare who have all developed very strong technology ecosystems that have transitioned from a manual and costly Data Centre model. In this blog, we will explore the top cloud plays, from our perspective, that organisations should consider unlocking to reach their full potential in 2023. Multi-Cloud and Hybrid Cloud Strategies Multi-Cloud and Hybrid Cloud Strategies: Multi-cloud and hybrid cloud strategies have gained significant traction in 2023. Organisations are leveraging multiple cloud providers and combining public and private cloud environments to achieve greater flexibility, scalability, and resilience through their investment. Multi-cloud and hybrid cloud approaches allow businesses to choose the best services from different providers while maintaining control over critical data and applications. This strategy helps mitigate vendor lock-in leveraging Kubernetes Container orchestration, including AKS, EKS & GKE and VMWare Tanzu, optimise costs, and tailor cloud deployments to specific business requirements and use cases. Cloud-Native Application Development Cloud-Native Application Development: Cloud-native application development is a transformative cloud play that enables organisations to build and deploy applications, through optimised DevSecOps practices, specifically designed for advanced cloud environments. This model leverages containerization, CICD, microservices architecture, and orchestration platforms again emphasising Kubernetes, a strong Cloud Native foundational play. Cloud-native applications are designed to be highly scalable, resilient, and agile, allowing organisations to rapidly adapt to changing business needs. By embracing cloud-native development, businesses can accelerate time-to-market, improve scalability, and enhance developer productivity embedding strong Developer Experience (DevEx) practices. Serverless Computing Serverless computing: is a game-changer for businesses seeking to build applications without worrying about server management. With serverless computing, developers can focus solely on writing code while the cloud provider handles infrastructure provisioning and scaling. An example of this is Microsoft Azure Serverless Platform or AWS Lambda. This cloud play offers automatic scaling, cost optimisation, and event-driven architectures, allowing businesses to build highly scalable and cost-effective applications. Serverless computing simplifies development efforts, reduces operational overhead, and enables companies to quickly respond to changing application workloads. Cloud Security and Compliance Cloud security and compliance: are critical cloud plays that organisations cannot afford to overlook in 2023 particularly with recent data breaches at Optus and Medicare. Leveraging security as a foundational element of your cloud native journey is crucial for ensuring the protection, integrity, and compliance of your applications and data. Cloud providers offer robust security frameworks, encryption services, identity and access management solutions, and compliance certifications. By leveraging these cloud security products and practices, businesses can enhance their data protection, safeguard customer information, and ensure regulatory compliance. Strong security and compliance measures build trust, mitigate risks, and protect organisations from potential data breaches. Data Analytics and Machine Learning:  Data analytics and machine learning (ML) are powerful cloud plays that drive data-driven decision-making and unlock actionable insights. Cloud providers offer advanced analytics and ML services that enable businesses to leverage their data effectively. By harnessing cloud-based data analytics and ML capabilities, businesses can gain valuable insights, predict trends, automate processes, and enhance customer experiences. These cloud plays empower organisations to extract value from their data, optimize operations, and drive innovation while providing an enhanced customer experience. As the evolution of Cloud Native, Multi-Cloud and Hybrid Cloud Strategies accelerate, strategically adopting the above drivers help enable innovation, agility, and business growth. Importantly Multi-cloud and hybrid cloud strategies provide enhanced security, flexibility, while cloud-native application development empowers rapid application deployment and better developer experience (DevEx), leveraging DevSecOps and Automation practices. These are critical initiatives to consider, if you are looking to advance your technology ecosystem and migrate and/or port workloads for optimum flexibility and Return on Investment (ROI). It is evident the traditional “lift and shift strategy” does not provide this level of value to the consumer. Instead, the above “on-demand cloud plays” may not be realised, with inefficient cloud resource management and unexpected expenses, leading to increased OPEX and TCO. By embracing these top cloud plays, it enables businesses investing in innovation to develop and deploy applications that can scale seamlessly on Cloud, adapting to changing customer demands, reduce TCO/ OPEX, accelerate time-to-market, maintain high availability and security, while future proofing themselves in this competitive digital landscape. For more information about Cloud, Cloud-Native, Data Analytics and more, visit our services page.

Top Cloud Plays in 2023: Unlocking Innovation and Agility Read More »

Cloud-Native, Data & AI, DevSecOps, , , , , , , ,
VMWare - Tanzu Application Platform

Unlocking The Potential of Tanzu Application Platform

Unlocking The Potential of Tanzu Application Platform (TAP – a Multicloud, Portable Kubernetes PaaS) Cloud-native application architecture targets building and running software applications that triumph the flexibility, scalability, and resilience of cloud computing by following the 12 factors, microservices architecture with self-service agile infrastructure offering an API based collaborative and self-healing system. Cloud-native encompasses the various tools and techniques used by software developers today to build applications for the public cloud. Kubernetes is the de-facto standard for container orchestration to build the Cloud Native applications. Undoubtedly Kubernetes is changing the way enterprises manages their infrastructure and application deployments. However, at the core, there is still a clean separation of concerns between the developers and operators. Now comes the new VMWare’s Tanzu Application Platform under the Tanzu Portfolio to address some of the fundamental issues with the developer and operations collaboration issues and provides an effortless path to application deployments in a secure, module, scalable in a portable Kubernetes environment. What is Tanzu Application Platform (TAP)? “A superior multi-cloud developer experience on Kubernetes VMware Tanzu Application Platform is a modular, application-aware platform that provides a rich set of developer tooling and a prepared path to production to build and deploy software quickly and securely on any compliant public cloud or on-premises Kubernetes cluster.” By VMWare Tanzu Application Platform simplifies workflows Tanzu Application Platform simplifies workflows in both the inner loop and outer loop of cloud-native application development and deployments on Kubernetes. A typical inner loop consists of developers writing the code in their local IDE (Integrated development environment), testing, and debugging the application, push and pull the code from a soured code repository, deploying to a development or staging environment, and then making additional code changes based on the continuous feedback. An outer loop consists of the steps to deploy the application to a non-production /production environment and support them over time. In the instance of a cloud-native platform, the outer loop includes activities such as building container images, adding container security, i.e., vulnerability scanning, trust and adding signature and configuring continuous integration (CI) and continuous delivery (CD) pipelines. TAP creates an abstraction layer above the underlying Kubernetes, focusing on portability and reproducibility, avoiding lock-in where possible. Underneath, TAP provides strong support with all the tools required for the build and deployment of the applications in the form of Accelerators and Supply chains Choreographers. TAP can be installed and managed on most of the managed Kubernetes instances like AKS(Azure), EKS(AWS) and GKE (Google Cloud) available in the market as well as any other unmanaged conformant Kubernetes cluster. Developers can even install it on their local Minikube instance as well. TAP also supports an out of the box workflow for DevSecOps based on the best open-source tools. However, there is strong support to customise these workflows with the enterprise-grade/commercial tools of choice. TL Consulting TLConsulting brings its consulting and engineering personnel to application modernisation adoption and implementation by providing range of services – as If you need assistance with your Containers/Kubernetes adoption, please contact us at our kubernetes consulting services  page.

Unlocking The Potential of Tanzu Application Platform Read More »

Cloud-Native, DevSecOps, Uncategorised, , , , , ,

Pressure on teams to modernise applications

Pressure on teams to modernise applications As many organisations are moving towards a cloud-native approach, the need to modernise applications using new platforms and products is inevitable. But are the expectations on teams too much? With agile delivery being the norm, teams are empowered to experiment, align capacity to continuously learn and are encouraged to fail fast. But with that said, there is increasing pressure for teams to cut corners and adapt to tools and engineering standards as they deliver. In TL Consulting’s opinion, this is when most teams fail to adopt Kubernetes and other modern technology correctly. Issues begin to appear right through the build pipeline most commonly with security, multi-cloud integration, compliance, governance, and reliability. Embedding modern engineering standards Organisations often opt for a lift and shift approach to reduce OPEX and or CAPEX. However, the underlying code is not mature enough to be decoupled correctly and housed within a container. This requires considerable rework and creates an anti-pattern for software engineering teams. Instead, to move from the traditional 3-tier architecture and implement new technical stacks, new development principles for cloud applications such as Twelve-Factor Apps need to be embraced. Other levels of DevSecOps automation and infrastructure as code need to become the engineering standard too. The Twelve Factor App The Twelve Factor App is a methodology providing a set of principles for enterprise engineering teams. Like microservices architecture, teams can leverage the similarities of these principles to embed engineering strategies. This does require highly skilled engineers to create models that can be adopted and reused by development teams. Engineering support With these types of expectations put on immature development teams, the pressures and demand on resources impact performance and quality. From our experience we have found that even Big 4 banks require assistance to modernise applications and seek external support from platforms, and products to modernise their app portfolio. e.g., VMWare Tanzu. VMWare Tanzu is an abstraction layer on top of Kubernetes platforms which enables enterprises to streamline operations across different cloud infrastructures.  Tanzu provides ease of management, portability, resilience, and efficient use of cloud resources. It is important to note that to be successful implementing the likes of Tanzu’s suite of products, an organisation would need to establish a DevSecOps culture and mature governance models. Embracing DevSecOps TL Consulting has found many organisations need guidance when embedding a culture shift towards DevSecOps. Teams must have a security first mindset. The norm therefore should not be limited to the likes of security testing, such as Static Application Security Testing (SAST) and Dynamic Application Security Testing (DAST), but instead focus on securing applications by design and automating security practices and policies across the SDLC. After all, the goal is to standardise teams’ daily activities, to build secure software into cloud-native engineering workflows. Infrastructure-as-code (IaC) As IT infrastructure has evolved, leveraging IaC can now be invigorating for teams. Engineers can spin up fully provisioned environments that scale, are secure and cost effect. However, if DevSecOps and infrastructure automation orchestration are not aligned, CI/CD pipelines and cloud costs will be difficult to control. To achieve these sustainable processes and practices, implementing a DevSecOps culture that has mature governance models will help keep cloud costs optimised. Conclusion Providing teams with capacity and implementing modern technology platforms will not overcome the engineering challenges faced when modernising applications. To modernise applications requires an established DevSecOps culture, robust governance models and highly skilled teams. Additionally, each team needs to understand the application(s) under their control to determine what needs to be automated. For example: the purpose of the application and customer experience architecture and design of the application and its dependencies application workflows and data privacy policies compliance with government managed data (if applicable) business security policies & procedures cloud security policies & procedures which impact the application application infrastructure employed The modern platforms, products and tools therefore become enablers to optimise cloud-native adoption, not solutions. This is where onsite education, guidance and support from experts and subscriptions models like A Cloud Guru, can be highly beneficial for leaders and engineers. If you are facing challenges implementing DevSecOps or adopting modern technology platforms such as Kubernetes, contact us.

Pressure on teams to modernise applications Read More »

DevSecOps, , , , , , , ,

Road to a Cloud Native Journey

Road to a Cloud Native Journey Author:  Ravi CheetiralaTechnical Architect ( Cloud & DevSecOps) at TL Consulting “Cloud Native” is the new buzz word in the modern application development. It is an evolving application build pattern. The technology is relatively new to the market; thus, our understanding of the architecture is very primitive and keeps changing over the time with technological advancements in the cloud and containers. Understanding cloud native approach and strategy helps to build better understanding among developers, engineers, and technology leaders so that, teams can collaborate each other more effectively. The Need for Cloud Application Modernization: In today’s IT landscape, 70-80% of C-Executives report that their IT budgets are spent on managing the legacy applications and infrastructure – In addition to that, legacy systems consume almost 76% of the IT spend. Despite of large amount of investment on legacy applications, most businesses fail to see through their digital transformation plans to a satisfactory. On the other hand, constantly changing digital behaviours of consumers and the evolution of viable, reduced opex, self-sustaining infrastructure models that are better suited to today’s pace of technological change are the primary drivers pushing application modernization up the CIO/CTO’s list of priorities. According to a study conducted by Google, public cloud adoption alone can reduce the IT overheads by 36-40% when migrating from traditional IT frameworks. However, application modernization can help in further reduction – it frees up the IT budget to make space for innovation and exploring new opportunities of business value. Lastly, this digital transformation brings greater agility, flexibility, and transparency while opening operations up to the benefits of modern technologies like AI, DevSecOps, intelligent automation, IoT, etc. Kickstart to Cloud Native Journey Beyond the upfront investments after creating a buy-in, application modernization entails several considerations to be made by the CIOs, and more importantly, a game plan to manage the massive amount of change that comes with such a large-scale transformation. However, moving away from the sunk costs of legacy IT can help enterprises take on a new trajectory of profitability and value. Here are four essential steps to a successful application modernization roadmap. Assessment of legacy system landscape: The first and crucial step of the application modernisation journey should be assessment of the legacy system systems. identify the business-critical systems, applications, and business processes. High-value assets that need to be modernized on priority can form the first tier of the legacy application modernization process. Next, we need to start with business value and technical impact assessments. The outcome of these assessments will drive the journey further down to the roadmap. Pickup your Anchor applications: Once an assessment is complete and business services are identified, team must shortlist their modernization options from their legacy application suite. This list will enable a more targeted implementation plan. Following this, an implementation framework needs to be developed and implemented, which will help you to create a modernization schedule. Assessment should also help in determining the scope of the project, team, technologies, and the skills required. Define the success criteria: Various application transformation approaches comprise different costs and risks involved. Say for some instances refactoring a legacy application cost much higher than rebuilding the application using a new technical stack. Most of the times organisations fail to determine the target outcomes in effective manner. So, it is very important to measure the change, costs and risks involved along with the return on investment, the features we aim to improve, and set new benchmarks of attaining agility and resilience while bringing an enhanced security and risk management strategy into the portfolio. Structure of target operating model: The traditional operating structure consists of network engineers, system administrators, and database engineers, are no longer fit to support to the new modern digital transformation landscape, so the organisation must align the IT landscape to suite to new suite, alongside upskilling/reskilling path – In the end, applications are ultimately maintained and supported by the people, and your end state operating model must account for ownership of microservices, who will configure and manage the production environment, etc. Benefits of Cloud Native applications: Drives Innovation: With a new cloud native environment, it is easy drive the digital transformation and  to adopt the new age technologies like AI/ML, automation driven insights as these are readily available in most of the cloud environments and comes with easy integration to the applications. Ship Faster: In current world, key to the success of any business is time to market. With the DevOps and CI/CD capabilities, it is very much a possibility to deploy changes very frequently (multiple times in day) while it takes months to deploy a change in traditional software development. Using DevOps, we can transform the software delivery pipeline using automation, building automation, test automation, or deploy automation. Optimised Costs: Containers manage and secure applications independently of the infrastructure that supports them. Most of the organisations use Kubernetes to manage the large volumes of containers. Kubernetes is an open-source platform that is standard for managing resources in the cloud. Cloud-native applications are using containers; hence it fully benefits from containerization. Alongside Kubernetes, there is a host of powerful cloud-native tools. This, along with an open-source model, drives down costs. Enhanced cloud-native capabilities such as Serverless let you run dynamic workloads and pay-per-use compute time in milliseconds. So, it has standardization of infrastructure and tooling. Hence, it helps to reduce cost. Improved Reliability: Achieving high fault tolerance is hard and expensive with the traditional applications. With modern cloud-native approaches like microservices architecture and Kubernetes in the cloud, you can more easily build applications to be fault tolerant with resiliency and autoscaling and self-healing built in. Because of this design, even when failures happen you can easily isolate the impact of the failure, so it doesn’t take down the entire application. Instead of servers and monolithic applications, cloud-native microservices helps you achieve higher uptime and thus further improve the user experience. Foundational elements of Cloud Native applications: In general, cloud native applications are designed

Road to a Cloud Native Journey Read More »

DevSecOps, , , , , , ,

How to modernise legacy applications

How to modernise legacy applications Hosting applications on the cloud is a strategic objective for most organisations. There are many benefits to modernise legacy applications and implementing enablers such as automated deployments, auto-scaling and containerised architectures. These include lower running costs and better performance. However, there is a perception that many legacy systems and commercial off-the-shelf (COTS) applications cannot be modernised. Instead, organisations opt for a “Lift and Shift” approach which not only requires a significant amount of rework and refactoring but does not deliver any of the benefits of the cloud. Consider an alternative to lift and shift While a “Lift and Shift” approach is an affordable option, there are often additional costs. These costs are generally not in the initial estimates. When estimating costs, the overall vision of the application and its lifecycle needs to be considered. As does the Total Cost of Ownership after deployment. When these factors are included the cost will often be more than first expected. But higher cost is not the only factor to consider. A lift and shift approach often does not deliver the benefits of moving to the cloud such as performance improvements and deployment efficiencies. As an alternative, monolithic applications can benefit from modern architectures such as Kubernetes, without rearchitecting the solution. An option that few organisations consider or have the skills to accomplish. This provides the same benefits as a “Lift and Shift” but at the same time, provides a model that enables a relatively mature cloud native application. A white paper and case study In the following sections, we will explore key findings from a recent application modernisation service provided to a NSW Government agency. In this white paper we describe how we successfully migrated a legacy Oracle SOA application stack to containerised infrastructure. We explore common challenges, solution design, the implementation and business benefits too. Common Challenges in Modernising Monolithic Applications A main difference between monolithic and microservices architectures, apart from the obvious scalability, flexibility and agility benefits that are achieved with microservices, is that monolithic applications are built of layers and components that are tightly coupled. Putting all these layers and components in one docker container does not at first sight seem like a viable option. Such an approach appears to be adding an external shell on top of the existing layers, thereby further complicating the build process. Also, from a scalability perspective, what if the consumption of the different components were not uniform? In other words, only few of the components would need to be replicated instead of replicating complete layers. It will be a complete waste of infrastructure resources having to replicate all the components when only a few are in high demand. Solution Design Stage Firstly, the engineering teams needed to assess the feasibility of decoupling the application components and explore different architecture design options. Secondly, we evaluated data segregation based on the needs of Docker containers. The next step of the design stage shows the different deployment models highlighting their respective advantages and disadvantages. Depending on the infrastructure, there are different options that can be considered. Another aspect that may need consideration is stateful versus stateless components. With technologies like docker and Kubernetes, running stateless workloads is easier compared to Stateful. The Solution Design Stage is important to setting up the core foundation of the modernised application. Without this assessment, key issues with the code, technology and or architecture will not be identified. In turn, the application will inherit the technical debt thus not achieving the ROI of the project. We often hear from other clients that TCO has risen due to poor analysis of an applications current state. Implementation Stage During the implementation stage there were many considerations to address. We needed to have robust continuous integration and continuous delivery pipelines to ensure stage gates are controlled and governed. This approach enabled the teams to have the transparency that was unfortunately lacking within the current technology stack. Infrastructure as code, cost benefit analysis, team skill levels and workflows were among other considerations, risks and issues to overcome. The image below shows a simplified version of the solution pipelines and technology stack. Figure 1 The Design that was implemented for our client needed to address three critical issues. The first issue was a manual activity requiring an engineer to switch a malfunctioning active node to a standby node. The second issue, was overcoming the substantial costs of the previous lift and shift implementation. The cost of provisioning and maintaining the different environments for the platform exceeded that of running it on VMs. The last main issue was scalability. Adding another node group to the platform to handle extra load was an onerous process, which required extensive planning prior to implementation. It is important to note, the infrastructure components and workloads were compliant with the mandated government policies and the government data centre models. Outcomes and Business Benefits Our client realised the immediate benefit of implementing an engineering model that leveraged an infrastructure-as-code pipeline and Kubernetes. Automated build/test/deploy pipelines, Self-Healing, Auto-scaling and 0% Downtime gradual deployments were just a few benefits that helped our client move towards a cloud-native approach. A Cloud-Native Partner While most internal engineers know the business and product well enough to perform a “Lift and Shift” approach, to modernise legacy applications effectively requires specialised DevOps knowledge. TL Consulting can provide this expertise allowing your team to get as close as possible to cloud native models when migrating your legacy systems to the cloud. If you want to find out more, please review our application modernisation services page or contact us below.

How to modernise legacy applications Read More »

Uncategorised, , , , ,