TL Consulting Group

governance

Building a Robust Data Governance Framework in 2023

In today’s data-driven world with accelerating advancements in Artificial Intelligence (AI) and advanced analytics, organisations play an important role in ensuring that the data they collect, store, and analyse is underpinned by a strong data governance framework. Embedding the right data governance framework is the enablement of an organisation’s data strategy which requires dedicated planning and strategic direction from various business & technical stakeholders and should be driven from the “top-down” rather than “bottom-up”. To achieve this, organisations should focus on defining their information and data lifecycle management, data relationships and classification, data privacy, data quality, and data integrity to become more competitive and resilient.  The key fundamental challenge for organisations is to embed data standardisation, data security & compliance horizontally across the enterprise, thereby eliminating silos with their own disparate ways of working. In addition, it’s important for organisations to align their data governance framework with their data lifecycle, business strategy and goals, enabling a more agile approach to accommodate the organisation’s current & future needs.   Data Governance Framework Best Practices As organisations collect more and more data points, it’s important to define the right standards, policies, controls, accountability, and ownership (roles & responsibilities). A data governance framework will ensure the organisation abides by these standards while ensuring data that is collected and stored is secure, with a focus on maintaining data integrity and data quality. Ultimately, the data that is consumed by end-users should enable informative, data-driven decisions to be made. A constant re-evaluation is recommended to ensure the organisation’s data governance program is modernised and caters to the latest advancements in data and technology. Prior to defining a data governance framework, a comprehensive data discovery should be performed across the business landscape to create a unified view. This would aid in establishing data governance across the following areas: Data cataloging of data relationships, data quality, and data lineage Data classification and sourcing Metadata definition (Technical and Enterprise metadata) Data compliance, security, and privacy Data analytics & engineering Data storage & sharing The following diagram is a high-level example of a data governance framework. This model should be aligned with the organisation’s data and information management lifecycle. The framework definition should be evaluated from a People, Processes & Technology/Tooling perspective considering data stewardship, efficiencies, data security & access controls, alongside standardised processes governing the technology and tools that facilitate the production, consumption, and processing of the organisation’s data. The following sections highlight a few key areas which the data governance framework should address: Alignment to the Organisation’s Cloud Strategy When uplifting the data governance program, another important consideration for organisation’s that are building technology solutions on Cloud is to define an integrated data governance architecture across their environments, whether it be hybrid or multi-cloud. Alignment to their cloud strategy can help in the following areas: Improve data quality with better management & tooling available around data cleansing and enrichment Build a holistic, unified view of the organisation’s data through discovery and benchmarking Gain higher visibility into data lineage and track data end-to-end from source to target Build more effective data catalogs to ensure it benefits organisational needs to search and access the right data when needed Proactively review, monitor, and measure the data to ensure data consistency and data integrity is preserved For example, Microsoft offers an Azure Governance service as a management and governance cloud solution that features advanced capabilities to help manage data throughout its entire IT lifecycle and track data flows end-to-end, ensuring the right people have access to reliable, accurate data they need, whenever they need it. Data Privacy & Compliance As organisations continue building insights and implementing advanced analytics to learn more about their customers and create more tailored experiences, protecting sensitive data attributes including Personal Information (PI) should be at the heart of the organisation’s data security & data privacy practices, as part of their data governance framework. With the rise of cyber-attacks & data breaches, organisations should consider implementing data obfuscation techniques to “mask” or “encrypt” their PI source data, especially across non-production environments where the access controls are considered weaker than production environments, and the “internal” threat can be considered just as high as the external cyber threats. Applying data obfuscation techniques would ensure the PI data attributes are de-sensitized prior to their use in development, testing and data analytics. In addition, organisations should ensure data controls & access policies are reviewed more frequently than ever. Understanding who has access to the underlying data sources and platforms will help organisations maintain a good risk posture and should be assessed against their data governance framework, across their environments whether on-premise or on Cloud. Augmented Analytics & Machine Learning Without advanced analytics, data loses a lot of its usability and power. Advanced analytics combines the power of machine learning and artificial intelligence to help teams make data-driven decisions based on in-depth insights. Advanced analytics tools greatly streamline the data analysis process and help to provide a competitive edge, uncovering patterns and insights that manual data analysis may overlook. With the introduction of open-source machine learning models such as Open AI’s ChatGPT, how do organisations ensure the data that is collected, analysed, and presented is highly accurate and high quality? Depending on the data models & training algorithms used, these insights can be deeply flawed and it’s important for organisations to embed the right data governance policies around the use of open-source data models, including the collection, use, and analysis of the data points collected. A few roles that data governance plays in the world of augmented analytics, machine learning, and AI include: Providing guidance on what data is collected and how it’s used to train and validate data models for machine learning models to generate advanced analytics Providing standardization on the data science lifecycle and algorithms applied for generating insights, along with data cleansing & enrichment exercises Defining the best practices and policies when introducing new data models, along with measures to fine-tune and train models to increase data accuracy

Building a Robust Data Governance Framework in 2023 Read More »

Cloud-Native, Data & AI, , , , , ,

What is Cloud Transformation? 

What is Cloud Transformation?  What is cloud transformation? In today’s world, cloud is the first option for everyone to run their workloads, unless they have a compelling reason such as compliance or security concerns to deploy it on-premises. Most of the organisations who manages their workloads on their own data centres, are looking for an opportunity to move to the cloud for numerous benefits which most of the cloud services providers offer. As per the recent survey by Forbes and Gartner recently increased prior forecasts of worldwide end-user spending on public cloud services to anticipate a 23.1% jump this year, followed by a more than 16% increase in 2022 — up from $270 billion in 2020 to just under $400 billion.  While the acceleration of cloud transformations continuous, most businesses data still reside on on-premises. Consequently, hybrid solutions that were once downplayed by virtualisation have emerged as not only practical but likely a preferred approach. We’ve moved past the “cloud-first” era to a time when clouds are becoming omnipresent.   There are numerous benefits in using cloud services. Some of key benefits are discussed below;  Pay per use: Switching from the on-premises IT infrastructure to remote cloud infrastructure provided by a third-party cloud provider allows businesses to make potentially significant cost savings in their IT expenditure.  Disaster Recovery: Cloud computing ensures that disaster recovery is much easier than it might otherwise be. This is because critical data is stored off-site in third-party data centres, thereby making it easier to retrieve in the event of unscheduled downtime.  Scalable: As your business grows, so is your infrastructure needs. Alternatively, it may be that you’ve had to scale down your operation, and with it your IT compute and storage needs. Cloud computing provides easy scalability, allowing you to scale up and scale down as your circumstances change.   Less maintenance: By adopting cloud, businesses can free up the resources (including both financial and human resources) for deployment in other areas. This allows them to have more focus on customer base, rather than managing and maintaining their own IT resources.  Security: Data security has been one of the key aspects to be considered when migrating into cloud. cloud providers go to great lengths to ensure that data is kept secure. They are tasked with protecting data from threats and unauthorized access, and this is something they do very effectively using robust encryption.  Because of these obvious reasons and much more benefits, many businesses are starting their journey to move or transform their applications or workloads to the cloud and this process of migrating or transforming the applications or workload is called as “Cloud Transformation”  What is Cloud Transformation? Cloud transformation is simply the process of migrating or transforming your work to the cloud, including migration of apps, software programs, desktops, data, or an entire infrastructure in alignment with the business objectives of the organization  The first step in performing the transformation is to do a comprehensive assessment if the cloud computing is suitable for our organisation from a long-term business strategy. Cloud transformation is popular because, among many other benefits, it increases the efficiency of sharing and storing data, accelerated time-to-market, enhanced organizational flexibility and scalability, and centralize their network security. Overall, it hugely changes the way of operating a business.  How to Approach Cloud Transformation? As state above cloud transformation is the enablement of a complete business transformation. To achieve this, organizations focus on cloud strategy, migration, management and optimization, data and analytics, and cloud security to become more competitive and resilient.  There are various ways the transformation to the cloud can be done but you may need to choose the option that better suits your organisation and its goals. A few options listed below will help you to consider the right options for the transformation approach.   Understanding the Organisation long term goals and environment   Security and regulatory considerations  Building a cloud transformation strategy and roadmap  Choosing the right cloud and approach   Defining a Robust Governance model  Layers of Cloud transformation  All or any of the below component layers are to be changed as a part of transformation when migrating to the cloud.  Application layer  It is the core layer where your application is hosted to run. It is also known as compute layer to run application code which performs business operations. Along with application code base, it also contains dependencies and software packages which are required to run your application.  Data layer  It consists of data which are processed by the application layer. This is the layer which maintains the state of your application. Storage (Files, Databases, stage management tools) is the key components of this layer.   Network layer  It consists of network components like LAN, router, load balancers, firewalls, and VPN etc. It is responsible for providing the segregation between different components and ensure restriction is applied between them as needed.  Security layer  Though it is mentioned as a separate layer, it will be part of each other layer mentioned above. For e.g., when migrating application layer, we will not be just migrating it but will be considering proper security in place by having security rules (firewall rules) in place and only the required traffic is allowed from and to the application. It applies for data and network layer as well.  Types of Cloud transformation  Distinct types of cloud transformation are listed and discussed below,  Lift & shift (or) Re-hosting  Re-platform  Re-factor (or) Re-architect  Develop in cloud  Lift & Shift (or) Re-hosting  This approach is nothing but lifting the application from on-prem and deployed to the cloud as-is. This is one of the quickest ways to transform the application from on-premises to the cloud but will not utilize the benefits of cloud-native features. The applications which do not have dependencies with on-premises and have less business impact are the ideal candidates for this approach. It is a way to start your cloud journey with smaller applications and then progress to a bigger one.  Application layer – No change  Data layer – No

What is Cloud Transformation?  Read More »

Cloud-Native, DevSecOps, , , , , ,

Pressure on teams to modernise applications

Pressure on teams to modernise applications As many organisations are moving towards a cloud-native approach, the need to modernise applications using new platforms and products is inevitable. But are the expectations on teams too much? With agile delivery being the norm, teams are empowered to experiment, align capacity to continuously learn and are encouraged to fail fast. But with that said, there is increasing pressure for teams to cut corners and adapt to tools and engineering standards as they deliver. In TL Consulting’s opinion, this is when most teams fail to adopt Kubernetes and other modern technology correctly. Issues begin to appear right through the build pipeline most commonly with security, multi-cloud integration, compliance, governance, and reliability. Embedding modern engineering standards Organisations often opt for a lift and shift approach to reduce OPEX and or CAPEX. However, the underlying code is not mature enough to be decoupled correctly and housed within a container. This requires considerable rework and creates an anti-pattern for software engineering teams. Instead, to move from the traditional 3-tier architecture and implement new technical stacks, new development principles for cloud applications such as Twelve-Factor Apps need to be embraced. Other levels of DevSecOps automation and infrastructure as code need to become the engineering standard too. The Twelve Factor App The Twelve Factor App is a methodology providing a set of principles for enterprise engineering teams. Like microservices architecture, teams can leverage the similarities of these principles to embed engineering strategies. This does require highly skilled engineers to create models that can be adopted and reused by development teams. Engineering support With these types of expectations put on immature development teams, the pressures and demand on resources impact performance and quality. From our experience we have found that even Big 4 banks require assistance to modernise applications and seek external support from platforms, and products to modernise their app portfolio. e.g., VMWare Tanzu. VMWare Tanzu is an abstraction layer on top of Kubernetes platforms which enables enterprises to streamline operations across different cloud infrastructures.  Tanzu provides ease of management, portability, resilience, and efficient use of cloud resources. It is important to note that to be successful implementing the likes of Tanzu’s suite of products, an organisation would need to establish a DevSecOps culture and mature governance models. Embracing DevSecOps TL Consulting has found many organisations need guidance when embedding a culture shift towards DevSecOps. Teams must have a security first mindset. The norm therefore should not be limited to the likes of security testing, such as Static Application Security Testing (SAST) and Dynamic Application Security Testing (DAST), but instead focus on securing applications by design and automating security practices and policies across the SDLC. After all, the goal is to standardise teams’ daily activities, to build secure software into cloud-native engineering workflows. Infrastructure-as-code (IaC) As IT infrastructure has evolved, leveraging IaC can now be invigorating for teams. Engineers can spin up fully provisioned environments that scale, are secure and cost effect. However, if DevSecOps and infrastructure automation orchestration are not aligned, CI/CD pipelines and cloud costs will be difficult to control. To achieve these sustainable processes and practices, implementing a DevSecOps culture that has mature governance models will help keep cloud costs optimised. Conclusion Providing teams with capacity and implementing modern technology platforms will not overcome the engineering challenges faced when modernising applications. To modernise applications requires an established DevSecOps culture, robust governance models and highly skilled teams. Additionally, each team needs to understand the application(s) under their control to determine what needs to be automated. For example: the purpose of the application and customer experience architecture and design of the application and its dependencies application workflows and data privacy policies compliance with government managed data (if applicable) business security policies & procedures cloud security policies & procedures which impact the application application infrastructure employed The modern platforms, products and tools therefore become enablers to optimise cloud-native adoption, not solutions. This is where onsite education, guidance and support from experts and subscriptions models like A Cloud Guru, can be highly beneficial for leaders and engineers. If you are facing challenges implementing DevSecOps or adopting modern technology platforms such as Kubernetes, contact us.

Pressure on teams to modernise applications Read More »

DevSecOps, , , , , , , ,

Reasons to Move, and Reasons Not to Move, to the Public Cloud

Reasons to Move, and Reasons Not to Move, to the Public Cloud Public cloud adoption is more popular now than ever. Companies across all industries are modernizing their environments to support remote work, lower costs, and improve reliability. In fact, Gartner predicts global public cloud end-user spending to increase by 23% in 2021. Despite this momentum, it’s important to realize the public cloud isn’t an ideal fit for every organization. Many companies rushed into the cloud during the pandemic without fully understanding the implications. Now, issues are surfacing — and some businesses are reconsidering their migration altogether. This post explores the pros and cons of moving to the public cloud. Keep reading to learn more about whether the cloud makes sense for your business, and the reasons to move, and reasons not to move, to the public cloud. What Is the Public Cloud? The public cloud is a framework that lets you access on-demand computing services and infrastructure through a third-party provider. In a public cloud environment, you’ll share the same hardware, software, and network services as other companies or tenants. It’s different from a private cloud environment where your company receives access to private, hosted infrastructure and services. To illustrate, it’s like staying in a hotel versus renting a private cottage on Airbnb. A few of the top public cloud providers on the market include Amazon Web Services (AWS), Google Cloud Platform (GCP), Microsoft Azure, Alibaba Cloud, and IBM Cloud. Public cloud services can refer to infrastructure as a service (IaaS), software as a service (SaaS), and platform as a service (PaaS) models. Top Reasons for Public Cloud Adoption Companies have a variety of reasons for migrating to the public cloud. Here’s a few of them. Replacing the Data Center and Lowering Computing Costs Enterprises are increasingly moving away from data centers. In fact, by 2025, 80% of enterprises will shut down their traditional data centers. Companies with aging data centers can avoid retrofitting facilities or building new ones by migrating to the public cloud and leveraging hosted infrastructure instead. This greatly reduces costly builds and minimizes operational expenses. Achieving Rapid Scalability The public cloud enables rapid scalability. You can significantly increase storage and compute power through the public cloud at a fraction of the cost of expanding your existing infrastructure. The public cloud is particularly useful for growing startups that need to be able to accommodate massive usage increases. It’s also ideal for organizations that experience seasonal spikes in sales. For example, an e-commerce provider might use the public cloud when ramping up production and sales around the holidays. By the same token, the public cloud provides flexibility to easily scale back down during lulls. Accessing Managed Services The service provider manages the underlying hardware and software in a public cloud deployment. They also typically provide security, monitoring, maintenance, and upgrades. This approach enables you to take a hands-off approach to managing infrastructure. Your IT team can focus on other business needs, with the expectation that the public cloud provider will keep your services up and running within the scope of the service-level agreement (SLA). Reducing IT Burden Right now, there’s a widespread IT staffing shortage. The issue is particularly bad in the data center industry, where 50% of data center owners and operators are having difficulty finding qualified candidates for open jobs. If your company’s having a hard time finding qualified IT workers, you may want to consider outsourcing operations to the public cloud. This can free your IT workers from grunt work and enable them to take on more valuable projects. At the same time, your IT team can still manage its public cloud environment. For example, they can still perform data governance and identity and access management (IAM). They just won’t have to worry about maintaining or upgrading any hardware or software.  Strengthening Business Continuity and Disaster Recovery (BC/DR) Another reason why companies migrate to the cloud is to improve their BC/DR posture. Business continuity involves establishing a plan to deal with unexpected challenges like service outages. Disaster recovery is all about restoring network access following an issue like a cyberattack or natural disaster. Companies often rely on the public cloud to establish BC and DR across two or more geographically separate locations. Running a BC/DR strategy through the cloud is much more efficient, as it prevents you from having to maintain a fully functioning recovery site 24/7. This approach drastically reduces costs. At the same time, using the public cloud can guarantee full operational BC/DR availability. This can provide the peace of mind that comes with knowing you can keep your business running when emergency strikes. Why Companies Avoid the Public Cloud Without a doubt, the public cloud offers several exciting advantages for businesses. But there are also a few major drawbacks to consider. Here are some of the top reasons why companies might avoid the public cloud. Higher Costs Companies often expect instant cost savings when migrating to the cloud. In reality, cloud services can sometimes be more expensive than on-premises data centers — at least at first. Oftentimes, companies fail to achieve true cost savings until they learn how to take full advantage of the public cloud. This can take months or years. It’s important to carefully break down cloud migration costs and ROI before moving to the public cloud to get an accurate understanding of the move’s short-, medium-, and long-term financial impact. In some cases, companies find they fare better with their existing setups. Data Ownership Concerns Right now, there’s an ongoing debate about who owns data in the public cloud. Some cloud providers attempt to retain ownership of some or all of the data they store. As such, many business leaders fear storing data in the public cloud, and some simply can’t risk it. Instead, they choose to avoid the issue by using their own dedicated infrastructure. It’s a good idea to talk with your team before migrating to the public cloud and conduct a security and privacy

Reasons to Move, and Reasons Not to Move, to the Public Cloud Read More »

Uncategorised, , , , ,

The need for adoption

Embrace DevSecOps Author:  Ravi CheetiralaTechnical Architect ( Cloud & DevSecOps) at TL Consulting DevOps is a widely adopted cultural norm in modern software development. It enabled enterprises to bring development teams, operations teams and tools under a single streamlined process. In addition, its automation capabilities help organisations to deliver the software much faster, by reducing the costs and release cycle times. However, in many cases security is not prioritised as a part of the CI/CD practices, thus the move to DevSecOps has not been adopted. While DevOps has been a successful methodology, one of the key roadblocks is that it doesn’t stress much upon a security and governance lens, as its core focus is on agility and faster time to market. A recent survey conducted by GitLab, (one of the popular DevOps vendors) had proven the point that more than 70% organisations have not included security in their DevOps model. With the rise of cyber-attacks, most of the incidents occur by exploiting the vulnerabilities in the software, which indicates a compelling need of rearchitecting the existing DevOps model to DevSecOps by adding additional levels of security and governance. Market Insights on DevSecOps adoption As per the recent survey by Gitlab conducted in the fall of year 2021. Please find some of the insights on DevOps, and security. The chart below illustrates the various drivers to adopt the DevSecOps. These findings demonstrate the alignment of improved security as a top priority for DevSecOps enablement. Why do we need DevSecOps? As per the above market insights, it is evident that more than 50% of the organisations have chosen security as their primary driver to lead to adoption. This is due to the fact conventional security measures are not good enough to cope up with latest technology innovations. Hence there is pressing need of DevSecOps adoption to have high security measures. What is DevSecOps? DevSecOps is an extension of DevOps by adding additional measures on security and governance layers, such as security testing, observability and, governance. Just like DevOps, the goal of DevSecOps is to deliver the trusted and secured software faster. Security adoption barriers in DevOps: Developers are focused on acceleration, least bothered about security – With the DevOps adoption, developers deliver the software faster. However, they tend to ignore the best security practices. Some of the risks include using an unsolicited third-party /open-source software downloaded from the internet without much of scrutiny and consent. Conflicting interests between teams – Development teams are usually relying on other teams for security and vulnerability testing, which is usually planned as a separate phase of the project. The delivered software might pose multiple security threats, vulnerabilities and usually, security analysts are assigned to review and take care of these issues. These usually create a knowledge gap between teams, thus end up delivering a compromised software. Cloud and container security challenges – Undoubtedly the wide adoption of containers and public cloud environments are helping in exceptional productivity with low cost and innovation lens for the organisation, however it also brings new security risks and challenges.  For instance, containers are an operating system agnostic and  that can run applications anywhere, but the lack of visibility into containers makes it difficult to scan them for vulnerabilities. Lack of skills and knowledge on security – There are fundamental knowledge gaps on security frameworks as most of the security standards are industry specific. Which acts as a barrier to achieve higher degree of efficiency with devops. The pitfall of DevOps nature – The core nature of DevOps is collaboration of the teams. This interconnection allows us the sharing of privileged information. Teams share account credentials, tokens, and SSH keys. Systems such as applications, containers, and microservices also share passwords and tokens. This opens an opportunity to attackers to disrupt operations, and steal information.          How to implement DevSecOps? Embed Security in the pipelines – Implement security in the DevOps or CI/CD pipelines as an additional level of integration, such as including the DAST, SAST and vulnerability, image scanning tools, which would help to identify and resolve the code vulnerabilities as soon as they appear. Identify the compliance requirements at design stage – Understand the organisation security framework and compare with the industry’s security guidelines during the early stages of design. This gap analysis will help us to assess the right tools to choose for automation. Shift left security approach – Embedding the security in the early stages of development cycles. As we move along to various phases of the development process, security will be carried along instead of focusing on the end. This leads to a better outcome and lesser challenges. Shift left is a preventive approach rather a reactive one. Automate as much as possible – The cornerstone of the DevOps is automation, use those capabilities to automate the security and governance by integrating with right tools in the CI/CD pipelines. DevSecOps tooling needs to run with full automation without any manual interventions. Validating cloud /container security standards – As a best practice, it is good to evaluate the cloud security standards with organisational, industry security frameworks and identify the gaps in the early stages. This will ensure the early detection of threats and organisational alignment. Creating awareness and education – Clear delineation of roles and responsibilities, creating the awareness of security best practices, providing education on industry security framework. Establishing a safe code guideline from the security lens. Adopting a security tooling is not always the best solution, as it might be ineffective if the teams do not know on how to use it. Establishing a governance model – Creating a governance model is the vital part of implementing the devsecops model to get the maximum outcome. Adopt the observability and governance tools, which will help to create a transparency in the teams to identify and address the security and other application related issues reported at all levels. How does DevSecOps fit in organisational GRC framework? GRC (Governance, Risk management and Compliance) and DevSecOps use various skills, tools and processes.

The need for adoption Read More »

DevSecOps, , , , ,