Capgemini Switzerland https://www.capgemini.com/ch-en/ Capgemini Switzerland Fri, 15 Mar 2024 11:38:52 +0000 en-US hourly 1 https://wordpress.org/?v=6.3.3 https://prod.ucwe.capgemini.com/ch-en/wp-content/uploads/sites/44/2023/08/cropped-cropped-favicon.png?w=32 Capgemini Switzerland https://www.capgemini.com/ch-en/ 32 32 Ethical Generative AI – At the crossroads of innovation and responsibility https://www.capgemini.com/ch-en/insights/expert-perspectives/ethical-generative-ai-at-the-crossroads-of-innovation-and-responsibility/ https://www.capgemini.com/ch-en/insights/expert-perspectives/ethical-generative-ai-at-the-crossroads-of-innovation-and-responsibility/#respond Fri, 15 Mar 2024 11:35:48 +0000 https://www.capgemini.com/ch-en/?p=533168&preview=true&preview_id=533168

Ethical Generative AI
At the crossroads of innovation and responsibility

Tijana Nikolic
Mar 12, 2024

Generative AI is reshaping business operations and customer engagement with its autonomous capabilities. However, to quote Uncle Ben from Spiderman: “With great power comes great responsibility.”

Managing generative AI has been challenging as generative AI models are outperforming humans in some areas, such as profiling for national security causes. Sometimes, anti-principles clearly explain why ethics must be enforced, so it is important to understand the following challenges:

  • Generative AI can assist in managing information overload by helping extract and generate meaningful insights from large volumes of data but, at same time, information overload can dilute precise messaging.
  • A lack of domain-specific knowledge or context leads to inaccurate information and contextual errors in addition to bias and subjectivity.
  • There may be limited human resources to oversee training and regulate output, due to a lack of experienced personnel.
  • Stale data may be used in training.
  • Elite and/or not always ethically sourced data may be used for training.
  • There may be a lack of resilience in execution.
  • Scalability and cost tradeoffs may cause organizations to consider a shortcut.

Although complex, these challenges can be alleviated on a technical level. Monitoring is a good example of ensuring robustness and observability of the behavior of these models. Additionally, since generative AI capability is exposing businesses to new risks, there is a need for well-thought-through governance, guardrails, and the following methods:

  • Model benchmarking
  • Model hallucination
  • Self-debugging
  • Guardrails.ai and RAIL specs
  • Auditing LLMs with LLMs
  • Detecting LLM-generated content
  • Differential privacy and homomorphic encryption
  • EBM (Explainable Boosting Machine)

It is crucial that generative AI design takes care of the following aspects of ethical AI:

  • Ensuring ethical and legal compliance – Generative AI models can produce outputs that may be biased, discriminatory, or infringe on privacy rights.
  • Mitigating risk – Generative AI models can produce unexpected and unintended outputs that can cause harm or damage to individuals or organizations.
  • Improving model accuracy and explainability – Generative AI models can be complex and difficult to interpret, leading to inaccuracies in their outputs. Governance and guardrails can improve the accuracy of the model by ensuring it is trained on appropriate data and its outputs are validated by human experts.
  • Ethical generative AI approaches need to be different based on the purpose and impact of the solution, so diagnosing and treating life-threatening diseases should have a much more rigorous governance model than using generative AI to give marketing content suggestions based on products. Even the upcoming EU AI Act prescribes risk-based approaches, classifying
  • AI systems into low-risk, limited or minimal risk, high-risk, and systems with unacceptable risk.
  • AIs must be designed to say “no,” a principle called “Humble AI.”
  • Ethical data sourcing is particularly important with generative AI, where the created model can supplant human efforts if the human has not granted explicit rights.
  • Inclusion of AI: most AIs today are English-language only or, at best, use English as a first language.

USING SYNTHETIC DATA FOR REGULATORY COMPLIANCE

Försäkringskassan, the Swedish authority responsible for social insurance benefits, faced a challenge in handling vast amounts of data containing personally identifiable information (PII), including medical records and symptoms, while adhering to GDPR regulations. It needed a way to test applications and systems with relevant data without compromising client privacy. Collaborating with Försäkringskassan, Sogeti delivered a scalable generative AI microservice, using generative adversarial network (GAN) models to alleviate this risk.

This solution involved feeding real data samples into the GAN model, which learned the data’s characteristics. The output was synthetic data closely mirroring the original dataset in statistical similarity and distribution, while not containing any PII. This allowed the data to be used for training AI models, text classification, chatbot Q&A, and document generation.

The implementation of this synthetic data solution marked a significant achievement. It provided Försäkringskassan with realistic and useful data for software testing and AI model improvement, ensuring compliance with legal requirements. Moreover, this innovation allowed for efficient scaling of data, benefiting model development and testing.

Försäkringskassan’s commitment to protecting personal data and embracing innovative technologies not only ensured regulatory compliance but also propelled it to the forefront of digital solutions in Sweden. Through this initiative, Försäkringskassan contributed significantly to the realization of the Social Insurance Agency’s vision of a society where individuals can feel secure even when life takes unexpected turns.

MARKET TRENDS

The market for trustworthy generative AI is flourishing, driven by these key trends.

  1. Regulatory compliance: Increasing government regulations demand rigorous testing and transparency.
  2. User awareness: Growing awareness among users regarding the importance of trustworthy and ethical AI systems.
  3. Operationalization of ethical principles: Specialized consulting to guide AI developers in creating ethical risk mitigations on a technical level.

RESPONSIBLE USE OF GENERATIVE AI

Ethical considerations are at the heart of these groundbreaking achievements. The responsible use of generative AI ensures that while we delve into the boundless possibilities of artificial intelligence, we do so with respect for privacy and security. Ethical generative AI, exemplified by Försäkringskassan’s initiative, paves the way for a future where innovation and integrity coexist in harmony.

“ETHICAL GENERATIVE AI IS THE ART OF NURTURING MACHINES TO MIRROR NOT ONLY OUR INTELLECT BUT THE VERY ESSENCE OF OUR NOBLEST INTENTIONS AND TIMELESS VALUES.”

INNOVATION TAKEAWAYS

TRANSPARENCY AND ACCOUNTABILITY

Generative AI systems should be designed with transparency in mind. Developers and organizations should be open about the technology’s capabilities, limitations, and potential biases. Clear documentation and disclosure of the data sources, training methods, and algorithms used are essential.

BIAS MITIGATION

Generative AI models often inherit biases present in their training data. It’s crucial to actively work on identifying and mitigating these biases to ensure that AI-generated content does not perpetuate or amplify harmful stereotypes or discrimination.

USER CONSENT AND CONTROL

Users should have the ability to control and consent to the use of generative AI in their interactions. This includes clear opt-in/opt-out mechanisms. Respect for user preferences and privacy and data protection principles should also be upheld.

Interesting read?

Capgemini’s Innovation publication, Data-powered Innovation Review | Wave 7 features 16 such fascinating articles, crafted by leading experts from Capgemini, and partners like Aible, the Green Software Foundation, and Fivetran. Discover groundbreaking advancements in data-powered innovation, explore the broader applications of AI beyond language models, and learn how data and AI can contribute to creating a more sustainable planet and society.  Find all previous Waves here.

Authors

Tijana Nikolic

AI Lead, Netherlands, Sogeti, Capgemini
Tijana is the AI Lead in the Sogeti Netherlands AI CoE team with a diverse background in biology, marketing, and IT. Her vision is to bring innovative solutions to the market with a strong emphasis on privacy, quality, ethics, and sustainability, while enabling growth and curiosity of team members.
    ]]>
    https://www.capgemini.com/ch-en/insights/expert-perspectives/ethical-generative-ai-at-the-crossroads-of-innovation-and-responsibility/feed/ 0
    Accelerating business transformation with our AWS Generative AI Competency https://www.capgemini.com/ch-en/insights/expert-perspectives/accelerating-business-transformation-with-our-aws-generative-ai-competency/ https://www.capgemini.com/ch-en/insights/expert-perspectives/accelerating-business-transformation-with-our-aws-generative-ai-competency/#respond Fri, 15 Mar 2024 11:21:18 +0000 https://www.capgemini.com/ch-en/?p=533156&preview=true&preview_id=533156

    Accelerating business transformation with our AWS Generative AI Competency

    Genevieve Chamard
    11 Mar 2024

    I am proud to announce our achievement of the AWS Generative AI Competency. This accomplishment marks a pivotal milestone on our decade-long journey with AWS.

    Notably, we recently announced the signing of a multi-year strategic collaboration agreement with AWS, designed to accelerate the adoption of generative AI solutions and technologies amongst organizations of all sizes.

    Alongside AWS, at the heart of our generative AI journey is the determination to help clients realize the full value of GenAI at scale, transforming their businesses to stay focused on a digital and sustainable economy. By leveraging Capgemini’s existing network of AWS Centers of Excellence, we can help clients realize value at speed, by accelerating the deployment of innovative industry-specific and functional use cases. These use cases are made possible by the AWS technologies of Amazon Bedrock to access a range of secure, high-performing Foundation [MC1] Models, including Amazon Titan.

    What distinguishes Capgemini in the field of Generative AI?

    In our recent Capgemini Research Institute report ‘Harnessing the value of Generative AI’,  74% of executives agreed the overall benefits of generative AI outweigh the associated risks. Our report guides organizations to focus on the following key areas to accelerate their generative AI journeys amid a rapidly evolving application landscape:

    • Integrate genAI into the organization’s strategy and operations
    • Drive a human-centered approach to scaling genAI
    • Focus on sustainable development
    • Build trust and responsibility in the AI systems
    • Establish guidelines around usage

    Our expertise in building the foundations for generative AI across all these key areas, empowers us to push boundaries for you, tackling even the most complex AI challenges. With an ongoing commitment to excellence through our AWS Centers of Excellence, we have a consistent measure of the latest training resources on AWS, the highest number of Generative AI badges, and a 10-year strong partnership enabling us to seamlessly work end-to-end as an integrated team. It means we can help organizations across all industries, achieve the best possible outcome.

    The unique advantages of AWS Generative AI

    AWS is renowned for its robust cloud technologies, setting a high bar for innovation and performance in the technology landscape. But what sets AWS Generative AI technologies specifically apart from the competition is their unparalleled scalability, security, and flexibility.

    AWS provides a comprehensive suite of tools and services that empower developers to build, train, and deploy machine learning models more efficiently than ever before. AWS Generative AI technologies stand out particularly in the areas of:

    • Extensive service suite: AWS offers a wide range of AI services, including Amazon SageMaker for building, training, and deploying machine learning models at scale, and AWS Lambda for running code without provisioning or managing servers, facilitating rapid deployment of AI applications.
    • Unparalleled scalability and reliability: AWS’s infrastructure supports scalable AI applications, making it suitable for projects ranging from small-scale experiments to large-scale enterprise deployments.
    • Customization at its core: AWS has been expanding its generative AI offerings, providing tools and services that allow for the easy integration of generative AI functionalities into applications, emphasizing customization and control.

    Step into the future of AI with our proven expertise, and the leading technology of AWS

    Together, we can help you explore how to:

    • Integrate GenAI seamlessly into your operations. Streamline processes and personalize experiences.
    • Drive sustainable innovation. Develop responsible, ethical AI solutions that can benefit both your business and the environment.
    • Scale your AI journey with confidence. Leverage our industry-leading expertise and reputation. Find out more about our journey with AWS.

    I invite you to connect with me today to explore the full potential of Generative AI.

    Author

    Genevieve Chamard

    Global AWS Partnership Executive
    An expert in partnership strategy at a global level with 13 years of innovation and strategy consulting. Teaming up with partners and startups, I translate the latest, bleeding-edge technologies into solutions that create new captivating customer experiences, intelligent operations and automated processes. I specialize in: – Global partnership strategy and management – Go-to-market and growth strategy – Industry vertical solution build – Pilot definition and management – Emerging technology and start-up curation
      ]]>
      https://www.capgemini.com/ch-en/insights/expert-perspectives/accelerating-business-transformation-with-our-aws-generative-ai-competency/feed/ 0
      Next gen railway operations intelligence https://www.capgemini.com/ch-en/insights/expert-perspectives/next-gen-railway-operations-intelligence/ https://www.capgemini.com/ch-en/insights/expert-perspectives/next-gen-railway-operations-intelligence/#respond Fri, 15 Mar 2024 10:17:04 +0000 https://www.capgemini.com/ch-en/?p=533149&preview=true&preview_id=533149

      Next gen railway operations intelligence

      Samir Mouhli
      Mar 11, 2024
      capgemini-engineering

      What if digital solutions could improve railway operational performance and passenger services?

      Introduction: network fragility and reputational risk  

      For as long as there has been a rail sector, service disruption (for example, delayed and canceled trains) has been a major cause of customer upset and significant reputational damage. We have all experienced the frustration of a delayed or canceled train – and some of us have gone on to resent the rail operator for it.  

      Sadly, such disruptions are common. This is because the railway network is an integrated fragile system. For example, one delay can have a domino effect on the network. Indeed, one incident can lead to a full day of disturbed traffic and many upset passengers. Addressing those issues requires an improved traffic management system.

      The first requirement is to provide accurate, tailored information to the customer. Punctuality is a key driver of business performance, both for passengers and for freight. Your customer may accept a delay, but it is becoming increasingly unacceptable not to know why there is a delay, or to provide this customer with erroneous or incomplete information.

      What are the main challenges to overcome?

      In pursuit of better traffic management (and customer experience), one of the main challenges is to precisely manage network problems when (and preferably before) they occur.

      This entails anticipation, alert management and resolution. However, the systems we have today are not built around such problem solving. In the past, the main role of operators was to trace routes and timetables – but today, we must rethink this approach. In particular, we must take into account the digitization of the operational process, moving from only managing disturbances once they occur to a more proactive approach of prediction, consideration, prevention and anticipation. This is because, as, an old saying goes, ‘prevention is better than cure’ – complex problems are easier to solve if they don’t happen in the first place.

      However, it is unrealistic to assume that all problems can be prevented, at least with the technology we have today. Take traffic jams as an example – these can be created if routes are poorly planned. Operators who make this mistake often suffer for months (as do their customers). Even today, with all our advancements in technology, it is still very difficult to maintain a robust, achievable timetable. So, in the inevitable case of a disruption, operators must be able to quickly perform short re-routing and re-planning. This translates into an understanding of the situation of the network, with a real-time tracking system.

      Being able to adapt the transport plan effectively to cope with eventualities requires a clear understanding of real time needs – but also of future requirements. As alluded to previously, the ability to anticipate needs in advance is highly desirable, particularly for complex problems. For example, the risk of disruption increases when the network is forced to work in a degraded mode (eg. a signal failure). It also increases during periods of exceptional demand – for example, an event like a football match must be planned for and anticipated, otherwise insufficient trains will be available and passengers left dissatisfied.

      Planning ahead: technology to optimize traffic management

      Such forethought is becoming more realistic. Indeed, with systems becoming increasingly intelligent, it is possible to track and anticipate the arrival of passengers. This is a major aid in organizing future transport plans. As networks are interdependent, big data and data analytics can help to solve problems – for example, by correlating and analyzing data in real-time from several network assets – turning this data into relevant insights.

      For instance, an issue with a train in Strasbourg can delay another train in Marseille, which could lead to a total disruption of the transport plan. This is an extremely complex problem that requires significant human resources to predict or solve (assuming that it hasn’t escalated so far as to be unsolvable). Anticipating traffic impacts and preparing solutions is, therefore, difficult. As a solution, AI technologies can help us evaluate all possible future scenarios and combinations – predicting anticipated issues and offering potential solutions to the operator.

      Decisions on train priority could be made based on the number of passengers inconvenienced. Trains can be deployed to meet demand, based on detailed individual data on passengers’ scheduled movements. This approach is based on global decision support and the implementation of automation to manage workload and make complex decisions. Anticipating connections between two trains may also be needed, to reduce the impact of delays on passengers.

      Back to our signal failure example; some of these digital solutions can help with the management of degraded modes, for example, using a digital twin of the network to simulate the effects of the degraded mode, or to check in advance if the network is enough robust to absorb extra traffic.

      Growing initiatives for a centralized network control

      Today, operators are putting a number of these technology solutions into practice, particularly the introduction of greater automation, thanks to Operation Control Centers (OCCs). Thanks to automation, the geographical remit of each of these centers has increased.

      For example, in France, the current objective is to replace 1500 legacy signaling centers with 16 Rail Operating Centers (ROCs) that use new digital communication systems. Here, AI, IoT and data analytics could improve the system further, for instance, through video image analysis and macro data, which allow better supervision. As a result, the passenger load could be monitored on the network, to automatically adapt the transport plan in near real time.

      Some barriers to digital transformation

      Although we have made some progress with automation, the integration of all these new technologies is easier said than done. A minor change can disrupt everything.

      For instance, introducing new modern trains without modernizing the underlying infrastructure won’t improve the system’s overall performance. For another example, adding driverless trains offers increased operational flexibility, but this kind of change requires both physical assets to be upgraded and, potentially, a complete update of the transport map. Nevertheless, metro systems are moving in this direction.

      Therefore, successful digital transformation requires us to take a holistic view of all the parts of the railway network, one that considers integration.

      Back to our command center example. Further improving the efficiency of these centers is partly an integration challenge. Within such a facility, all supervisory equipment may be provided by different specialized companies, not all of which are compatible with each other. Instead of having all functions contained within a single system, this requires the operator to move from one isolated application to another. As a result, operations are nowhere near as effective as they could be. We know this can be done better; we propose an ecosystem approach…

      Conclusion: go further with partner ecosystems

      To deliver more flexible deployments and provide efficient solutions tailored specifically to operator needs, operators should choose a broad and highly experienced ecosystem integrator that understands their goals and challenges.

      As a leading global Intelligent Rail transformation partner, Capgemini has worked for years alongside rail operators across the world as an integrator, supporting their transformation to next-gen traffic management. We can provide your organization with the software ‘bricks’ that will help you to build a robust, failure resistant and user-centered transport system. Want to find out how we can help you?…

      Meet our expert

      Samir Mouhli

      Railway Signaling Consultant, Capgemini Engineering
      Samir MOUHLI is a signaling engineer who specializes in the CBTC (Communications-based train control) system. With nearly 22 years of experience, Samir has supported various major railway players with Integration, Validation & Verification, T&C (Test and Commissioning) and Interface Management. He is passionate about how digital technology and improved user experiences can help companies to enhance the customer rail journey.
        ]]>
        https://www.capgemini.com/ch-en/insights/expert-perspectives/next-gen-railway-operations-intelligence/feed/ 0
        Generative AI is only as good as the data you feed it https://www.capgemini.com/ch-en/insights/expert-perspectives/generative-ai-is-only-as-good-as-the-data-you-feed-it/ https://www.capgemini.com/ch-en/insights/expert-perspectives/generative-ai-is-only-as-good-as-the-data-you-feed-it/#respond Mon, 11 Mar 2024 10:51:06 +0000 https://www.capgemini.com/ch-en/?p=532953&preview=true&preview_id=532953

        Generative AI is only as good as the data you feed it
        Your data is your competitive advantage

        Taylor Brown
        5th March 2024

        Generative AI is the pinnacle of data science. It will boost profits, reduce costs, and help you expand into new markets. To take full advantage of generative AI’s capabilities, train your models on all your data.

        The world is being transformed by AI-assisted medicine, education, scientific research, law, and more. Today, researchers at the University of Toronto use generative AI to model proteins that don’t exist in nature; pharmaceutical giant Bayer now uses generative AI to accelerate the process of drug discovery; and education provider Khan Academy has developed an AI chatbot/tutor, Khanmigo, to personalize learning. And with each passing day, the list of AI use cases across all industries only continues to grow.

        According to the Capgemini Research Institute, nearly all (96 percent) of executives cite generative AI as a hot topic of discussion in their respective boardrooms. Generative AI is not just used as an aid to surface information the way a search engine does; with generative AI, organizations can combine their proprietary data with foundation models that have been pre-trained on a broad base of public data to create a sustainable competitive advantage.

        Generative AI then becomes the most knowledgeable entity within your organization.

        However, as with all analytics, generative AI is only as good as its data. To fully leverage AI, an organization needs a solid data foundation and organizational norms that facilitate responsible and effective use of data.

        Data readiness for generative AI depends on two key elements:

        1. The ability to move and integrate data from databases, applications, and other sources in an automated, reliable, cost-effective, and secure manner
        2. Knowing, protecting, and accessing data through data governance

        Automated data pipeline platforms, like Fivetran, allow enterprises to capture all of their data, irrespective of the source platform. These automated tools reduce the friction and overhead required to maintain the flow of data to continuously train generative AI applications.

        OPERATIONALIZING GENERATIVE AI

        To operationalize generative AI effectively, organizations must establish a solid foundation of automated, reliable, and well-governed data operations. Generative AI requires a modern and scalable data infrastructure that can continuously integrate and centralize data from a variety of sources, including both structured and semi-structured data.

        However, as businesses start to operationalize generative AI, they may encounter a number of challenges.

        • Data quality and preparation: Generative AI models are only as good as the data they are trained on. It is important to ensure that the data is high-quality, clean, and well-organized. This includes identifying any potential biases in the data that may distort the outputs of any model trained on it.
        • Security and governance: Security and governance in the context of generative AI concern masking sensitive information, controlling data residency, controlling and monitoring access, and being able to track the provenance and lineage of data models.
        • User experience: It is important to design user interfaces for your model that make it easy for people to interact with your models.
        • Scalability: It is important to choose a generative AI platform that can scale to meet your needs at a reasonable cost.

        Generative AI models are trained on massive datasets of text, code, images, or other media. Foundation models, which are off-the-shelf generative AI models that are pre-trained on large volumes of (usually public) data, may be specialized by industry or use case. Choosing the right foundation model can have a significant impact on performance and capabilities. For example, a foundation model that specializes in code generation will do so in a more comprehensive and informative way than a model that is trained on a general dataset of text. Other specialties of foundation models may include sentiment analysis, geospatial analysis, image generation, audio generation, and so on.

        While you can easily make use of pre-trained, publicly available AI models, your data is a unique asset that differentiates your organization from the competition. To make the most of it, you must additionally supply foundation models with your business’s unique context.

        With access to your organization’s accumulated data, a properly tuned generative AI model can become the most knowledgeable member of your organization, assisting with analytics, customer assistance, sales and marketing, software engineering, and even product ideation.

        The Fivetran product team leverages generative AI and natural language processing technologies to develop Fivetran Lite Connectors in a fraction of the time of Fivetran’s standard connectors, while ensuring the same high quality, data integrity, and security customers expect from Fivetran.

        In addition, several notable organizations have already found practical ways to use generative AI. Global commercial real estate and investment management company JLL recently rolled out a proprietary large language model that employees access through a natural language interface, quickly answering questions about topics such as an office building’s leasing terms. Similarly, the motor club in the US, AAA, now uses generative AI to help agents quickly answer questions from customers. Of the 100 tech companies profiled in the Forbes Cloud 100, more than half use generative AI.

        According to Carrie Tharp, VP Strategic Industries, Google Cloud, “Generative AI opens up a new avenue, allowing people to think differently about how business works. Whereas AI and ML were more about productivity and efficiency – doing things smarter and faster than before – now it’sabout ‘I can do it completely differently than before.’”

        Until enterprises get the data right, the nirvana of asking generative AI app-specific and contextual organizational questions in a “Siri-like” way will remain elusive. Get the data right, and it opens up possibilities for all analytics workloads, including generative AI and LLMs.

        To make full use of an ever-expanding roster of powerful foundation models, you must first ensure the integrity, accessibility and governance of your own data. Your journey into generative AI and the innovation and change it can bring will be fueled by high-quality, usable, trusted data built on automated, self-healing pipelines.

        “GENERATIVE AI APPLICATIONS ARE ONLY AS GOOD AS THE DATA THAT POWERS THEM.”

        INNOVATION TAKEAWAYS

        OPERATIONALIZE GENERATIVE AI

        Operationalization begins with centralizing data and modernizing the data stack to include all available data.

        AUTOMATED DATA ACCESS

        By automating data pipelines, enterprises can focus on improving data models and algorithms to accelerate the efficacy and ROI of investing in a generative AI application.

        CREATE AN UNFAIR ADVANTAGE

        Generative AI trained on your data will provide insights and guidance driven by your data, creating a unique competitive advantage that cannot be replicated.

        Interesting read? Capgemini’s Innovation publication, Data-powered Innovation Review | Wave 7 features 16 such fascinating articles, crafted by leading experts from Capgemini, and partners like Aible, the Green Software Foundation, and Fivetran. Discover groundbreaking advancements in data-powered innovation, explore the broader applications of AI beyond language models, and learn how data and AI can contribute to creating a more sustainable planet and society.  Find all previous Waves here.

        Taylor Brown

        COO and Co-founder, Fivetran
        As COO and co-founder, Taylor has helped build Fivetran, the industry leader in data integration, from an idea to a rapidly growing global business valued at more than $5.6 billion. He believes that magic happens when you can build a simple yet powerful product that is truly innovative and helps users solve a hard problem.
          ]]>
          https://www.capgemini.com/ch-en/insights/expert-perspectives/generative-ai-is-only-as-good-as-the-data-you-feed-it/feed/ 0
          Building GenAI applications for business growth – actions behind the scenes https://www.capgemini.com/ch-en/insights/expert-perspectives/building-genai-applications-for-business-growth-actions-behind-the-scenes/ https://www.capgemini.com/ch-en/insights/expert-perspectives/building-genai-applications-for-business-growth-actions-behind-the-scenes/#respond Mon, 04 Mar 2024 11:51:15 +0000 https://www.capgemini.com/ch-en/?p=532569&preview=true&preview_id=532569

          Building GenAI applications for business growth – actions behind the scenes

          Manas K. Deb & Jennifer L. Marchand
          01 Mar 2024

          Over the last few years, we have been witnessing a strong adoption of artificial intelligence and machine learning (AI/ML) across industries with a wide variety of applications. Use cases range from cost reduction via automation to the generation of additional business via the introduction of AI-infused products and services. The launch of the generative AI (GenAI) application ChatGPT by OpenAI in November 2022 only accelerated AI adoption. At present, many of the tech giants including the leading cloud platform vendors like Google, Microsoft, and Amazon have strong GenAI offerings along with those from many smaller vendors and open-source platforms.

          In short, GenAI is an AI discipline where the AI foundation models (FMs) are trained on vast amounts of multimodal data (i.e., text, image, audio, video, terabytes of data, trillions of parameters). With proper user requests on the input, FMs can generate a large variety of multimodal synthesized outputs. Large language models (LLMs) are a subclass of FMs specializing in text. An added benefit of GenAI is its highly superior natural language processing (NLP) capabilities, in many cases using multimodal input/output, making it a great and not-realized-before technology for human-computer interfaces. This is one of the key reasons for the heightened interest in GenAI.

          GenAI, with applicability in virtually all industries, can significantly improve many of the day-to-day operations of a business as well as help launch new business capabilities. While some GenAI-based autonomous products like certain types of text, image, and audio/video processing are emerging, many of the enterprise-grade usage scenarios that are currently in focus involve GenAI-based digital assistants to humans.

          These assistants can help chatbots (and copilots):

          • Respond to open-ended questions in a more human-like manner
          • Improve overall customer experience
          • Detect features and anomalies in images and transactions
          • Help with code writing and testing
          • Expand work automation
          • Improve a wide range of document processing
          • Make cognitive and semantic content searches more efficient and effective
          • Provide advanced analytics to assess what-if scenarios
          • Assist in creative content generation.

          Typical metrics for business growth are revenue increase and healthy profitability. Productivity, innovation, and time-to-market are the key enablers of business growth. Depending on the situation, the discipline of GenAI can positively impact some or all of these enablers. A recent McKinsey study [1] estimates that GenAI-enhanced productivity and innovation could add between $2.6 and $4.4 trillion to the global economy annually and identified that around 75% of the value delivered would fall under four use case categories:

          • Customer operations
          • Marketing and sales
          • Software engineering
          • R&D

          An early 2023 Capgemini Research Institute report [2] that explored a wide variety of industry use cases and surveyed nearly a thousand executives shows the broad applicability of GenAI and high ROI expectations from GenAI adoption. Of course, to realize significant business growth benefits, GenAI-based applications need to be functionally completed using additional application components besides the GenAI piece and need to be scalable, reliable, and integrated with other enterprise systems as necessary.

          Example: A GenAI-enhanced multimodal and omnichannel B2C commerce application

          Figure 1. Modular and component-based architecture for “Casey” – A GenAI-powered virtual retail assistant

          We, at Capgemini, recently developed a virtual retail assistant, named “Casey” to accept orders and drive the order-to-cash process for partner stores (see figure 1). Casey is voice-activated and GenAI-enabled. Capgemini solution accelerator components power,[3] Google/GCP, and Soul Machines. For the end-to-end application, we layered a ‘digital human’ with conversational AI and cloud-native headless commerce APIs,[3] all pre-integrated for conversational commercial kiosks. It serves as a store-in-store order kiosk allowing the partner stores to maximize their channel reach with minimal investment. Casey is a business growth enabler – it opens a new revenue channel where it is easy to market innovative offers and whose cost does not grow rapidly with business growth, i.e., highly productive, and the solution construction allows for fast time-to-market implementation. Casey’s solution architecture is modular which has enabled us to use this as a basis for many other digital channel use cases in a variety of industries, for example, grocery, general retail, call center, telco, and automotive.

          As this example illustrates, to build GenAI-powered applications that cover full customer journeys thus yielding tangible business value, we need to either combine several other application components and technologies or integrate the GenAI parts into otherwise functionally complete existing applications in case suitable ones are available

          Creating enterprise-grade GenAI-based apps: Key considerations

          To build a GenAI-based enterprise-grade application delivering substantial business growth, we need to consider:

          • Opportunity formulation. Identification of the right business-relevant opportunities with realistic ROI projections is a critical success factor (CSF) for eventual success with GenAI-based applications. Especially as companies embark on GenAI adoption, it can reduce the risk of failure if GenAI is used to augment existing activities and processes. For example, the addition of GenAI into an existing customer churn prediction algorithm could process unstructured data like call recordings from customer interactions and customer reviews, capture additional insights like ‘sentiment’, specific store or product issues, competition strengths and weaknesses, possible new product bundles, and suggest appropriate ‘white glove’ treatments to reduce churn. As another example, GenAI could assist in a customer’s product exploration by improving existing user interfaces with visuals and helpful hints and by simplifying the actual purchase action by making the supporting processes more transparent.
          • Solution design. One of the first considerations in drafting a GenAI-based solution strategy is to recognize that GenAI-powered interactions with customers or end users can produce actions that may not follow strict workflows, i.e., the complete application needs to have the flexibility to appropriately react to more free-flowing human-GenAI conversations. If the solution is built from scratch such flexibilities can be developed from the ground up which, of course, means a larger development burden. Cloud-first development and use of pre-built components (such as Capgemini’s Digital Cloud Platform [3]) can significantly reduce this burden. If the GenAI components are incorporated in an existing solution then the existing solution most likely will have to be refactored for proper integration of the new and the old including changing/upgrading some of the functionalities of the older components, for example, from batch processing to real-time response, etc. The choice of the appropriate GenAI tool/platform and the availability of data required for the proper functioning of the solution are also key considerations.
          • Customer/employee experience and data orchestration. The value of GenAI in chatbots (and copilots) is the level of personalization and context an unscripted conversation can provide to a customer and employee. To retain this value, an enterprise must think through how to orchestrate various interaction points (or digital teammates) for consistency, as well as share interaction and customer data so the next conversation at a different interaction point is able to pick up the conversation where that customer left off last time. These chatbots are also a tool to empower employees to assist customers more broadly, where previously, an employee used to rely on what she knew at that moment now has access to comprehensive and granular data on-demand. Enterprises must also consider an orchestration layer to connect the various GenAI initiatives and data.
          • Scale-out. GenAI is still an emerging technology; hence, it is advisable to start small, prove concrete business value, and then scale out to realize the target business benefits. However, in GenAI use cases where technical feasibility has already been proven elsewhere and a realistic business case for the solution is deemed positive, it can be worth the time and effort to create solution architectures with possible scale-out in mind. Such architectures would consider solution performance under production workload, availability and disaster recovery, security and data privacy, identity and access management, error handling, development and run cost optimization, and sustainable development practices. In the scale-out phase, a cloud-based solution approach is often superior and should be duly considered. Some of the GenAI-specific considerations are enterprise data foundations and trust (solid source of truth for customers, vendors, products, promotions, knowledge base, etc.), LLM selection, LLM lifecycle management, prompt version control across environment tiers, UX design for free-flowing conversations, balancing intent-based and generative-based interactions, incorporation of human-in-the-loop, response feedback loop, cost monitoring and optimization, technical debt management, and responsible AI governance.
          • Measure and improve. Adequate measurement of solution performance is essential to understanding the current maturity of the solution and possible future enhancements; thus, measurement mechanisms should be built into the solution as first-class citizens. As such, high-level KPIs from traditional solutions can be reused in GenAI-powered solutions, for example, reduction in churn rate, increase in revenue per customer, efficiency in anomaly detection, and the like. However, it would be insightful to also add some metrics related to the model and system quality, and the performance of the GenAI components (see, for example, a summary of relevant metrics in [4]) which could include response error rate, range of input over which response accuracy stays acceptable, system latency, throughput, and run cost.
          • Learn and grow. Capturing and sharing experiences as the solutions are developed and rolled out – and learning from them – is extremely valuable for fast-developing technologies like GenAI. Some design documentation, decisions taken along with the rationales, and stakeholder and end-user feedback are good ways to capture experiences from which lessons learned can be derived. This process would help in improving the solution over time as well as increase the organizational maturity to take on higher value (and potentially higher complexity) GenAI-based projects down the line. Over time, defining a robust set of build patterns across use cases would be helpful for asset reuse, solution management, and acceleration of new use case implementations.

          Concluding remarks:

          Done right, GenAI has tremendous power to push most enterprises forward with healthier business growth and higher market competitiveness. As a productivity enabler, GenAI is expected to accelerate automation by ten years with nearly half of the current tasks having been automated by the end of this decade.[1] Not to be left behind, enterprises should focus both on identifying what GenAI-powered applications are the most valuable for them as well as acquiring, either in-house or via partners, adequate skills to understand the ‘what’ and the ‘how’ of GenAI. In the early stages of GenAI maturity, spot solutions can bring quick wins while as the maturity grows, incorporation of GenAI in broader and across enterprise value chains should be considered for reaching higher benefit goals – and this will take some foundational investment in data, UX strategy, integration strategy, and building a GenAI platform.

          References:

          [1] https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/the-economic-potential-of-generative-ai-the-next-productivity-frontier

          [2] https://www.capgemini.com/insights/research-library/generative-ai-in-organizations/

          [3] https://www.capgemini.com/us-en/solutions/digital-cloud-platform-for-retail/

          [4] https://cloud.google.com/transform/kpis-for-gen-ai-why-measuring-your-new-ai-is-essential-to-its-success

          Author

          Jennifer Marchand

          Enterprise Architect Director and GCP CoE Leader, Capgemini/Americas
          Jennifer leads the Google Cloud COE for Capgemini Americas, with a focus on solutions and investments for the CPRS, TMT, and MALS MUs, and supporting pre-sales across all MUs. She has been with Capgemini for 18 years focusing on cloud transformation since 2015. She works closely with accounts to bring solutions to our clients around GenAI, AI/ML on VertexAI and Cortex, Data Estate Modernization on Big Query, SAP on Google Cloud, Application Modernization & Edge, and Call Center Transformation and Conversational AI. She leverages the broader Capgemini ecosystem across AIE, Invent, ER&D, I&D, C&CA, and CIS to shape cloud and transformation programs focusing on business outcomes.
            ]]>
            https://www.capgemini.com/ch-en/insights/expert-perspectives/building-genai-applications-for-business-growth-actions-behind-the-scenes/feed/ 0
            New AI compute paradigm: The language processing unit (LPU) https://www.capgemini.com/ch-en/insights/expert-perspectives/new-ai-compute-paradigm-the-language-processing-unit-lpu/ https://www.capgemini.com/ch-en/insights/expert-perspectives/new-ai-compute-paradigm-the-language-processing-unit-lpu/#respond Mon, 04 Mar 2024 11:26:53 +0000 https://www.capgemini.com/ch-en/?p=532558&preview=true&preview_id=532558

            New AI Compute Paradigm: The Language Processing Unit (LPU)

            Dheeren Vélu
            Feb 27, 2024

            Could NVIDIA’s AI and GPU dominance be at risk?

            Have you heard about #LPUs, or Language Processing Units yet? This new kid on the block is 10x faster, 90% less latency, minimal energy vs. Nvidia GPUs. What does this mean for #ai‘s #genAI future?

            I explore this massive shift in my latest article. Discover how Groq could redefine AI hardware efficiency and challenge the current giant.

            Meet the author

            Dheeren Vélu

            Head of Innovation, AIE Australia  |  Web3 & NFT Stream Lead, Capgemini Metaverse Lab
            Dheeren Velu is an award winning leader in emerging technology, innovation, and digital transformation and is committed to helping organisations thrive in today’s era of fast-paced disruptive technological change. He is an Innovation expert & Web3 Strategist, with a deep background in implementing large scale AI and Cognitive solutions in his previous roles. His current area of focus is Web3 and its intersection with Metaverse and is working on bringing to life innovative concepts and business models that are underpinned by the decentralised capabilities like Smart Contracts, Tokens and NFT techniques.
              ]]>
              https://www.capgemini.com/ch-en/insights/expert-perspectives/new-ai-compute-paradigm-the-language-processing-unit-lpu/feed/ 0
              Why manufacturing facilities need software-defined networks https://www.capgemini.com/ch-en/insights/expert-perspectives/why-manufacturing-facilities-need-software-defined-networks/ https://www.capgemini.com/ch-en/insights/expert-perspectives/why-manufacturing-facilities-need-software-defined-networks/#respond Wed, 28 Feb 2024 05:39:00 +0000 https://www.capgemini.com/ch-en/?p=532294&preview=true&preview_id=532294

              Why Manufacturing Facilities need Software Defined Networks

              Vijay Anand
              Feb 26, 2024
              capgemini-engineering

              Picture the factory of the future. Thousands of devices – production machinery, motors, pumps, sensors, cameras – all connected, collecting detailed data, and communicating seamlessly.

              This reliable data flow means clever algorithms can monitor the entire facility, and make subtle adjustments to optimize efficiency, energy use, and productivity. A machine fault is detected, an alert goes out and an engineer is booked. A new IoT software patch is released, and all the devices are updated without anyone noticing. A data intensive maintenance system is deployed, and the IT team increases the Wi-Fi bandwidth to ensure it works seamlessly.

              This is all made possible by several innovations, one of which is Software Defined Networking, or SDN.

              SDN involves digitizing all moving parts of the network, so the entire thing can be joined up and controlled through a single user interface. Changes and updates can be made consistently across the network, at the push of a button.

              With SDN, the entire factory and its IIoT (Industrial IoT) network can be dynamically managed and configured using a single SDN controller, handling all the elements that make a network work and adapting without users even noticing: network monitoring, packet forwarding, networking devices status checks, load balancing, queue management, scheduling and quality of experience (QoE) awareness. SDNs also have far fewer problems than networks made of physical switches and gateways, reducing time spent on troubleshooting and reconfiguring. That creates a network that is flexible, scalable, efficient, secure, and resilient.

              Is manufacturing ready for SDN?

              SDN has traditionally been associated with telecoms companies which benefit enormously from flexible networks that can be quickly configured to meet changing customer needs. But SDNs are growing in popularity in industry too. Intel is upgrading its chipmaking facilities to SDN. IDC’s 2023 Future of Connectedness Sentiment found that 41% of manufacturers cite the flexibility to change bandwidth capacity in near real-time as a top reason for SDN investment.

              That said, SDN is still relatively new in manufacturing. There is a need for a greater understanding of its benefits and challenges, so manufacturers can go in with their eyes open – ensuring that the transition is seamless and able to deliver what the manufacturer wants.

              SDN based Network design for manufacturing plants

              Fig. 1:  SDN based Network design for manufacturing plants

              What does a software defined network look like?

              An SDN is a cost-effective way to create reliable, seamless network connectivity with any combination of available communications links, whether Wireless (Wi-Fi/5G) or Wired (Ethernet/LAN) (as shown in Fig 2).

              A growing number of connected industrial IoT (IIoT) devices are being installed throughout the manufacturing plant. In an SDN, edge gateways are installed around the plant to provide wireless connectivity to these IIoT devices. IoT-enabled devices, like routers, switches, firewalls, and storage devices help to forward data through the edge gateway efficiently.

              Within the factory, an edge cloud is established – a localized virtual hosting space which manages the devices and gateways. Within that sits an SDN controller, which manages thousands of IIoT devices, establishes and maintains network connectivity to each, and automatically manages data routing between them, ensuring continuous low latency connectivity, regardless of the location of the machines or connection type.

              Finally, an SDN includes a network of intelligent industrial gateways to optimize traffic and cost, while providing end-to-end encrypted data transport to corporate data centers.

              Seamless Connectivity between Wi-Fi and Ethernet, based on SDN (images from Internet source)

              Fig. 2:  Seamless Connectivity between Wi-Fi and Ethernet, based on SDN (images from Internet source)

              What are the benefits of SDN for manufacturing facilities?

              SDNs offer various benefits to factory-based networks. These include:

              • Simplifies network management, by separating the control plane(network intelligence and data forwarding decisions) from the data plane(data collection)as shown inFig 3
              • Allows the routing of newly added devices to be automatically configured
              • Provides a programmable centralized control and management system for the factory network, using high-level policies, all without changing existing factory network architecture
              • Centralized management facilitates the optimization and configuration of the factory network in an efficient and automated manner
              • Offers interoperability/seamless connectivity through interfacing with different wired and wireless technologies
              • Enables the dynamic management of smart devices
              • Facilitates the real time feed of data, processed at the edge, for quick, automatic decision-making
              • Provides higher data traffic optimization/rerouting of packets based on edge processing, along with fault tolerance during production
              • Enables faster delivery of data packets
              • Reduces OPEX and CAPEX
              • Improves scalability
              Control and Data Plane separation, based on SDN (images/icons are from Internet source)

              Fig. 3:  Control and Data Plane separation, based on SDN (images/icons are from Internet source)

              A SDN automatically recognizes and prioritizes data traffic flow, avoids network congestion, and provides seamless handoff between wired (LAN) and wireless (Wi-Fi), which ensures manufacturing operations continue, even in the event of an unexpected link outage. The result is persistent, low-latency connections designed to support real-time collaboration throughout the entire manufacturing operation – from the factory to sales, planning, distribution, and customer care.

              The SDN can also be customized to suit the manufacturing plant. Product designers can create private networks to perform tasks, like monitoring and controlling machinery, and offer customized policies to govern the factory’s network’s traffic. That gives it great flexibility to meet user needs and adapt as those needs change.

              Fig.4:  Seamless Switching / Connectivity based on SDN

              Conclusion

              SDNs enable fast and persistent data transfer between various industrial devices to handle mission-critical applications. That boosts operational efficiency, increases productivity, and reduces the risk of downtime from connectivity issues.

              How Capgemini can help

              SDNs in manufacturing are relatively new and unproven. Capgemini brings years of experience working on SDNs in the telecoms industry, where we have built a detailed understanding of the technologies involved and the challenges of integration, along with connections to an ecosystem of key technology players that must be brought together to deliver an effective SDN. In addition, we offer a deep understanding of manufacturing networks, including Wi-Fi, gateways, and IoT, which are all part of our longstanding DNA. Together, this makes us an ideal partner on your manufacturing SDN journey.

              Each new generation of mobile technology has delivered more: More data. More devices. More efficiency. But it’s time to broaden our view of network technology – focusing not just on what it brings today, but what more we can build with it tomorrow.

              Meet our expert

              Vijay Anand

              Senior Director, Technology, and Chief IoT Architect, Capgemini Engineering
              Vijay plays a strategic leadership role in building connected IoT solutions in many market segments, including consumer and industrial IoT. He has over 25 years of experience and has published 19 research papers, including IEEE award-winning articles. He is currently pursuing a Ph.D. at the Crescent Institute of Science and Technology, India.

                The Smart Factory

                Understanding your factory using data acquisition coupled with our analysis tools and visualisation tools

                  Intelligent Industry

                  We have entered the next era of digital transformation. This is characterized by a growing convergence of product, software, data, and services.

                  Intelligent 5G & Edge

                  Pioneering 5G Open Networks and the full ecosystem.

                  Solutions for the 5G & Edge evolution

                  Two of the key technologies that are powering Intelligent Industry, 5G and Edge computing, will have a radical impact on industries

                  Capgemini Invent
                    ]]>
                    https://www.capgemini.com/ch-en/insights/expert-perspectives/why-manufacturing-facilities-need-software-defined-networks/feed/ 0
                    Capgemini creates a pathway to innovation across the workforce https://www.capgemini.com/ch-en/insights/expert-perspectives/capgemini-creates-a-pathway-to-innovation-across-the-workforce/ https://www.capgemini.com/ch-en/insights/expert-perspectives/capgemini-creates-a-pathway-to-innovation-across-the-workforce/#respond Tue, 27 Feb 2024 11:34:42 +0000 https://www.capgemini.com/ch-en/?p=532284&preview=true&preview_id=532284

                    Capgemini creates a pathway to innovation across the workforce

                    Preeti Chopra
                    Feb 23, 2024

                    Capgemini’s award-winning “Innovation Awareness Week” initiative is driving an inclusive culture of learning through innovation across all levels of its organization.

                    Innovation is the lifeblood that breathes excitement, relevance, and change across every area of business.

                    In its simple definition, innovation culture refers to the deliberate cultivation and support of an environment that encourages and nurtures the generation, development, and implementation of new ideas, processes, products, or services. It involves creating conditions that stimulate creativity, experimentation, and collaboration among individuals and organizations, leading to the generation of innovative solutions and advancements.

                    In a world grappling with environmental challenges, sustainable innovation leadership has emerged as a pivotal approach to addressing pressing global issues while fostering growth. Integrating innovation into creative sustainable strategies is becoming increasingly necessary for businesses, organizations, and governments worldwide.

                    Innovation is everyone’s superpower

                    According to the Capgemini Research Institute, open innovation is crucial to navigating current and future business challenges, with 71% of organizations planning to increase investment in the next two years.

                    But to do this effectively, organizations need to start spreading the message that innovation is everyone’s superpower – that innovation is the secret sauce that can unlock the recipe for creativity, serendipity, and success.

                    Educating the workforce on ways to harness the innovative power of emerging technologies in a safe, controlled environment ensures they are ready for the future and whatever else the market might throw at them. It also enables people to share their learning experiences across every level of the organization.

                    Fostering a culture of innovation drives positive change

                    But how can an organization make this a reality across its business?

                    At Capgemini, we recently organized “Innovation Awareness Week” – a series of webinars designed to ignite the creative spark and empower our people to thrive in today’s ever-changing market.

                    Led by expert speakers, thought leaders, experienced innovators, and young professionals from across Capgemini and externally, these webinars inspired and equipped our people with the necessary strategies to foster a culture of innovation.

                    The webinars covered various topics including the role of storytelling in bridging the gap between people and technology, the importance of understanding the business reality, and delivering real value for the client. They also emphasized that innovation can also be the outcome of evolution and that a growth mindset requires people to be constantly challenged.

                    Award-winning, innovative HR

                    “Innovation Awareness Week” has helped Capgemini revolutionize how we approach challenges, envision possibilities, and drive innovation within the organization. This is enabling our people to be more proactive and innovative in helping our clients to transform their businesses and bring about positive change.

                    And it’s for these reasons that we’re extremely proud that Innovation Awareness Week has recently won a 2023 HRO Today Award in the “Best-in-Class: Innovation” category. With a record 175 nominations this year, this award clearly demonstrates Capgemini as being a leader among an elite group of exceptional HR teams.

                    Initiatives such as Innovation Awareness Week strengthen our intellectual capital by encouraging teams to bring our mission to life and drive results for our customers.

                    To learn how Capgemini’s Intelligent Learning Operations can help you deliver a personalized, connected, and continuous learning journey, contact: preeti.chopra@capgemini.com

                    Meet our expert

                      ]]>
                      https://www.capgemini.com/ch-en/insights/expert-perspectives/capgemini-creates-a-pathway-to-innovation-across-the-workforce/feed/ 0
                      How data, AI, and intelligent technologies are transforming electricity grids https://www.capgemini.com/ch-en/insights/expert-perspectives/how-data-ai-and-intelligent-technologies-are-transforming-electricity-grids/ https://www.capgemini.com/ch-en/insights/expert-perspectives/how-data-ai-and-intelligent-technologies-are-transforming-electricity-grids/#respond Mon, 26 Feb 2024 09:43:56 +0000 https://www.capgemini.com/ch-en/?p=532321&preview=true&preview_id=532321

                      How data, AI, and intelligent technologies are transforming electricity grids

                      Hariharan Krishnamurthy
                      Feb 26, 2024
                      capgemini-engineering

                      For decades, grids have transported electricity from power stations to where it is needed. Fuel is burned, turbines spin, electrons move along transmission lines to substations, then onto homes and businesses, powering everything from lightbulbs to industrial machinery.

                      For decades, this worked well. Energy demand was predictable, and utilities burned enough coal and gas to meet it – a bit less on long summer days, a bit more on short winter ones, or when the national team reaches the cup final.

                      But things have changed, and this way of doing things is no longer sufficient. Reliable coal and gas furnaces are out, and intermittent solar and wind are in. Demand is changing rapidly and unevenly, thanks to the electrification of transport, heating, and industrial processes, as well as to extreme weather that makes hot days hotter and cold days colder.

                      All of this transforms the grid from means of carrying energy from power stations to consumers, into a complex, dynamic, marketplace for energy. Aging infrastructure – that was not designed for decentralized energy – doesn’t help matters.

                      Reinventing the grid for a decentralized, decarbonized world

                      Utilities face several challenges all at once.

                      They need to grow capacity to meet surging electricity demand – our own estimates suggest 125 million kilometers of transmission and distribution lines are needed in the next 30 years (up from 80 million km today), at a cost of $7trn per year[1]. To deliver that efficiently, they need to become much better at predicting short- and long-term demand, so they can make intelligent decisions.

                      They need to rethink the grid around decentralized and intermittent renewable energy inputs, from huge wind farms to dispersed rooftop solar. In doing so, they will need to deal with a complicated mix of new and aging assets for generation, transmission & distribution, and storage.

                      Increasingly, they also must offer consumer services, as end users now expect more detailed data on energy usage and billing, and tools to sell surplus energy back to the grid.

                      With the increasing digitization and interconnectedness of the electric grid, cybersecurity has become a paramount concern. The grid’s vulnerability to cyber threats poses significant challenges, as malicious actors target critical infrastructure, aiming to disrupt operations, cause widespread outages, and compromise the reliability and safety of the energy system.

                      To compound matters, utilities are faced with the challenge of navigating a complex and evolving regulatory environment. New policies, mandates, and standards are being introduced to address emerging issues such as environmental sustainability, grid modernization, and cybersecurity.

                      All of this is leading utilities to ask new questions: How can we make the most of decentralized energy sources? How can we optimize aging assets? How can we accurately predict supply and demand in this more complex world? How can we improve the customer experience? And how can we keep the energy system secure?

                      To navigate these challenges successfully, utilities must define a vision and design a clear, tailored roadmap step by step, with clear added value and risk reduction regarding the constraints they will face. This strategic approach is critical, not only for adapting to the evolving energy landscape, but also for ensuring the resilience and efficiency of the grid in the face of rapid technological and environmental changes.

                      To create the grid of the future – and so answer all these questions – we need to do more with data and AI.

                      Making intelligent decisions

                      The heart of this transformation is about using data to generate situational awareness of energy infrastructure, so utilities can make intelligent decisions.

                      Take EV ownership. The transition to EVs will place massive new electricity demands on the grid. But when, and where, and how fast? Will everyone charge slowly overnight, or quickly at lunchtime? Can people be incentivized to charge at quiet times? We need to know these things to decide how much more distribution capacity to build or upgrade, and when.

                      That needs clever models to predict the future, and such models need a wide range of data. Utilities may already have some of that, such as electricity usage amongst existing EV drivers. But other data – such as EV sales projections, the percentage of people with driveways, public charging infrastructure plans – will need to be sourced from elsewhere. That is something new for utilities.

                      Perhaps the most critical models will be those for balancing renewable energy. By combining accurate weather models (e.g. how much sun will shine, and wind will blow) with a digital twin of your generation infrastructure, you can predict how much renewable energy will be fed in on a given day. By building precise energy demand models – e.g., using smart meter data, historical usage data, and weather data – you can predict how much energy is needed on that same day.

                      The gap between the minimum predicted renewable production, and maximum predicted energy demand, tells you how much fossil fuels you need to burn to ensure the lights stay on. The more accurate the models, the less you need to burn.

                      These are two of many examples of how data will transform the grid. Others include models that nudge consumers to use electricity at different times, models to predict sudden energy demand spikes, predictive models for asset health and vegetation management (asset failures can cost $millions in downtime, so are best avoided).

                      Technology drives grid transformation

                      All this will require wholesale transformation that builds intelligence into energy systems, turning utilities into companies that routinely use high-quality data to develop and deploy models – from load balancing to infrastructure planning, to predictive asset maintenance.

                      That will need people, processes, and technology infrastructure.

                      It will need people who can combine high-quality data from multiple sources to build highly predictive models and generate actionable insights for the company and its customers. This is not just about modeling, but also making smart decisions about data, such as where to focus limited resources, what data sources to acquire and use, and whether to build AI tools in the cloud or at the edge.

                      It will need processes to gather data. That will mean changes to your own data sources – e.g., by deploying smart meters to gather data, and adding connected assets (smart new ones or retrofitting aging ones with practical sensors) to monitor performance and build a cohesive model of the grid. It will need new relationships to secure data from third party sources – from weather companies to EV sales analysts, to government electric heating installation programs.

                      And utilities companies will need to build the IT infrastructure backbone that securely collects data from these many sources and transports it into a shared cloud platform. It will need tools to aggregate disparate data into consistent formats that can be used to build new models and feed existing ones. And it will need to deliver those insights – via purpose-built digital interfaces – to the people who need to act on them, whether network planners, asset maintenance engineers, or energy users.

                      Conclusion

                      Better technology for data collection and model-building (both AI and classical), will be critical to transforming the grid into one that is fit for the future. Technology is often thought of as an enabler of change, but in this case, that is thinking too small. Technology is the driver of change. It is the only way to create a smart grid that will deliver the decentralized, decarbonized energy system we need.


                      [1] World Energy Markets Observatory 2023, Capgemini.

                      Meet our expert

                      Hariharan Krishnamurthy

                      Vice President, Global Head of Energy Transition & Utilities
                        ]]>
                        https://www.capgemini.com/ch-en/insights/expert-perspectives/how-data-ai-and-intelligent-technologies-are-transforming-electricity-grids/feed/ 0
                        ADOPTING ZERO TRUST FOR PRIVATE 5G https://www.capgemini.com/ch-en/insights/expert-perspectives/adopting-zero-trust-for-private-5g/ https://www.capgemini.com/ch-en/insights/expert-perspectives/adopting-zero-trust-for-private-5g/#respond Thu, 22 Feb 2024 11:13:25 +0000 https://www.capgemini.com/ch-en/?p=531927&preview=true&preview_id=531927

                        ADOPTING ZERO TRUST FOR PRIVATE 5G

                        Aarthi Krishna and Kiran Gurudatt
                        20 Feb 2024

                        For organizations across the world, 5G has the potential to drive digital transformation and unlock new business opportunities, whether that’s connecting factories across vast areas or optimizing new energy sites like wind farms. However, as organizations embrace 5G, they must also address the escalating cybersecurity threats that come with this technological evolution.

                        Traditional security models, which rely on perimeter-based defenses and implicit trust built within the industrial network, are ill-equipped to handle the dynamic and distributed nature of 5G networks. This is where zero trust security principles come into play.

                        Defining zero trust

                        Zero trust assumes that an attacker may already be present within the network, and a constant cycle of validation needs to be in motion to prevent further infiltration and lateral movements. It offers a proactive and adaptive approach to security, emphasizing continuous verification and strict access controls to mitigate risks and ensure the integrity of 5G networks.

                        For Industrial enterprises deploying private 5G networks, a zero-trust approach means that all access to 5G networks should be explicitly authenticated, authorized, and monitored, and access privileges should be continuously reviewed. No access should be granted implicitly or by default.

                        Zero trust for Private 5G Networks

                        Our strategy for implementing zero trust in private 5G networks aligns with the vendor-agnostic Cybersecurity and Infrastructure Security Agency (CISA) Zero Trust Framework, addressing security across five pillars of zero trust. By leveraging the right technology and methodology, Capgemini recommends an approach based on zero trust principles for secure deployment of 5G technology in OT networks.

                        The implementation of robust protection and seamless operations needs to cover all five key pillars:

                        • Identity: Define and enforce granular access control policies for industrial users, allowing specific users to perform specific tasks on a specific asset. These policies should consider contextual requirements such as the time and location of a user’s request.
                        • Devices: Validate the end devices and ensure that the trust level of the devices is assessed. Policies need to be enforced to allow the segmentation of devices based on 5G network-specific identities e.g. Subscription Permanent Identifier (SUPI)/Subscription Concealed Identifier (SUCI). Policies should support device identity matching and context-based segmentation that allows the grouping of devices based on device type, cellular identities, location, along with the quality of service (QoS), latency, bandwidth, redirection, etc.
                        • Networks: Zero trust requires a clear separation of communication flows for network control and application/service tasks. For organizations using private 5G networks, a secure communication channel should be established when communicating with different locations, in addition to enabling secure remote access so that only those with the correct authentication and authorization credentials are allowed on the network.
                        • Applications and Workloads: The OT environment hosts various applications that are used for different purposes, such as data collection, process monitoring, and product creation. Access to these applications over a 5G network should be governed by access control policies that enforce application control, thereby minimizing the attack surface at ingress and egress. Policies should also be able to detect workload vulnerabilities and misconfigurations and enable application control based on operator-specific/standard slices.
                        • Data: Data protection should include measures to protect sensitive industrial information and prevent data loss. Data protection must be supported for both data at rest and data in motion, taking into account data classification and file types. Data flowing from one industrial site to another and to remote users must be inspected for data leakage, and measures must be taken to restrict access to unmanaged devices and unknown users.

                        Key industrial 5G security use cases aligned with the zero-trust framework include:

                        • Network segmentation: Improve digital perimeter resilience by enforcing, micro-segmenting, and grouping devices based on device type (CCTV, mobile), vendor, location, QoS, or 5G cellular identities (SUPI/SUCI).
                        • Secure remote access: Enforce zero trust-based access controls and offer secure remote access to industrial environments deploying 5G assets for internal and third-party users.
                        • Policy enforcement for slices: Apply security policies per network slice or group of slices assigned for various applications, based on their slice ID and thus prevent unauthorized data transfer and block various malicious activities inside industrial environment.
                        • Security monitoring using 5G SOC: Security monitoring of various 5G powered industrial devices (sensors, robots cameras drones end user phones, laptops) in a 5G security operation center (SOC) that offers centralized visibility along with other features such as incident management, vulnerability, and compliance management.

                        Conclusion

                        Implementing Zero trust for private 5G networks involves several key essential steps that include defining all the key assets to be protected such as applications, devices, data, etc, documenting the traffic flow over the 5G networks and defining fine grained policies that determine access to resources along with logging and monitoring, that provide key insights into network activity. Effectively implementing zero trust across all levels can greatly enhance the security posture of OT networks leveraging 5G technology.

                        By embracing zero trust principles and integrating them into the fabric of 5G networks, organizations can mitigate risks, protect sensitive data, and ensure the integrity of their networks in an increasingly interconnected and dynamic digital world.

                        You can learn more about our approach and our partners by joining us at Mobile World Congress in Barcelona between 26–29 February 2024.

                        Author

                        Aarthi Krishna

                        Global Head, Intelligent Industry Security, Capgemini

                        Kiran Gurudatt

                        Director, Cybersecurity, Capgemini
                          ]]>
                          https://www.capgemini.com/ch-en/insights/expert-perspectives/adopting-zero-trust-for-private-5g/feed/ 0