Bridgenext Salesforce Team, Author at Bridgenext https://www.bridgenext.com Fri, 11 Apr 2025 14:01:09 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 https://www.bridgenext.com/wp-content/uploads/2024/01/cropped-android-chrome-512x512-1-32x32.png Bridgenext Salesforce Team, Author at Bridgenext https://www.bridgenext.com 32 32 A Guide to Publishing Salesforce Second Generation Managed Packages using Azure CI/CD https://www.bridgenext.com/blog/simplify-salesforce-2gp-publishing-with-azure/ Wed, 18 Dec 2024 09:39:10 +0000 https://www.bridgenext.com/?post_type=blog&p=6898 A Guide to Publishing Salesforce Second Generation Managed Packages using Azure CI/CDDiscover how to set up Azure CI/CD pipelines for Salesforce 2GP. Streamline app publishing, reduce manual work, and boost efficiency.]]>

In the modern era of software development, the ability to deliver high-quality applications swiftly and reliably is more important than ever. As businesses increasingly embrace cloud technologies and agile methodologies, the need for robust deployment processes has become a top priority. Continuous Integration and Continuous Deployment (CI/CD) pipelines are central to this transformation, enabling development teams to automate their build, test, and deployment processes efficiently.

Azure DevOps, with its comprehensive suite of tools, stands out as one of the leading solutions for managing these pipelines effectively. When it comes to Salesforce development, particularly with the adoption of second-generation managed packages (2GP), the benefits of a well-designed CI/CD pipeline become even more apparent. By implementing a CI/CD pipeline using Azure DevOps, you can streamline your development processes, reduce manual errors, and ensure a smoother deployment experience.

So, are you ready to streamline the process of publishing second-generation managed packages on Salesforce, achieve greater efficiency, reduce manual errors, and deliver higher-quality applications with confidence? Let’s embark on a journey to unlock the powerful capabilities of Azure DevOps and second-generation managed packages for your development needs.

Read our blog to learn how to set up an Azure CI/CD pipeline specifically for publishing second-generation managed packages in Salesforce. We will delve into the steps required to integrate Azure DevOps into your development workflow, showcasing how to automate the various stages of your Salesforce application lifecycle—from initial code commits to final production releases.

Prerequisites

Before we dive into the details of setting up an Azure CI/CD pipeline for Salesforce second-generation managed packages, there are a few prerequisites that need to be in place. Here’s what you’ll need to get started:

  1. Salesforce CLI: Ensure that the Salesforce CLI is installed on your local machine for seamless interaction with Salesforce. You can install the Salesforce CLI by following the instructions at here.
  2. VS Code Editor: Use Visual Studio Code as your primary development environment for writing and editing code.
  3. Salesforce Extension Pack: Install the Salesforce Extension Pack for enhanced functionality in VS Code, including tools specifically designed for Salesforce development.
  4. Azure DevOps Account: Set up an Azure DevOps account to manage your code repository and implement CI/CD pipelines effectively. If you don’t have an account, you can sign up for free at https://dev.azure.com/.
  5. Dev Hub Org: This will be your central management hub for overseeing and accessing all your managed packages.
  6. Development Org: This will serves as your primary workspace for building, customizing, and testing Salesforce applications prior to deployment.
  7. Namespace Org: A Namespace Org is essential for ensuring that your custom components and applications have a unique global identity. This environment allows you to register a namespace, preventing potential conflicts with other Salesforce components.
    • If you do not currently have an organization with a registered namespace, create a Developer Edition org that is distinct from your Dev Hub or scratch orgs. If you already possess an org with a registered namespace, you’re all set!
    • Within the Developer Edition org, follow the steps to create and register your unique namespace.

Before you can create second-generation managed packages in Salesforce, you need to link a namespace to your Dev Hub org. A namespace is a unique identifier that distinguishes your package from others in the Salesforce ecosystem. By linking a namespace to your Dev Hub org, you establish a naming convention for your packages and components, ensuring that they are uniquely identified and can be distributed across different orgs seamlessly.

  1. Log in to your Dev Hub org as the System Administrator or as a user with the Salesforce DX Namespace Registry permissions.
  2. From the App Launcher menu, select Namespace Registries.
  3. Namespace Registries

  4. Click Link Namespace.
  5. Link Namespace

  6. In the window that pops up, log in to the Developer Edition org in which your namespace is registered using the org’s System Administrator’s credentials.
  7. To view all the namespaces linked to the Namespace Registry, select the All Namespace Registries list view.
  8. All Namespace

Create Salesforce DX Project

A Salesforce DX project has a specific structure and a configuration file (sfdx-project.json) that identifies the directory as a Salesforce DX project.

This command generates the necessary configuration files and directories to get you started.

sf project generate –name mywork –default-package-dir force-app–manifest
  1. Open sfdx-project.json file.
  2. Add your registered namespace.
  3. Namespace Registries

Authorize Dev Hub and Dev Org

  1. Authorize Dev Hub: Run the following command to authorize your Dev Hub:
  2. sf org login web –set-default-dev-hub –alias DevHub

    Follow the instructions to log in and authorize Dev Hub.

  3. Authorize Dev Org: Use the following command to authorize your Development Org:
  4. sf org login web –alias DevOrg

    Follow the prompts to log in and authorize the Development Org.

Configure Scratch Org Definition

Before creating your scratch org, you might need to set up some pre-configurations. This includes defining features like namespaces, enabling specific Salesforce features, and setting custom fields. Update your project-scratch-def.json file with these configurations.

  1. Open project-scratch-def.json: This file is located in the config directory of your Salesforce DX project.
  2. Add Necessary Configurations:
  3. Adjust these settings based on your project requirements. Refer to Salesforce documentation for additional options you might need.

Add Metadata to Manifest

Before committing your code to the Azure DevOps repository, you need to ensure that all required metadata is included in your manifest/package.xml file. You can do this manually or use the Salesforce Package.xml Generator Extension for VS Code.

  1. Using Salesforce Package.xml Generator Extension:
    • Install the Salesforce Package.xml Generator extension from the VS Code marketplace.
    • Open the Command Palette (Ctrl+Shift+P) and search for “SFDX Package.xml Generator: Choose Metadata components”.
    • Follow the prompts to generate a package.xml file that includes all the necessary metadata for your project. Package XML Extension
  2. Manually Adding Metadata:
    • Open the manifest/package.xml file in your project directory.
    • Add all relevant metadata components that you want to include in your deployment. For example:

Ensure that you include all the metadata components required for your managed package.

Retrieve Source from Org

To ensure your local project is in sync with the Salesforce org, retrieve the source metadata using Salesforce CLI.

  1. Retrieve Metadata:
    • Use the Salesforce CLI command to retrieve the metadata specified in your manifest.xml file:
    • sf project retrieve start –manifest manifest/package.xml –target-org DevOrg

      This command pulls the metadata from your Salesforce org into your local project directory.

Generate OpenSSL Certificate

This step is needed to generate a private key (server.key) and a self-signed certificate (server.crt) that will be used for authentication in the CI/CD pipeline.

Follow steps mentioned in the Salesforce documentation to Create a Private Key and Self-Signed Digital Certificate.

Create Connected App in Salesforce

  1. In your Salesforce org, create a connected app to establish a secure connection between the CI/CD pipeline and Salesforce.
  2. Configure the connected app with the following settings:
    • Connected App Name: Azure CI/CD Pipeline
    • API (Enable OAuth Settings): Enable OAuth Settings
    • Callback URL: https://login.salesforce.com
    • Selected OAuth Scopes: Full access (full)
    • Require Secret for Web Server Flow: Checked
    • Require Secret for Refresh Token Flow: Checked
  3. After creating the connected app, note down the Consumer Key and Consumer Secret, which will be used in the CI/CD pipeline for authentication.

Follow this guide for detailed steps to create a connected app in salesforce.

Upload Private Key in Azure Secure Files

  1. In Azure DevOps, navigate to your project and select the Library tab.
  2. Create a Secure File named server.key and upload the private key generated earlier.
  3. This secure file will be used in the CI/CD pipeline for authentication purposes.

Upload Private key

Creating Azure CI/CD Pipeline

  1. Create an azure-pipelines.yml file at the root location of project directory.
  2. Update this file with the code below. A sample configuration might look like this
  3.       
          trigger:
          - main
          
          pool:
            vmImage: 'ubuntu-latest'
          
          variables:
            - template: .\variables\variables.yaml
          
          jobs:
          - job: SalesforcePackageCreation
            displayName: 'Salesforce Managed Package Creation
            steps:
            - checkout: self
          
            - task: DownloadSecureFile@1
              name: jwtKey
              inputs:
                secureFile: 'server.key'
          
            - task: UseNode@1
              inputs:
                version: '20.x'
              displayName: 'Install Node.js'
          
            - bash: npm install @salesforce/cli --global
              displayName: 'Install Salesforce CLI'
          
            - script: |
                echo "Installing jq"
                sudo apt-get update
                sudo apt-get install -y jq
              displayName: 'Install jq'
          
            - script: |
                echo "Authenticating to Dev Hub"
                sf org login jwt \
                --username DEV_HUB_Username \
                --jwt-key-file $(jwtKey.secureFilePath) \
                --client-id $(CONSUMER_KEY) \
                --alias $(DEV_HUB_ORG_ALIAS) \
                --set-default-dev-hub
              displayName: 'Authenticate to Dev Hub'
          
            - script: |
                echo "Creating Scratch Org"
                sf org create scratch \
                  --set-default \
                  --target-dev-hub $(DEV_HUB_ORG_ALIAS) \
                  --definition-file config/project-scratch-def.json \
                  --alias $(SCRATCH_ORG_ALIAS) \
                  --duration-days 3
                echo "Scratch Org created"
              displayName: 'Create Scratch Org'
          
            - script: |
                echo "Pushing Source to Scratch Org"
                sf project deploy start --source-dir force-app --target-org $(SCRATCH_ORG_ALIAS)
              displayName: 'Push Source to Scratch Org'
          
            - script: |
                echo "Running Tests on Scratch Org"
                sf apex run test \
                  --target-org $(SCRATCH_ORG_ALIAS) \
                  --synchronous \
                  --code-coverage \
                  --detailed-coverage \
                  --result-format human > result.txt
              displayName: 'Run Apex Tests on Scratch Org'
          
            - script: |
                echo "Deleting Scratch Org"
                sf org delete scratch \
                  --target-org $(SCRATCH_ORG_ALIAS) \
                  --no-prompt
              displayName: 'Delete Scratch Org'
              condition: always()
          
            - script: |
                echo "Creating 2nd Generation Package"
                PACKAGE_LIST=$(sf package list --target-dev-hub $(DEV_HUB_ORG_ALIAS))
                echo "Available packages: $PACKAGE_LIST"
                if echo "$PACKAGE_LIST" | grep -q "$(PACKAGE_NAME)"; then
                  echo "Package $(PACKAGE_NAME) exists. No action required."
                else
                  echo "$(PACKAGE_NAME) does not exist, running sf package create"
                  sf package create \
                    --name $(PACKAGE_NAME) \
                    --path force-app \
                    --package-type Managed \
                    --target-dev-hub $(DEV_HUB_ORG_ALIAS)
                fi
              displayName: 'Create 2nd Generation Package'
          
            - script: |
                echo "Creating package version"
                OUTPUT=$(sf package version create \
                  --package $(PACKAGE_NAME) \
                  --installation-key-bypass \
                  --wait 10 \
                  --target-dev-hub $(DEV_HUB_ORG_ALIAS) \
                  --code-coverage \
                  --definition-file config/project-scratch-def.json \
                  --version-number $(VERSION_NUMBER) \
                  --version-name "$(VERSION_NAME)" \
                  --version-description "$(VERSION_DESCRIPTION)" \
                  --tag "$(TAG)" \
                  --verbose \
                  --json)
          
                if [ $? -ne 0 ]; then
                  echo "Failed to create package version. Command output:"
                  echo "$OUTPUT"
                  exit 1
                fi
                
                PACKAGE_VERSION_ID=$(echo "$OUTPUT" | jq -r '.result.SubscriberPackageVersionId')
                
                if [ -z "$PACKAGE_VERSION_ID" ]; then
                  echo "Package Version ID is null or empty. Command output:"
                  echo "$OUTPUT"
                  exit 1
                fi
                
                echo "Package Version ID: $PACKAGE_VERSION_ID"
                echo "##vso[task.setvariable variable=PACKAGE_VERSION_ID]$PACKAGE_VERSION_ID"
              displayName: 'Create Package Version'
              condition: succeeded()
          
            - script: |
                if [ "$PROMOTE_PACKAGE" = "true" ]; then
                  echo "Promoting Package Version"
                  echo "Publishing Package with Version ID: $(PACKAGE_VERSION_ID)"
                  sf package version promote \
                    --package "$(PACKAGE_VERSION_ID)" \
                    --target-dev-hub $(DEV_HUB_ORG_ALIAS) \
                    --no-prompt
                fi
              displayName: 'Promote Package Version'
              condition: and(succeeded(), eq(variables.PROMOTE_PACKAGE, 'true'))
          
            - task: PublishPipelineArtifact@1
              condition: always()'
              inputs:
                targetPath: $(System.DefaultWorkingDirectory)/result.txt'
                artifactName: TestResults'
                publishLocation: pipeline'
          
            - task: PublishBuildArtifacts@1
              condition: succeeded()'
              inputs:
                pathtoPublish: $(Build.ArtifactStagingDirectory)'
                artifactName: deploy-artifacts'
          
                  

Step-by-Step Explanation of the YAML Pipeline

In this section, we’ll break down each component of the YAML pipeline and explain the purpose of every step involved.

  1. Trigger

    This defines which branch initiates the pipeline. In our case, the pipeline is triggered by any changes made to the main branch.

  2. Pool

    Here, we specify the virtual machine image that will run the pipeline. For this example, we’re utilizing the latest version of Ubuntu to ensure compatibility and access to updated tools.

  3. Variables

    This section is crucial for managing configuration settings. You can store sensitive information such as API keys and connection strings here, keeping them secure while allowing easy access during pipeline execution.

  4. Jobs

    Jobs are the core components of the pipeline, outlining the sequence of tasks to be executed. In this example, we have a single job called SalesforcePackageCreation.

  5. Steps

    Each step represents an individual action within the job, executing specific tasks like checking out code, running scripts, or deploying to a Salesforce org. Here’s a detailed look at each step:

    • Checkout: This step retrieves the code from the repository to the agent, ensuring the latest version is available for processing.
    • Secure File Download: In this step, we download a secure file containing the private key used for authentication, ensuring a safe connection.
    • Node.js Installation: This step installs Node.js on the agent machine, preparing the environment for subsequent tasks.
    • Salesforce CLI Installation: Here, we install the Salesforce CLI globally on the agent, enabling Salesforce-specific commands.
    • jq Installation: This step installs the jq utility, which is essential for parsing JSON output generated by various scripts.
    • Dev Hub Authentication: We authenticate to the Dev Hub org using the JWT flow, allowing the pipeline to interact securely with Salesforce resources.
    • Scratch Org Creation: This step creates a scratch org based on the specifications provided in the project-scratch-def.json file, providing an isolated environment for development and testing.
    • Source Code Push: The pipeline pushes the source code to the newly created scratch org, ensuring that the latest changes are reflected.
    • Apex Tests Execution: This step runs Apex tests on the scratch org to verify the integrity of the code and ensure everything is functioning as expected.
    • Scratch Org Deletion: After testing, we delete the scratch org to clean up resources and avoid unnecessary charges.
    • Second-Generation Package Creation: This step checks if a second-generation package exists and creates one if it doesn’t, facilitating streamlined package management.
    • Package Version Creation: Here, we create a new version of the package based on specified parameters, making it ready for deployment.
    • Package Promotion: If the PROMOTE_PACKAGE variable is set to true, this step promotes the package version, making it available for broader use.
    • Publishing Test Results: This step publishes the test results as a pipeline artifact, allowing for easy review and tracking of the testing process.
    • Publishing Deployment Artifacts: Finally, we publish the deployment artifacts as a build artifact, ensuring that all necessary files are available for future reference.

By following this structured approach, we ensure a clear understanding of the pipeline’s operation, making it easier for you to adapt and implement in your own projects.

Commit Code to Azure DevOps Repo

  1. Create a Repository: In Azure DevOps, create a new Git repository to host your Salesforce DX project.
  2. Add Remote and Push Code:
    • Navigate to your local Salesforce DX project directory.
    • Initialize a Git repository if you haven’t already:
    • git init
    • Add the Azure DevOps repository as a remote:
    • git remote add origin https://dev.azure.com/your-organization/your-project/_git/your-repo
    • Stage and commit your code:
    • git add .
      git commit -m “Initial commit”
    • Push the code to Azure DevOps:
    • git push -u origin master

Set Up CI/CD Pipeline in Azure DevOps

  1. Create a Pipeline:

    • Go to your Azure DevOps project and navigate to Pipelines.
    • Click Create Pipeline and choose the repository you just pushed your code to.
      Create Pipeline
      Create Pipeline 2
      Create Pipeline 3
      Create Pipeline 4
    • Select YAML for pipeline configuration file created in previous step (azure-pipelines.yml).
      Create Pipeline 5
  2. Save Pipeline

Store Variables in Pipeline

To make your CI/CD pipeline flexible and adaptable, store configuration variables in Azure DevOps pipeline variables. This allows you to update values without modifying the pipeline configuration file directly.

  1. Access Pipeline Variables:

    • In your Azure DevOps project, navigate to Pipelines and select your pipeline.
    • Click on Edit to access the pipeline configuration.
    • Go to the Variables tab to define and manage variables for your pipeline.
      Create Pipeline 6
      Create Pipeline 7
  2. Add Pipeline Variables:
    • Click New Variable and add your variables. Create Pipeline 8 Create Pipeline 9
    • You can define variables for environment-specific configurations, API keys, connection strings, and other sensitive information.
    • Example Variables:
    • Name: DEV_HUB_ORG_ALIAS, Value: DevHub
    • Name: SCRATCH_ORG_ALIAS, Value: ScratchOrg
    • Name: PACKAGE_DESCRIPTION, Value: Package Description
    • Name: PACKAGE_NAME, Value: Package Name
    • Name: PROMOTE_PACKAGE, Value: Promote Package
    • Name: TAG, Value: Tag
    • Name: VERSION_DESCRIPTION, Value: Version Description
    • Name: VERSION_NAME, Value: Version Name
    • Name: VERSION_NUMBER, Value: Version Number
    • Create Pipeline 10

Ensure these variables are used in your pipeline YAML file for greater flexibility.

Run and Monitor Pipeline

  • Run the pipeline. Create Pipeline 11
  • Monitor the pipeline execution in the Azure DevOps portal to ensure that the build and deployment processes are executed successfully.

Conclusion

In conclusion, setting up an Azure CI/CD pipeline for publishing Salesforce second-generation managed packages can significantly enhance your development workflow and deployment processes. By automating the build, test, and deployment stages of your application lifecycle, you can achieve greater efficiency, reduce manual errors, and ensure a smoother deployment experience. Azure DevOps provides a powerful platform for managing your CI/CD pipelines, offering a comprehensive set of tools and features to streamline your development processes. By following the steps outlined in this guide, you can harness the full potential of Azure DevOps and second-generation managed packages to transform your Salesforce development experience.

Start your journey today and experience the transformative impact of CI/CD pipelines in your development projects.

]]>
Demystifying Agentforce and Getting Started with AI in Salesforce https://www.bridgenext.com/blog/real-world-applications-of-agentforce-salesforce-use-cases/ Thu, 07 Nov 2024 06:37:17 +0000 https://www.bridgenext.com/?post_type=blog&p=6615 Demystifying Agentforce and Getting Started with AI in SalesforceLearn how Agentforce revolutionizes business with AI solutions for sales, service, finance, & more, streamlining operations & enhancing customer experience.]]>

In today’s rapidly evolving digital landscape, businesses are constantly seeking innovative ways to enhance efficiency and stay ahead of the curve. Enter Agentforce—a powerful tool within Salesforce that is redefining how enterprise leaders interact with artificial intelligence (AI). Agentforce is primed to impact the entire Salesforce ecosystem, whether you’re a user, administrator, executive, or independent software vendor (ISV), as the technology is now available across the full platform, marking one of the most accelerated deployments of new features we have seen Salesforce release. In this comprehensive guide, we will explore the capabilities of Agentforce, bust common myths, highlight real-world applications, and offer strategic insights for implementing this cutting-edge technology.

Unlocking the Potential of Salesforce with Agentforce

Agentforce is more than just a buzzword; it’s a game-changer for Salesforce customers looking to rapidly harness the true potential of AI. By offering tools that create and customize AI agents, Agentforce empowers human agents to work faster and more efficiently. Imagine building pipelines, deflecting cases, and coaching sales teams—all with the aid of advanced natural language processing (NLP) capabilities. Unlike previous Einstein-based applications, Agentforce understands and responds in plain English, eliminating the need for complex JSON parsing.

This intuitive interaction allows Agentforce to learn from context, continuously optimizing its responses to become more efficient. Similarly, developers appreciate its accessibility, with costs starting at just $2 per conversation (and volume discounts apply). This affordability means that enterprises of all sizes can experiment with AI-driven applications, ensuring a tailored fit for their unique business needs.

Separating Agentforce Hype from Reality

Since its introduction at Dreamforce, Agentforce has generated considerable excitement. However, as with any new technology, it’s important to parse through and understand what is truly available today, and what is on the roadmap for the future.

For ISVs and Salesforce developers, while Agentforce is integrated into Salesforce’s core offerings, some packaging challenges remain. For instance, support for certain features are not yet available, and topics, while mandatory, aren’t packageable until early 2025.

For ISVs, as of December 2024, the following will be available to be 2GP packaged:

  1. Agent Topics
  2. Agent Actions built with
    • Apex
    • Flows
    • APIs

Packaging Agents & Prompt Template features will be available to be packaged during Q1 2025. Despite some features not being readily available, the core functionalities of Agentforce are robust and ready to revolutionize your business processes.

Real-World Applications of Agentforce

Agentforce is flexible, adaptable, and works well across a number of business functions. Your adoption of Agentforce should begin with understanding the use cases you envision Agentforce solving for you. Running an ideation process will enable you to understand your pain points and uncover opportunities to use Agentforce in action.

To assist in thinking about how your business or application can begin to experiment with Agentforce, let’s dive into some real-world examples that showcase its game-changing potential.

  • Reinventing Marketing and Sales

    Sales representatives often find themselves bogged down by routine tasks that detract from their primary objective: identifying and closing business. Agentforce has the potential to act as an invaluable assistant, automating lead tracking, email follow-ups, and sentiment analysis. Imagine a system where action plans are automatically created post-meetings, seamlessly updating opportunities and notifying account executives of necessary follow-ups. Additionally, marketing teams benefit from automated content creation, allowing professionals to focus on crafting impactful strategies.

  • Elevating Customer Service

    In a world where customer experience is paramount, Agentforce offers AI-driven solutions to enhance service delivery. By leveraging natural language processing, Agentforce allows customer service agents to respond to inquiries more efficiently. From product returns to case deflection and sentiment analysis on social media, Agentforce streamlines processes, reduces wait times, and ultimately elevates customer satisfaction.

  • Optimizing Financial Services

    For financial institutions, delivering quick and efficient service is key to enhancing the customer experience. Imagine a bank where customers can effortlessly open accounts, report lost cards or update personal details—without long wait times. Agentforce empowers financial service providers with AI tools that streamline these processes, minimizing friction and maximizing customer satisfaction.

  • Transforming Healthcare Administration

    In healthcare, Agentforce can identify fraudulent claims, ensuring accurate and timely claim processing. By automating data analysis and leveraging AI insights, healthcare providers can focus on delivering exceptional care while maintaining operational integrity.

  • Enhancing Platform Management

    For administrators, managing complex systems and processes can be daunting. Agentforce simplifies this task by automating post-installation steps and providing valuable insights into an organization’s security posture. Whether it’s generating sales pitches, answering customer queries, or creating marketing campaigns, Agentforce equips admins with the tools they need to drive efficiency and innovation.

The Strategic Benefits of Adopting Agentforce

Agentforce is more than a tool; it’s a strategic ally that empowers organizations to unlock new levels of productivity and efficiency. By automating mundane tasks, sales and service representatives can concentrate on high-impact initiatives. Error-prone processes become relics of the past as Agentforce optimizes resources and reduces operational costs. Enterprises already utilizing CRM data, Slack, Flows, Apex, and GenAI prompts will find Agentforce seamlessly integrates with existing workflows, driving intelligent automation.

In addition to streamlining internal processes, Agentforce offers lucrative monetization opportunities for partners. Standalone agent apps can be sold as premium services, and extension packages offer flexibility in pricing models. This versatility ensures that businesses of all sizes can leverage Agentforce’s capabilities to suit their unique needs.

Why Choose CodeScience as Your Agentforce Partner

At CodeScience, a Bridgenext company, we’re incredibly proud to be a launch partner for Agentforce and a key member of the Partner network, leading the way in helping our clients develop groundbreaking Agentforce innovations. As an early stage partner in the Agentforce journey, we’re already collaborating with Salesforce to bring agents online.

With our extensive experience and deep knowledge of the Salesforce ecosystem as a certified Salesforce Partner, we have a proven track record of delivering customized Salesforce solutions tailored to meet the unique needs of each client, ultimately driving sustainable business success and fostering trust in their operations. We have also worked directly with Salesforce to build, enhance, and deliver a number of its core offerings. As we apply this expertise to Agentforce, our focus remains on ensuring that our clients achieve their strategic goals in an ever-evolving digital landscape.

Conclusion

Agentforce represents a paradigm shift in how enterprises interact with AI, offering unprecedented capabilities for enhancing efficiency and driving strategic objectives. By leveraging its advanced features, businesses can streamline operations, optimize resources, and deliver exceptional customer experiences. For C-level executives, ISVs, and Salesforce customers, Agentforce is the key to unlocking new levels of innovation and success. If you’re ready to harness the power of AI, now is the time to explore Agentforce and discover the boundless possibilities it offers for your organization.

]]>
Transportation and Logistics State of the Industry https://www.bridgenext.com/blog/transportation-and-logistics-state-of-the-industry/ Thu, 12 May 2022 14:18:09 +0000 https://www.bridgenext.com/blog/transportation-and-logistics-state-of-the-industry/ Transportation and Logistics State of the IndustryThe transportation and logistics industry has experienced more change in the last few years than in the previous ten years combined.]]>

The transportation and logistics industry has experienced more change in the last few years than in the previous ten years combined. Since the pandemic started, supply chains have been stressed. There are new hiccups making headlines every day, and whether it’s trucking, rail, ocean, or third-party logistics — providers are scrambling to keep up. This disruption requires companies to adjust quickly and ensure their processes can keep up with these sudden changes.

There are many challenges that the T&L industry is facing, one of the largest being the rise in customer expectations. With the growth in e-commerce and the “Amazon effect” that has put the customer experience front and center, encouraging companies to think outside the box when it comes to customer experience. To strengthen customer experience, T&L firms have enhanced their systems, adding things like GPS and in-cap ELDs, enabling real time visibility, allowing for customers to get more information on their orders. While this is a great resource for customers and saves companies from an increase in calls to the help line, it also creates challenges with customers demanding more, and more. A world-class customer experience requires close collaboration across all parties in the supply chain in an increasingly connected environment through technology.

Workforce Challenges

Like many industries, T&L is also facing challenges within their workforce. One of the largest being the ongoing driver shortage. According to the American Trucking Associations, the industry’s current shortage of over 80,000 drivers could increase to over 160,000 by the end of 2030 (McNally). This shortage, driven by an aging driver workforce, driver pay, quality of life and regulatory hurdles, will require the industry to recruit nearly 1 million new drivers to replace capacity and support growth by the end of the decade. Implementing the proper Salesforce technology into your company helps combat this challenge. By enhancing recruiting and onboarding strategies, providing your drivers with the skills and support they need and consistency with employee feedback to increase retention.

Boom & Bust Cycle Continues

The freight transportation industry typically experiences a complete business cycle every three to four years, with the typical freight recession lasting around ten months. During the good times like we have now, where demand spikes, capacity is tight, and freight rates remain elevated, the industry typically responds by over-expanding. Transportation providers invest in capacity such as trucks, trailers and containers, while new entrants also enter the industry. These investments are capital intensive and take time to come on, but eventually, capacity will catch up with demand, rates will moderate, creating a more competitive environment for providers. Will there be another record of trucking bankruptcies like we saw in 2019? It’s too early to tell, but one thing is sure — providers need to make investments that position them for success in both boom-and-bust cycles.

Where Salesforce comes in

As an established, industry-leading platform, Salesforce has been proven to tackle some of their biggest headaches in the T&L industry while also providing a platform to manage a comprehensive portfolio of business processes that companies incorporeal to elevate performance. Salesforce might be top of mind with traditional CRM capabilities like lead conversion, pipeline management and customer 360. Still, the platform has been successfully deployed across transportation and logistics providers for much broader use cases. What’s possible on the Salesforce platform to enhance performance within Transportation and Logistics organizations is unlimited.

Want to Learn more?

Finding the right solution can be overwhelming. Bridgenext is here to help! Let’s connect to learn more how Salesforce can help you achieve success.

References

McNally, Sean. “ATA Chief Economist Pegs Driver Shortage at Historic High.” American Trucking Associations, https://www.trucking.org/news-insights/ata-chief-economist-pegs-driver-shortage-historic-high.

]]>
Pardot Engagement Dashboard Customization with Progressive Disclosure https://www.bridgenext.com/blog/pardot-engagement-dashboard-customization-with-progressive-disclosure/ Wed, 15 Jul 2020 12:45:02 +0000 https://www.bridgenext.com/blog/pardot-engagement-dashboard-customization-with-progressive-disclosure/ Pardot Engagement Dashboard Customization with Progressive DisclosureHave you ever created a dashboard that had so many data visualizations that users were forced to scroll and scroll to find the information they’re looking for? We’ve all been there. Sometimes you need to fit a LOT of information in a single dashboard. But adding a LOT of information is overwhelming to users.]]>

Have you ever created a dashboard that had so many data visualizations that users were forced to scroll (and scroll) down the screen to find the information they’re lookin for? We’ve all been there. Sometimes you need to fit a LOT of information in a single dashboard. But adding a LOT of information is overwhelming to users. Enter the principle of progressive disclosure.

What is Progressive Disclosure?

At its essence, the design principle of progressive disclosure ensures that users are viewing only the information they need to see. One of my favorite real-world examples of progressive disclosure involves kitchen cabinets. How would you feel if you walked into your kitchen and all the cabinet doors were open? It’s overwhelming right? If you need to get a cup out of the cabinet, do you open all the doors, or only the one where the cups are stored?

Now imagine you visit a dashboard to answer a specific question, but you’re presented with an overwhelming number of data visualizations. How long would it take for your eyes to scan the screen to find what you are looking for? Wouldn’t it be better to open the dashboard and be presented with the “right” information you need? In a nutshell, that’s the principle of progressive disclosure.

Pardot Engagement Dashboard

The B2B Marketing Analytics App in Einstein Analytics comes with several useful dashboards to allow users to analyze the effectiveness of their Pardot marketing assets. These dashboards are a great first step in analyzing your marketing assets and we highly recommend users begin their analytics journey with these dashboards and then customize and create new visualizations based on unique business processes and needs.

pardot engagement dashboard

The Engagement Dashboard helps to answer questions related to list emails, email templates, forms, and landing pages. Users can use the dashboard to answer questions like:

  • What emails generate the most clicks and opens?
  • Are there campaigns that are performing better or worse than others?
  • How are our teams performing in various regions (using Tag filters)?

The Engagement Dashboard displays key metrics at the top followed by details about individual Pardot assets including List Email Engagement, Email Template Engagement, Forms Engagement, and Landing Page Engagement. One of the challenges we’ve seen is that sometime users are interested in drilling down into details on assets located further down the screen, like Forms and Landing Pages. In order to analyze the data they select their filters, scroll down to view results, scroll up to adjust filters, scroll down to view results (repeat as needed). This scrolling can be time consuming.

Instead, why not apply progressive disclosure, to allow users to only view the information they need to see? In the screenshot below, you can see the same out-of-the-box dashboard, but with a few enhancements. The new ‘Open’ and ‘Close’ buttons allow users to only view the sections of the dashboards they wish to see – collapsing unnecessary sections like an accordion. Below, users can choose to open only the Landing Page Engagement section leaving the rest of the sections collapsed so that the most important information they need is easily accessible.

pardot engagement dashboard

Progressive Disclosure: Step-by-Step

Maybe one day in the future, Einstein Analytics will incorporate new functionality to create more user-friendly navigation on dashboards. But for now, we can use a combination of Links and Pages to create the same effect.

To start, you will need to create multiple Pages for each section you wish to open/close. For this example, we have created the following: Open All, List Email, Email Template, Form, Landing Page, Close All

multiple page example

Next, drag the Link widget onto the appropriate section on the dashboard and create ‘Open’ and ‘Close’ buttons. Below are the widget properties for a link that controls the view for List Emails. When a user clicks on the ‘Open’ link, they will be taken to the List Email page which is configured to only show List Emails. You can control how the button looks using the Widget Style section for the Widget properties. For example, adding a border and increasing the value of the Border Radius to 16 will make the link look more like a button.

widget example

Pardot engagement dashboard

For the ‘Close’ buttons, the widget properties will link to the Close All page. You can make more pages if needed, depending on the type of views that work best for your organization. These pages are just recommended starting points.

There you have it….progressive disclosure for the Pardot Engagement Dashboard. Obviously, there are additional types of customizations you can do. This is just a very simple example of how you can take an out-of-the-box dashboard and improve the UI/UX so that it meets the needs of your individual users.

]]>
Tableau vs Einstein Analytics: Which Tool is Best for You? https://www.bridgenext.com/blog/tableau-vs-einstein-analytics-which-tool-is-best-for-you/ Wed, 08 Jul 2020 14:54:07 +0000 https://www.bridgenext.com/blog/tableau-vs-einstein-analytics-which-tool-is-best-for-you/ Tableau vs Einstein Analytics: Which Tool is Best for You?We have put together a straightforward guide to help guide you along the path toward a selecting your Analytics tool of choice.]]>

The importance of data and analytics for modern organizations is no longer up for debate. Every day, organizations capture and store increasing volumes of data. That raw data then needs to be transformed into meaningful analytical insights that can be used to make data-driven business decisions. There is no shortage of analytical tools to choose from.

You have likely heard about Tableau and Einstein Analytics – two powerful analytical products currently offered by Salesforce. And you may be asking yourself, “What’s the difference? How do I choose between these two products?” While a final decision involves a thoughtful analysis, we have put together a straightforward guide to help guide you along the path toward a final decision.

How do organizations use Tableau?

First, let’s take a high level overview of these two products, starting with Tableau. Tableau is a powerful analytics platform that excels in providing fast adoption at scale across all skill sets in order to drive business value. It’s an intuitive product that allows organizations to integrate data from a variety of cloud-based or on-premise data sources in order to generate powerful data visualizations.

How do organizations use Einstein Analytics?

Einstein Analytics is also a powerful analytical tool that allows organizations to integrate and transform data from a variety of cloud-based sources, including a live integration with Salesforce. Using these integrations, users can create and view powerful visual analytics and predictive insights, as well as take action on data from within the tool. Einstein Analytics is native to the Salesforce CRM platform, so it can inherit the Salesforce security model and also comes with a variety of industry and role-specific apps and templates to jump-start the data visualization and predictive analytics process.

Which tool is best for my organization?

Tableau and Einstein Analytics seem to serve primarily the same purposes, and there is a lot of overlap in functionality. So, how do you know which one is “best” for your organization? While there may be specific features that play into your decision (like Einstein Analytics’ use of AI to guide data preparation or Tableau’s robust mapping capabilities), we recommend asking a series of questions:

  1. First, “Where do your users spend their time and perform their work?” Do they live and breathe in Salesforce? If so, Einstein Analytics will be the default choice for the best user experience. But if your users do not work in Salesforce, Tableau will likely provide the best user experience.
  2. “Where does the data live?” If all of the data you need to analyze exists in Salesforce, then Einstein Analytics is likely the best tool. If the majority of data exists in Salesforce, but you have a way to bring the data into Einstein Analytics via a connector or custom integration, then Einstein Analytics may also be your default tool.

Analytics Decision Tree

You can see that in deciding between Tableau vs Einstein Analytics, the primary issue is not where the data comes from, but where the user lives. After all, what use are analytical insights if your users are reluctant to log into a tool to view them.

What about using BOTH Tableau and Einstein Analytics?

For some organizations, Tableau and Einstein Analytics can be complimentary purchases. Einstein Analytics provides predictive and machine learning capabilities that can be integrated with your Tableau dashboards. Also, some organizations benefit from allowing their Salesforce users to view interactive dashboards in Einstein Analytics where they can take action on records, while users in other departments want access to analytics that is not powered by Salesforce data. In these situations, it makes sense to strategically acquire licenses for both products to meet the needs of your users and allow them to work most effectively.

Want to learn more? Let’s connect.

]]>
Uncovering Insights from Salesforce EDA Model with Einstein Analytics https://www.bridgenext.com/blog/uncovering-insights-from-salesforce-eda-model-with-einstein-analytics/ Mon, 10 Feb 2020 15:19:46 +0000 https://www.bridgenext.com/blog/uncovering-insights-from-salesforce-eda-model-with-einstein-analytics/ Uncovering Insights from Salesforce EDA Model with Einstein AnalyticsUncovering Insights from the Salesforce Educational Data Model (“EDA”) with the power of the Salesforce platform.]]>

Salesforce’s Educational Data Model (“EDA”) is an innovative and flexible architecture that gives educational institutions from K-12 to higher education a way to capture a 360 degree view of their students. With EDA, institutions can easily view the relationships that students have with faculty, staff, academic advisors, family members, and even memberships with academic departments and student organizations. It’s truly an innovative data architecture built on the Salesforce platform designed to facilitate a Connected Campus.

While EDA makes it easy to view details about the student lifecycle within Salesforce, it doesn’t always support the robust analytical needs that many institutions have come to expect. This is because the strength of the EDA architecture is designed to allow users to view and take action on student data within Salesforce – not to facilitate a multi-dimensional view for reporting and analytics.

So what’s a university to do when it needs to allow users to perform a complex analysis of retention and graduation rates? The answer lies in transforming the EDA data into an analytic database that supports multi-dimensional queries using Einstein Analytics.

Einstein Analytics for Educational Business Intelligence

Einstein Analytics is uniquely designed to generate insights from the EDA model. Not only can Einstein Analytics transform EDA data into datasets that are useful for answering questions about retention and institutional effectiveness; but because Einstein Analytics is closely integrated with Salesforce, users can take action directly on records from within Einstein Analytics itself.

For example, if an academic advisor discovers that a cohort of students is at-risk for withdrawing from the university, she can instantly take action – sending a customized mass email, auto-assigning tasks, or even engaging directly with students to resolve the issue. All of this happens with minimal clicks within a customized dashboard view.

Moving from EDA to an Analytical Architecture – Identifying the Key Questions

It’s clear that Einstein Analytics can take data from EDA and transform it into datasets that are suitable for conducting analytical investigations. But what should those datasets look like? What EDA objects and fields should be used, and how should the data be structured? Viewing the EDA Entity Relationship Diagram below, it’s clear that institutions can capture data related to their students on a variety of dimensions — academic enrollments, case management, relationships, and organizational memberships.

eda_entity_relationship_diagram

Source:s3-us-west-2.amazonaws.com/sfdo-docs/eda_entity_relationship_diagram.pdf

However, not all of the EDA data is useful to an institutions analytical inquiries. It’s not necessary to import the entire EDA model into Einstein Analytics. Rather, institutions should take a more strategic approach. The key is to decide what data is needed to provide answers to the questions each institution is asking. In making the leap to an analytics-focused solution, the important question to ask is “What problem is my institution trying to solve?” Answering this question allows institutions to design the appropriate data visualizations as well as identify the underlying data that will support the visualizations.

Below are just a few examples that institutions can use in beginning discussions of how Einstein Analytics can solve problems using the EDA model:

  • What courses have the highest failure of withdrawal rates?
  • What period of time before the semester begins is a student unlikely to enroll in classes?
  • Is there an association between class registration date and success in that class?
  • What factors contribute to a student receiving a D, F, or W grade in a class?
  • What recruitment efforts will have the most impact in generating student applications?
  • Which courses should be offered next year / semester and how many seats are required?

After an institution determines its list of questions, the next step is to identify what kind of data is needed to provide answers and where is that data stored? Additional discovery will be conducted to determine the quality of that data and if there are any data gaps that need to be addressed so that the institution can utilize the best data to answer its key questions.

Next Steps – Designing Visualizations and Dataset Creation

After data has been identified, the institution can get to the exciting task of designing data visualizations that will allow users to have a conversation with their data. (It’s important to note that this design phase may also elicit additional data discovery). In the next blog post of this series, we will walk through the process of drafting a key question related to student retention, how to identify the appropriate data in the EDA model, and discuss how to transform the data into datasets that provide a flexible multi-dimensional format for complex data analysis.

]]>
The Art of the Possible: Manufacturing Chatbots https://www.bridgenext.com/blog/the-art-of-the-possible-manufacturing-chatbots/ Mon, 09 Dec 2019 19:29:53 +0000 https://www.bridgenext.com/blog/the-art-of-the-possible-manufacturing-chatbots/ The Art of the Possible: Manufacturing ChatbotsDiscover how Salesforce Einstein Analytics Chatbots can take your manufacturing organization to the next level.]]>

The Role of Einstein Bots in Manufacturing

In part three in the series “The Art of the Possible: Manufacturing,” let’s explore use cases Chatbots and AI for manufacturing.

The manufacturing industry is well-positioned to take advantage of AI enabled chatbot functionality. According to a recent Salesforce “State of Service” report, 24% of manufacturing customer service professionals currently use AI chatbots with 33% planning to use them in the next 18 months. The projected growth rate of AI chatbot use in the manufacturing industry over the next 18 months is 137%.

Chatbot for Manf stats

When you think about chatbots, the first thing that comes to mind is likely an online customer service agent that engages with customers. While this is an excellent application for chatbots, there are many more ways that manufacturing can make use of this evolving technology. Because a key goal in manufacturing is to produce and deliver a quality product with minimal delay, the ability to keep employees up-to-date on all stages of the production supply chain is imperative. Employees, just like consumers, expect to have access to the information they need in real-time. Chatbots are able to respond to this need by providing answers to employee inquiries at any time of the day from any location. But first, what exactly are chatbots?

What are Salesforce Einstein Bots?

A chatbot is a generic term for an application powered by Artificial Intelligence (AI) that simulates a real-time conversation in the user’s natural language. The Salesforce Einstein Bot functionality harnesses the power of AI to interact with users on virtually any channel. With Einstein Bots, it’s possible to provide users with the answers they need quickly using Natural Language Processing (NLP) combined with any type connected data including CRM and ERP systems. Einstein Bots can be deployed on a traditional online web chat platform, SMS, Facebook Messenger, Apple Business Chat, and more. With chatbots, businesses can expect to reduce in-person call volume and average call handle time while simultaneously increasing first contact resolution, agent productivity, employee morale, and customer satisfaction. In addition to Einstein Bots, Salesforce also offers Einstein Voice functionality that allows customers to interact with their data through a voice-powered app.

While the applications for AI enabled chatbots are numerous, we have highlighted several use cases below specific to the manufacturing industry. These applications address the industry-wide challenges we encounter in our clients related to budgetary constraints, maintaining visibility into end-to-end processes, and keeping up with consumer and vendor expectations.

Supply and Inventory Status

Factory managers need to stay up-to-date on the status of parts and materials inventory. This allows them to know the right time to submit a request to order additional supplies. Traditionally, a manager might open a report on a desktop application in their office to determine (for example) how many types of widgets can be manufactured based on existing supply inventory. Instead of opening a static report on a computer, the manager could simply use a chatbot on a mobile device and type or speak a question directly. The chatbot will respond with the number of widget types that could be manufactured based on existing inventory.

Safety and Maintenance Inquiries

In a manufacturing plant, accidents are not only damaging to employee health and productivity, but can also cause tremendous financial loss. Thus, it’s important to be able to monitor and assess the safety of each factory’s work environment in real-time. Again, managers and maintenance crews can check and log safety issues in traditional reports. However with chatbots, they can ask specifically about facility metrics and make immediate adjustments to building temperatures, gauges, and equipment as necessary while they are on the factory floor. This ensures that managers can monitor and respond to issues in the most efficient way possible.

Internal Human Resources

Every company is expected to manage HR activities including recruitment and people management for existing employees. The manufacturing industry is no different. AI enabled chatbots allow HR professionals to gather instant information about employee performance, progress reviews, and attendance from any location. Chatbots are also effective during the recruitment process as a way for prospective employees to schedule interviews, engage in screening exercises, and event have questions answered.

End-to-End Visibility

Because of the complexities of modern agile supply chains, AI enabled chatbots can be a valuable asset in providing end-to-end visibility. Where a manager might struggle in interpreting reports to predict inventory levels or set delivery dates, a chatbot powered can be trained to interpret data from multiple sources and provide clear and actionable answers. Managers and buyers can also ask questions directly to chatbots to determine order status to get information about the status of shipments directly from their mobile devices.

Recalls and Customer Interactions

Recalls are costly not only to manufacturing operations, but also in terms of responding to consumer inquiries. Chatbots enable a streamlined way for consumers to contact manufacturers directly and receive answers about recalls without straining the resources of live agents. This allows chatbots to handle routine questions while allowing live agents to respond to more complex inquiries.

As the applications for chatbots in the manufacturing industry continue to grow, so do their benefits of meeting increasing demands for self-service needs, providing instant responses to complex questions, reductions in human error, and improvements in ROI for manufacturing efforts as a whole. AI enabled chatbots are just another tool that will eventually become standard for all manufacturers as they strive for greater operational efficiencies to meet performance goals.

]]>
Predicting Outcomes with Salesforce Einstein Analytics and Einstein Discovery  https://www.bridgenext.com/blog/predicting-outcomes-with-salesforce-einstein-analytics-and-einstein-discovery/ Thu, 31 Jan 2019 17:00:41 +0000 https://www.bridgenext.com/blog/predicting-outcomes-with-salesforce-einstein-analytics-and-einstein-discovery/ Predicting Outcomes with Salesforce Einstein Analytics and Einstein Discovery Predictions play a key role in our decision-making processes. With predictive analytics, businesses can take their knowledge of their customers to the next level and anticipate wants and needs. Companies that approach decision-making based on data trends and analysis tend to treat information as a defined asset compared to companies with other approaches.]]>

Ancient civilizations were very interested in predictions. If you read Greek and Roman mythologies, you may remember there were people called oracles who were blessed with the power of prophecy. People offered sacrifices to the gods so that the gods could speak through these divine individuals. Kings would make decisions based on what they were told by oracles. Oracles made predictions and people listened to them.

A lot has changed since then, but predictions still play a key role in our decision-making processes.

Today, we live in an era of predictions that impact the evolution of our civilization and the accoutrements that go with that including our day-to-day business activities.

“Life is an Uber ride, you have to decide what your destination is. Predict what’s next and then have the flexibility to evolve.” Marc Benioff

What’s a prediction?

Wikipedia points out that a prediction can also be referred to as a forecast. In other words, a prediction is a future-looking result or sign based on historical data. These days, enterprises are using AI and machine learning to generate predictions and use them in business flows. Basically, predictive analytics is used to fill in the missing pieces of the data puzzle by building more understanding of the data that is available.

Why are predictions important when making business decisions?

With predictive analytics, businesses can take their knowledge of their customers to the next level and anticipate wants and needs. Companies that approach decision-making based on data trends and analysis tend to treat information as a defined asset compared to companies with other approaches. These companies use analytics to more easily identify business opportunities and predict future trends and strive to generate more revenue with data.

What is important to consider when thinking about predictions?

We mentioned that predictions are calculated based on historical data. It’s important that this historical data is a true representation of what actually happened.

Let’s say you want to predict the estimated travel time from point A to point B. You rely on a dataset, which tells you that on Mondays and Fridays the trip took 20 minutes, on Tuesdays 30 minutes, and on Wednesdays and Thursdays 25 minutes. So, you might think it’d be safe to say that if you travel over the weekend the trip would take around 25 minutes. This might be totally wrong. Over the weekend, the same route is not congested and it could actually take less than 10 minutes. Similarly, it’s important to know the business context when predicting the outcome of a variable.

What if the data doesn’t look good? How do you fix it?

First off, get to know your existing data. Einstein Analytics (EA) can help uncover key data problems. EA allows you to augment (join) Salesforce objects (and external data as well) so you can get a better understanding of the many relationships among data objects. EA faceting makes your job easier by drilling down and across so that you don’t have to worry about specific roll-up fields. You can perform a thorough analysis by looking at those dimensions or attributes that would help predict the outcome of a variable.

Why should you use Salesforce Einstein Analytics and Einstein Discovery?

If you are a Salesforce User, the Einstein platform is your best bet for enabling predictions. The self-service analytics app of Salesforce includes an ETL (extract, transform, load) tool that allows you to read not only from Salesforce objects but from many other platforms, such as AWS, Azure, Google Cloud, and even from CSV files using simple connectors. Einstein Discovery (ED) then consumes data from a dataset created by EA through a seamless integration on the same platform. Finally, predictions coming out of Einstein Discovery can easily be embedded in Salesforce Lightning pages, so that you don’t have to navigate to other apps or websites.

Why should you use Salesforce Einstein Analytics and Einstein Discovery?

How can you make the best use of EA + ED licenses (EA Plus)?

Einstein Analytics is a collection of tools that embeds predictive and analytical charts for its Sales, Service, and Marketing clouds. Einstein Discovery is a platform that provides actionable data and insights to the stories. These tools are designed to bring artificial intelligence to business executives while providing the underlying data models to more advanced users like data scientists.

Best use of EA + ED licenses

What is this thing about stories? Maximize or minimize?

A story includes what happened, why it happened, and predictions and recommendations for a given outcome variable. A model is the portion of a story that includes:

  • Explanations (why it happened)
  • Predictions (what could happen)
  • Recommendations (what to do to improve a predicted outcome)

Every story includes a desired goal of maximizing or minimizing an outcome. For example, a goal could be to maximize the margin or to minimize the per-unit cost. You can deploy multiple models to the same goal. You would take this approach to segment your data into different models (one per segment) that target the same goal.

Blog_image3-1

What is the writeback feature and how can it help Salesforce users? How can integrating in Salesforce make more sense? What’s the advantage of Lightning Integration?

Writeback is a feature where the explanation, recommendations, and predictions created from a story in ED can be applied to a new record being created in Salesforce. Imagine how cool it would be for a sales rep to see these statistics for, let’s say, an opportunity created by them and then being able to predict the outcome of the opportunity. This feature allows us to show all of this information as a separate section within the Lightning page layout of a Salesforce record. To top this, these are real-time numbers that get calculated based on the story created in Einstein Discovery. The rep can see them as soon as they create the record. These predictions give key insights to the rep and help them understand how that record may play out in the future, offering assistance and providing recommendations to help achieve the desired goal: closing deals.

Remember that the story works on the principle of setting a goal for maximizing or minimizing an outcome.

Consider an example where you need to predict the outcome of an Opportunity. Wouldn’t it be great if you could predict the percentage chance of winning it? To get analysis and predictions around this, a story can be created in ED by selecting the ‘is Won’ field in the Opportunity to be true. Remember that a story can make use of a variable to be maximized or minimized, but at the same time you can configure it as a Boolean value such as true or false (as in this case). This story can then be deployed to Salesforce and the results can be viewed for the individual Opportunity record that is created or updated. Please see the snapshot below, which shows the writeback feature embedded in a Salesforce detail page record.

Blog_image4-1

Marc Benioff quote source: www.linkedin.com/pulse/salesforce-evolution-stephen-cummins/

]]>
Einstein Analytics Dataset Creation Leveraging Analytics External Data API https://www.bridgenext.com/blog/einstein-analytics-dataset-creation-leveraging-analytics-external-data-api/ Thu, 15 Mar 2018 15:29:41 +0000 https://www.bridgenext.com/blog/einstein-analytics-dataset-creation-leveraging-analytics-external-data-api/ Einstein Analytics Dataset Creation Leveraging Analytics External Data APIData is the core of any organization and an accurate data set is imperative. This blog gives insight into creating Einstein analytics data sets leveraging analytics external data API.]]>

Granularity

It’s a word we hear all the time with Einstein Analytics. Key decision makers can perform granular analysis of data through Einstein Analytics’ easy to use dashboards to derive actionable insights.

There are some scenarios where data is fetched from multiple sources to broaden the analysis scope. In this blog, we discuss a reliable method to allow your multiple data sources to keep many-to-many relationships or granularity. We will delve into how you can also utilize the Salesforce Analytics External API to avoid missing rows or even duplicating them in your datasets.

Use case: analyzing sales data

Suppose your organization uses Opportunity Schedules and Opportunity Splits for maintaining the Sales Data for products that your company markets. Your sales management needs reports or dashboards to track the number of bookings each sales representative has made. If you have Einstein Analytics licenses, you can use the custom reporting and dashboards provided by this tool to represent the data.

To view this data, creating a dataset in the Einstein Analytics platform is required where the Opportunity Schedules and Opportunity Splits can be combined. Another option is to keep the data as separate datasets and utilize Salesforce Analytics Query Language (SAQL) to join them together at runtime to view the combined data. Lastly, you can somehow join the two datasets together to combine the data in one dataset.

It is important to understand that if you use these datasets separately, then the uniting logic will happen while you load it to the dashboard. This will reduce the performance of the dashboard as loading time will increase if there is a high volume of data.

When augmenting them together using the data flow to create one single dataset, limitations will lead to joining two objects which have a many-to-many type of relationship between them. When such datasets in Einstein analytics are augmented, the platform provides the option of joining using the ‘LookupSingle’ or ‘LookupMulti’ option.

Both of the above joining options do not guarantee achieving a true form of data as a final dataset. Some records may get skipped for the single type join and when using multi-type the measures for the number of matches found will get added.

The solution: Apex

A way to resolve the above-mentioned problems or to overcome analytics limitations is by using Apex. The joining logic can be written using Apex code within the Salesforce platform. The combined data can then be sent to Einstein analytics using Analytics External Data Rest API and a final combined dataset can be created there. A scheduler will ensure this logic is executed regularly and the data is sent over to Einstein Analytics systematically.

To accomplish this, a batch class is written in the Salesforce instance using Apex, which will fetch the Opportunity Schedules across the needed time period. Now, the corresponding Opportunity ID can be used to fetch the related Split records and then use them for the loop. The data can be combined to form a comma separated string (CSV). The idea here is to create a CSV dataset which can be sent over to Analytics in an iterative fashion for the creation of the dataset.

Using InsightsExternalData Object for Creating Datasets in Einstein Analytics

The batch class approach

Dataset creation needs metadata, typically in json format. This json file can be uploaded in Salesforce and be used as a reference for creating the actual dataset.

To accomplish this, you would insert the ‘InsightsExternalData’ object after assigning properties like Dataset Name, Format of the data source with ‘Action’ initially set as ‘None’ and the ‘metdataDataJson’ property assigned with the data coming from the above-mentioned metadata creation.

The Start, Execute and Finish methods are used to complete the analytics dataset creation. The start method would fetch the data. The execute method would then create the delimited header as well as data strings with columns comprising of both Schedules and Splits objects, in batches. With the data created for each batch, it is incrementally sent to Einstein analytics using the ‘InsightsExternalDataPart’ object. The finish method will then process the data strings using ‘InsightsExternalData’ creating the dataset in Einstein Analytics.

Solution benefits

Organizations can achieve various benefits from this solution, including:

  1. Overcoming the limitation of the augmenting operation in Einstein Analytics for joining objects having many to many relationships
  2. Flexibility of creating joined data per business needs as there is feasibility to add only those columns that are needed
  3. Easy maintenance since new columns can be added by altering the Apex code
  4. Schedulable and updatable columns per the required frequency/time interval
  5. A single dataset which eases the effort needed on the Analytics side, where developers will no longer need to write SAQL code
  6. Improved dashboard performance as there is no need for SAQL code which runs on high data volumes
  7. No additional code needed in Dataflow which saves time as the flow executes faster

Data is the core of an organization. So, an accurate dataset is imperative. Connecting datasets which are cumbersome can be simplified with Einstein Analytics, providing two methods to create the dataset utilizing data from various sources.

If your organization needs help using Einstein Analytics, contact us today!

]]>
Quick Tips to Get Started with Salesforce Quote-to-Cash https://www.bridgenext.com/blog/quick-tips-to-get-started-with-salesforce-quote-to-cash/ Thu, 08 Jun 2017 14:35:00 +0000 https://www.bridgenext.com/blog/quick-tips-to-get-started-with-salesforce-quote-to-cash/ Quick Tips to Get Started with Salesforce Quote-to-CashLooking to streamline your configure-price-quote (CPQ) process? Salesforce Quote-to-Cash may be right for you. Here are a few insider tips to get started.]]>

It’s a fact of life that closing deals can be complicated. And if your company needs to manage a large product catalog, apply automatic discounts, or keep track of a quote’s signature status, you may have some even bigger issues.

A great Configure-Price-Quote (CPQ) tool can be invaluable in these scenarios. While there are many tools in the market, Salesforce Quote-to-Cash (formerly SteelBrick) has some impressive features that kick traditional CPQ up a notch.

But with so much functionality available, knowing where to start can be tricky.

Fortunately, there are easy ways to dip your toes into the Configure, Price and Quote elements of Salesforce Quote-to-Cash, so you can get your team quickly up and running and start closing those complicated deals effortlessly.

Configure: Using the Product Configurator to Create Product Bundles

Bundling may be one of the most important elements of Salesforce Quote-to-Cash and possibly the easiest to start using.

For businesses that sell in bundles – specific combinations of product or service options, offered as preset package deals or as client-requested customizations – determining the appropriate product configuration options, and then pricing those options accurately, can be a challenge.

The Product Configurator within Salesforce Quote-to-Cash makes bundling a snap. Think of it as a Visualforce version of your product catalog, integrating seamlessly with your standard Salesforce objects to allow products to be combined in valid ways and priced automatically.

To use the Product Configurator to create a bundle:

  1. Choose the products and options to include in the bundle. Products can be sorted by any filter criteria, including by Product Name, Product Family and Product Code. If the product you select has been configured to be a bundle, you can choose which options to include or exclude.
  2. Select Edit Lines to modify quantities and update the descriptions as needed.
  3. Group products together and discount them either across the board or on a line-by-line basis.

Salesforce Quote to Cash Image 1.png

The Salesforce Quote-to-Cash Product Configurator simplifies bundling.

Price: Discounting Made Simple

For many sales reps, “price” and “discounts” go hand in hand, so discounting is the natural next area on which to focus. Salesforce Quote-to-Cash makes it easy for reps to discount products in multiple ways, including the ability to:

  • Apply discounts automatically or manually
  • Discount by percentage or dollar amount
  • Set discounts on individual line items or at the quote level
  • Discount a quote to a specific amount

Discounts Ceceblog part 2 image 2.png

Discounts can easily be added to individual line items.

For example, say a customer only has budget of $75,000 on their order this month, but their current subtotal sits at roughly $78,000.

Simply set the Target Customer Amount at $75,000, and discounts are automatically and evenly spread across the products to match the invoice total to the targeted amount.

Ceceblog part 2 image 3 spread out discounting.png

Spread out discounting proportionally by setting the Target Customer Amount.

Even better, the built-in calculator means sales reps don’t need to double-check math, and preset pricing controls limit how discounts are applied.

On the other hand, reps may not want to have to manually input discounts for every sale. The Discount Schedule tool allows you to establish and apply rules that trigger discounts by volume, terms, sales rep authorization levels, or other criteria.

For example, say your company sells subscription products that come with special, predetermined introductory offers. Simply select the quantity or subscription term length, and the discount is automatically calculated.

Quote: Combining the Product Configurator & Discounts to Optimize Quotes

Finally, the ability to create gorgeous quotes may be where Salesforce Quote-to-Cash shines brightest.

Instead of using the standard Quote object, Salesforce Quote-to-Cash comes with its own custom Quote, Quote Terms and Quote Document objects (along with others). The ability to combine these custom objects allows reps to create dynamic, accurate quotes every time, with far less effort.

For example:

  • Automatically update fields using data in Salesforce. If you’ve ever had to manually update fields in a Word-document-based quote, chances are you’ve at least once accidentally sent out a quote with the wrong company, contact name or price. Salesforce Quote-to-Cash eliminates this risk by pulling existing data into the quote.

Cece blog part 2 image 4 prepopulated fields.png

Prepopulated fields make search-and-replace errors a thing of the past.

  • Update totals based on discounts applied. The Display Quote Line adjusts the pricing information displayed for selected products based on specified criteria. In the example below, had a manual discount been applied to any of the products shown, the Additional Discount Column would be displayed.
  • Select static or dynamic terms and conditions. Terms and Conditions can be configured to be permanent or customizable. Reps can modify terms as needed without affecting other quote records. Quote terms can also be configured to display differently for different products.
  • Enhance quotes with images and templates. Quote PDFs can be configured to show product images and virtually anything else that can be coded via HTML. Quote templates can also be preselected to avoid sales rep mistakes.
  • Sign quotes digitally. DocuSign or EchoSign allows customers to sign quotes digitally, and the status of signatures can be tracked directly in Salesforce.

Salesforce Quote-to-Cash: Unexpected Flexibility

In truth, these features are just the tip of the iceberg. Salesforce Quote-to-Cash can be used for order management, automated renewals, guided selling, forecasting, and quite a lot more.

But there’s no need to do it all at once. Start slowly with the basic CPQ functionalities we’ve outlined here, and introduce new features as your team gains experience and confidence. Before you know it, even your most complex deals will be a breeze to close.

Bonus Read:3 Unexpected Advantages of Salesforce Quote-to-Cash”

Can Salesforce Quote-to-Cash help you automate your CPQ process? Contact a Bridgenext consultant today!

]]>