Data Cloud Consultant - Data Cloud Setup and Administration (12%)

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

Package an S3 Data Stream Process

- Navigate to "Setup" in your development org. - Search for "Package Manager" in Quick Find. - Click "New" to create a new package - Add a name, select the language, add an optional description, and then click "Save". - After the package is created, on the Components tab, click "Add". - From the Component Type dropdown menu, select "Data Stream Definition". - Select the correct data stream and then click "Add to Package". - From the same screen in Package Manager, click "Upload". - From the Package Details screen in Package Manager, add in a "Version Name" and "Version Number". - You can also fill out optional fields, like password protection. - When done, click "Upload".

Google Cloud Storage Connector Use Cases

- A data Cloud segment of any customers who have browsed running shoes on the website in the past seven days but have not yet purchased anything. You may want to activate that audience for personalized omni-channel messaging to influence their behavior to complete the online purchase. - A Data Cloud segment of customers who have browsed hiking gear more than three times in the past X days in the San Francisco area. You can target that segment with an in-store promotion for hiking gear to drive them to convert in-store.

Metadata API Supported Data Cloud Items

- AWS Data Streams - Ingestion API Data Stream - Mobile and Web Data Streams - Data Lakes - Data Models

Ingestion Patterns

- Batch - Near Real-Time - Real-Time

Data Cloud Functionality for Standard Salesforce Package

- Calculated Insights: Calculated Insights are Data Cloud definitions and calculations that aid in segmentation. The SQL components can be included in a standard package. - S3 Data Streams: S3 data streams associated mapping can be packaged for both standard and custom data models. - Ingestion API Data Streams: Ingestion API data streams with associated mapping can be packaged for both standard and custom data models. - Data Models: Custom data models can be included in a standard package.

Data Cloud Functionality for Data Kit Package

- Commerce and CRM Data Streams: Both CRM and Commerce Cloud data streams can be packages within this. - Data Model: If a data stream is added to a data kit, the data models that it's mapped to are automatically added and auto-populated to prepare for segmentation.

Data Cloud Permission Sets

- Data Cloud Admin - Data Cloud User

Available Data Cloud Record Components for Profile Explorer Record

- Data Cloud Detail Panel - Data Cloud Highlights Panel - Data Cloud Profile Related Records - Data Cloud Profile Engagements - Data Cloud Profile Insights

Segmentation and Activation Add-On License Permission Sets

- Data Cloud for Marketing Admin - Data Cloud for Marketing Data Aware Specialist - Data Cloud for Marketing Manager - Data Cloud for Marketing Specialist

New (Separate) Home Org for Data Cloud Considerations

- Data Cloud is meant to support a multi-tenant level. - You probably need to build custom LWCs in your data org to provide those users with Data Cloud data views. - Data Cloud can connect to multiple Salesforce core orgs, but only one Enterprise Marketing Cloud account (IED).

Data Cloud Processes Considerations

- Data Cloud processes are schedule-based - Triggering of processes even when not required. - No-near real-time processing - Long waiting time between processes

Available Data Cloud Flow Actions

- Data Ingestion for CRM data stream. - Data Ingestion for S3 data stream. - Publish Calculated Insight. - Trigger Identity Resolution Job. - Publish Segments, materialize segments, and activate.

Data Cloud Objects Supported in Lightning Report Builder

- Data Stream - Segment - Activation Target - Identity Resolution

Existing Data Org for Data Cloud Considerations

- Data org migration and object model refactoring require Data Cloud rework or reimplementation. - Data Cloud still requires API access to sObjects from within this org because it replicates that data to the Data Lake. - A single Data Cloud instance can connect to multiple Salesforce Core orgs. - Data Cloud still separately persists ingested data "off core."

Data Cloud Org Provision Options

- Existing data org - New, separate home org

Org Types

- Home - External - Sandbox

Data Cloud Objects Supported in Flow Builder

- Identity Resolution - Calculated Insights - Data Streams - Segments - Activations

Data Kits and Packages Best Practices

- If you get an error that your package can't be installed, confirm your environment setup. You can't install a standard package in the same org you created it in. - If you need to upgrade or make updates to a package regularly, create a managed package version. To reflect the latest updates, the new package version needs to be installed in the destination org. - CRM and Commerce data kits can be packaged and reused multiple times in an org and can be mapped to multiple orgs. - Packaging S3 and ingestion API data streams with standard packaging can only be packaged once, for use in one org. - Calculated insights should be packaged and used in an org that already has a corresponding data model established and mapped. Without the required data models, installed calculated insights cannot be deployed. - When new data model relationships are added, make sure to include the relationships in the package for the new version. - Only developer edition orgs can create managed packages. In order to create a managed package the user needs to first claim a namespace in the Package Manager UI.

Create Autolaunched Flow for Data Cloud Processes

- In Data Cloud, select 'Setup'. - In the Quick Find box, enter Flows, and select 'Flows'. - Click 'New Flow'. - Select 'Autolaunched Flow (No Trigger), and then click 'Create'. - On the canvas, click 'Add Element', and then click 'Action. - In the left pane, click 'Data Cloud', and select a Data Cloud action. - Enter the label, input values, and then click 'Done'. - On the canvas, click 'Add Element', and then click 'Pause'. (When a flow pauses, it waits for one or more resume events. Resume the flow when a specified platform event is published.) - Enter the label and wait configuration label. - Go to the Resume Event tab and select 'A Platform Event Message is Received'. - From the Platform Event dropdown, select an event, and then click 'Done'.

Install a Data Kit Process

- Log in to the environment you wish to upload the Data Kit. - Use the Package Install URL in a new window. - Complete the Data Stream deployment. - Navigate to the "Data Stream" tab and create a data stream. - Select "Salesforce CRM" and click "Next". - Select "Data Kits" and choose your created Data Kit. - Click "Next" to review and edit data fields. - Select the Data Stream Definition from the package. - Review details and click "Deploy".

Packaging Data Kits Benefits

- Makes recreating the data models between environments unnecessary - Streamlines the package creation process - Manages metadata needing to package

Auditing and Ongoing Maintenance Activities

- Monitor usage entitlements - View identity resolution processing history - View record modification fields - View and monitor setup changes with Setup Audit Trail - View and monitor login history - Monitor account status on Salesforce Trust

New (Separate) Home Org for Data Cloud Scenarios

- Multiple Salesforce customer orgs exist. - Highly complex enterprise architecture exists. - The Data Cloud administration users are different from the Salesforce admin users. - The existing data org is highly customized.

Package a Calculated Insight Process

- Navigate to "Setup" in your development org. - Search for "Package Manager" in Quick Find. - Click "New" to create a new package - Add a name, select the language, add an optional description, and then click "Save". - After the package is created, on the Components tab, click "Add". - From the Component Type dropdown menu, select "Calculated Insight Object Definition". - Select your component and then click "Add to Package". - From the Package Details screen in Package Manager, add in a "Version Name" and "Version Number". - You can also fill out optional fields, like password protection. - When done, click "Upload". - Use the Installation URL to install the package in another org after it is uploaded.

Data Cloud Implementation Step: Select Data Bundles and Business Units

This is the fifth step in the Data Cloud Implementation steps: Data Bundles: - Click "Manage" to select data bundles to import into Data Cloud. - You see three data bundles sources: Email, MobileConnect, and MobilePush. These sources are auto-selected for import, but only select the data bundles of the channels you currently use. - Click "Start" to confirm the bundles for import. Business Units: - Click "Manage" to select which of your available Marketing Cloud business units to activate. - Use the arrows to add or remove business units to the Selected business units to the Selected business units field. - Click "Save".

Identity Resolution Rulesets

These are a combination of match and reconciliation rules used to combine source records to resolve identity and create unified profiles.

Data Cloud Connector: Google Cloud Storage

This Data Cloud connector allows you to connect Google Analytics and Google Big Query with Data Cloud, which can then be used for data ingestion. This allows you to enrich Data Cloud profiles with Google Analytics data.

Data Cloud Connector: Salesforce CRM

This Data Cloud connector enables access to the Salesforce CRM data, including but not limited to Sales Cloud, Service Cloud, B2B Commerce, and Loyalty Management. It supports connections to the following org types: Home org, External orgs, and Sandbox orgs.

Data Cloud Connector: Amazon S3

This Data Cloud connector lets you ingest data from S3 buckets as well as activate data to S3.

Data Cloud Org Provision Option: New (Separate) Home Org

This Data Cloud org provision option refers to deploying a brand-new Salesforce org solely for housing the Data Cloud.

Data Cloud Org Provision Option: Existing Data Org

This Data Cloud org provision option refers to provisioning Data Cloud inside of a Salesforce org currently used by a business.

Data Cloud Record Component for Profile Explorer Record: Data Cloud Detail Panel

This Data Cloud record component for a Profile Explorer record displays record details for a Data Cloud record.

Org Type: External

This Org type refers to if Data Cloud is installed outside of the CRM org. Customers may connect to any production external orgs, including other orgs where it may be installed.

Org Type: Home

This Org type refers to where Data Cloud is installed. If the customer is using this org for Sales Cloud or Service Cloud or Loyalty Management, they may use the Salesforce CRM connector to ingest CRM data from within this org.

Package Type: Unmanaged

This Package Type: - Commonly used for one-time migration of metadata. - All components and attributes are editable. - Not upgradeable nor supported. - The developer has no control over the components once installed.

Package Type: Managed

This Package Type: - Typically used by ISVs on AppExchange to sell and distribute their apps. - Protects intellectual property of developer/ISV. - Is upgradeable and supports versioning. - To support upgrades, certain destructive changes are not allowed. - Contains a distinguishing namespace.

Salesforce CRM Connector Supported Connection: One-to-Many

This Salesforce CRM Connector supported connection refers to a single CRM instance being connected to more than one Data Cloud instances. This is a good example where data from a single CRM org needs to be segregated by regions or sub-brands, or to maintain data access restrictions that are implemented in CRM at the data (record) level. Another example is adherence to the development lifecycle; in this case, development and production environments have to be separated.

Salesforce CRM Connector Supported Connection: One-to-One

This Salesforce CRM Connector supported connection refers to a single CRM instance to a single Data Cloud instance. A typical example of this would be Production to Production environment connection. Another example would be connection to the home org when this and Loyalty Cloud are deployed in the same instance.

Salesforce CRM Connector Supported Connection: Many-to-One

This Salesforce CRM Connector supported connection refers to more than one CRM instance being connected to a single Data Cloud instance. This configuration applies to scenarios where brand data is aggregated from multiple CRM instances into a single, consolidated view of the entire customer base within a single instance.

Auditing and Ongoing Maintenance Activity: Monitor Usage Entitlement

This auditing and ongoing maintenance activity involves monitoring your account for activities that impact your contract. These include: Unified profiles, segment publishes, and engagement events or records.

Auditing and Ongoing Maintenance Activity: View Identity Resolution Processing History

This auditing and ongoing maintenance activity involves visiting the identity Resolution page to check the processing history and consolidation rate over time for your resolution rules.

Workflow Orchestration

This enables Data Cloud Admins to define more granular, connected workflows with more flexible execution schedules. You define workflows and chain together Data Cloud processes, such as ingestion (Salesforce CRM and Amazon S3 data streams), segments, and activation. You run processes in a sequence as needed instead of waiting to run them at a scheduled time. For example, you chain the processes to refresh calculated insights or run segmentation when data ingestion is completed. You define workflows to monitor orchestration flow runs or check their progress. Use Flow Builder to orchestrate Data Cloud workflow processes. With Flow Builder, you build complex enterprise-scale automation with automated triggers, reusable building blocks, and prebuilt solutions.

Ingestion Pattern: Real-Time

This ingestion pattern if used by the Web & Mobile Connector and processes engagement data every 2 minutes.

Ingestion Pattern: Near Real-Time

This ingestion pattern is often used by APIs and process small micro-batches of records every 15 minutes.

Ingestion Pattern: Batch

This ingestion pattern is used by the CRM Connector and Marketing Cloud. It updates data on an hourly basis.

Calculated Insight Object (CIO)

This is a data model object created after a calculated insight is processed. They can be viewed in Data Explorer.

Data Stream

This is a data source brought into Data Cloud. For example, a Marketing Cloud customer data extension. These can be based on batched data or real-time.

Data Model Objects (DMOs)

This is a grouping of data (made up of attributes) created from data streams, insights, and other sources. They can be standard or can be custom, based on business need. Common standard ones include sales orders, account, party identification, email engagement, and so on. They can be virtual (a view into a data lake object) or physical (an actual grouping of data found in Data Cloud).

Installed Package

This is a managed or unmanaged package that has been uploaded into an org from another org or from AppExchange. It is a collection of components and applications that are made available to other organizations through AppExchange or in another instance of Salesforce.

Data Kit

This is a package created within Data Cloud. It allows you to streamline the package creation and installation process. Data Cloud objects, such as metadata, relationships, and other components, can be wrapped together with a few easy clicks.

Data Lake Object (DLO)

This is a storage container with the data lake for the data ingested into all data streams within Data Cloud.

Data Explorer

This is a tool in Data Cloud that allows users to view and validate the data from a Data Model Object (DMO), Data Lake Object (DLO), or in a Calculated Insights Object (CIO).

Calculated Insights

This is a tool in Data Cloud that queries, transforms, and creates complex calculations based on stored data.

Package Manager

This is a tool in Salesforce Setup that packages components.

Profile Explorer

This is a tool within Data Cloud in which a user can view Unified Individual profiles.

Data Model

This is a way to organize and standardize data elements, and to add and edit data relationships. This is also a tab within Data Cloud where a user can create custom data model objects and view existing data model objects.

Metadata

This is data that describes other data. In Data Cloud, this relates to the fields, configurations, and code that make up your environment. It can be imported into other instances of Salesforce, modified in the product interface or edited via the API.

Data Cloud Implementation Step: Update Your Admin User Process

This is the first step in the Data Cloud Implementation Steps: - Log in to your Data Cloud instance with the link provided in your admin email. - Reset your password when prompted on-screen. - Navigate to Setup from the dropdown. - Type 'users' into the Quick Find field. - Click "Users". - From the user screen, click your username. - From your User page, under Permission Set Assignments click "Edit Assignments". - Select the "Data Cloud Admin" permission set and click the "Add" arrow icon. - Click "Save".

Data Cloud Implementation Step: Connect to Marketing Cloud

This is the fourth step in the Data Cloud implementation steps. - Navigate to the Setup gear and click "Data Cloud Setup". - Under Configuration, click "Marketing Cloud". - To connect your Marketing Cloud account, click "Manage". - Enter your Marketing cloud admin username and password.

Data Mapping

This is the process of associating data lake objects (DLOs) to data model objects (DMOs) after data has been ingested into Data Cloud. Only mapped fields and objects with relationships can be used for segmentation and activation.

Data Cloud Implementation Step: Provision Data Platform Process

This is the second step in the Data Cloud Implementation steps. - Navigate back to the 'Setup gear.' - Click "Data Cloud Setup." - Click the "Get Started" button. - Wait until you see the green success message.

Data Cloud Implementation Step: Connect to Sales or Service Cloud

This is the sixth step in the Data Cloud Implementation steps: - Under the Setup gear, select "Data Cloud Setup". - Under Configuration, select "Salesforce CRM". - Click "Connect" - You have the option to click "Connect" for the Salesforce org where Data Cloud is provisioned. You can also click Connect next to Connect Another Org. - Enter your suer credentials to establish the connection with Data Cloud.

Data Cloud Permission Set: Data Cloud Admin

Users with this Data Cloud permission set can access all functionality within Data Cloud, including mapping data to the data model and creating data streams, identity resolution rulesets, and calculated insights. To manage and assign users in Setup and access Data Cloud Setup, you must have this permission set and have a Salesforce administrator role that grants access to Salesforce Setup. If you have access to Salesforce Setup, you can set up the application, and access Salesforce Sales and Service clouds and other integrated Salesforce systems.

Data Cloud Permission Set: Data Cloud User

Users with this Data Cloud permission set can view Data Cloud features.

Segmentation and Activation Add-On License Permission Set: Data Cloud for Marketing Data Aware Specialist

Users with this Segmentation and Activation Add-On License Permission Set can map data to the data model and create data streams, identity resolution rulesets, and calculated insights.

Segmentation and activation Add-On License Permission Set: Data Cloud for Marketing Admin

Users with this Segmentation and Activation Add-On License Permission set can manage day-to-day configuration needs, support, maintenance, and improvement and perform regular internal system audits. To manage and assign users in Setup and access Data Cloud Setup, you must be a Data Cloud Admin and have a Salesforce administrator role that grants access to Salesforce Setup. If you have access to Salesforce Setup, you can set up the application, and access Salesforce Sales and Service clouds and other integrated Salesforce systems.

Segmentation and Activation Add-On License Permission Set: Data Cloud for Marketing Specialist

Users with this Segmentation and Activation Add-On License permission set can create segments.

Segmentation and Activation Add-On License Permission Set: Data Cloud for Marketing Manager

Users with this Segmentation and Activation Add-On License permission set can manage an overall segmentation strategy, including creating activation targets and activations.

Data Cloud for Marketing Data Aware Specialist Features Access

View Only: - Data Space Data Addition - Data Streams - Datashares - Data Lake Objects - Data Transforms - Data Model - Identity Resolution - Data Explorer - Profile Explorer - Calculated Insights - Einstein Studio Bring Your Own Model - Data Action and Data Action Targets View Only: - Segments - Activation and Activation Targets Access Denied: - Data Cloud Setup - Data Space Management

Data Cloud User Features Access

View Only: - Data Streams - Datashares - Data Lake Objects - Data Transforms - Data Model - Identity Resolution - Data Explorer - Profile Explorer - Calculated Insights - Einstein Studio Bring Your Own Model - Data Action and Data Action Targets Access Denied: - Data Cloud Setup - Data Space Management - Data Space Data Addition - Segments - Activation and Activation Targets

Create a Data Kit Process

- Navigate to Data Cloud Setup - Under "Tools", click "Data Kits". - Click "New". - Give your Data Kit a Name (required) and a Description (optional) and click "Save". - To add Data Streams, click the "Add" button in the Data Streams section". -Provide a "Bundle Name" for your Data Streams. - Select the Data Streams you wish to add to the package and click "Next". - All related objects and relationships with the data stream are included in the bundle automatically. - Click "Save". - Navigate to Salesforce Setup. - Navigate to "Package Manager". - Click "New". - Add a "Package Name", choose a "Language", provide a "Description" (optional), and click "Save". - From the "Components" tab, click "Add". - Select "Data Kit Definition" from the Component Type dropdown menu. - Click "Add to Package". - Click "Upload". - Add "Version Name" and "Version Number". - Click "Upload".

Install Calculated Insights from a Package Process

- Navigate to the "Calculated Insights" tab in Data Cloud then click "New". - Select "Create from Package" and then "Next". - Choose from your installed packages and then click "Next".

Create an S3 Data Stream from a Package Process

- Navigate to the "Data Streams" tab in Data Cloud then click "New". - Select "Installed Packages" from the Other Sources section and click "Next". - Select the correct data stream package and select the data stream you want to install, then click "Next". - Review the information carefully and add formula fields, as needed by click the "New Formula Field" button. - Enter a name for your data stream, the required S3 information and S3 authentication details, and a schedule frequency. - Click "Deploy".

Salesforce CRM Connector Supported Connections

- One-to-One - One-to-Many -Many-to-One

Amazon S3 Connector Considerations

- Rather than having an administrator configure the connector within Data Cloud's setup, these connections for data ingestion are configured individually at the data stream level. - A single Data Cloud account can connect to multiple S3 buckets if needed. - These connections can be made by any users with access to create a Data Source, such as a Data Aware Specialist. - Connection information must be provided each time a new Data Stream is created.

Workflow Orchestration Use Cases

- Reward customers who are impacted by a technical failure by depositing loyalty points to their accounts and sending an email. Kick off Calculated Insights once data is ingested and once Calculated Insights has completed, kick-off segmentation and activation. - Kick-off Identity Resolution job to run only after data ingestion for one or more data streams is complete. Kick-off segmentation and activation only once Identity Resolution job is complete.

Data Cloud Connectors

- Salesforce CRM - Amazon S3 - Google Cloud Storage

Pre-Built Data Cloud Connectors

- Salesforce Clouds, such as CRM, Marketing Cloud, B2C Commerce, and Marketing Cloud Personalization - External sources, such as external file storage (Google Cloud Storage, Amazon S3). - API and mobile connectors, such as Ingestion API, Web, and Mobile SDK.

Data Cloud Initial Setup Process Overview

- Set up your Data Cloud account. - Configure additional users by creating profiles. - Set up connectors to connect data sources.

Data Cloud Data Considerations

- Sharing rules and other data restrictions in Core CRM DO NOT apply to data stored in Data Cloud. - Data Cloud data is not stored as sObjects, but in the Data Lake outside of Core CRM.

Existing Data Org for Data Cloud Scenarios

- The customer has a single line of business. - Customer data is housed in a single Salesforce org. - Primary use cases require OOTB Data Cloud LWCs and search capabilities for service agents.

Data Cloud Analytics Consideration

- The data that lives in Data Cloud itself is in the Data Lake so at this point, it's not possible to use the Standard Salesforce Reports & Dashboards in CRM on top of this. - To create a report on a Data Cloud object, you need to configure a custom report type.

Package Types

- Unmanaged - Managed

Unmanaged Packages Considerations

- can't be upgraded, but all the contents are editable after installation. - To replace Installed contents with a new version, uninstall then reinstall the newer unmanaged package. - Can be created in any Salesforce edition (Enterprise, Unlimited, Performance, and Developer) - If using namespaces, avoid creating unmanaged packages in the same Developer Edition org

Managed Packages Considerations

- comes with namespaces that make the content unique to the developer - Supports versioning and pushes upgrades for automation - should be used if you plan to list on AppExchange

Set Up Data Cloud Consideration

A user must of the Data Cloud Admin or Data Cloud Marketing Admin permission set in order to set up Data Cloud.

Google Cloud Connector Implementation Steps

Create a Connection: - Click on the Data Cloud Setup - Click the Google cloud storage on the left nav. - Enter the bucket and connection details. - Click "Save." Create a Data Stream: - Select the connection. - Select the file that needs to be ingested. - Check all the file header and primary key details. - Click "Save." Monitor: - Go to the data stream tab and check the status.

Data Cloud for Marketing Admin Features Access

Full Access: - Data Cloud Setup (If the user is also a Salesforce admin) - Data Space Management - Data Space Data Addition - Data Streams - Data Lake Objects - Data Transforms - Data Model - Identity Resolution - Data Explorer - Profile Explorer - Calculated Insights - Einstein Studio Bring Your Own Model - Data Action and Data Action Targets - Segments - Activation and Activation Targets View Only: - Datashares

Data Cloud Admin Features Access

Full Access: - Data Cloud Setup (If the user is also a Salesforce admin) - Data Space Management - Data Space Data Addition - Data Streams - Datashares - Data Lake Objects - Data Transforms - Data Model - Identity Resolution - Data Explorer - Profile Explorer - Calculated Insights - Einstein Studio Bring Your Own Model - Data Action and Data Action Targets Access Denied: - Segments - Activation and Activation Targets

Data Cloud for Marketing Specialist Features Access

Full Access: - Segments View Only: - Data Streams - Datashares - Data Lake Objects - Data Transforms - Data Model - Identity Resolution - Calculated Insights - Einstein Studio Bring Your Own Model - Data Action and Data Action Targets - Activation and Activation Targets Access Denied: - Data Cloud Setup - Data Space Management - Data Space Data Addition - Data Explorer - Profile Explorer

Data Cloud for Marketing Manager Features Access

Full Access: - Segments - Activation and Activation Targets View Only: - Data Streams - Datashares - Data Lake Objects - Data Transforms - Data Model - Identity Resolution - Calculated Insights - Einstein Studio Bring Your Own Model - Data Action and Data Action Targets Access Denied: - Data Cloud Setup - Data Space Management - Data Space Data Addition - Data Explorer - Profile Explorer

Data Cloud Implementation Steps

Step 1: Update your admin user Step 2: Provision Data Cloud Step 3: Create profiles, users, and add permission sets. Step 4: Connect to Marketing Cloud Step 5: Select appropriate data bundles and business units in Marketing Cloud. Step 6: Connect to Sales or Service Cloud. Step 7: Prepare for ongoing tasks and maintenance.

Data Cloud Implementation Step: Add Users and Permission Sets Process

This is the third step in the Data Cloud Implementation steps: Create User: - In Setup, type 'users' into the Quick Find field. - Click "Users." - From the User screen, click "New User." - Fill out the required information for your user: - General Information: - Last Name - Alias (auto-generated) - Username (must be in email format) - Nickname - Role (Select or keep as None Specified) - User License: Salesforce - Profile: Select Standard User or a custom profile - Email Encoding: Defaults to Unicode (UTF-8) - Locale Settings: - Time Zone - Locale - Language - Approver Settings: - Receive Approval Request Emails: Select from displayed options. - Review your work and once done, click "Save." Permission Sets: - From the user screen, under "Permission Set Assignments" click "Edit Assignments". - Assign the appropriate permission set to that user. - Click "Save".

Metadata API

This main purpose of this is to move metadata between Salesforce orgs during the development process. Use this to deploy, retrieve, create, update, or delete customization information, such as custom object definitions and page layouts. It doesn't work directly with business data.

Functional Domain

This refers to the Salesforce's public cloud infrastructure where your instance is located. This information is helpful to provide when troubleshooting any potential issues with support.

Identity Resolution

This refers to the process of identity management by means of matching and reconciling data about people into a comprehensive view called unified profiles. This is powered by rulesets and creates unified and link objects.

Data Deletion Request

This request deletes the specified Individual entity and any entities where a relationship has been defined between that entity's identifying attribute and the Individual ID attribute.

Restriction of Processing Requests

This request restricts all data processing for the specified Individual and Unified Individual profiles within 24 hours.

Data Access and Export Requests

This request triggers the export of all data stored within Data Cloud for the specified Individual profiles. This export is published as a CSV file to your defined Amazon S3 bucket within 15 days.


Ensembles d'études connexes

Chapter 42: Management of Patients With Musculoskeletal Trauma

View Set

Biology - Chapter 24 (The Body's Defenses)

View Set

Psych 211 Exam 4 Applied Questions

View Set

Philos: Final Study Guide: Part 5 Sartre

View Set