Data Architect
What makes Skinny tables fast? Choose three answers. A . They do not include soft-deleted records B . They avoid resource intensive joins C . Their tables are kept in sync with their source tables when the source tables are modified D . They can contain fields from other objects E . They support up to a max of 100 of columns
A B C
NTO uses salesforce to manage relationships and track sales opportunities. It has 10 million customers and 100 million opportunities. The CEO has been complaining 10 minutes to run and sometimes failed to load, throwing a time out error. Which 3 options should help improve the dashboard performance? Choose 3 answers: A . Use selective queries to reduce the amount of data being returned. B . De-normalize the data by reducing the number of joins. C . Remove widgets from the dashboard to reduce the number of graphics loaded. D . Run the dashboard for CEO and send it via email. E . Reduce the amount of data queried by archiving unused opportunity records.
A B E
Universal Containers (UC) is implementing a formal, cross -business -unit data governance program As part of the program, UC will implement a team to make decisions on enterprise -wide data governance. Which two roles are appropriate as members of this team? Choose 2 answers A . Analytics/BI Owners B . Data Domain Stewards C . Salesforce Administrators D . Operational Data Users
A . Analytics/BI Owners B . Data Domain Stewards
(NTO) has multiple salesforce orgs based on geographical reports (AMER, EMEA, APAC). NTO products are in the AMER org and need to be created in the EMEA and APAC after the products are approved. Which two features should a data architect recommend to share records between salesforce orgs? Choose 2. A . Change data capture (CDC) B . Salesforce connect. C . Federation search D . Salesforce 2 Salesforce
A . Change data capture (CDC) D . Salesforce 2 Salesforce
UC has large amount of orders coming in from its online portal. Historically all order are assigned to a generic user. Which 2 measures should data architect recommend to avoid any performance issues while working with large number of order records? Choose 2 answers: A . Clear the role field in the generic user record. B . Salesforce handles the assignment of orders automatically and there is no performance impact. C . Create a role at top of role hierarchy and assign the role to the generic user. D . Create a pool of generic users and distribute the assignment of memory to the pool of users.
A . Clear the role field in the generic user record. C . Create a role at top of role hierarchy and assign the role to the generic user.
Which two aspects of data does an Enterprise data governance program aim to improve? A . Data integrity B . Data distribution C . Data usability D . Data modeling
A . Data integrity C . Data usability
Universal Containers has a public website with several forms that create Lead records in Salesforce using the REST API. When designing these forms, which two techniques will help maintain a high level of data quality? A . Do client-side validation of phone number and email field formats. B . Prefer picklist form fields over free text fields, where possible. C . Ensure the website visitor is browsing using an HTTPS connection. D . Use cookies to track when visitors submit multiple forms.
A . Do client-side validation of phone number and email field formats. B . Prefer picklist form fields over free text fields, where possible.
Universal Containers has a public website with several forms that create Lead records in Salesforce using the REST API. When designing these forms, which two techniques will help maintain a high level of data quality? A . Do client-side validation of phone number and email field formats. B . Prefer picklist form fields over free text fields, where possible. C . Ensure the website visitor is browsing using an HTTPS connection. D . Use cookies to track when visitors submit multiple forms.
A . Do client-side validation of phone number and email field formats. B . Prefer picklist form fields over free text fields, where possible.
Universal Containers (UC) owns a complex Salesforce org with many Apex classes, triggers, and automated processes that will modify records if available. UC has identified that, in its current development state, UC runs change of encountering race condition on the same record. What should a data architect recommend to guarantee that records are not being updated at the same time? A . Embed the keywords FOR UPDATE after SOQL statements. B . Disable classes or triggers that have the potential to obtain the same record. C . Migrate programmatic logic to processes and flows. D . Refactor or optimize classes and trigger for maximum CPU performance.
A . Embed the keywords FOR UPDATE after SOQL statements.
Universal Containers is experiencing frequent and persistent group membership locking issues that severely restricts its ability to manage manual and a automated updates at the same time. What should a data architect do in order to restore the issue? A . Enable granular locking B . Enable parallel sharing rule calculation. C . Enable defer sharing calculation D . Enable implicit sharing
A . Enable granular locking
Each contact may attend multiple conferences and each conference may be related to multiple contacts. How should a data architect model the relationship between the contact and conference objects? A . Implement a Contact Conference junction object with master detail relationship to both contact and conference__c. B . Create a master detail relationship field on the Contact object. C . Create a master detail relationship field on the Conference object. D . Create a lookup relationship field on contact object.
A . Implement a Contact Conference junction object with master detail relationship to both contact and conference__c.
DreamHouse Realty has a data model as shown in the image. The Project object has a private sharing model, and it has Roll-Up summary fields to calculate the number of resources assigned to the project, total hours for the project, and the number of work items associated to the project. There will be a large amount of time entry records to be loaded regularly from an external system into Salesforce. What should the Architect consider in this situation? A . Load all data after deferring sharing calculations. B . Calculate summary values instead of Roll-Up by using workflow. C . Calculate summary values instead of Roll-Up by using triggers. D . Load all data using external IDs to link to parent records.
A . Load all data after deferring sharing calculations.
A company has 12 million records, and a nightly integration queries these records. Which two areas should a Data Architect investigate during troubleshooting if queries are timing out? (Choose two.) A . Make sure the query doesn't contain NULL in any filter criteria. B . Create a formula field instead of having multiple filter criteria. C . Create custom indexes on the fields used in the filter criteria. D . Modify the integration users' profile to have View All Data.
A . Make sure the query doesn't contain NULL in any filter criteria. C . Create custom indexes on the fields used in the filter criteria.
A data architect has been tasked with optimizing a data stewardship engagement for a Salesforce instance. Which three areas of Salesforce should the architect review before proposing any design recommendation? Choose 3 answers A . Review the metadata xml files for redundant fields to consolidate. B . Determine if any integration points create records in Salesforce. C . Run key reports to determine what fields should be required. D . Export the setup audit trail to review what fields are being used. E . Review the sharing model to determine impact on duplicate records.
A . Review the metadata xml files for redundant fields to consolidate. C . Run key reports to determine what fields should be required. E . Review the sharing model to determine impact on duplicate records.
Universal Containers (UC) is using Salesforce Sales & Service Cloud for B2C sales and customer service but they are experiencing a lot of duplicate customers in the system. Which are two recommended approaches for UC to avoid duplicate data and increase the level of data quality? A . Use Duplicate Management. B . Use an Enterprise Service Bus. C . Use Data.com Clean D . Use a data warehouse.
A . Use Duplicate Management. C . Use Data.com Clean
NTO has been using salesforce for sales and service for 10 years. For the past 2 years, the marketing group has noticed a raise from 0 to 35 % in returned mail when sending mail using the contact information stored in salesforce. Which solution should the data architect use to reduce the amount of returned mails? A . Use a 3rd party data source to update contact information in salesforce. B . Email all customer and asked them to verify their information and to call NTO if their address is incorrect. C . Delete contacts when the mail is returned to save postal cost to NT E . Have the sales team to call all existing customers and ask to verify the contact details.
A . Use a 3rd party data source to update contact information in salesforce.
Universal Containers (UC) wants to ensure their data on 100,000 Accounts pertaining mostly to US-based companies is enriched and cleansed on an ongoing basis. UC is looking for a solution that allows easy monitoring of key data quality metrics. What should be the recommended solution to meet this requirement? A . Use a declarative approach by installing and configuring Data.com Clean to monitor Account data quality. B . Implement Batch Apex that calls out a third-party data quality API in order to monitor Account data quality. C . Use declarative approach by installing and configuring Data.com Prospector to monitor Account data quality. D . Implement an Apex Trigger on Account that queries a third-party data quality API to monitor Account data quality.
A . Use a declarative approach by installing and configuring Data.com Clean to monitor Account data quality.
Universal Container require all customers to provide either a phone number of an email address when registering for an account. What should the data architect use to ensure this requirement is met? A . validation Rule B . required Fields C . Apex Class D . Process Builder
A . validation Rule
Universal Containers (UC) has a variety of systems across its technology landscape, including Salesforce, legacy enterprise resource planning (ERP) applications, and homegrown CRM tools. UC has decided that they would like to consolidate all customer, opportunity, and order data into Salesforce as part of its master data management strategy. What are the three key steps that a data architect should take when merging data from multiple systems into Salesforce? (Choose three.) A. Analyze each system's data model and perform gap analysis B. Create new fields to store additional values from all of the systems C. Install a third-party AppExchange tool to handle the merger D. Utilize an ETL tool to migrate, transform, and deduplicate data E. Work with stakeholders to define record and field survivorship rules
A. Analyze each system's data model and perform gap analysis D. Utilize an ETL tool to migrate, transform, and deduplicate data E. Work with stakeholders to define record and field survivorship rules
Which two data management policies does the Data Classification feature allow customers to classify in Salesforce? (Choose two.) A. Data sensitivity policy B. Compliance categorization policy C. Reference data policy D. Data governance policy
A. Data sensitivity policy B. Compliance categorization policy
Universal Container (UC) is migrating data from legacy system to Salesforce. UC would like to preserve the following information on records being migrated: 1. Date time stamps for created date and last modified date. 2. Ownership of records belonging to inactive users being migrated to Salesforce. Which two solutions should a data architect recommend to preserve date timestamps and ownership on records? (Choose two.) A. Enable Update Records with Inactive Owners Permission B. Enable modify all and view all permission C. Enable Set Audit Fields upon Record Creation Permission D. Log a case with Salesforce to allow updating these fields
A. Enable Update Records with Inactive Owners Permission C. Enable Set Audit Fields upon Record Creation Permission
Northern Trail Outfitters (NTO) processes orders from its website via an order management system (OMS). The OMS stores over 2 million historical records and is currently not integrated with Salesforce. The sales team at NTO is using Sales Cloud and would like visibility into related customer orders, yet they do not want to persist millions of records directly in Salesforce. NTO has asked the data architect to evaluate Salesforce Connect and the concept of data virtualization. Which three key considerations are needed prior to a Salesforce Connect implementation? (Choose three.) A. Identify the external tables to sync into external objects. B. Assess whether the external data source is reachable via an OData endpoint. C. Configure a middleware tool to poll external table data. D. Develop an object relationship strategy. E. Create a second system admin user for authentication to the external source.
A. Identify the external tables to sync into external objects. B. Assess whether the external data source is reachable via an OData endpoint. D. Develop an object relationship strategy.
Universal Containers has a classic encryption for custom fields and is leveraging weekly data export for data backups. During the data validation of exported data, UC discovered that encrypted field values are still being exported as part of the data export. What should a data architect recommend to make sure decrypted values are exported during data export? A. Set up a custom profile for data migration user, and assign View Encrypted Data B. Set up a standard profile for data migration user, and assign View Encrypted Data C. Create another field to copy data from encrypted field, and use this field in export D. Leverage Apex class to decrypt data before exporting it
A. Set up a custom profile for data migration user, and assign View Encrypted Data
**Universal Containers (UC) recently migrated 1 billion customer related records from a legacy datastore to Heroku Postgres. A subnet of the data needs to be synchronized with Salesforce so that service agents are able to support customers directly within the service console. The remaining non-synchronized set of data will need to be accessed by Salesforce at any point in time, but UC management is concerned about storage limitations. What should a data architect recommend to meet these requirements with minimal effort? A. Use Heroku Connect to bidirectionally sync all data between systems B. As needed, make callouts into Heroku Postgres and persist the data in Salesforce C. Virtualize the remaining set of data with Salesforce Connect and external objects D. Migrate the data to big objects and leverage Async SOQL with custom objects
A. Use Heroku Connect to bidirectionally sync all data between systems
An architect has been asked to provide error messages when a future date is detected in a custom Birthdate _c field on the Contact object. The client wants the ability to translate the error messages. What are two approaches the architect should use to achieve this solution? Choose 2 answers A . Implement a third -party validation process with translate functionality. B . Create a trigger on Contact and add an error to the record with a custom label. C . Create a workflow field update to set the standard ErrorMessage field. D . Create a validation rule and translate the error message with translation workbench.
B . Create a trigger on Contact and add an error to the record with a custom label. D . Create a validation rule and translate the error message with translation workbench.
UC is migrating data from legacy system to SF. UC would like to preserve the following information on records being migrated: ✑ Date time stamps for created date and last modified date. ✑ Ownership of records belonging to inactive users being migrated to Salesforce. Which 2 solutions should a data architect recommends to preserve the date timestamps and ownership on records? Choose 2 answers. A . Log a case with SF to update these fields B . Enable update records with Inactive Owners Permission C . Enable Set Audit fields upon Record Creation Permission D . Enable modify all and view all permission.
B . Enable update records with Inactive Owners Permission C . Enable Set Audit fields upon Record Creation Permission
Which two best practices should be followed when using SOSL for searching? A . Use searches against single Objects for greater speed and accuracy. B . Keep searches specific and avoid wildcards where possible. C . Use SOSL option to ignore custom indexes as search fields are pre-indexed. D . Use Find in "ALL FIELDS" for faster searches.
B . Keep searches specific and avoid wildcards where possible. D . Use Find in "ALL FIELDS" for faster searches.
Northern trail Outfitters (NTO) runs its entire out of an enterprise data warehouse (EDW), NTD's sales team starting to use Salesforce after a recent implementation, but currently lacks data required to advanced and opportunity to the next stage. NTO's management has research Salesforce Connect and would like to use It to virtualize and report on data from the EDW within Salesforce. NTO will be running thousands of reports per day across 10 to 15 external objects. What should a data architect consider before implementing Salesforce Connect for reporting? A . Maximum number for records returned B . OData callout limits per day C . Maximum page size for server-driven paging D . Maximum external objects per org
B . OData callout limits per day
Universal Containers has more than 10 million records in the Order_c object. The query has timed out when running a bulk query. What should be considered to resolve query timeout? A . Tooling API B . PK Chunking C . Metadata API D . Streaming API
B . PK Chunking
Universal Containers has successfully migrated 50 million records into five different objects multiple times in a full copy sandbox. The Integration Engineer wants to re-run the test again a month before it goes live into Production. What is the recommended approach to re-run the test? A . Truncate all 5 objects quickly and re-run the data migration test. B . Refresh the full copy sandbox and re-run the data migration test. C . Hard delete all 5 objects' data and re-run the data migration test. D . Truncate all 5 objects and hard delete before running the migration test.
B . Refresh the full copy sandbox and re-run the data migration test.
Universal containers is implementing Salesforce lead management. UC Precure lead data from multiple sources and would like to make sure lead data as company profile and location information. Which solution should a data architect recommend to make sure lead data has both profile and location information? Option A . Ask sales people to search for populating company profile and location data B . Run reports to identify records which does not have company profile and location data C . Leverage external data providers populate company profile and location data D . Export data out of Salesforce and send to another team to populate company profile and location data
B . Run reports to identify records which does not have company profile and location data
As part of addressing general data protection regulation (GDPR) requirements, UC plans to implement a data classification policy for all its internal systems that stores customer information including salesforce. What should a data architect recommend so that UC can easily classify consumer information maintained in salesforce under both standard and custom objects? A . Use App Exchange products to classify fields based on policy. B . Use data classification metadata fields available in field definition. C . Create a custom picklist field to capture classification of information on customer. D . Build reports for customer information and validate.
B . Use data classification metadata fields available in field definition.
**Northern Trail Outfitters (NTO) has outgrown its current Salesforce org and will be migrating to a new org shortly. As part of this process, NTO will be migrating all of its metadata and data. NTO's data model in the source org has a complex relationship hierarchy with several master-detail and lookup relationships across objects, which should be maintained in the target org. Which three things should a data architect do to maintain the relationship hierarchy during migration? (Choose three.) A. Keep the relationship fields populated with the source record IDs in the import file. B. Create an external ID field for each object in the target org and map source record IDs to this field. C. Replace source record IDs with new record IDs from the target org in the import file. D. Redefine the master-detail relationship fields to lookup relationship fields in the target org. E. Use data loader to export the data from the source org and then import/upsert into the target org in sequential order.
B. Create an external ID field for each object in the target org and map source record IDs to this field. C. Replace source record IDs with new record IDs from the target org in the import file. E. Use data loader to export the data from the source org and then import/upsert into the target org in sequential order.
NTO uses opportunity forecast for its sales planning and management. Sales users have noticed that their updates to the opportunity amount field are overwritten when PPS updates their opportunities. How should a data architect address this overriding issue? A . Create a custom field for opportunity amount that sales users update separating the fields that PPS updates. B . Create a custom field for opportunity amount that PPS updates separating the field that sales user updates. C . Change opportunity amount field access to read only for sales users using field level security. D . Change PPS integration to update only opportunity amount fields when values is NUL
C . Change opportunity amount field access to read only for sales users using field level security.
Northern Trail Outfitters (NTO) has the following systems: Customer master-source of truth for customer information Service cloud-customer support Marketing cloud-marketing support Enterprise data warehouse―business reporting The customer data is duplicated across all these system and are not kept in sync. Customers are also complaining that they get repeated marketing emails and have to call into update their information. NTO is planning to implement master data management (MDM) solution across the enterprise. Which three data will an MDM tool solve? Choose 3 answers A . Data completeness B . Data loss and recovery C . Data duplication D . Data accuracy and quality E . Data standardization
C . Data duplication D . Data accuracy and quality E . Data standardization
Universal Containers (UC) is planning to move away from legacy CRM to Salesforce. As part of one-time data migration, UC will need to keep the original date when a contact was created in the legacy system. How should an Architect design the data migration solution to meet this requirement? A . After the data is migrated, perform an update on all records to set the original date in a standard CreatedDate field. B . Create a new field on Contact object to capture the Created Date. Hide the standard CreatedDate field using Field -Level Security. C . Enable "Set Audit Fields" and assign the permission to the user loading the data for the duration of the migration. D . Write an Apex trigger on the Contact object, before insert event to set the original value in a standard CreatedDate field.
C . Enable "Set Audit Fields" and assign the permission to the user loading the data for the duration of the migration.
Universal Containers (UC) has implemented Sales Cloud for its entire sales organization, UC has built a custom object called projects_c that stores customers project detail and employee bitable hours. The following requirements are needed: A subnet of individuals from the finance team will need to access to the projects object for reporting and adjusting employee utilization. The finance users will not access to any sales objects, but they will need to interact with the custom object. Which license type a data architect recommend for the finance team that best meets the requirements? A . Service Cloud B . Sales Cloud C . Light Platform Start D . Lighting platform plus
C . Light Platform Start
Universal container (UC) would like to build a Human resources application on Salesforce to manage employee details, payroll, and hiring efforts. To adequately and store the relevant data, the application will need to leverage 45 custom objects. In addition to this, UC expects roughly 20,00 API calls into Salesfoce from an n-premises application daily. Which license type should a data architect recommend that best fits these requirements? A . Service Cloud B . Lightning platform Start C . Lightning Platform plus D . Lightning External Apps Starts
C . Lightning Platform plus
Universal Containers (UC) is a business that works directly with individual consumers (B2C). They are moving from a current home-grown CRM system to Salesforce. UC has about one million consumer records. What should the architect recommend for optimal use of Salesforce functionality and also to avoid data loading issues? A . Create a Custom Object Individual Consumer c to load all individual consumers. B . Load all individual consumers as Account records and avoid using the Contact object. C . Load one Account record and one Contact record for each individual consumer. D . Create one Account and load individual consumers as Contacts linked to that one Accoun
C . Load one Account record and one Contact record for each individual consumer.
Patients: They are individuals who need care. A data architect needs to map the actor to Sf objects. What should be the optimal selection by the data architect? A . Patients as Contacts, Payment providers as Accounts, & Doctors as Accounts B . Patients as Person Accounts, Payment providers as Accounts, & Doctors as Contacts C . Patients as Person Accounts, Payment providers as Accounts, & Doctors as Person Account D . Patients as Accounts, Payment providers as Accounts, & Doctors as Person Accounts
C . Patients as Person Accounts, Payment providers as Accounts, & Doctors as Person Account
Universal Containers (UC) requires 2 years of customer related cases to be available on SF for operational reporting. Any cases older than 2 years and upto 7 years need to be available on demand to the Service agents. UC creates 5 million cases per yr. Which 2 data archiving strategies should a data architect recommend? Choose 2 options: A . Use custom objects for cases older than 2 years and use nightly batch to move them. B . Sync cases older than 2 years to an external database, and provide access to Service agents to the database C . Use Big objects for cases older than 2 years, and use nightly batch to move them. D . Use Heroku and external objects to display cases older than 2 years and bulk API to hard delete from Salesforce.
C . Use Big objects for cases older than 2 years, and use nightly batch to move them. D . Use Heroku and external objects to display cases older than 2 years and bulk API to hard delete from Salesforce.
Universal Containers (UC) requires 2 years of customer related cases to be available on SF for operational reporting. Any cases older than 2 years and upto 7 years need to be available on demand to the Service agents. UC creates 5 million cases per yr. Which 2 data archiving strategies should a data architect recommend? Choose 2 options: A . Use custom objects for cases older than 2 years and use nightly batch to move them. B . Sync cases older than 2 years to an external database, and provide access to Service agents to the database C . Use Big objects for cases older than 2 years, and use nightly batch to move them. D . Use Heroku and external objects to display cases older than 2 years and bulk API to hard delete from Salesforce.
C . Use Big objects for cases older than 2 years, and use nightly batch to move them. D . Use Heroku and external objects to display cases older than 2 years and bulk API to hard delete from Salesforce.
Due to security requirements, Universal Containers needs to capture specific user actions, such as login, logout, file attachment download, package install, etc. What is the recommended approach for defining a solution for this requirement? A . Use a field audit trail to capture field changes. B . Use a custom object and trigger to capture changes. C . Use Event Monitoring to capture these changes. D . Use a third-party AppExchange app to capture changes.
C . Use Event Monitoring to capture these changes.
A customer wishes to migrate 700,000 Account records in a single migration into Salesforce. What is the recommended solution to migrate these records while minimizing migration time? A . Use Salesforce Soap API in parallel mode. B . Use Salesforce Bulk API in serial mode. C . Use Salesforce Bulk API in parallel mode. D . Use Salesforce Soap API in serial mode.
C . Use Salesforce Bulk API in parallel mode.
Universal Containers (UC) wants to capture information on how data entities are stored within the different applications and systems used within the company. For that purpose, the architecture team decided to create a data dictionary covering the main business domains within UC. Which two common techniques are used building a data dictionary to store information on how business entities are defined? A . Use Salesforce Object Query Language. B . Use a data definition language. C . Use an entity relationship diagram. D . Use the Salesforce Metadata AP
C . Use an entity relationship diagram. D . Use the Salesforce Metadata API
Sales Orders will not be updated in Salesforce. What should a data architect recommend for maintaining Sales Orders in salesforce? A . Us custom objects to maintain Sales Orders in Salesforce. B . Use custom big objects to maintain Sales Orders in Salesforce. C . Use external objects to maintain Sales Order in Salesforce. D . Use Standard order object to maintain Sale Orders in Salesforce
C . Use external objects to maintain Sales Order in Salesforce.
Sales Orders will not be updated in Salesforce.What should a data architect recommend for maintaining Sales Orders in salesforce? A . Us custom objects to maintain Sales Orders in Salesforce. B. Use custom big objects to maintain Sales Orders in Salesforce. C . Use external objects to maintain Sales Order in Salesforce. D . Use Standard order object to maintain Sale Orders in Salesforce
C . Use external objects to maintain Sales Order in Salesforce.
Universal Container is using Salesforce for Opportunity management and enterprise resource planning (ERP) for order management. Sales reps do not have access to the ERP and have no visibility into order status. What solution a data architect recommend to give the sales team visibility into order status? A . Leverage Canvas to bring the order management UI in to the Salesforce tab. B . Build batch jobs to push order line items to salesforce. C . leverage Salesforce Connect top bring the order line item from the legacy system to Salesforce. D . Build real-time integration to pull order line items into Salesforce when viewing orders.
C . leverage Salesforce Connect top bring the order line item from the legacy system to Salesforce.
Universal Containers (UC) needs to run monthly and yearly reports on opportunities and orders for sales reporting. There are 5 million opportunities and 10 million orders. Sales users are complaining that the report time-outs. What is the fastest and most effective way for a data architect to solve the time-out issue? A. Create custom fields on opportunity, and copy data from order into those custom fields and run all reports on Opportunity object. B. Extract opportunity and order data from Salesforce, and use a third-party reporting tool to run reports outside of Salesforce. C. Create an aggregate custom object that summarizes the monthly and yearly values into the required format for the required reports. D. Create a skinny table in Salesforce, and copy order and opportunity fields into the skinny table and create the required reports on it.
C. Create an aggregate custom object that summarizes the monthly and yearly values into the required format for the required reports.
**Universal Containers (UC) requires 2 years of customer related cases to be available on Salesforce for operational reporting. Any cases older than 2 years and up to 7 years need to be available on demand to service agents. UC creates 5 millions cases per year. Which two data archiving strategies should a data architect recommend? (Choose two.) A. Use Custom objects for cases older than 2 years and use nightly batch to move them B. Sync cases older than 2 years to an external database, and provide access to service agents to the database C. Use Big objects for cases older than 2 years, and use nightly batch to move them D. Use Heroku and External objects to display cases older than 2 years and Bulk API to hard delete from Salesforce
C. Use Big objects for cases older than 2 years, and use nightly batch to move them D. Use Heroku and External objects to display cases older than 2 years and Bulk API to hard delete from Salesforce
The data architect for UC has written a SOQL query that will return all records from the Task object that do not have a value in the WhatId field: Select id, description, Subject from Task where WhatId != NULL When the data architect usages the query to select values for a process a time out error occurs. What does the data architect need to change to make this query more performant? A . Remove description from the requested field set. B . Change query to SOS C . ?? D . Add limit 100 to the query. E . Change the where clause to filter by a deterministic defined value.
D . Add limit 100 to the query.
A customer needs a sales model that allows the following: ✑ Opportunities need to be assigned to sales people based on the zip code. ✑ Each sales person can be assigned to multiple zip codes. ✑ Each zip code is assigned to a sales area definition. Sales is aggregated by sales area for reporting. What should a data architect recommend? A . Assign opportunities using list views using zip code. B . Add custom fields in opportunities for zip code and use assignment rules. C . Allow sales users to manually assign opportunity ownership based on zip code. D . Configure territory management feature to support opportunity assignment.
D . Configure territory management feature to support opportunity assignment.
The program generates 100 million records each month. NTO customer support would like to see a summary of a customer's recent transaction and reward level(s) they have attained. Which solution should the data architect use to provide the information within the salesforce for the customer support agents? A . Create a custom object in salesforce to capture and store all reward program. Populate nightly from the point-of-scale system, and present on the customer record. B . Capture the reward program data in an external data store and present the 12 months trailing summary in salesforce using salesforce connect and then external object. C . Provide a button so that the agent can quickly open the point of sales system displaying the customer history. D . Create a custom big object to capture the reward program data and display it on the contact record and update nightly from the point-of-scale system.
D . Create a custom big object to capture the reward program data and display it on the contact record and update nightly from the point-of-scale system.
Marketing solution What should a data architect recommend that would help upgrade uniquely identify customer across multiple systems: A . Store the salesforce id in all the solutions to identify the customer. B . Create a custom object that will serve as a cross reference for the customer id. C . Create a customer data base and use this id in all systems. D . Create a custom field as external id to maintain the customer Id from the MDM solution.
D . Create a custom field as external id to maintain the customer Id from the MDM solution.
How can an architect find information about who is creating, changing, or deleting certain fields within the past two months? A . Remove "customize application" permissions from everyone else. B . Export the metadata and search it for the fields in question. C . Create a field history report for the fields in question. D . Export the setup audit trail and find the fields in question.
D . Export the setup audit trail and find the fields in question.
Two million Opportunities need to be loaded in different batches into Salesforce using the Bulk API in parallel mode. What should an Architect consider when loading the Opportunity records? A . Use the Name field values to sort batches. B . Order batches by Auto-number field. C . Create indexes on Opportunity object text fields. D . Group batches by the AccountId field.
D . Group batches by the AccountId field.
Universal Containers has a large volume of Contact data going into Salesforce.com. There are 100,000 existing contact records. 200,000 new contacts will be loaded. The Contact object has an external ID field that is unique and must be populated for all existing records. What should the architect recommend to reduce data load processing time? A . Load Contact records together using the Streaming API via the Upsert operation. B . Delete all existing records, and then load all records together via the Insert operation. C . Load all records via the Upsert operation to determine new records vs. existing records. D . Load new records via the Insert operation and existing records via the Update operation.
D . Load new records via the Insert operation and existing records via the Update operation.
Universal Containers has a large volume of Contact data going into Salesforce.com. There are 100,000 existing contact records. 200,000 new contacts will be loaded. The Contact object has an external ID field that is unique and must be populated for all existing records. What should the architect recommend to reduce data load processing time? A . Load Contact records together using the Streaming API via the Upsert operation. B . Delete all existing records, and then load all records together via the Insert operation. C . Load all records via the Upsert operation to determine new records vs. existing records. D . Load new records via the Insert operation and existing records via the Update operation.
D . Load new records via the Insert operation and existing records via the Update operation.
April 21, 2021exams1 CommentPost navigation There is no need for more than 10 custom objects or additional file storage. Which community cloud license type should a data architect recommend to meet the portal requirements? A . Customer community. B . Lightning external apps starter. C . Customer community plus. D . Partner community.
D . Partner community.
Universal Containers wishes to maintain Lead data from Leads even after they are deleted and cleared from the Recycle Bin. What approach should be implemented to achieve this solution? A . Use a Lead standard report and filter on the IsDeleted standard field. B . Use a Converted Lead report to display data on Leads that have been deleted. C . Query Salesforce with the queryAll API method or using the ALL ROWS SOQL keywords. D . Send data to a Data Warehouse and mark Leads as deleted in that system.
D . Send data to a Data Warehouse and mark Leads as deleted in that system.
Universal Containers is integrating a new Opportunity engagement system with Salesforce. According to their Master Data Management strategy, Salesforce is the system of record for Account, Contact, and Opportunity data. However, there does seem to be valuable Opportunity data in the new system that potentially conflicts with what is stored in Salesforce. What is the recommended course of action to appropriately integrate this new system? A . The MDM strategy defines Salesforce as the system of record, so Salesforce Opportunity values prevail in all conflicts. B . A policy should be adopted so that the system whose record was most recently updated should prevail in conflicts. C . The Opportunity engagement system should become the system of record for Opportunity records. D . Stakeholders should be brought together to discuss the appropriate data strategy moving forward.
D . Stakeholders should be brought together to discuss the appropriate data strategy moving forward.
The MDM solution provides de-duplication features, so it acts as the single source of truth. How should a data architect implement the storage of master key within salesforce? A . Store the master key in Heroku postgres and use Heroku connect for synchronization. B . Create a custom object to store the master key with a lookup field to contact. C . Create an external object to store the master key with a lookup field to contact. D . Store the master key on the contact object as an external ID (Field for referential imports)
D . Store the master key on the contact object as an external ID (Field for referential imports)
Universal Containers (UC) is planning a massive Salesforce implementation with large volumes of data. As part of the org's implementation, several roles, territories, groups, and sharing rules have been configured. The data architect has been tasked with loading all of the required data, including user data, in a timely manner. What should a data architect do to minimize data load times due to system calculations? A. Enable granular locking to avoid "UNABLE_TO_LOCK_ROW" error B. Load the data through data loader, and turn on parallel processing C. Leverage the Bulk API and concurrent processes with multiple batches D. Enable defer sharing calculations, and suspend sharing rule calculations
D. Enable defer sharing calculations, and suspend sharing rule calculations
Northern Trail Outfitters (NTO) has decided to franchise its brand. Upon implementation 1,000 franchisees will be able to access NTO's product information and track large customer sales and opportunities through a portal. The franchisees will also be able to run monthly and quarterly sales reports and projections as well as view the reports in dashboards. Which license does NTO need to provide these features to the franchisees? A. Salesforce Sales Cloud License B. Customer Community License C. Lighting Platform License D. Partner Community License
D. Partner Community License
Skinny Table
Org and Object specific table created on demand for specific fields to speed up access - Salesforce Created and Supported (Must submit case to have created) - Bypass the join of standard and custom fields - Kept in sync with Master table - Do no include soft deleted records - Available for custom objects, account, contact, opportunity, lead, and case objects
Index
a way to provide quick access to data Standard index: Fields automatically Indexed Custom Index: Created on demand per org