Data Architecture Practice Exam 1

Ace your homework & exams now with Quizwiz!

The sales reps of Cosmic Innovation often contact the wrong person while trying to reach a prospect. Millions of lead records were obtained from an untrusted source recently. Which of the following explains this situation? Choose 2 answers. A. Some of the new lead records contain inaccurate information B. Some of the new lead records contain outdated information C. Some of the new lead records contain Invalid information D. Some of the new lead records contain incomplete contact information

A & B It's possible that some of the new lead records contain inaccurate NOT COMPLETE

The data architect of Cosmic Grocery is developing a data stewardship engagement plan for improving data quality in SF. Which of the following are valid considerations pertaining to an effective data stewardship engagement? Choose 3 answers. A. Stewardship should include assigning the roles each team member and each departmental group has to play in improving customer data. B. Data must always be internally sourced for an effective data stewardship engagement C. Duplicates should be eliminated and SF features should be utilized to flag or block the creation of duplicates D. A reporting plan should be developed to track all incoming and edited data and determine any gaps in data quality E. A plan should be designed to contact customers in order to determine the accuracy of SF data.

A, C, & D A data stewardship includes assigning the roles each team member and each departmental group has to play in improving customer data, which can include 'Data Steward' and 'Business Intelligence Manager'. Data can be sourced internally or externally. Data should be examined to identify duplicates, which should be eliminated using an appropriate de-duplication tool. In order to check the accuracy of records, they can be compared again a high quality and complete third-party data source. Reports and dashboards can be created to view gaps in data quality. https://a.sfdcstatic.com/content/dam/www/ocms-backup/assets/pdf/misc/data_Governance_Stewardship_ebook.pdf

Cosmic Service Solutions would like to capture any changes made to fifteen custom fields on the Account object. The captured changed should include who made the change, when the change was made, the old field value, and the new field value. Which for the following should a data architect recommend in order to meet this requirement? Choose 2 answers. A. Field History Tracking B. Apex Trigger and Custom Object C. Visualforce page D. Workflow Rule

A & B Field History Tracking can be enabled for specific fields of an object. Reports can be created based on the field history. The field history data is also displayed in the History related list of an object. The data is retained for up to 18 months through the org, and up to 24 months via the API. The required info can also be tracked by creating an Apex Trigger that uses the 'after insert' and 'after update' events. It can store field values in a custom object. Reports can be created based on the records of the custom object. A Vf page or WF rule cannot be used to track field history. https://help.salesforce.com/articleView?id=tracking_field_history.htm&type=5

The sales director of Cosmic Solutions has recently noticed a decline in the number of opportunities that move through the sales pipeline. He wants to make sure that opportunity records are regularly updated by the sales reps. What must be implemented to allow the sales director to use a single view in Lightning Experience to monitor multiple opportunities that have not been updated by a sales rep for more than 21 days? Choose 2 answers. A. A Lightning Experience report that displays opportunities with 'Last Modified Date' less than 21 DAYS AGO B. A workflow rule that sends an email alert if 'Last Modified Date' < TODAY - 21 C. A list view that displays opportunities with 'Last Modified Date' less than 21 DAYS AGO D. A process created in Process Builder that sends an email alert if 'Last Modified Date' < TODAY - 21

A & C A Lightning Experience report or list view would provide a single view of opportunity records that have not been updated within the last 21 days, which can be used by the sales director to monitor the opportunities. It can utilize the filter 'Last Modified Date less than 21 DAYS AGO'. A workflow rule or process can trigger an email alert, but it does not provide a single view of records. https://help.salesforce.com/articleView?id=reports_builder_filtering.htm&language=en_us&r=https%3A%2F%2Fwww.google.com%2F&type=5 https://help.salesforce.com/articleView?id=customviews_lex.htm&type=5

Cosmic Innovation uses Salesforce to store accounts and contacts. All the users have switched to the Lightning Experience user interface. The account hierarchy feature is used extensively to relate companies to their subsidiaries. Sales representatives often create duplicate contacts with different primary accounts. Which of the following can be used to ensure that there is only a single contact for a particular person across all the subsidiaries of a company? Choose 2 answers. A. Third-party application or tool B. Lightning Data C. 'Potential Duplicates' component D. Workflow Rule

A & C A third-party application or tool can be used to merge contacts across accounts, i.e., contacts with different primary accounts. In Salesforce Classic, the 'Merge Contacts' tool can only be used to merge contacts that have the same primary account. In Lightning Experience, it is possible to merge contacts that have different primary accounts using the 'Potential Duplicates' component. The component can show duplicates and allow users to merge duplicate records. Lightning Data is used to update and import records. A workflow rule cannot be used to merge contact records. https://help.salesforce.com/articleView?id=contacts_merge_classic.htm&type=5

Cosmic Service Solutions would like to automatically archive opportunity records that are older than one year. Users do not require access to these records once they are archived. Which of the following can be used to meet this requirement? Choose 2 answers. A. Third-Party ETL Tool B. Workbench C. Data Loader D. Data Export Wizard

A & C Since users do not require access to archived records, they can be deleted. A third-party ETL tool or Data Loader can be used to schedule jobs for exporting and deleting records automatically. The Data Export Wizard in the Salesforce UI cannot be used to delete records automatically. Workbench cannot be used to schedule jobs for export and deletion. https://help.salesforce.com/articleView?id=command_line_intro.htm&type=5

The sales agents of Cosmic Electronics require industry information while reaching out to prospects for discussing sales deals. However, the 'Industry Information' field often does not contain any value, which is resulting in a lot of lost opportunities. Which of the following indicate a possible data quality issue that is causing this problem? Choose 2 answers. A. Users do not possess sufficient information for creating or updating lead records. B. Users are adding information that is hard to understand. C. Users are entering incomplete data while creating or updating lead records. D. Users have not updated the lead records for a while.

A & C The most likely issue is that users are entering incomplete data while creating or updating lead records. It's possible that they are not adding information to the 'Industry Information' field, probably because they do not have sufficient information. The field would still contain a value if the records had not been updated for a while. It would also contain a value if the users were adding information that is hard to understand. https://trailhead.salesforce.com/en/content/learn/modules/data_quality/data_quality_assess_your_data https://a.sfdcstatic.com/content/dam/www/ocms-backup/assets/pdf/misc/data_Governance_Stewardship_ebook.pdf

Cosmic Financial Services uses Salesforce to manage accounts and contacts. Users regularly create and update records manually, but the data entered is often incomplete and inaccurate. The director of the company would like to implement a new process for improving and monitoring data quality. What should a data architect recommend for the requirement? Choose 2 answers. A. Download and install an AppExchange solution for monitoring data quality metrics B. Implement a process that requires updating data using the Data Import wizard C. Create an Apex trigger that sends a callout to a third-party service for data assessment and updates D. Use Lightning Data for updating records

A & D An AppExchange application can be utilized to monitor data quality. Lightning Data can be used to update records by comparing the Salesforce data values with the data service provider. A Lightning Data package available in the AppExchange can be installed. Using an Apex trigger that sends a callout to a third-party service would require programmatic development and the purchase of a license for the third-party service. The Data Import Wizard can be used to update leads, contacts and accounts, however without an external data source to validate the data, it is not a complete solution to improving data quality. https://appexchange.salesforce.com/appxListingDetail?listingId=a0N3000000B5ilaEAB https://appexchange.salesforce.com/appxStore?type=Data

Cosmic Repair Solutions has grown its global customer base from a few thousand to more than a million customers over the last few years. The 'Account' object is used to store information about customers in Salesforce. The org used to work perfectly until the number of account records reached one million. As a result, batch jobs and triggers, especially those with complex queries based on the Account object, are now failing. Which of the following measures can be taken for resolution? Choose 2 answers. A. Identify query filters that use non-indexed custom fields and request custom indexing from Salesforce Support. B. Bulkify Apex triggers and batch jobs to improve query performance. C. Minimize the number of objects that are related to Account by deleting look-up and master-detail relationship fields. D. Make existing queries selective by adding default indexed fields to the filter.

A & D SOQL queries, especially on objects with large data volume, must be selective for optimal query performance. A selective query needs to have index fields included in the query filter to reduce the resulting number of rows. Some fields are indexed by default, such as Primary Keys (Id, Name, and Owner), Foreign Keys (lookup or master-detail), Audit dates (SystemModStamp, etc.), and custom fields marked as External ID. Custom fields that are not indexed by default can be indexed by contacting Salesforce Customer Support. Bulkified Apex code ensures optimal execution time and governor limit usage. However, it does not improve query performance. Minimizing the number of related objects to Account is also not the solution as it does not fix the account data volume issue or improve the query performance on Account. https://help.salesforce.com/articleView?id=000006007&language=en_US%C2%A0&r=https%3A%2F%2Fwww.google.com%2F&type=1

Cosmic Service Solutions would like to migrate several million account and contact records from a legacy system to Salesforce. The legacy system is known for bad data quality and duplicate records. Which of the following are important considerations for minimizing data duplication in Salesforce? Choose 2 answers. A. Salesforce matching and duplicate rules should be utilized. B. Validation rules should be created to prevent data duplication. C. Workflow rules should be used to prevent the creation of duplicate records. D. The records should be cleaned before they are imported.

A & D The account and contact records should be cleaned before they are imported into Salesforce. Matching and duplicate rules should be utilized to identify duplicate records and also block users from creating them. Validation rules and workflow rules cannot be used to prevent data duplication. https://help.salesforce.com/articleView?id=duplicate_prevention_map_of_tasks.htm&type=5

Cosmic Grocery is a company that would like to implement a master data management solution. The company uses several systems, including Salesforce and an ERP system, which should be integrated for consolidating data in a central hub. A master data management strategy will need to be defined to determine the system of record (SOR) and the single source of truth (SSOT). Which of the following are valid considerations related to this? Choose 2 answers. A. The system of record is the is the authoritative data source for a given record B. There can be multiple systems of record for a given piece of information C. The master data management system cannot act as the single source of truth D. The single source of truth refers to the place where a record can be sourced from.

A & D The system of record is the authoritative data source for a given record, data attribute or piece of information. There can be only one system of record for a given piece of information. The single source of truth refers to the place where a record can be sourced from, which is generally the master data management system or hub where the data from the different enterprise systems are consolidated.

Cosmic Solutions uses a custom object called 'Performance Review' to store the performance reviews of employees who work for the company. A system administrator has created a report that displays all the performance reviews created within a specific time period. The report uses a filter based on a custom date field. However, the report takes a long time to run due to the large number of records. Which steps can the system administrator take to improve the performance of the report? Choose 2 answers. A. Ask Salesforce Support to add a custom index to the custom date field. B. Ask Salesforce Support to create a skinny table for all fields defined on the custom object. C. Ask Salesforce Support to add a custom index to the fields added to the report. D. Ask Salesforce Support to create a skinny table for the fields used in the report.

A & D To improve report performance, a skinny table can be created for the fields added to the report, and a custom index can be added to the custom date field that is used for filtering the report. A skinny table can improve performance of a report as it stores the required fields in one database table, instead of querying one table for standard fields and another for custom fields, requiring a join operation. https://developer.salesforce.com/blogs/engineering/2013/03/long-and-short-term-approaches-for-tuning-force-com-performance.html https://developer.salesforce.com/blogs/engineering/2015/06/know-thy-salesforce-field-indexes-fast-reports-list-views-soql.html

Cosmic Enterprises would like to capture how the different data and business entities are defined with the applications created in Salesforce, including information about the sales, support, and lead lifecycles. The data architect of the company has suggested the creation of a data dictionary using the Metadata API for this requirement. Which of the following metadata types should be retrieved using Metadata API and included in the data dictionary? Choose 3 answers. A. Business Process B. CustomObject C. Layout D. Profile E. Report

A, B, & C The following metadata types can be included in the data dictionary: 1) BusinessProcess - This allows tracking sales, support, and lead lifecycles, and displays different picklist values for users based on their profile 2) CustomObject - This represents metadata information about a custom object 3) Layout - This represents metadata associated with a page layout Profile represents a user profile which defines a user's permission to perform different functions within Salesforce. Report represents a custom report. These metadata types cannot be used o capture how different data and business entities are defined in Salesforce applications. https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/meta_types_list.htm

The data architect of Cosmic Service Solutions has been asked to document the data architecture of the various enterprise systems used by the company. While documenting the Salesforce data architecture, which of the following metadata types can be utilized? Choose 3 answers. A. RecordType B. CustomField C. CustomObject D. Flow E. ApexTrigger

A, B, & C The following metadata types can be used for this requirement: 1) RecordType - This represents the metadata associated with a record type of an object. 2) CustomField - This represents the metadata associated with a custom field of an object. 3) CustomObject - This represents the metadata associated with a custom object. ApexTrigger represents metadata associated with an Apex trigger, while Flow represents metadata associated with a flow. Unlike RecordType, CustomField, and CustomObject, these metadata types do not represent the data model within a Salesforce org. https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/meta_types_list.htm

The data architect of Cosmic Solutions is designing a plan for regular archiving and purging of the records of certain custom objects such as 'Performance Review' and 'Inventory'. Which of the following are important considerations for designing an effective archiving strategy? Choose 3 answers. A. It would be necessary to consider whether the archived data would be required for reporting. B. Any regulatory restrictions that would influence the archiving strategy would need to be considered. C. For the purpose of reporting, the information can be aggregated into summary records. D. The number of history retention policies that are defined on each custom object would need to be considered for archiving records. E. It would be necessary to remove permissions from the profiles of users who can currently access the custom objects.

A, B, & C The following would be some of the important considerations for creating an effective archiving strategy: 1) For the purpose of reporting, the information can be aggregated into the summary records. Aggregation can be achieved by creating records of a custom object that contain summary of the data in the source records, i.e., records of 'Performance Review', and 'Inventory' in this case 2) It would be necessary to consider whether the archived data would be required for reporting. The users would require access to the archived records to view any reports based on them. 3) Any regulatory restrictions that would influence the archiving strategy would need to be considered. If certain data should not be archived due to those restrictions, that should be taken into account. It would not be necessary to remove permissions from profiles since users would require access to the new records. Also, it would not be necessary to consider the number of history retention policies since only one history retention policy can be defined for a particular object.

A developer would like to make SOQL queries used in certain Apex classes selective to improve their performance due to a large number of records retrieved by them. Which of the following would make a SOQL query non-selective? Choose 3 answers. A. Filter condition with a non-deterministic formula field B. Filter condition with leading "%" wildcards C. Filter condition with a composite index D. Negative filter operators such as NOT LIKE and EXCLUDES E. Comparison operators paired with a number or percent field

A, B, & D The following can make a SOQL query non-selective: 1) Negative filter operators used in filter conditions (!=, NOT LIKE, and EXCLUDES) 2) Comparison operators paired with text fields (<, >, <=, and >=) 3) Leading "%" wildcards (LIKE '%string%') 4) References to non-deterministic formula fields, such as cross-object formula fields Comparison operators paired with a number or percent field do not make a query non-selective. A filter condition with a composite index that uses the AND operator does not necessarily make a SOQL query non-selective. To ensure selectivity, the filter condition should target less than twice the index selectivity thresholds for each filter and the index selectivity thresholds for the intersection of the fields. https://help.salesforce.com/articleView?id=000325257&language=en_US&type=1&mode=1 http://resources.docs.salesforce.com/rel1/doc/en-us/static/pdf/salesforce_query_search_optimization_developer_cheatsheet.pdf

Cosmic Service Solutions has a Salesforce org with thousands of users who access it concurrently. There are more than 50 million account records and 20 million contact records in Salesforce. More than 200 GB of storage space has been used to store data. Due to large data volumes, users have been experiencing long search times and record saving times. Which of the following approaches can be utilized to mitigate such issues? Choose 3 answers. A. Ask Salesforce Support to create skinny tables for frequently used fields. B. Ask Salesforce Support to add indexes to custom fields that are often used for searches. C. Utilize Apex triggers to delete accounts and contacts and store them in custom objects. D. Use reporting snapshots for aggregating and storing large data volumes. E. Move old data to an external platform for archiving on a regular basis.

A, B, & E Skinny tables and custom indexes can increase Salesforce performance and reduce search and record saving times. Moving old data to an external platform for archiving can improve performance by reducing the total number of records. Using Apex triggers to delete accounts and contacts and store them in custom objects is not a good solution since the data would still exist in Salesforce and users would have to refer to two objects to find records of an object. Reporting snapshots are used to report on data at regular intervals; they cannot be used to improve search and record saving times. https://www.salesforce.org/two-steps-to-optimize-your-data-model-and-avoid-crm-performance-degradation/

The system administrator of Cosmic Enterprises has been asked to load 1,000 users, 20 million accounts, 30 million contacts, and 50 million opportunities into Salesforce. A number of sharing rules need to be created to define the users' access to the records of these objects. Which of the following are important considerations for minimizing the loading time? Choose 3 answers. A. Calculation of sharing rules should be deferred until data load is complete. B. Sharing rules should be deployed after loading all the data. C. Custom fields should be indexed before loading the data. D. Account records should be loaded before users, contacts, and opportunities. E. The users should be loaded before other records.

A, B, & E The following order should be followed for loading the data: 1) Users 2) Accounts 3) Contacts 4) Opportunities Once the users and other records have been loaded, the sharing rules can be deployed. However, sharing calculations should be deferred until all the data has been loaded. All the sharing rules can be created at once and then recalculated. Indexing custom fields prior to a data load does not improve loading performance. https://developer.salesforce.com/blogs/engineering/2013/04/extreme-force-com-data-loading-part-2-load-into-a-lean-salesforce-configuration.html

The data architect of Cosmic Innovation has been asked to optimize the data stewardship engagement for a Salesforce org by reviewing key areas of Salesforce. Which of the following areas should be reviewed by the data architect? Choose 3 answers. A. Redundant metadata associated with custom objects and fields B. Integrations with other enterprise systems C. Organization-wide default sharing settings D. Required fields and validation rules E. Setup changes made by users

A, C, & D Some of the key areas which would be important to analyze for optimizing a data stewardship engagement include the following: 1) Redundant metadata - Any redundant metadata can have an adverse impact on the overall quality of data. For example, if extraneous or irrelevant fields have been defined on an object, and users have been entering information in these fields, then it affects the overall quality of the records of the object. 2) Sharing model - When the organization-wide default sharing setting is 'Private', users are more likely to create duplicate records. If a user can't see a record that is owned by another user, then they are likely to create a duplicate version of it under the assumption that the record does not exist in Salesforce. 3) Required fields and validation rules - The fields that are required for data entry and the data that should be validated prior to saving the records are necessary to consider for evaluating a data stewardship engagement. Integrations and setup changes are not necessary to review for optimizing a data steward engagement.

Cosmic Footwear recently started using a community that allows customers to place sales orders. Every week the sales employees of the company receive more than 100,000 sales orders from around the world. This number is expected to increase to 500,000 as the company expands its business in Asia and Europe in the next quarter. Customers should be able to view sales orders placed in the last year and be able to access reports based on the same. Sales managers should be able to view reports based on all the current and historical data. The sales director of the company is concerned about data storage consumption, and has asked the data architect to ensure that only the required data are stored in Salesforce at a given time. What should the data architect recommend? Choose 1 answer. A. A custom object should be used to store sales orders, and old data should be moved off-platform to a data warehouse every year. B. A big object should be used to store sales orders, and old data should be permanently deleted every year. C. An external object should be used to store sales orders, and old data should be moved off-platform to a data warehouse every year. D. A Heroku Postgres database should be used to store sales orders, and an external object should be used to allow customers to access them.

A. A custom object should be used to store sales orders, and old data should be moved off-platform to a data warehouse every year. To meet this requirement, a custom object should be used to store sales orders. Another option that the company can consider is using the standard Order object, but out of the available options, using a custom object to store sales orders is the most suitable. Records that are older than one year can be moved off-platform to a data warehouse every year. A suitable BI/Analytics tool can be used to access reports based on the data in the data warehouse. An external object is used to access data stored in an external system, but there is no external system in this scenario. Heroku Postgres should not be used as sales orders should be stored in Salesforce. A big object could be used to store sales orders, but big objects currently do not support reporting, and old sales orders should not be permanently deleted as sales managers require access to all the current and historical data for reporting. https://developer.salesforce.com/blogs/2015/03/salesforce-backup-and-restore-essentials-part-1-backup-overview-api-options-and-performance.html

Cosmic Wears is a small-scale company that has been using a legacy CRM system for managing accounts, contacts, and opps. The owner of the company would like to switch to Sf to manage these records, but the original date on which each account record was created in the legacy system should be maintained in SF. What approach should be suggested by the company's data architect? Choose 1 answer. A. Assign the 'Set Audit Fields upon Record Creation' permission to allow a user to set the original 'Created Date' while migrating the data. B. Assign the 'Set System Fields upon Record Creation' permission to allow a user to set the original 'Created Date' while migrating the data. C. Assign the 'Set Date Fields upon Record Creation' permission to allow a user to set the original 'Created Date' while migrating the data. D. Assign the 'Set Created Date Field upon Record Creation' permission to allow a user to set the original 'Created Date' while migrating the data.

A. Assign the 'Set Audit Fields upon Record Creation' permission to allow a user to set the original 'Created Date' while migrating the data. The 'Set Audit Fields upon Record Creation' permission can be granted through a custom profile or permission set to allow a user to set the original 'Created Date' while migrating the account records. https://help.salesforce.com/articleView?id=000334139&type=1&mode=1

Cosmic Supermarket is a global company that will soon integrate multiple information systems, including Salesforce, by utilizing a master data management solution. A system architect has been hired to provide assistance in its implementation. What approach should the data architect use to determine which system should be the system of record? Choose 1 answer. A. Data flows should be reviewed with business users to determine the system of record. B. The system that updates a record should always be the system of record. C. The last system that receives an update should be the system of record. D. The MDM system should always be the system of record.

A. Data flows should be reviewed with business users to determine the system of record. In order to determine the system of record for different objects and fields, the flow of data from one system to another should be reviewed by meeting with stakeholders. The system that updates a record or receives an update shouldn't always be the system of record. The MDM system is typically used as the source of truth that contains the master records, or in other words, the most accurate and complete versions of the records.

Cosmic Enterprises would like to store information about documents that are sent to customers. The Salesforce Architect of the company has suggested the creation of a custom object called 'Customer Documentation' for this requirement. The Sales Director would like to ensure that a sales user specifies the name of the account while creating a customer documentation record. A customer documentation record can be owned by a user other than the owner of the parent account record. Which type of relationship can be created between the Account object and the customer Documentation object for this requirement? Choose 1 answer. A. Required Lookup Relationship B. Master-Detail Relationship C. Hierarchical Relationship D. Self Relationship

A. Required Lookup Relationship This requirement can be met by creating a lookup relationship field on the 'customer Documentation' object. A lookup relationship field can be used to link two objects together. The field can be made 'Required' to ensure that a sales user always specifies an account while creating a Customer Documentation record. A master-detail relationship should not be used for this requirement since the owner of a customer documentation record can be different from the owner of its parent account record. When a master-detail relationship is defined the 'Owner" field on the detail records is not available and is automatically set to the owner of the master record. A hierarchical relationship is a special lookup relationship that is available for only the User object. It lets users use a lookup field to associate one user with another that does not directly or indirectly refer to itself. A self relationship is used to link an object with itself using a lookup relationship field https://help.salesforce.com/articleView?id=overview_of_custom_object_relationships.htm&type=5

Cosmic Enterprises would like to have two custom objects called 'Policy' and 'Documentation'. A policy or documentation record can be related to multiple account records. Each account must have a policy and a documentation specified on it, and it should not be possible to delete the parent policy or documentation record when it is related to at least one account record. Also, the owner of an account record can be different from the owner of a policy or documentation. In order to meet this requirement by relating the Account object to the two custom objects, which type of relationship can be utilized? Choose 1 answer. A. Required Lookup Relationship B. Master-Detail Relationship C. Indirect Lookup Relationship D. Optional Lookup Relationship

A. Required Lookup Relationship Two required lookup relationships can be created for this requirement to ensure that each account is related to a policy and a documentation. When a user tries to delete a policy or documentation that is related to at least one account record, Salesforce can prevent the deletion of the parent record. Although an optional lookup relationship can also prevent the deletion of the parent record, it should not be used for the requirement since policy and documentation are required for saving each account record. When a master-detail relationship is used, the owner of the child records is the same as the owner of the parent record, which is why it should not be used in this case. An indirect lookup relationship is used to link a child external object to a parent standard or custom object. https://help.salesforce.com/articleView?id=overview_of_custom_object_relationships.htm&type=5

Cosmic Tools stores sales orders in an external system. There are 20 million sales orders, and the company receives one million new sales orders every month. The sales director has asked the system administrator of the company to display all the related sales orders on each customer's account record. Which solution should a data architect recommend for this requirement? Choose 1 answer. A. Use Salesforce Connect to define an external object, and create a lookup relationship field on the object B. Use Data Loader to import sales orders into a custom object, and create a lookup relationship field on the object. C. Use Apex and Vf to design a custom integration that displays sales orders on each account record D. Use an AppExchange application to retrieve and display related sales orders on each account record

A. Use Salesforce Connect to define an external object, and create a lookup relationship field on the object External objects can be used to display external data in Sf. Sf Connect can be set up for this. External data is not stored in Sf, but users can access it when required. Storing data in Sf is not a good solution due to a large number of sales orders, which is why Data Loader should not be used for this requirement. Using an AppExchange application is unnecessary since Sf Connect can be used. https://help.salesforce.com/articleView?id=platform_connect_about.htm&type=5

Cosmic Repair Solutions has been using an enterprise system other than Salesforce for managing accounts and cases. The company has recently switched to Salesforce, and the system administrator is trying to import 20 million account records and 50 million related case records into Salesforce using the Bulk API. However, while querying the parent account IDs to import the related case records, he is experiencing long delays. What should a data architect suggest to resolve this issue? Choose 1 answer. A. Use the external IDs obtained from the enterprise system to query account records and import related case records. B. Define and populate an indexed text field on the account records and use it to import related case records. C. Use batch Apex to update the parent account IDs after importing the related case records. D. Define and populate an auto-number field on the account records and use it to import related case records.

A. Use the external IDs obtained from the enterprise system to query account records and import related case records. An External ID field can be defined on the Account object, and the IDs in the enterprise system can be used to populate the value of this field on the account records that have been imported into Salesforce. The field can be used to import the related case records into Salesforce. This should reduce the query time. Data Types that we can set as External Id are Text, Number and Email. Using an indexed text field or an auto-number field for importing related case records would not affect the performance of the migration. Case records should not be imported without specifying their original parent account ID. It is not a good solution to update parent account IDs after importing the related case records. https://help.salesforce.com/articleView?id=000320964&type=1&mode=1 https://help.salesforce.com/articleView?id=000324871&type=1&mode=1

Cosmic Electronics is considering the implementation of a master data management solution to integrate data and resolve any duplicates and discrepancies between Salesforce and other enterprise systems used by the company. Which of the following are the most important factors to consider for the implementation? Choose 2 answers. A. Existence of an on-premise system that requires integration B. Number of systems that need to be integrated C. System of record for different types of data D. Location of systems that need to be integrated

B & C Determining the system of record for different types of data and the number of systems that need to be integrated are two of the most important factors that would need to be considered for the MDM implementation. Whether a system is cloud-based or on-premise is not as important as defining the system record for a certain type of data and identifying the number of systems from which data need to be obtained and consolidated in the master hub. The location of a particular system is also not as important as the other factors, as a system can be located anywhere and doesn't affect the master data management solution.

Cosmic Foods & Goods uses a Salesforce org that has more than 500,000 account records. Sales users of the company have identified various accounts with very similar names, phone numbers, or email addresses. Which of the following solutions can be utilized by a system administrator in order to resolve this issue and prevent the creation of more account records with similar attributes? Choose 2 answers. A. Use a data integration rule to identify duplicate records. B. Use duplicate and matching rules to prevent the creation of additional duplicate records. C. Use a native tool to merge similar account records. D. Use a custom application to manage duplicates.

B & C Duplicate and matching rules can be utilized to identify duplicate account records and prevent the creation of more duplicate records. Existing duplicates can be merged by using a native tool that can be accessed from the Accounts tab. A data integration rule is configured to automatically match records to current information in a third-party data service; it is not used to identify duplicates. Using a custom application is not necessary for this use case. https://trailhead.salesforce.com/en/content/learn/modules/data_quality/data_quality_getting_started https://help.salesforce.com/articleView?id=duplicate_rules_map_of_reference.htm&type=5

Cosmic Solutions wants to build a Salesforce Community to enable its customers to submit inquiries and log cases. The community manager wants to avoid dealing with incomplete and/or inaccurate case information. One of the validation requirements is to make sure that the 'Store Branch' field is populated when the 'Case Type' is 'Product Return'. Which two validation solutions can be implemented if page responsiveness is a concern? Choose 2 answers. A. Create a validation rule that is triggered when 'Store Branch' is blank and 'Case Type' is 'Product Return'. B. Implement client-side validation in a Lightning component using JavaScript. C. Make 'Case Type' a controlling picklist field and 'Store Branch' the dependent picklist field. D. Implement server-side validation using an Apex trigger.

B & C If page responsiveness is a concern, validation must be performed on the client-side. A Lightning component is able to perform client-side validation using JavaScript. It can also leverage a dependent picklist field which dynamically shows appropriate values depending on the selected value in a controlling picklist field. When the picklist fields in Lightning components are loaded by Salesforce, the values are automatically generated based on the configured field dependency matrix. Validation rules and Apex triggers are both executed on the server-side. If implemented, this can negatively impact page responsiveness. https://developer.salesforce.com/docs/atlas.en-us.lightning.meta/lightning/validation.htm https://help.salesforce.com/articleView?id=fields_defining_field_dependencies.htm&type=5

The sales director of Cosmic Cosmetics is considering the use of Lightning Data services to cleanse account data automatically. What must be considered when doing this? Choose 2 answers. A. Data.com Clean is the platform recommendation for this use case. B. Some AppExchange apps automatically cleanse data and remove the need for scheduling cleanse jobs C. AppExchange apps are limited to daily scheduled jobs for data cleanse activities. D. An AppExchange application such as D&B Optimizer is the recommendation for this use case.

B & D Data.com Prospector and Clean are scheduled for retirement on July 31, 2020, and licenses are no longer available for sale or renewal. Therefore, an architect must consider an AppExchange solution based on Lightning Data to cover this use case, such as (but not limited to) D&B Optimizer. The frequency of data updates when using a lightning data service is determined by the data service provider. An AppExchange app, such as one based on Lightning Data, may be used to cleanse data automatically. A Lightning Data solution can be used for real-time enrichment of data. It removes the need for scheduling data cleansing jobs. https://appexchange.salesforce.com/appxStore?type=Data https://help.salesforce.com/articleView?id=000317679&type=1&mode=1

The IT manager of Cosmic Solutions would like to build a data dictionary that defines how different data entities are stored in various enterprise systems including Salesforce. The data dictionary should cover all the main applications and business domains used within the company. Which of the following can be utilized for building a data dictionary that includes the data entities in Salesforce? Choose 2 answers. A. SOQL B. Metadata API C. SOAP API D. Entity Relationship Diagram

B & D Metadata API can be used to retrieve customization information about the data model in Salesforce, such as definitions of custom objects and relationship fields. An Entity Relationship Diagram (ERD) makes use of different symbols and connectors to visualize the major data entities within a particular system and the inter-relationships among these entities. SOQL refers to the Salesforce Object Query Language, which is used to search an organization's Salesforce data for specific information. It cannot be used to retrieve information about the data model. SOAP API is used to create, retrieve, update or delete records. It cannot be used to obtain information about the data model. https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/meta_intro.htm

The data steward of Cosmic Enterprises is working on the development of a data governance plan for the records in Salesforce. He has already assessed the state of the data in Salesforce. Which of the following elements should be included in the governance plan? Choose 3 answers. A. Data Review and Segmentation B. Data Quality Standards C. Quality Control Process D. Data Definitions E. Data Quality Measurement

B, C, & D The following elements or sections should be included in a data governance plan: 1) Data Definitions 2) Data Quality Standards 3) Quality Control Process 4) Roles & Ownership 5) Security & Permissions Data quality measurement and data review and segmentation would be included in data assessment, but in this case, the data steward has already assessed the state of the data. https://a.sfdcstatic.com/content/dam/www/ocms-backup/assets/pdf/misc/data_Governance_Stewardship_ebook.pdf

Cosmic Sporting Goods uses multiple enterprises systems, including Sf and an ERP system. The IT director of the company would like to implement a master data management solution, which should include consolidating data from the different systems, defining the system of record for different types of records, and defining a single version of truth. The IT manager has asked the data architect of the company to provide recommendations related to the definition of system of record (SOR) and source of truth (SOT) for the MDM implementation. Which of the following are valid statements pertaining to the same? Choose 3 answers.

B, C, & D The purpose of defining a single source of truth is to ensure that a single data element such as a record can only be mastered in a single system. When a record is updated in the source of truther location, the update is propagated to all the systems that contain the record. The single source of truther system contains records that are authentic, accurate, relevant, and referable. A system of record is defined as the authoritative data source for a certain data element such as a record. A system of record is useful for resolving conflicts when multiple enterprise systems disagree about information in a certain record.

A developer of Cosmic Solutions is using SOSL in an Apex class to search multiple objects and fields for a specific term and return the result. Which of the following are best practices that should be recommended by a data architect when using SOSL for searches in Salesforce? Choose 3 answers. A. Records that are owned by other users should be targeted for faster searches. B. Search terms should use exact phrases and be as selective as possible. C. Search scope should be limited by targeting specific objects. D. Records within a division should be excluded in SOSL searches. E. Searches should be performed in specific fields, such as name fields, instead of all fields.

B, C, & E In order to ensure faster SOSL searches, exact phrases should be used, and the search terms should be as selective as possible. The search scope can be limited by targeting specific objects, records owned by the searcher, and records within a division, whenever applicable. http://resources.docs.salesforce.com/rel1/doc/en-us/static/pdf/salesforce_query_search_optimization_developer_cheatsheet.pdf

Cosmic Innovation is considering the implementation of an MDM solution for the regular consolidation of account records from three different enterprise systems including Salesforce. Which of the following are valid techniques for selecting the surviving fields from the three systems to ensure that the most accurate and complete master records exist in the MDM system? Choose 3 answers. A. Selecting the most regularly updated fields B. Selecting the most recently updated fields C. Selecting the fields that were not updated D. Selecting fields from the most trusted source E. Selecting the most complete fields

B, D, & E Some of the techniques for selecting surviving fields from different source systems include: 1) Selecting the fields from the most trusted data source, i.e., selecting fields from a record ina system that is known to be accurate 2) Selecting the most complete fields, i.e., fields with more complete details 3) Selecting the most recently updated fields 4) Selecting the fields with the least ambiguous values 5) Selecting the fields which are not null

The sales users of Cosmic Innovation often create reports to view key information pertaining to records such as accounts and opportunities. There are more than 10 million accounts and 20 million opportunities in Salesforce. The reports created by users often use multiple filters and filter logic. A lot of users have been complaining that the reports are taking a long time to run or timing out. What measures should a data architect recommend in order to improve the performance of the reports? Choose 3 answers. A. Filter the reports to show all accounts or opportunities. B. Use indexed fields and External ID fields to filter the reports. C. Use only standard fields to filter the reports. D. Use the AND filter logic instead of OR to filter the reports. E. Make sure that the Recycle Bin doesn't contain any records.

B, D, & E To improve the performance of reports, one can filter on indexed fields, standard fields that are optimized for search, such as Id and Name, and External ID fields. Using only standard fields to filter the reports does not improve the report performance. Instead of filtering the report to show all the records, filters such as 'My Accounts' can be utilized. Since reports include the records in the Recycle Bin, one can make sure that it is empty to reduce the number of records that are returned, which would improve the report performance. The AND filter logic should be used instead of OR to make the filters as selective as possible. https://developer.salesforce.com/docs/atlas.en-us.salesforce_reportperformance_cheatsheet.meta/salesforce_reportperformance_cheatsheet/reportperformance_cheatsheet.htm

Cosmic Enterprises stores sales orders in an external system. Salesforce is used to store accounts and contacts. An external object has been defined in Salesforce to display sales orders. A sales manager of the company would like to view all the accounts in San Francisco with all the related sales orders. What should a data architect suggest to meet this requirement? Choose 1 answer. A. Create a Visualforce page that shows accounts and related sales orders. B. Create a report in Salesforce that combines records of the Account object and the external object. C. Create a report in the external system that combines records of the Account object and the external object. D. Install an AppExchange solution that combines data from the Account object and the external object.

B. Create a report in Salesforce that combines records of the Account object and the external object. Salesforce supports reports based on external objects, so a report can be created in Salesforce to meet this requirement. It is not necessary to pull data from the external system using a Visualforce page or creating a report in the external system since an external object has already been defined in Salesforce. Installing an AppExchange application is also unnecessary to meet this requirement. https://help.salesforce.com/articleView?id=platform_connect_considerations_reports.htm&type=5

The sales managers of Cosmic Solutions require the ability to use their mobile device to view a dashboard with key sales-related data that allows adding ad-hoc filters on the go. Which of the following should a data architect recommend for this requirement? Choose 1 answer. A. Sales Activity Dashboard B. Einstein Analytics Dashboard C. Lightning Experience Dashboard D. Third-Party Business Intelligence Tool

B. Einstein Analytics Dashboard An Einstein Analytics dashboard can be utilized for this requirement. Analytics licenses can be used to add ad-hoc filters. Ad hoc queries can be created for a data set. An Analytics dashboard is populated with the data based on the defined queries. Analytics dashboards are available in both Salesforce Classic and Lightning Experience. Users can download the Einstein Analytics mobile app to access dashboards. Ad hoc queries are not possible when using a Lightning Experience dashboard. The Sales Activity dashboard available in the AppExchange shows information about sales reps' activity. A third-party business intelligence tool is not necessary for this requirement. https://help.salesforce.com/articleView?id=bi_query_data_to_know_business.htm&type=5 https://help.salesforce.com/articleView?id=bi_mobile_resources.htm&type=5

Cosmic Supermarket is a company with thousands of employees who work in more than a hundred stores that are located in different states of the country. The IT director of the company would like to establish certain data governance processes for the maintenance of data quality in Salesforce. The executive management will establish policies and standards for meeting the data governance goal. The decisions made by the management will flow down the organizational hierarchy via supervisory channels. Which data governance model should be recommended by a data architect for this requirement? Choose 1 answer. A. Silo-in governance B. Top-down governance C. Bottom-up governance D. Center-out governance

B. Top-down governance When the top-down governance model is utilized, executive decisions flow down through the organizational hierarchy via management channels. In the bottom-up model, governance-related decisions, such as defining the naming standards for records, are made by end users as part of their everyday responsibilities. In the center-out model, experts define the controls and protocols for establishing the data governance processes. In the silo-in governance model, representatives from multiple groups are brought together to collectively agree on the data governance policies and procedures.

The data architect of Cosmic Foods & Goods has been given the responsibility of designing a data governance plan for assessing, improving, and maintaining the quality of records in Salesforce. The architect is required to define a data governance team which should consists of people who make sure that the plan is working as intended, data quality is improving across the board, and the company's users are getting maximum value from the records. Which of the following roles should be included in the team? Choose 2 answers. A. Support Operations Manager B. IT Assistant C. Data Steward D. Analytics Manager

C & D An Analytics Manager would be responsible for translating data into meaningful information using reports and dashboards, which would be useful for determining whether users are getting maximum value from the Sf records. A data steward would be responsible for utilizing the data governance processes to ensure that the governance plan is working as intended and the data quality is improving. Support Operation Managers and IT Assistance would be end users of data who would not be responsible for data governance. https://a.sfdcstatic.com/content/dam/www/ocms-backup/assets/pdf/misc/data_Governance_Stewardship_ebook.pdf https://www.salesforce.com/video/1779731/

Cosmic Service Solutions uses an on-premise system to generate regular reports containing information about repairs sent by the field service agents of the company. The system generates 100 reports every 15 minutes based on data sent by field service agents using a custom web application from customer sites throughout the country. There are more than 35 million records in the on-premise system that represent these reports. The support director of the company is considering the use of Salesforce for storage. If the records in the on-premise system also need to be generated in Salesforce, which of the following limitations should be considered? Choose 2 answers. A. Apex Governor Limitations B. Metadata Limitations C. API Request Limitations D. Data Storage Limitations

C & D Data storage limitations would need to be considered due to the large number of records that need to be stored in Salesforce. API request limitations would also need to be considered because regular API calls would be required to create records that represent field service agents' reports in Salesforce. Metadata limitations and Apex governor limitations would not be important to consider due to the absence of metadata and programmatic requirements. https://developer.salesforce.com/docs/atlas.en-us.salesforce_app_limits_cheatsheet.meta/salesforce_app_limits_cheatsheet/salesforce_app_limits_overview.htm

The system administrator of Cosmic Service Solutions has been asked to create a report that displays certain types of accounts and related cases in Salesforce. There are more than 50 million accounts, and on average, each account has 5 related cases. Which of the following approaches can be utilized to ensure that the performance of the report does not degrade? Choose 2 answers. A. Reduce the number of master-detail relationships in the application B. Ensure that there are no more than 1,000,000 accounts in the system C. Ensure that filters are applied to the report to selectively retrieve data D. Ensure that appropriate indexes have been created

C & D Indexes can improve SOQL query performance. Since queries are used by reports to retrieve data, indexes can also improve report performance. Applying filters to reports ensures that data is selectively retrieved, which also improves report performance. Reducing more than 50 million accounts to just 1 million is not a good solution. Reducing or removing master-detail relationships on an object does not have any effect on report performance.

Cosmic Service Solutions would like to implement a data governance plan to monitor, maintain and improve the quality of data in Salesforce. Which of the following can be used to enforce the plan? Choose 3 answers. A. Matching rules to update Salesforce data based on crowd-sourced data B. Apex triggers to validate the format of fields C. Workflow rules to automatically update key information D. Validation rules to validate information and enforce data entry by users E. Weekly dashboard that displays key data quality metrics

C, D, & E A weekly dashboard that displays key data quality metrics, such as missing information, duplicate records, inaccurate records, and incomplete records, can be used by the data steward to monitor data quality in Salesforce. Validation rules can be created on objects to validate information and enforce data entry. Workflow rules can automatically update fields based on custom criteria, when a record is created or edited. Validation rules can be used to validate the format of fields using the REGEX function, so it is not necessary to use Apex triggers. Matching rules are used to define how records should be matched for identifying duplicates. https://appexchange.salesforce.com/appxListingDetail?listingId=a0N300000016cshEAA

Cosmic Service Solutions has several Salesforce instances for different regions where the company would like to install a custom application that will be used for the management of internal service records. The application must use custom metadata configuration. The data architect of the company has suggested the use of custom metadata types for this use case. Which of the following are valid reasons for using custom metadata types? Choose 3 answers. A. Apex code can create, read, update, and delete custom metadata records. B. SOAP API can be used to create, edit, and delete custom metadata records. C. Different configurations can be assigned to different regions using mapping. D. Custom metadata types can be deployed using packages. E. Values of a custom metadata type can be referenced in a formula field.

C, D, & E Custom metadata types can be deployed using managed packages, unmanaged packages, or managed package extensions. Mappings can be used to create associations between different objects, such as a custom metadata type that assigns different internal support options to particular regions in a country. Metadata API is used to create, edit, and delete custom metadata records. Apex code can create, edit, and update custom metadata records, but it cannot be used to delete custom metadata records. It is possible to use values of a custom metadata type in a formula field. https://help.salesforce.com/articleView?id=custommetadatatypes_overview.htm&type=5

Cosmic Repair Services would like to store information about repair jobs and repair sites. There are almost 300 repair sites in different states throughout the country. The company is currently using an internal database to store information about repairs conducted at these repair sites. The repair sites are fixed company locations where repairs are performed by technicians. More than a million new repair jobs are created every month. Other than the name of the repair site and the state where it is located, no other information about a repair site needs to be stored for a particular repair job. Which of the following should be used to store the information in Salesforce? Choose 1 answer. A. A custom object with a multi-select picklist field B. A custom object with a picklist field C. A custom object with dependent and controlling picklist fields D. Two custom objects and a master-detail relationship field

C. A custom object with dependent and controlling picklist fields The best way to meet this requirement is to use a custom object to store repair jobs and define two custom picklist fields on the object to allow users to specify the state and repair site for each repair job. 'State' can be the controlling field, while 'Repair Site' can be the dependent field. The repair sites that are available for selection would depend on the state selected by a user. Since up to 1000 values can be defined for each picklist field, it can be used to define the predefined names of repair sites. Creating another custom object with a relationship field is unnecessary since the number of repair sites is fixed and there is no need to store any additional information. https://help.salesforce.com/articleView?id=fields_defining_field_dependencies.htm&type=5 https://help.salesforce.com/articleView?id=picklist_limitations.htm&type=5

The sales users of Cosmic Enterprises attach various kinds of documents to account records in Salesforce. The sales director of the company would like to allow users to specify details related to the attachment using certain custom fields. What should an architect recommend for this requirement? Choose 1 answer. A. Create a Visualforce page that allows storing details about related attachments. B. Create custom fields on the Account object to store information about related attachments. C. Create a new custom object to store attachments and relate it to the Account object. D. Use the standard Attachment object and create custom fields on it.

C. Create a new custom object to store attachments and relate it to the Account object. For this requirement, a new custom object can be created to store attachments. Custom fields, including a relationship field, can be created on this object to store information about related attachments. It is not possible to create custom fields on the standard Attachment object. And it would be better to store information about each attachment separately on the attachment record. https://help.salesforce.com/articleView?id=dev_objectedit.htm&type=5 https://help.salesforce.com/articleView?id=overview_of_custom_object_relationships.htm&type=5

Cosmic Enterprises uses the Account object in Salesforce to store information about B2B and B2C customers. Sales orders are managed in an external system that supports REST API. Tens of thousands of account records are added to Salesforce every month. Information about the creation of a sales order is sent to the external system when an opportunity is won. The company generally receives more than 500,000 sales orders each month. The sales director of the company would like to make it possible for sales representatives to view sales orders when required in Salesforce. He would prefer not paying extra for storage capacity in Salesforce. What should an architect recommend for this requirement? Choose 1 answer. A. Create a big object to store sales orders in Salesforce. B. Create a custom object to store sales orders in Salesforce. C. Create an external object to allow 'View' access to sales orders. D. Use a data warehouse to store existing and new sales orders.

C. Create an external object to allow 'View' access to sales orders. Since there is a large number of sales orders, Salesforce Connect can be set up, and an external object can be created to allow users to view sales orders when they require access. An external data source based on OData (Open Data Protocol) can be created. OData is a REST-based protocol for integrating data. Although a big object can be used to store and manage massive amounts of data on the Salesforce platform, it can only store up to one million records; additional record capacity is available as an add-on license. However, the sales director would prefer not paying extra for additional storage in Salesforce. Using a data warehouse is unnecessary for this requirement since the records are already stored in an external system. https://help.salesforce.com/articleView?id=external_object_define.htm&type=5 https://help.salesforce.com/articleView?err=1&id=ext_data_sync_database.htm&type=5

Cosmic Circle Goods uses a Salesforce org with two custom objects called 'Sold Product' and 'Product Part'. The 'Sold Product' object stores information about each product sold to a customer and is related to the 'Account' object through a master-detail relationship. The 'Product Part' object stores information about the details of each part associated with a particular product, such as the name of the part and the cost of producing the part. The 'Product Part' object is the child of the 'Sold Product' object in a master-detail relationship. The sales director of the company would like to view the total cost of producing all the products that are sold to a customer on the customer's account record. What solution should a Data Architect recommend for this requirement?Choose 1 answer. A. Create a flow to calculate the total cost, and add a button to the Account detail page to allow users to trigger the flow. B. Create an Apex trigger on the Account object to calculate the total cost. C. Create one roll-up summary field on the 'Account' object and another one on the 'Sold Product' object to calculate the total cost. D. Create one roll-up summary field on the 'Account' object to calculate the total cost.

C. Create one roll-up summary field on the 'Account' object and another one on the 'Sold Product' object to calculate the total cost. The simplest way to meet this requirement is to create two roll-up summary fields. A roll-up summary field on the 'Sold Product' object can calculate the total cost of all the parts associated with a particular product. Another roll-up summary field on the 'Account' object can use the first roll-up summary field to calculate the total cost of the parts associated with all the products that are sold to a particular customer. https://help.salesforce.com/articleView?id=fields_about_roll_up_summary_fields.htm&r=https%3A%2F%2Fwww.google.com%2F&type=5 https://help.salesforce.com/articleView?id=fields_defining_summary_fields.htm&type=5

Cosmic Innovations is a company that sells smart home products. It uses a custom object called 'Smart Solution' in Salesforce to store data about smart home products sold by the company. The director of the company is interested in the implementation of a categorization approach for the products. A smart home product can be a 'Device', 'System' or 'Gadget'. What should a Solution Architect recommend for this requirement? Choose 1 answer. A. Define a self-relationship using a lookup relationship field on the 'Smart Solution' object. B. Create a new custom object with a master-detail relationship field. C. Define a new custom picklist field on the 'Smart Solution' object. D. Create three new custom objects to store information about the types of smart home products.

C. Define a new custom picklist field on the 'Smart Solution' object. A custom picklist field can be created on an object to allow users to select a particular value from a list of values on a record of that object. In this case, a custom picklist field can be created to allow selecting the type of smart home product. A self-relationship cannot be used for this requirement since it is used to link an object with itself. Also, it is much simpler to use a custom field instead of defining one or more custom objects with a master-detail relationship field for this requirement. https://help.salesforce.com/articleView?id=fields_creating_picklists.htm&type=5

The legal and regulatory requirements of Cosmic Innovation make it necessary to capture certain metadata changes in Salesforce, such as file downloads, report exports, Visualforce page loads, etc. Which of the following should be a data architect recommend for this use case? Choose 1 answer. A. Setup Audit Trail B. Debug Log C. Event Monitoring D. System Log

C. Event Monitoring Event Monitoring tracks certain Setup changes, such as logins, logouts, file downloads, report exports, Apex executions, Visualforce page loads, etc. Debug Logs cannot be used without setting trace flags. Setup Audit Trail is used to track changes in Setup. There is no feature in Salesforce called 'System Log'. https://trailhead.salesforce.com/en/content/learn/modules/event_monitoring/event_monitoring_intro

Cosmic Fitness Solutions has a custom object named 'Equipment__c' which is used to store info about various types of fitness equipment sold by the company. When a particular fitness equipment is sold to a customer, it should appear on the custom's account record in Sf. The owner of an account should be able to view and edit all the related equipment records. When a customer is inactive for a period of more than 5 years, their account record is deleted. This should also delete all the related equipment records. Which type of relationship can be utilized to meet this requirement? Choose 1 answer. A. Self Relationship B. Lookup Relationship C. Master-Detail Relationship D. Hierarchical Relationship

C. Master-Detail Relationship This requirement can be met by creating a master-detail relationship field on the 'Equipment' object. A master-detail relationship field can be used to link two objects together closely. When the master record is deleted, all the related child records are automatically deleted. In this case, deleting an account record would also delete all the related equipment records. Furthermore, since the 'Owner' field on the detail records is not available and is automatically set to the owner of the master record, the owner of each account record would be able to view and edit the related equipment records. A lookup relationship should not be used for this requirement since it creates a loose relationship between two objects. It is typically used when the ownership and security of the child records should not be based on the parent record. A hierarchical relationship is a special lookup relationship that is available for only the User object. It lets users use a lookup field to associate one user with another that does not directly or indirectly refer to itself. A self-relationship is used to link an object with itself using a lookup relationship field. https://help.salesforce.com/articleView?id=overview_of_custom_object_relationships.htm&type=5

Cosmic Solutions has more than 10 million records of a custom object called 'Performance Review'. A developer recently tried to retrieve these records using a Bulk API query for the purpose of extraction, but the query timed out. What should be used to resolve the timeout issue? Choose 1 answer. A. SOAP API B. Third-party data export tool C. PK Chunking D. Metadata API

C. PK Chunking PK Chunking can be used to split the bulk API query to retrieve the records, which would resolve the timeout issue. The 'Sforce-Enable-PKChunking' header can be specified on the job request for the Bulk API query. Although a third party tool that supports automatic chunking can be utilized, it is easier to use the existing tool and PK Chunking to resolve the issue. Other options here cannot be used to resolve a query timeout. https://developer.salesforce.com/blogs/engineering/2015/03/use-pk-chunking-extract-large-data-sets-salesforce.html

The sales managers of Cosmic Enterprises would like to view information regarding the sales performance of the company, such as the number of closed sales this year and the number of closed sales by industry and country. Based on this information, they would like to take corrective actions where appropriate. Which dashboard available in the AppExchange should be recommended by a data architect for this requirement? Choose 1 answer. A. Sales Activity Dashboard B. Sales Performance Dashboard C. Sales KPI Dashboard D. Salesforce CRM Dashboard

C. Sales KPI Dashboard The Sales KPI Dashboard in the AppExchange shows relevant sales key performance figures and helps sales managers take corrective actions where appropriate. The Sales Activity Dashboard allows analyzing sales reps' activity rates and types against opportunities. Although a Salesforce CRM dashboard for sales managers shows information about how a sales team is doing, it is better to use the Sales KPI dashboard that shows a number of performance metrics. There is no Sales Performance Dashboard in the AppExchange. https://www.salesforce.com/blog/2019/01/sales-management-dashboards.html https://appexchange.salesforce.com/listingDetail?listingId=a0N300000016ZOSEA2

Cosmic Luxiam uses an account management system that contains several million account records. It has recently started using Salesforce, and would like to integrate it with the account management system. A master data management strategy is being defined. To define the system of record for accounts, the existing system is being considered. However, Salesforce contains more than 500,000 account records, many of which conflict with data stored in the existing system. What approach should a data architect recommend to integrate Salesforce with the existing system and prevent data conflicts? Choose 1 answer. A. The system with the most recently updated record should prevail in conflicts. B. The existing system should prevail in all conflicts pertaining to account records. C. Stakeholders should be brought together to discuss the data strategy moving forward. D. Salesforce should be the system of record for all account records.

C. Stakeholders should be brought together to discuss the data strategy moving forward. The correct approach would be to determine the system of record for specific account records. This can be done by bringing together stakeholders and discussing the data strategy moving forward. The data strategy would include specifying the system of record for specific fields and account records in the two systems.

Cosmic Repair Solutions uses a custom object called 'Repair Job' to store repair jobs requested by customers in Salesforce. A custom field has been defined on the object to allow sales agents to specify the 'Job ID' of each repair job. There are more than 10 million repair jobs that are currently stored in Salesforce, and more than 100,000 new repair jobs are created every week. The data architect of the company has been asked to ensure that sales agents can quickly search for a repair job by using its Job ID as the search term in global search. What should the data architect recommend? Choose 1 answer. A. Repair jobs that are older than one year should be archived regularly. B. The 'Job ID' field should be set as a non-unique External ID. C. The 'Job ID' field should be set as a unique External ID. D. Repair jobs should be sent to an external system and exposed in Salesforce.

C. The 'Job ID' field should be set as a unique External ID. The best way to meet this requirement is to set the 'Job ID' field as a unique External ID. A custom field that is marked as unique or External ID is automatically indexed by Salesforce, which makes search results faster. The field should not be set as a non-unique field since each repair job would be unique. Archiving older records would remove them from Salesforce, making them unavailable for search. Sending the records to an external system and exposing them in Salesforce as the records of an external object would require more effort and also not guarantee improved search performance. https://help.salesforce.com/articleView?id=custom_field_attributes.htm&type=5 https://help.salesforce.com/articleView?id=000325247&language=en_US%C2%A0&type=1&mode=1

Cosmic Cosmetics would like to user an AppExchange application for financial management. The company has 50 million invoice records that are currently stored in an internal database. Five million new invoice records are likely to be generated every month in the future. The AppExchange application will be used to mange these records. Which of the following is an important consideration with regard to it's impact on Salesforce performance? Choose 1 answer. A. Data Loader should be used to insert existing invoice records on a regular basis. B. The custom object used for invoice records should be indexed C. A big object would be essential for storing invoice records in Salesforce D. A data archiving strategy should be used to remove old records from the database

D. A data archiving strategy should be used to remove old records from the database Using a sound data archiving strategy would be an important consideration for improving Salesforce performance while using the AppExhchange application. Due to a large number of invoice records, there will be a significant impact on the performance of queries, reports, list views, etc. By archiving old records on a regular basis, the impact on performance can be minimized. A big object is used to store massive amounts of data on the Salesforce platform, but it is not essential. It is typically used to archive data from other object or bring datasets from external systems into a big object. Using Data Loader on a regular basis would not have an impact on performance without a good archiving strategy. https://www.salesforce.org/ask-an-architect-5-steps-to-an-effective-salesforce-data-management-strategy/

The sales reps of Cosmic Enterprises use Salesforce to manage the sales pipeline. Contracts are stored in an external system to allow collaboration with co-workers who cannot access Salesforce. An external object has been created to allow viewing contracts in Salesforce. A custom field called 'Contract Link' on the Opportunity object is used to select the contract associated with a particular opportunity. The sales director of the company would like to make it mandatory for the sales reps to enter a value in the 'Contract Link' field when the value of the 'Stage' field is set to 'Negotiation'. What solution must be implemented to enforce the validation? Choose 1 answer. A. Create an Apex trigger that displays an error message if 'Contract Link' is empty and 'Stage' is set to 'Negotiation'. B. Mark the field as required in the field definition if 'Contract Link' is empty and 'Stage' is set to 'Negotiation'. C. Mark the field as required on the page layout if 'Contract Link' is empty and 'Stage' is set to 'Negotiation'. D. Create a validation rule that is triggered when 'Contract Link' is empty and 'Stage' is set to 'Negotiation'.

D. Create a validation rule that is triggered when 'Contract Link' is empty and 'Stage' is set to 'Negotiation'. The most appropriate solution is to use a validation rule that is triggered when 'Contract Link' is empty and 'Stage' is set to 'Negotiation'. Validation using an Apex trigger would also work, but is not the recommended solution because it requires programmatic development and adds unnecessary complexity to a simple validation requirement. Marking the field as required on the page layout or in the field definition is not the correct solution because this would make the field required for all the stages. https://help.salesforce.com/articleView?id=fields_about_field_validation.htm&type=5

Cosmic Harvest has more than 200,000 account records in Salesforce. The data steward of the company has found that many users have been entering inaccurate or incomplete information while creating and updating account records. Which of the following can be used to allow the data steward to monitor data quality by comparing individual records side by side with accurate data? Choose 1 answer. A. Matching Rule B. Data Extension C. Data.com Clean D. D&B Optimizer

D. D&B Optimizer Data.com Clean was formerly an option to keep account data current and complete, but will be retired July 31st, 2020, and is no longer available for sale. D&B Optimizer, an AppExchange app using Lightning Data, is the recommended replacement, which also allows for manual comparison of individual account, contact, and lead records side by side with matched Dun & Bradstreet records. Records can be updated field by field. A matching rule is used to define how records should be matched for identifying duplicates using a duplicate rule. There is no feature called 'Data Extension'. https://help.salesforce.com/articleView?id=000318293&type=1&mode=1

The sales users of Cosmic Enterprises are experiencing a lot of report time-out issues while running reports related to accounts and opportunities. Which of the following should a data architect recommend to replace the reports in order to prevent report time-out issues? Choose 1 answer. A. Visualforce pages based on the reports B. Dashboards created in Einstein Analytics C. Reports and dashboards from the AppExchange D. Dashboards that are scheduled to refresh once a day

D. Dashboards that are scheduled to refresh once a day Dashboards based on reports that are scheduled to refresh once a day can be utilized to allow users to view key sales data. This approach can be used in place of letting users run reports every time they need to access sales data. Other features and approaches are not suitable to prevent report time-out issues. https://help.salesforce.com/articleView?id=dashboards_schedule_refresh.htm&type=5

Cosmic Dry Foods uses multiple instances of Salesforce for different continents where it sells its products, including North America and Europe. Millions of accounts, contacts, and opportunities are stored in these instances. The IT director of the company would like to capture all the master data from multiple instances and store them in a central hub for the purpose of analytics and reference. A single version of the master data should be created in one data store. Which technique should an architect recommend for this use case? Choose 1 answer. A. Data replication B. Data federation C. Data propagation D. Data consolidation

D. Data consolidation Master data can be collated and distributed to different systems in multiple ways. For this requirement, data consolidation should be used since it is the process of capturing master data from multiple sources and integrating them into a central hub. Data can then be replicated to the destination systems. Data federation refers to providing a single virtual view of master data from one or more systems to another system. Data propagation means copying master data from one system to another.

The sales director of Cosmic Enterprises is concerned about data deletions in Salesforce due to a recent incident in the company. On average, the company adds 100 new account records to Salesforce every week. One of the sales users with API access deleted more than 100,000 account records permanently. The data architect has been asked to recommend an appropriate solution that can be utilized to recover from such incidents. The sales director is less concerned about the deletion of records that were created recently, since each user is required to use an Excel spreadsheet to record information about accounts created by them every day and submit it to their supervisor. Each spreadsheet is stored in a separate database for one week. What should the data architect recommend? Choose 1 answer. A. Use a third-party solution that provides data recovery services. B. Use Data Loader to manually export all the account records regularly. C. Install an AppExchange application that can be used to recover deleted account records. D. Schedule a weekly export of all the account records in Salesforce.

D. Schedule a weekly export of all the account records in Salesforce. Backup data can be exported from Salesforce on a weekly or monthly basis by navigating to 'Data Export' in Setup. In this case, account records can be exported weekly in order to recover from any accidental or deliberate deletions by users. Since the sales director is not concerned about recent deletions, and not a lot of account records are created each week, a weekly export of account records would suffice for the company's requirement. Although an AppExchange application or a third-party solution could be used, it is better to use a native solution that's already available in Salesforce. It is better to schedule a weekly export instead of manually exporting records using Data Loader. https://help.salesforce.com/articleView?id=admin_exportdata.htm&type=5

The sales users of Cosmic Solutions regularly delete 'Closed Lost' opportunities. For the purpose of reference and reporting, the sales director of the company would like to maintain opportunity data even after opportunities have been removed from the Recycle Bin. However, he is concerned about data storage limitations due to the growing number of records. What should a data architect recommend to meet this requirement? Choose 1 answer. A. Use a custom object and an Apex trigger to store deleted opps in Salesforce B. Send the opps to another enterprise system before they are deleted, in the target system. C. Create a custom field on the Opp object to allow users to specify which opps need to be deleted. D. Send the opps to a data warehouse before they are deleted, and flag them as deleted in the data warehouse.

D. Send the opps to a data warehouse before they are deleted, and flag them as deleted in the data warehouse. In order to meet this requirement, opps can be sent to a data warehouse before they are deleted. These opps can be marked as 'Deleted' in the data warehouse. Opps in the data warehouse can be used for reference and reporting. Although they could also be sent to another enterprise system, using a data warehouse is a more logical approach since organizational data are typically consolidated in a data warehouse. A custom object should not be used for this requirement due to data storage limitations of SF. Using a custom field on the Opp object to mark opps that require deletion would not prevent the deletion of those opps.

Cosmic HR Solutions is a human services agency that is considering an AppExchange application for managing service records. There are currently 50 million records of a custom object that is used for managing services provided to customers by the agency, most of which are archived and do not require additional updates. This number is expected to increase by 5% every month due to the expanding customer base. The CEO of the company is willing to pay for additional storage. What should an architect recommend to ensure scalability while choosing a suitable AppExchange application? Choose 1 answer. A. The application should include a custom setting for service records. B. The application should include reports and dashboards. C. The application should support the use of Heroku and External Objects. D. The application should make use of Big Objects.

D. The application should make use of Big Objects. Because of the large volume of archived records, Big Objects are a viable solution. Although a big object supports only up to 1 million records by default, additional storage capacity is available as an add-on license. Archived records can be inserted into the Big Objects using Queueable Apex and a trigger on the custom object. Including a custom setting would not be a good solution since archived records would not need to be cached, nor would the 10MB allowed storage support the requirement. Writable External Objects are not supported for high-volume external data sources, and reports or dashboards would not have any effect on storage or scalability. https://developer.salesforce.com/docs/atlas.en-us.bigobjects.meta/bigobjects/big_object.htm https://developer.salesforce.com/docs/atlas.en-us.bigobjects.meta/bigobjects/big_object_example_queueable.htm

Cosmic Software Solutions would like to offer different levels of support to customer accounts based on the size of the company. An app configuration needs to be created for different support tiers and account types, which should then be deployed to multiple production orgs used by the company. Which of the following is a valid reason for using custom metadata types instead of custom settings for this use case? Choose 1 answer. A. Custom metadata types can be used in formula fields and validation rules. B. Custom metadata rows can be created or modified using SOAP API. C. Apex code can be used to create and edit records of custom metadata types. D. The records of custom metadata types can be deployed using packages.

D. The records of custom metadata types can be deployed using packages. The records of custom metadata types can be deployed using managed and unmanaged packages. Only the definitions of custom settings can be included in packages, not data. Custom metadata rows can be created or modified using Metadata API. Both custom metadata types and custom settings can be used in formula fields and validation rules. However, formula fields only work for hierarchy custom settings. Apex code can be used to create and edit records of custom metadata types as well as custom settings' data. https://help.salesforce.com/articleView?id=custommetadatatypes_package_install.htm&type=5 https://help.salesforce.com/articleView?id=cs_define.htm&type=5

An administrator of Cosmic Solutions is trying to extract more than 10 million account records from Salesforce using a third-party tool that supports Bulk API. However, the Bulk API query fails to complete. What should a data architect recommend to ensure that the query does not fail? Choose 1 answer. A. Manually export the records in multiple batches of 250,000 records. B. Use Bulk API in serial mode to extract the records. C. Look for an AppExchange application to extract the records. D. Use PK Chunking to split the query into chunks.

D. Use PK Chunking to split the query into chunks. PK Chunking can be used to ensure that the Bulk API query does not fail due to the large number of records. To enable the feature, one can specify the header 'Sforce-Enable-PKChunking' on the job request for the Bulk API query. It automatically splits the query into separate chunks, executes a query for each chunk, and returns the data. This approach is better than manually exporting the records in batches or looking for an AppExchange application. Using Bulk API in serial mode is not likely to ensure that the query does not fail. Serial mode is typically used to prevent lock contention on records. https://developer.salesforce.com/blogs/engineering/2015/03/use-pk-chunking-extract-large-data-sets-salesforce.html https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/async_api_headers_enable_pk_chunking.htm

Cosmic Luxiam currently uses a legacy CRM system to store information such as accounts, contacts, and opportunities. The company would like to switch to Salesforce and keep all the data in sync between the various systems, including the ERP system and the legacy CRM, which will be kept in place until all the functionality has been deployed. It would also like to consolidate all the important data in a central hub and resolve any duplicates. What should a data architect recommend for this requirement? Choose 1 answer. A. Utilize a third-party data integration solution available in the AppExchange. B. Create a custom solution using Apex that allows automatic data synchronization between the systems. C. Use the Data Loader command-line interface to schedule regular record updates in the systems. D. Use a master data management solution and link it with Salesforce, legacy CRM, and the other systems.

D. Use a master data management solution and link it with Salesforce, legacy CRM, and the other systems. The company should look for a master data management (MDM) solution that allows data synchronization between the systems and consolidation of master data from various source systems in a central hub. A third-party data integration solution, a custom Apex solution or the Data Loader command-line interface cannot be used to consolidate data in a central hub.


Related study sets

PEDS immunizations, lead poisoning

View Set

Chapter 6 - Personal Risk Management

View Set

CH 17 EUKARYOTIC MICROORGANISMS: THE FUNGI

View Set

Biology Homework Questions: Exam 1

View Set