An architect is planning on having different batches to load one million Opportunities into Salesforce using the Bulk API in parallel mode. What should be considered when loading the Opportunity records?
A. Create indexes on Opportunity object text fields.
B. Group batches by the AccountId field.
C. Sort batches by Name field values.
D. Order batches by Auto -number field.
Universal Container is Implementing salesforce and needs to migrate data from two legacy systems. UC would like to clean and duplicate data before migrate to Salesforce.
Which solution should a data architect recommend a clean migration?
A. Define external IDs for an object, migrate second database to first database, and load into Salesforce.
B. Define duplicate rules in Salesforce, and load data into Salesforce from both databases.
C. Set up staging data base, and define external IDs to merge, clean duplicate data, and load into Salesforce.
D. Define external IDs for an object, Insert data from one database, and use upsert for a second database
Universal Containers wants to automatically archive all inactive Account data that is older than 3 years. The information does not need to remain accessible within the application. Which two methods should be recommended to meet this requirement? Choose 2 answers
A. Use the Force.com Workbench to export the data.
B. Schedule a weekly export file from the Salesforce UI.
C. Schedule jobs to export and delete using an ETL tool.
D. Schedule jobs to export and delete using the Data Loader.
Universal Containers (UC) is concerned about the accuracy of their Customer information in Salesforce. They have recently created an enterprise-wide trusted source MDM for Customer data which they have certified to be accurate. UC has over 20 million unique customer records in the trusted source and Salesforce. What should an Architect recommend to ensure the data in Salesforce is identical to the MDM?
A. Extract the Salesforce data into Excel and manually compare this against the trusted source.
B. Load the Trusted Source data into Salesforce and run an Apex Batch job to find difference.
C. Use an AppExchange package for Data Quality to match Salesforce data against the Trusted source.
D. Leave the data in Salesforce alone and assume that it will auto-correct itself over time.
UC has to built a B2C ecommerce site on Heroku that shares customer and order data with a Heroku Postgres database. UC is currently utilizing Postgres as the single source of truth for both customers and orders. UC has asked a data architect to replicate the data into salesforce so that salesforce can now act as the system of record.
What are the 3 considerations that data architect should weigh before implementing this requirement? Choose 23 answers:
A. Consider whether the data is required for sales reports, dashboards and KPI's.
B. Determine if the data is driver of key process implemented within salesforce.
C. Ensure there is a tight relationship between order data and an enterprise resource plaining (ERP) application.
D. Ensure the data is CRM center and able to populate standard of custom objects.
E. A selection of the tool required to replicate the data.
F. Heroku Connect is required but this is confusing
Universal Containers (UC) is transitioning from Classic to Lightning Experience.
What does UC need to do to ensure users have access to its notices and attachments in Lightning Experience?
A. Add Notes and Attachments Related List to page Layout in Lighting Experience.
B. Manually upload Notes in Lighting Experience.
C. Migrate Notes and Attachment to Enhanced Notes and Files a migration tool
D. Manually upload Attachments in Lighting Experience.
UC is planning a massive SF implementation with large volumes of data. As part of the org's implementation, several roles, territories, groups, and sharing rules have been configured. The data architect has been tasked with loading all of the required data, including user data, in a timely manner.
What should a data architect do to minimize data load times due to system calculations?
A. Enable defer sharing calculations, and suspend sharing rule calculations
B. Load the data through data loader, and turn on parallel processing.
C. Leverage the Bulk API and concurrent processing with multiple batches
D. Enable granular locking to avoid "UNABLE _TO_LOCK_ROW" error.
A large retail company has recently chosen SF as its CRM solution. They have the following record counts:
1.
2500000 accounts
2.
25000000 contacts
3.
When doing an initial performance test, the data architect noticed an extremely slow response for reports and list views.
What should a data architect do to solve the performance issue?
A. Load only the data that the users is permitted to access
B. Add custom indexes on frequently searched account and contact objects fields
C. Limit data loading to the 2000 most recently created records.
D. Create a skinny table to represent account and contact objects.
NTO processes orders from its website via an order management system (OMS). The OMS stores over 2 million historical records and is currently not integrated with SF. The Sales team at NTO using Sales cloud and would like visibility into related customer orders yet they do not want to persist millions of records directly in Salesforce. NTO has asked the data architect to evaluate SF connect and the concept of data verification. Which 3 considerations are needed prior to a SF Connect implementation?
Choose 3 answers:
A. Create a 2nd system Admin user for authentication to the external source.
B. Develop an object relationship strategy.
C. Identify the external tables to sync into external objects
D. Assess whether the external data source is reachable via an ODATA endpoint.
E. Configure a middleware tool to poll external table data
Universal Containers (UC) management has identified a total of ten text fields on the Contact object as important to capture any changes made to these fields, such as who made the change, when they made the change, what is the old value, and what is the new value. UC needs to be able to report on these field data changes within Salesforce for the past 3 months. What are two approaches that will meet this requirement? Choose 2 answers
A. Create a workflow to evaluate the rule when a record is created and use field update actions to store previous values for these ten fields in ten new fields.
B. Write an Apex trigger on Contact after insert event and after update events and store the old values in another custom object.
C. Turn on field Contact object history tracking for these ten fields, then create reports on contact history.
D. Create a Contact report including these ten fields and Salesforce Id, then schedule the report to run once a day and send email to the admin.