Large size of data storage (>50GB)<\/li><\/ul>\n\n\n\nAs far as when large data volumes really start to matter \u2013 that can vary considerably. Carl\u2019s response is what he calls the \u201cclassic architect answer.\u201d<\/p>\n\n\n\n
\u201cIt depends,\u201d he says. He explains that you can start to see the negative effects of large data volumes when you have fewer than 1 million records if you haven\u2019t implemented best practices. The telltale signs are slow or even failing performance like \u201creports are taking a long time to load, you go to the all-list view and guess what \u2013 nothing comes back so you just wait and wait and wait.\u201d<\/p>\n\n\n\n
So while 10 million is considered a large data volume, the negative effects can start impacting you with far fewer records if you don\u2019t plan for it.<\/p>\n\n\n\n
What are the risks associated with large data volumes in Salesforce?<\/h2>\n\n\n\n Anil from Schneider Electric explains that large data volumes can impact system performance ranging from very slow user experiences, to directly impacting customers, even to bringing down the Salesforce org.<\/p>\n\n\n\n
\u201cWhen running reporting on huge data volumes, you are going to be consuming so much resources on the platform which is going to impact your users who are trying to perform transactions on the platform,\u201d says Anil. \u201cYour users are starting to see the page loading taking much longer than it used to, sometimes minutes and minutes of time.\u201d<\/p>\n\n\n\n
\u201cThe issues look simple but are actually impacting your platform on a huge scale,\u201d he says. That\u2019s because when your employees are impacted by slowdowns, it ultimately affects your customers. \u201cYou\u2019re directly translating that to your customer experience. Let\u2019s say your customer care agent is on a call with a customer and if you\u2019re taking such a long time to even open up a case, you\u2019re directly impacting your customer who is probably not going to be willing to wait for that long.\u201d<\/p>\n\n\n\n
So how do we start to solve some of these issues? What are some best practices about mitigating these risks?<\/h2>\n\n\n\n Carl says that \u201ceven if you\u2019ve done the right things at a code level, you can run into problems when you start moving data in and out of the system. So the question becomes what can you do about it?\u201d<\/p>\n\n\n\n
Carl cites bypass strategies as a way to skip automations if the data coming into the system \u201calready has the other things that the automations may be doing.\u201d<\/p>\n\n\n\n
\u201cFor example\u2026 for one customer to update 100,000 cases with batch Apex it took 6-7 hours of run time and that\u2019s because of triggers, lots of logic blocks, process builders, work flows, escalation. Once a bypass strategy was implemented, it took 10 minutes.\u201d<\/p>\n\n\n\n
So while we talk about large data volumes and working within the constraints of Salesforce, there\u2019s a lot that we can do as users to make sure that the things we\u2019re adding on top don\u2019t impact our large data volume experience even more.<\/p>\n\n\n\n
What is the best way to avoid hitting Salesforce governor limits when dealing with large data volumes?<\/h2>\n\n\n\n There are \u201cin-your-face\u201d limits\u2026 and then there are \u201csneaky\u201d limits.<\/p>\n\n\n\n
Susannah explains that there are API limits in Salesforce that can come as a surprise because while some Salesforce limits are soft, where you can ask for an increase, \u201ca lot of these API limits are strictly enforced for good reason. <\/p>\n\n\n\n
A quick example of how that might play out: if you\u2019re not aware of the different API limits that are governed by each API\u2026 and you pick not-the-best API for the job, you can start to run into these 24 hour rolling blocks where if you exceed the number of API calls for a 24 hour rolling period you have to wait until that next period opens up and that can mean pause to your business and ultimately a pause to your customer. So that\u2019s extremely important to be aware of.\u201d<\/p>\n\n\n\n
But it\u2019s not just the obvious limits that you need to be aware of, says Carl. Sneaky limits can catch you off guard, such as that you can only put 30 million files into Salesforce, or partner limits. \u201cLooking for sneaky limits at the start will save you a lot of headaches down the road.\u201d<\/p>\n\n\n\n
Storage space in Salesforce can be pricey, what should be done to keep data volumes low? Do we move data out of Salesforce?<\/h2>\n\n\n\n This is another question where the answer is \u201cit depends.\u201d There are many different options depending on the business requirements.<\/p>\n\n\n\n
Anil explains that \u201cOne aspect is storage space, the other aspect is ensuring that your users are getting the optimal performance from the Salesforce platform they\u2019re using.\u201d Schneider Electric deals with very, very large data volumes in the tens of millions, and Anil says that \u201cit\u2019s a very methodical process you have to put in place, it\u2019s not just purely from a storage standpoint but from an end to end journey that you\u2019re trying to address for your agents or your customer.\u201d<\/p>\n\n\n\n
\u201cThere is some data that we can take off the platform but at the same time we don\u2019t want to lose that access to the data because it\u2019s a gold mine in today\u2019s world and it\u2019s always important for your agents to refer back to something they have addressed in the past.\u201d<\/p>\n\n\n\n
So Schneider Electric\u2019s archiving goals weren\u2019t just about taking data off the platform, but also to think about \u201cwhat are we going to gain in terms of improving the agent experience, how do we still supplement the agent experience so they never feel that they lost the data and they still have access to it? The first and foremost thing that is very important is to talk to your business users, understand what is it that they need and what they\u2019ll be using often\u201d and then remove what can be removed based on that.<\/p>\n\n\n\n
When should you archive Salesforce data?<\/h2>\n\n\n\n Anil explains that once you start getting into very large data volumes, it\u2019s a trigger point to start thinking about your archiving strategy. While Schneider Electric archives every day, Anil says that there\u2019s no hard and fast rule. \u201cIt\u2019s going back to\u2026 what is the business expectation? What\u2019s the volume of the data growth that you have? That is what is really driving the frequency that you need to archive the data.\u201d<\/p>\n\n\n\n
I like to sum this up as: \u201cData growth + business requirements = frequency.\u201d<\/p>\n\n\n\n
\u2014<\/p>\n\n\n\n
If you\u2019re managing Large Data Volumes in Salesforce, we can help. Architects and platform owners at Fortune 500 companies like Schneider Electric, Robert Half, and Heineken rely on Odaseva to back up, archive, and distribute huge volumes of data. If you want to level up your data management, get in touch for a personalized demo<\/a>.<\/p>\n","protected":false},"excerpt":{"rendered":"[By Susannah St-Germain, Technical Architect at Odaseva] Large data volumes in Salesforce environments can grow out of control and cause problems like frustratingly slow user experiences for your employees, subpar customer experiences \u2013 even bringing down your entire Salesforce org. Many Salesforce admins inherit this problem when lots of data was generated by code written","protected":false},"author":7,"featured_media":5763,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[18,30],"tags":[],"yoast_head":"\nSalesforce Data Mastery Q&A: How to build for Salesforce large data volumes<\/title>\n \n \n \n \n \n \n \n \n \n \n \n \n\t \n\t \n \n \n \n \n \n \n \n\t \n\t \n\t \n