I found this sorry story about an Excel operational nightmare on the /r/Excel sub-reddit yesterday: "a few days back I started working at a new company's finance department. I was shocked to see that all financial reports are built with excel references to other excel references to hundreds of manually filled excel documents. The reference tree goes quite deep up and is rather large: average excel sheet feeds from about 20 external excels, and each of those in turn also feeds from 20 external excels. And so it goes about 16 layers deep. The issue is that each month we need to update all reports, which means we need to open all documents and click "refresh" or update connections. However connections don't update unless all linked excel documents are also opened and updated, and so it goes all the way down to manual input excels. On top of this my computer crashes if I open any more than 4 excel sheets at the same times, because the excel sheets contain dozens of tabs and each tab is a huge table. I am looking for an idea to improve this mess in order to (1) reduce risk for human error (if i forget to update some of the 600+ links, there will be wrong numbers presented to CEO), (2) to reduce the time needed to run this (now takes almost a week to update everything), and (3) do this change fast within 1 month time."
One SpreadServe host deployment and a few days consulting would fix this problem. The one person week to execute the report would be eliminated as would all the errors from manual operation.