You can use your favorite social network to register or link an existing account:
Or use your email address to register without a social network:
Sign in with these social networks:
Or enter your username and password
Forgot your password?
Yes, please link my existing account with for quick, secure access.
No, I would like to create a new account with my profile information.
Thanks to Chad Rothschiller for helping me put this post together.
We’ve spent the last few weeks talking about improvements to the various visualization features in Excel. For the next few posts, I am going to cover some “under the hood” type improvements, things like performance, internal limits, and the like. For this post, I will focus on the work we did to make a 64-bit version of Excel.
64-bit Excel, simply put, is for building really really (really) big workbooks. 64-bit Excel is for those who want to work with large data sets and get past the 32-bit Windows address space constraint.
In short: enormous capacity; humungous workbooks. Now that 64-bit machines and operating systems are common, we wanted to provide a version of Excel that runs natively in that environment and takes advantage of 64-bit strengths. One of the more obvious benefits is the lifting of the addressable memory constraint we were dealing with in 32-bit Windows. Applications built with the 64-bit version simply can use more physical memory than ever before.
This is good news for those of you who need to work with really large data sets, for example.
However, there was a lot of work for us to do to ensure that Excel operations scale efficiently with the size of the data, use lots of memory efficiently, and not run into internal limits. We’ve made specific investments in our 64-bit architecture to optimize our memory consumption while keeping the cell table (and related operations) as fast as possible.
We also spent significant time investigating and optimizing our algorithms for common “Large Data” scenarios like copy/paste and sorting & filtering (more on that in an upcoming post about Excel 2010 performance).
Generally, you can expect to create bigger workbooks, whether that’s due to tons of data in the grid, tons of charts in your workbook, several really huge PivotCaches supporting your PivotTables, and so on. Most features take some amount of memory to use them, so you can essentially have more features and create bigger workbooks.
This may seem obvious, but worth stating: because we’re enabling Excel to scale beyond 32-bit windows memory limits, if you have a really beefy machine, you should be able to do more without bringing your computer to a crawl. Your code / solutions will also run out of memory less, being able to go farther in processing your data.
In many of our test cases the 64-bit Excel working set is very similar to 32-bit Excel working set (64-bit was under 1.2 times 32-bit). Intensive use of the cell table shows around a 1.5 times increase in 64-bit. If you have a case where 64-bit Excel working set exceeds 2 times that of 32-bit Excel, let us know so we can look into it.
I’ve heard some people comment that 64-bit Excel means that it’ll just run faster than 32-bit Excel. Well, that’s not true as a general rule. We have some tests that run a bit faster on 64-bit Excel, and other tests that run a bit slower on 64-bit Excel. Again, if you find an operation that you think should be faster in 64-bit Excel, let us know and we can look into it.
One test I decided to look into: how much RAM does it take to fill up my columns, A-Z, with data, e.g. using =RAND() ? I found that it takes a little more than 2GB, and that scales pretty well as I add additional columns of 24 (e.g. filling A-AZ takes about 4GB, A-BZ takes about 6GB), though there is some overhead of just running Excel that also needs to be added on top of that.
First of all, workbooks are generally interchangeable between 32-bit and 64-bit. There’s no special flag in the file marking it as a 64-bit workbook. Your average, every day, smaller-sized workbooks will be fine in both environments. However, with 64-bit Excel you will be able to create workbooks that are too big for 32-bit Excel to open (you’ll hit “out of memory” alerts before getting the file open completely). Note that even prior to having a 64-bit version of Excel, I can create a workbook using 32-bit Excel on a beefy machine (say 4GB RAM) that someone else might have serious performance issues opening using the same 32-bit Excel on a less-beefy machine (say 1GB RAM).
Running Solutions on 64-Bit Excel
You might need to review your VBA code and make updates in order to get it working with 64-bit Excel. You may also need to have 64-bit versions of any ActiveX controls, COM Add-ins, or XLLs if these are used in your solutions. I’ll have more details in a future post about programmability.
I appreciate humungous workbooks as a programming challenge for the Excel developers, but in the wild very large workbooks are always problematic and more often than not nightmares.
In no small way Microsoft is to blame for this by having Access in some versions of Office but not others. Databases are the appropriate places for really large datasets, even if most of the values are calculated. However, people are cheap, so many (most?) are unwilling to pay for Access, so they MISUSE Excel as a database and Excel workbooks as data stores.
On balance, if the main raison d'etre for 64-bit Excel is larger capacity, you're not doing sensible spreadsheet users any favors.
Best use that I can think of for 64-bit Excel is creating large Gemini models. This alone is probably worth the price of admission. Having to update macros and add-ins will be a pain for many.
Harlan is EXACTLY right. As a computer science guy I'm jazzed about the work to make Excel hum in 64 bit mode -- as someone who works with Fortune 50 companies that ALREADY abuse Excel this is HORRIBLE news.
Excel 2003 was more than adequate in terms of worksheet space. If you need more that 65K rows you're doing it wrong! Use a database!
As someone who has been tracking Excel Memory Limits since Excel 5 (16 Megabytes!) I welcome this change.
Sure, there will be misguided people using Excel as a database, just as there have always been in the past.
And often a great solution is an Excel front-end to a Jet/Sql DB (don't need Access for that).
But that's no reason to ignore genuine customer needs for huge workbooks full of calculations that are difficult/slow to do in a DB, or trying to do pivots on large DB recordsets (its not difficult to hit the current limits with a Pivot), or getting rid of those pesky links to external workbooks, or ...
I look forward to trying out 64-bit Excel and 64-bit VBA (pity about 32-bit VB6 DLLs)
Great change! As they boost their memory, many consumers highly appreciate the innovation.
As someone who has had to analyze large amounts of data, I welcome this change. I understand the concern people have with huge spreadmarts, but the amount of data I'm being asked to do some "quick" analysis on has only increased over the years. Now days it's common to exceed the old 64K limit. I think that's a good thing since I can focus more on my work and less time on moving my data to Access or summarizing over the details just to get it to fit in the old 64K row limit in Excel. With that being said, I love my databases too, and realize that any production data that will be used on-going by multiple users should not be sitting in a spreadsheet.
I use very large Excel spreadsheets (upwards of 250 MB), and such large spreadsheets are stable in Excel 2003, and very slow but still stable in Excel 2007. Excel 2010 gives me great speed, but a simple copy and paste operation invariably causes the spreadsheet to crash (however, copy and paste macros seem to work). Also, long macros tend to stop by themselves in the middle. The only way I found of remedying these problems is by significantly reducing the number of horizontal columns and breaking the long macros down into shorter macros. If you have any insight into whether these problems can be fixed for the official release, please let me know! My e-mail is email@example.com.
I really wonder where MS is going with Excel. Huge spreadsheets... users don't know what they are doing with spreadsheets now! "a great solution is an Excel front-end to a Jet/Sql DB"... Since when is Excel a great front-end tool? Maybe with a pile of customization to make it bullet proof from prying users.
"250mb" spreadsheets!! Are people crazy. I can just imagine the fun when some decides to email that one. Just another example of using the wrong tool for the wrong job. Who needs 250mb of 'in-memory' data? Use the right tool. Do these people use an open end wrench set to build a car when a socket set is available?
Great . 100 million rows in Excel . It's just amazing. With this , peoples' analytical capabilities will increase drastically . And it is for these analytical masteries and mysteries that all reporting tools count compatibility with Excel as a selling point .
If Excel 2010 has any good layouting capability to render any data in data source in any database , csv files and xml files in a read-only manner and print it it on the fly (with adequate runtime events for detail and summary lines where some .net code can be wrritten ), it will do magic for all.
Being the univeral load engine in BI cycles since Excel 2000 , it is already a 'universal BI last mile'. If there are good layoutting capabilities , it will enhance it's status from 'universal BI last mile (universal load engine )' to 'Universal BI ETL engine '.
But that's only a mystery which God and MS developers only know .And we too will know it in future.
It is also a fact that lack layoutting capabilities in Excel is the reason why reporting tools sell like hot cake .
I thought that one of the reasons for high performance clusters and lack of data mining *in Excel* was because Excel was 32 bit thus could not do high computing, etc. I was also under the impression that with 64 bit Excel (and 64 bit OS) data mining and large databases will be possible to be run in Excel without the need to hook up with other Microsoft solutions e.g. SQL. But all I am seeing now is that you are trying to sell us a stack (Analysis tool and now Gemini all need SQL to run) instead of making Excel more useful. While I understand and appreciate the motives, I am disappointed with this.
There was nothing compelling with Excel 2007 and there is nothing compelling with Excel 2010. And what about all the work with compute shaders and other parallel work Microsoft is doing with Windows 7. Shouldn’t this make Excel more powerful thus reducing the need to have to link up with SQL, etc?
I hope Apple puts more work in iWork (mainly Numbers) by copying all the best that is left in Excel in their next release as there is very little left that is holding me from jumping ship.
Great post, looking forward to hearing more about excel 2010 and 64-bit in particular. Personally I welcome any expansion in the amount of data that excel can use, I accept the argument above about databases but lets be honest databases are for storing data, very few if any have the same ability as excel when it comes to processing of data and changing it in to something that directors and managers can understand easily and put to use, excel can and it's easy, I'm not saying you should use them instead of a database but if you are trying to extract and use just the relevant data in a database to do some analysis this can easily exceed 65K lines.
Anyway really looking forward to hearing more "under the hood" improvements as to me that's as interesting as the surface improvements.
Something directors and managers can understand involves no more than 1,000 data values possibly arising from counts, sums, averages, minimums and maximums of large amounts of underlying data. Databases most definitely can & do calculate simple summary statistics like these just as quickly and maybe more reliably than Excel (more reliably because database fields, unlike Excel ranges, are strongly typed, so text that merely appears numeric can't be stored in numeric fields in databases).
Anything nontrivial, such as ARIMA time series analysis or econometric forecasting, is more reliably and more quickly done using dedicated statistical analysis software.
Excel is strongest at high level summarization and decision support. That type of model may require many worksheets performing different analyses with source data and other intermediate calculations, but millions or rows and thousands of columns are NEVER necessary.
It's only people who misuse spreadsheets as other types of software (most often as database or stats analysis) who may believe they need worksheet grids with billions of cells.
Too often Excel is the only tool remotely capable of the tasks business users need to perform, but that's not a good reason to make it an ever larger hammer.
What is the use of a big ass grid (billions of cells) if excel is slow at calculating and/or sorting the data ? And two, what is the use if you cannot do data mining in Excel without the need for SQL? For everyday reporting and analysis the current grid is already an overkill. What we lack are tools to use these billion cells to extract useful information from data. The tools are there but they need other Microsoft systems to work.
Need of managers for large row numbers
1. I agree with that managers and directors really need / understand hardly 1000 rows. In that sense 65K Excel 2003 also was sufficient because it too allowed analysis of 2 million rows which reduced to less than 8000 rows in pivot .
2. But if u c a bigger picture, there is an allegation against Excel that it’s not scaleable (by competitors of course for selling their tools). And based on that many companies provide no integration or inferior integrations to Excel to their current solutions. And then eventually excel turns out to be just ‘last mile ‘with last (and least also) value for itself in overall BI analysis schema. And if raising these limits wards off this allegation of ‘no/little scalability’ and hordes of companies provide state of art integrations with their solutions , its good for users and good for Microsoft also. Also it gives MS a powerful option in pricing MS office.
3. It is also a fact users competitive scaleable solutions hardly use their scalable functionality bcos 95% of the time operational users need only latest / relevant / exceptional data which is hardly few 1000 rows.
4. So I agree firmly that 100 million rows must in no way degrade performance & functionality. Also 100 million rows may command different tech and commercial configurations.
Bundling various BI components in different solutions
Further with respect to bundling various BI components in different solutions (like analysis services and reporting service in SQL server, Excel services in sharepoint , development in .Net platform etc ) , I feel this is against Microsoft interests only because it was WYSIWYG kind of easy to use software (like windows and MS Office) for SME laymen users (which are very large in number ) which brought MS to this leadership status. If they expect people wanting to shop reporting software buys all these software it’s just not possible even for enterprise users (not to talk of SME laymen users). There is no harm in keeping such components in all such places. But there is dire need for a inter-operable reporting / layoutting package in MS Office (which differentiates itself from other freeware / commercial Offices / reporting packages). One may term / brand this as Designer / Layout designer / Report designer. Such package must come with lots of easy to understand, use and deploy (with real life examples of say invoices, packing slips, delivery challans etc with VBA kind code at certain events). This is dire necessity of MS if they are to sustain the demand pull for its product against other OS/Office/databases/ reporting tools/ERPs. In that larger perspective this is kind of a cherish / perish decision which MS must/ should have made . Even aging / dying VFP of MS has good report designer comparable to stalwarts. Superior integration with MS Excel / Outlook / sharepoint / SQL server / AD , compatibility with any browsers / databases / file formats, automation / schedulability must be it’s key ingredients. It may be a special option in MS Office pricing. It’s server deployment may either be stopped or be licensed appropriately just like Office.
Positioning Excel as Universal ETL engine
If Excel 2010 is portrayed and visualized as Universal BI last mile (I have gathered such impressions), this is nothing new. It’s already the BI last mile since Excel 2000 considering many reporting tool outputs ended in export Excel. And this is not a major consideration for people planning to migrate to other Office or for people to upgrade from lower excel version or for people to migrate from other office / reporting tool to Excel 2010.
One must remember that in long distance travel last mile is least important and is of least value.
Though limited ETL capabilities (Extract as data import/file import/web query , Transform in pivot summary and Load in sorting,formatting,charting ,grouping etc) in raw form already existed in Excel even earlier, it has never been visualized and used that way . So visualizing Excel 2010 with full blown ETL capability (with a report designer) along with this 100 million capability may be a strategic opportunity for MS to be aggressive (trying to enhance/upgrade it’s status) rather than defensive (trying to retain last mile king position) .
Report designer is ‘an extra mile’ which if MS walks (and all leaders have to walk such an extra mile), it has the paradigm changing implications. Report designer of basic capability doing mundane reporting also will do bcos few customers of competitive tools hardly use anything more than that (just like people not using office to its max potential)
Web or not web
True all applications go web and there is nothing wrong incorporating web capabilities in the desktop applications. However that must not be done at the cost of security , ease of use ,flexibility and speed which are the forte of desktop applications.
People are driven by passions . All such developments (in direction of office/excel enhancements) can / will fire up excel users (even non-excel users) euphoria which in the overall analysis will be only beneficial to MS. And it is only such fired up laymen users only who will bring large sales for MS in large numbers (which happened earlier Windows 95 / office 97 and the stimulus impact of which is still there in the MS goodwill).
Post sale services like training /implementation / support
Further I have seen MS partners interested only in selling and not post-sale hand holding and optimizations. This is serious . Even today most people use hardly 30-40% of Excel. Not much of the partners have invested in training and optimizations of MS product. This must be taken care off.
Data mining without need for SQL
In excel u can mine the data even without SQL . Don’t understand what T. Kwetane means.
I'm going to need to update my Excel add-in products to work with 64-bit Excel (several calls to standard DLLs). When will I be able to get a beta to work with? I signed up for the 2010 beta but haven't heard anything.