Let me start with my conclusion first, and then show you how I arrived here. The figure to the right shows a minimun analog to digital conversion bit density for a given range of log scale. As you can see, if we wanted to display our data on a 5 log scale, we should have at least a 20-bit ADC. Side note - Bit(eff) means Effective Bit density, which basically takes into account that if you put a 20-bit ADC on your instrument, it probably doesn't actually perform at a full 20-bit. This is because there is some noise associated with the ADC, which limits the performance of the ADC. /Side note.
So, how did I arrive at this conclusion? Well first let me demonstrate that bit-density is important with an example. I created a mock data set of 3 Gaussian distributions (n=1000 data points for each) where the mean of the distributions and the SD were altered such that the populations were overlapping significantly. I then plotted these distributions on 4 histograms with different quantities of bin resolution ranging from 3-bit to 8-bit. It's important to remember that this is the exact same data set merely binned differently according to the available resolution. As you can see, the 3 populations are not at all discernable at the 3-bit range and it's not until we get to the 6-bit histogram that you can start to see the 3 different populations. Using this information, we can appreciate the importance of having sufficient bin density to resolve distributions from one another.
As an example to a system that might not have enough bin density, I display the following. Here we have a 20-bit ADC yielding over 1 million bins of resolution to spread across a 6-log scale. This may sound sufficient, but when we break it down per log, we see that in the first decade, where we have scale values of 1-10, we would only have 11 bins of resolution which would certainly lead to picket fencing and poor resolution of populations in that range. The Effective bins column shows an example where the noise of the ADC is such that our true bin resolution would be much less than the theoretical 20-bit.
Going through the process and crunching numbers for different scenarios, I conclude that ideally we would like to have on the order of 100s of bins of resolution in the 1st decade. So, in order to achieve that level on a 6-log scale, we'd actually need to have an 24-bit ADC. Now, the breakdown would be like what's shown below.
Take-home message: First of all, is a 6-log scale really necessary? For you the answer may be yes, but for most, probably not. The second question to ask your friendly sales representative is what sort of analog-to-digital conversion is done, and what the bit resolution of the converter is. It means nothing to have a 7-log scale displaying data from a 10-bit ADC. No matter how good the optics are you'll never be able to resolve dim populations from unstained cells. What really matters is having a really good optical system that has high speed, high density electronics that can display all the fine detail of your distributions. Find an instrument like that, and you have a winner.