Monday, December 19, 2011

Is Compensation really necessary?

For some reason, it seems like the idea of compensation gets so much 'publicity'.  Everyone is always talking about compensation and how difficult it is.  New users of flow cytometry tend to think of this idea as something so complex that they end up stumbling on this one idea before they even get started.  So, let's get one thing straight right off the bat;  compensation is easy.  In fact, I'd say compensation is ridiculously easy today, now that you really don't have to do anything.  You just identify your single stained controls, and your software package uses that information to compensate your samples for you.  The real difficulty in performing flow cytometry assays is panel design - determining which colors to use and coming up with a panel where you have the optimal fluorochrome coupled to each antibody to give you the best resolution of your populations.  In fact, I'd go so far as to say that in some cases, compensation isn't even necessary.

Wha, Wha, Wha, What???  That's right ladies and gents - compensation isn't even necessary (in some cases).  And, I'm not just referring to the instances where you're using two colors that don't even overlap, I'm talking about straight-up FITC and PE off a 488nm laser.  Now, before you stop reading and jump over to your Facebook feed let me just assure you that you first learned of the superfluous nature of compensation when you were about 5 years old.  You see, analyzing flow cytometry data with or without compensation is nothing more than a simple "spot the difference" game you use to find in the back of the Highlights magazine while waiting to get your annual immunizations from the pediatrician.  If you take a look at the figure below you may be able to recognize the left panel as the FMO (Fluorescence Minus One) control and the right panel as the sample.  Spot the difference?  Instead of seeing the sun missing on the left and then appearing on the right, let's just substitute a CD8-PE positive population for the sun.  It doesn't really matter if the image is compensated, you're just comparing the differences between the two.


Let's make the comparison a bit more directly.  Here we have some flow cytometry data showing CD3 FITC and CD8 PE.  Our goal is to determine what percentage of the cells are CD3+CD8+.  Obviously, there's some overlap in the emission of the FITC fluorescence into the PE channel when run on a standard 488nm laser system with typical filters.  If I were to hand you this data set and pose the question of "What's the % double positive,"  you could employ the same strategy used above in the spot the difference cartoon without knowing a thing about compensation.  The top two plots below are the FMO controls (in this case, stained with CD3 FITC, but not stained with anything in the PE channel), and the bottom plots are the fully stained sample.  In addition, the left column of plots were compensated using the FlowJo Compensation Wizard, and the right column of plots are uncompensated.  Were you able to "spot the difference"?  If you take a look at the results, you'll see that either way we come up with the same answer.  So what's the point of compensating?

As you can imagine, this is greatly simplifying the situation, and when you start adding more and more colors, you simply cannot create an n-dimensional plot that can easily be displayed on a two-dimensional screen.  This could easily work for 2-color experiments - it could even work for 3-color experiments (maybe using a 3-D plot), but beyond that, you're going to have to do one of two things.  1.  Bite the bullet and get on the compensation train, or 2.  Abandon visual, subjective data display altogether and move to completely objective machine-driven data analysis.  Compensation, much like display transformation is a visual aid used to help us make sense of our data, two parameters at a time.  In our example above, we don't magically create more separation between the CD3+ CD8- and CD3+ CD8+ populations.  The separation between them is the same, we're just visualizing that separation on the higher end of the log scale (when uncompensated) where things are compressed in one case, and on the lower end of the log scale (when compensated) where things spread.  You didn't gain a thing.  

Monday, December 12, 2011

10 Steps to a Successful Flow Cytometry Experiment

I've been doing a good amount of application development recently and have had to "practice what I've preached."  Those of us in the flow cytometry world, especially those in core facilities, like to pontificate all the do's and don'ts of flow cytometry, but how many of us have (recently) struggled through all the intricacies of perfecting a staining assay.  I must say, I was a bit cavalier when I first agreed to set some protocols up for an investigator.  The staining protocols weren't anything novel or difficult, it's just that I personally had not done some of the assays in quite a while.  As I was going through the process I thought, hey, this is not as trivial as one might think...and I've been doing this for a loooooong time.  I could only imagine what someone who is brand new to flow cytometry as a technique must feel like when their PI suggests they use this technology to investigate their hypothesis.  So, I can put forth my top 10 steps to a successful flow experiment with some conviction, because I have now walked in your shoes.

I really wanted to make this a top-10, but as hard as I tried, I could only pare things down to 11.  So, without further adieu I present to you;

10 11 Steps to a Successful Flow Cytometry Experiment


1. Read lots of protocols (not just the reagent manufacturer's protocol).  Let's face it.  If you ask a dozen people how to make a peanut butter and jelly sandwich, you'll end up with 12 different recipes.  The same goes for FCM protocols.  Everyone finds a different part of the protocol worthy of emphasis.  If you read a few of them, you can start to put the entire story together.

2. Know which colors work best on your instrument.  This is probably a bigger deal when you're using a core facility with a few different platforms.  Let me tell you firsthand,  no two cytometers are alike in their capabilities, not even two of the same model of cytometer.  If you're lucky enough to have a flow cytometry core with knowledgable staff, make sure to ask them what their favorite 4, or 5, or 6-color panel is.  They should also be able to tell you what the limitations of certain colors on a given instrument may be.

3. When designing your panel, look for max brightness with min spillover.  Ok, let's say you know what sort of antibodies you want to run, and you know what's available, as far as hardware goes, at your institution.  Now comes the fun part. You have a list of antibodies, and a list of fluorochromes - how do you match them up?  You've probably heard the old adage, put your dim fluorochromes on the antibody that targets abundant antigen, and your bright fluorochromes on antibodies against sparse antigen.  In addition to that you want to minimize spillover - fluorescence from probes that are excited by the same laser and whose emission overlaps.  Spillover = Background, and Background = Diminished resolution.  This takes some effort and a bit of know-how, so consult your friendly flow guru for help, or try out some of the new utilities designed to help with this process (namely CytoGenie from Woodside Logic or Fluorish from Treestar).

4. Titrate your reagents.  What for?  The manufacturer told me to use 5ul per test (usually 10^6 cells in 100ul of volume).  Without jumping on the conspiracy theory bandwagon that reagent manufacturers tell you to use too much antibody so that you'll waste your antibody and have to buy more, I will say that I've found more times than not that the manufacturers suggested dilution is too concentrated. If you want to see why you should titrate your antibodies, check out the figure below.  If you want to see how to titrate your antibodies, click on over to this prior entry to the UCFlow Blog.

CD4 staining of fixed human PBMCs at the properly 
titrated concentration (Left) and the manufacturer's 
recommended concentration (Right).  


Example Staining Worklist 
5.  Outline your plan of attack.  Make a detailed work list of your protocol.  Generic protocols are good to help plan your experiment, but when it comes time to perform the steps of an assay, you really want a work list.  As the name implies, this is a step-by-step recipe of how to execute the protocol.  I usually include the step, duration, volume of reagent, temperature, etc...  While you're performing your assay, take copious notes so you can fine-tune the protocol, adding more detail.  The goal is to be able to hand this work list and the reagents to another user and they should have successful results.  I like to do this in Excel and write in all the cell formulas so that I can type in how many samples I need to stain and have it automagically do all my dilutions for me.  I also have a summary of the buffers needed and quantities at the bottom.  See below as an example.


6.  Always use a Dead Cell Marker. Dead cells can really screw up an analysis.  I guarantee there is a color and assay compatible dead cell marker available for most every experiment you will do.  There's no excuse not to use a dead cell marker, so please, please do it. It makes for a much nicer looking plot, and you really can't do good (dim) double positive enumeration without it.

Two-parameter plot without using an upstream dead cell 
marker (Left) and the same plot after removing dead 
cells (Right).  Note the diagonal population extending 
out of the negative population (encircled with a region 
in the left plot)


7.  Set up your FMO's as a separate experiment, not on your real samples.  I won't discuss the merits of using an FMO control (Fluorescence Minus One), let's just assume you know that it's pretty much a necessity.  What I will say is if you try and set up an FMO control on the day that you're using your precious sample, you're likely to either forget it, or omit it because you think you don't have enough cells. So, if possible, set up your FMO controls ahead of time on a different day so you can take your time getting everything set up properly.  It'd be nice to include it every time, if you have enough sample.

8.  Make compensation controls using beads.  I'm a huge advocate of using capture beads to set up compensation.  It's really a no brainer.  I've written about this subject before.  Even if your single stained controls look fine on cells, I'd still use beads because they're always consistent.

9.  Acquire your samples nice and slow to achieve maximum resolution.  If you go through the trouble of perfecting your staining procedure, now's not the time to screw things up.  On a hydrodynamically focused instrument you'll want to concentrate your sample and run it slow in order to keep a narrow core stream and achieve optimal resolution. If you're using another type of flow cell (such as a capillary a la Millipore or an acoustically focused system like the Attune) you should be more focused on increases in background due to insufficient washing rather than a wide sample core.

10.  Analyze your data a couple of different ways.  Even if I have a clear idea of how to go about the analysis, I'm frequently surprised at how many times I've changed axes or started backwards and found I liked the new way better than the old way.  Backgating is one way to help identify a rare population all the way up through its ancestry.  Make sure to take advantage of your Live cell channel as well as gating out aggregates and removing any time slices where there may have been a drift in fluorescence.

11.  QC your instrument and create an application specific QA protocol.  Science is not about 1-shot deals.  If it's not reproducible, it's not real.  In order to give you the best possible chance of getting reproducible data you'll want to minimize the error contributed by the instrument.  Quality control and Quality assurance cannot be emphasized enough.  By doing something as simple as running beads at your application-specific voltage settings you can ensure that the instrument is in the same state as it was the last time you acquired these samples.  For this, I typically use one of the peaks (peak 4, actually) of the 8-peak bead set.  After I have the samples acquired with the proper voltage settings, I run the beads, create target channels for the peaks and save it as a template.  Next time, all I need to do is dial in the voltage to put the beads in the target.  You'll also want to make an Acquisition template and probably an analysis template too.

Well, there you have it.  Hopefully this will help you focus your attention on some key aspects of setting up a well-thought-out flow cytometry staining protocol.  Of course, this merely scratches the surface of all the things you need to think about.  Did I miss something major?  Feel free to leave a comment with your #12, #13, and beyond.

Friday, December 2, 2011

The Year of Acquisitions

Like most industries, the Flow Cytometry Industry appears to be shrinking, in that the number of players on the industry side of things is getting smaller.  For many years, there were a few big players, namely Becton Dickinson and Beckman Coulter (who they themselves were products of mergers - BD+Ortho, and Beckman+Coulter).  They made instruments and reagents and pretty much sold the whole package.  Seeing the potential for others to capture some of the market share, we experienced a growth of smaller start-ups, either focusing on the hardware or the reagents.  Companies like Cytomation (maker of the MoFlo), and Guava on the instrument side of things introduced some nice products and created some much needed buzz.  A major impact of these companies was that it forced the major companies to invest in R&D and come out with more competitive products.  On the antibody side of things, reagent-focused companies like eBioscience and Biolegend gained popularity.  But, I think a real turning point happened when little-known Accuri Cytometers exploded on the scene with a low-cost, small footprint cytometer with capabilities similar to a FACSCalibur.  They took a page from the Guava playbook and targeted individual labs instead of the typical cytometer purchaser - a core facility.  Soon other companies were seeing the success of these platforms, and the much larger market outside of the core facility.  Companies like Stratedigm, Life Technologies and iCyt started offering smaller sized, less expensive cytometers.  It seemed like the cytometery industry - both on the instrument and reagent side - was expanding.  This lead to competition and innovation.  The old standby's like BD and Beckman Coulter were forced to come up with new and exciting products to maintain their market share.  And then the recession hit.

So, what happens in a recession.  Well, contrary to what you might think, many companies do just fine in a recession.  Of course their growth may slow, but then they also tend to accumulate capital as well.  In fact many companies wind up in a situation where they have lots of cash on hand and are sort of waiting to see what's going to happen.  John Waggoner explains in a USA Today piece (http://www.usatoday.com/money/perfi/columnist/waggon/2011-05-05-cash-in-on-mergers-and-aquisitions_n.htm)  that this past summer, it was estimated that companies in the S&P 500 stock index had a combined $940 Billion in cash.  I postulate that the well-established cytometry companies were/are in a similar boat...but to a much lower degree.

Mr. Waggoner goes on to explain, companies with cash-on-hand basically have three things they're going to do with it.

1.  They can reinvest in the company, hire more people, build more plants, funnel it into R&D, etc...  However, with funding becoming more and more scarce, there's not enough demand in the market to warrant such reinvestment.

2.  They can return money to their investors in the form of dividends.  Some companies are doing this, but probably in moderation.

3.  They can buy another company to position themselves for the recovery.  Mergers and Acquisitions are a pretty huge business in recent years.  In total, M&As are running at a $1.6 Trillion pace for 2011.  A good chunk of this is happening in the healthcare sector.  

Bingo.  Herein lies the recent increase in mergers and acquisitions.  For example, Accuri raises ~$30 Million to get their business going, BD sees the threat and buys them for $205 Million (not a bad ROI for the Accuri Investors).  BD removes the threat, and clears the way for its new flagship, small footprint, easy-to-use cytometer, the FACSVerse.  This works for reagent companies too.  Affymetrix buys eBioscience, EMD-Millipore buys Guava and now Amnis, Life Technologies licenses the acoustic focusing technology to build the Attune, and on and on it goes.  Even bigger name companies like Sony and Danaher are getting into the game.  Sony purchased iCyt to see if it can get its foot into the biomedical research arena, and Danaher purchased Beckman Coulter for who knows what reason.  At any rate, it seems like the industry is attempting to go back to the old days where you'd do all your shopping at one company.  Buy your instrument, reagents, analysis software, and all the rest from one company.  You'll end up having BD labs, Millipore Labs, Life Technology Labs and maybe even Beckman Coulter Labs.  A necessity in the current environment, but I'm sure things will oscillate back to the innovative start-ups taking on the big-boys once again.  So, who's next to be gobbled up?  I'm sure companies like Stratedigm, Blue Ocean, and Cyntellect are hoping their phones will start ringing.

Monday, November 21, 2011

Options for Flow Cytometry Training - FloCyte Review

Flow Cytometry (FCM) isn't the easiest technique to learn.  It actually takes quite a while to master both the hardware and software components to sample acquisition and data analysis - let alone the applications utilizing the aforementioned instrumentation.  For many users of flow (in an academic setting) their first encounter with FCM is likely through a core facility, whereby they'll receive some instruction on how to operate an instrument and then how to analyze the data they collected.  The type and quality of this training varies greatly.  Some institutions I'm familiar with have multi-day courses with wet lab sessions and hands-on instrument time, while others attempt to provide a theoretical base and then do a bit of hand-holding for a few sessions.  The success a user may achieve greatly depends on his or her resourcefulness and overall aptitude for technology.  Some people pick it up quickly; others struggle for years.  I will say that training users in a busy core facility is a huge drain of time and resources.  In our core, for example we basically have an entire F.T.E. just providing training and consultation, so I'm sure that in smaller cores, where it's just one or two people, training has to be an even greater burden.  The question then becomes, how are we to provide the necessary training and attention our users require with the limited time and personnel resources characteristic of a core facility?

There aren't too many options.  Before I jump into an assessment of the FloCyte courses (which is the whole point of this post) let me briefly highlight other possibilities.  FYI, I've personally attended all 3 types of training sessions and have viewed all the resources in #4.  

1.  The Annual Course in Flow Cytometry - This weeklong course alternates between Los Alamos National Labs (or the University of New Mexico) and Bowdoin College in Brunswick, ME.  It is really geared towards users of the technology who already have a basic understanding of the technology.  Also, it focuses on the applications of flow cytometry rather than operation of a flow cytometer, however numerous sections also delve into the hardware components.  There's a pretty cool lab where you can assemble your very own (fairly crude) cytometer.  The cost of the course is about $1800, which includes dorm-style accommodations and meals (transportation is not included).   

2.  Vendor-specific instrument/software training - Most vendors will provide training for their hardware and associated software.  When you purchase an instrument, you might get some free training included with the purchase, but additional training is going to cost you.  As you'd expect, the training is geared towards the operation of that vendor's hardware.  If you were using multiple cytometers from different vendors, this obviously wouldn't be ideal, but if you were using a single platform it might be a good option.  The vendor training will also include some of the basics of cytometry, but again, it will be skewed towards their instruments, their reagents, and their idea of the technology.  It's also pretty expensive, sometimes as much as $2500 per person.

3. Training courses at meeting - Typically when you go to some of the bigger conferences they'll have some workshops on FCM.  Certainly at the CYTO meetings you'll have the opportunity to attend training sessions on various topics.  Also, some of the immunology focused scientific meetings will have some FCM training associated with them (for example, the AIC meeting in Chicago).  Cost for this training is variable, however it's usually limited to conference attendees, so unless you were already planning to attend the conference, it might be really expensive. 

4.  Online utilities - There is quite a lot of information freely available on the web.  You can certainly start at the Purdue University Cytometry Laboratory web site, where there are a bunch of powerpoint slides, movies, and resources freely available.  In addition, companies such as Becton Dickinson, Life Technologies, and Beckman Coulter offer overviews of flow cytometry and flow cytometer technology.  Note that the above links are linked directly to the company's training/support page with the intended materials.  Although these online utilities are readily available and free, you lose the benefit of asking questions and interacting with people who can tailor the training to your specific needs.  

So now, I'll walk you through my experience with the FloCyte Training course offered by FloCyte Services.  I attended the Comprehensive Training Course from 11/15/11 - 11/17/11 held at Spherotech, Inc.  I won't bother taking up space here to give you the rundown of the company and the mission of the training courses.  You can read all about it here.  However, I will note that I attended the Comprehensive training course, which is designed for novice users of flow.  You can see the course curriculum here.

Day 1, as you'd expect, goes over the basic components of flow cytometry.  This is done is a pretty common fashion, and anyone who's gone through the powerpoint slides on the Purdue University Cytometry Laboratory web site will recognize the format.  4-components, Fluidics, Optics, Electronics, and Data Analysis.  All the standard material you'd expect to be here is here.  There was however at least one pretty critical omission - multi-laser systems, laser delays, and how fluorescence emission is spatially separated.  I know this was briefly mentioned during one of the sections, but there was no figure, no reiteration of how it's possible to look at two colors with the exact same emission simultaneously because they're excited by spatially separated laser beams (e.g. PECy7 and APCCy7).  When we broke into small groups to take a look at some of the hardware, I spent most of the time explaining to my other group members how this works.  They were very confused.  The graphics used to talk about emission filtering where all systems like a FACScan or FACSCalibur, which don't have spatially separated beams, and all the light goes through the same "pinhole".  Also on day 1, we finished up with a mathematical explanation of compensation, which went horribly wrong.  The math is complicated and it's probably not something basic users need to understand in order to compensate their data correctly (or, should I say, let FlowJo compensate their data correctly).  Lastly, there was no mention of 1 very critical component to flow cytometry, Quality Assurance and Quality Control.  In all, the basics were handled just fine.  I will say, though, that it seemed to move pretty slow.  I think for the amount of information covered in that first day, it could've have been condensed into a half day.  For example, I feel like the flow basics class given at UCFlow is comparable in it's scope but is completed in about 1.5 -2 hours.  

Day 2 brought in a plethora of applications and tried to reinforce some of the concepts from day 1 while explaining how those concepts effect how you think about the applications.  I think this way of presenting the information is really good.  When we're talking about immunophenotyping, we're also talking about compensation, background due to fluorescence overlap, non-specific binding, etc...   When we're talking about cell cycle, we're also looking at doublet discrimination, coincidence, sample core size, etc...  Here we also start tackling the necessity of controls, including comp controls and the always popular FMO controls.  My big issues with this section solely revolved around the figures.  Many of the figures were at best poor representations of the idea being put forth and at worst blatantly misleading.  This was especially noteworthy in regards to an explanation of biexponential display transformation.  In another instance, the instructors were driving home the idea of how we are to never use quadrants to perform gating on our plots and the very next slide describing FMO controls was filled with quadrants used as gating.  A bit contradictory.

Day 3 was all about stats and panel design.  The stats part was very straight-forward and pretty easy to follow.  The panel design section was good, and covered many of the issues that arise when trying to put together a multicolor panel.  There was an introduction to a utility from Treestar called Fluorish (which I'm not going to complain about because I like it)  however there wasn't any real mention or demonstration of other available utilities like Chromocyte and CytoGenie.  Also, we spent some time going through some data analysis strategies using FlowJo.

The cost for the 3-day Comprehensive course is $700.  The beauty of the course is that it's brought to you (either your institution can host it, or it is hosted nearby) so you don't have to factor in airfare or hotel costs.  But, you'll have to remember that you're getting a comprehensive theoretical overview of flow cytometry, you are not learning how to operate your specific cytometer.  So, if you didn't have a core facility around to show you how to open up FACSDiVa and adjust voltages on your LSRII, you'd still be pretty clueless on how to run your first FCM experiment.  Another positive about the training is that it is modular such that you can attend just days 1 and 2, or just 2 and 3, or even just day 3.  That way if you have some basic knowledge already, you can skip day 1 and just attend days 2 and 3. Lastly, I'll mention that there are a bunch of other, more advanced courses available outside the comprehensive course, including a multicolor compensation course, a course on "phosflow" assays, and even clinical flow cytometry.  

The instructors are well-respected flow cytometry professionals with years of experience under their belts.  They presented most of the material in a clear and concise way.  There was, at times, some confusion regarding what a figure was trying to describe, but this was due to the fact that the slides were recently re-done and the instructors were not 100% comfortable with them.  I feel like I want to give them a pass on that, but then again, I did pay $700 on this course and expected a very polished delivery.  All things considered, they did an excellent job.

I could see this working in a couple of ways.  1.  You get some initial training on how to operate your cytometer from your core facility and then attend days 2 and 3 of the comprehensive course.  2.  You could attend the entire comprehensive course and then go through the specific instrument training given by your core facility.  3.  Get trained by your core, start running experiments, and then jump in on one of the advanced courses offered by FloCyte.  If you're not fortunate enough to have the support of a core facility, then this makes the FloCyte courses even more attractive.  Relying on them for the basic theoretical training, and then the instrument vendor for training on the actual equipment is probably your best bet. 

Friday, November 11, 2011

Cytometer Service Contract or Self Insure: the Wal-Mart effect.

Instrument maintenance and repair is typically not a huge factor when deciding on a piece of equipment to purchase.  People are much more concerned with the practical things like how many lasers can I put on, how fast can I run my samples, or more simply, can it handle the applications I plan to run?  Even after we have the instrument installed in the lab we're not really thinking about maintenance and repair because we're on the "full-warranty high."  If something breaks, what does it matter?  The company will come out the next day and repair it at no cost.  Right about the halfway point through the warranty period the thought hits you - I'm going to have to start paying for service on this thing.  Herein lies the dilemma.

Although there are many variations, in general there are two schools of thought here.  The first involves some level of service agreement (full, partial, lasers only, instrument minus lasers, etc...) and the second is akin to an "insurance" plan.  By the way, before I go on, I should state that I'm writing this from the standpoint of a private academic institution (namely the University of Chicago), however private companies, public institutions, or individuals may have a vastly different experience.  Let me briefly explain these two systems of instrument maintenance.

Service agreements.  About 6-8 months into your warranty period, a friendly company representative will contact you to try and sell you on a full service contract.  This basically extends the type of service experienced during the warranty period.  Labor and parts will be covered under the service contract costs you pay annually.  Be sure to get a list of what are typically called 'consumable parts'.  These items are parts that will not be covered under the service contract.  Consumables are commodities that are intended to be used up quickly and therefore are not parts that could undergo some type of failure.  It is this failure of a part that is covered by the service contract.  Consumables can be expensive; sometimes as much as $1000 - $2000 for a single item that may only last 6-12 months.  You'll need to be sure to add these costs to your total cost of ownership.  Full service contracts are fantastic.  You get rapid response times, an endless supply of new parts, and generally I find the quality of service is of a higher standard.  The downside is the expense.  You can plan on spending about 10% of the original purchase price yearly on a full service contract, which means that after 10 years, you'll have bought the instrument twice.  You can also look into service contracts that cover only parts of the instrument, such as a 'lasers-only' contract.  This may cover some of the major expenses that might hit, but some of the routine fluidics issues or electronics issues would still need to be paid out-of-pocket.  Lastly, you don't need to rely solely on the Original Equipment Manufacturer (OEM) for service.  In some cases, third party companies will either provide the service agreement (serve as a middle man between you and the OEM) or there are companies that can actually come out and fix some of your older generation instruments.

Insurance.  If you pass on the service agreement route, either with the OEM or a 3rd party company, you'll need to carefully make a plan on how you will pay for problems that pop up.  This can be done by including a line item on your budget and simply inserting the cost of the service contract.  Then you'd need to pay for any repairs using those available funds.  If you don't use all the funds then you have a surplus and possibly a way to do some upgrades or save it for a rainy day.  If, however, you end up paying out-of-pocket more than you have put away as insurance then you could have some trouble with your institution.  The insurance method also has some unintended consequences including the possibility that your service calls may be bumped to the bottom of the list if the OEM services customers on service contract first.  Secondly, I've noticed that the field service engineers tend to do the minimum to get the instrument functional again.  This is not to say they're lazy or anything, they're actually doing you a favor by not replacing non-essential parts, and performing the work quickly so the hourly labor charge is not too high.  However, this sometimes leads to more frequent trips to a site to fix a related part that breaks shortly after the instrument was put back into service.

So, what do we do at UCFlow?  Well, a hybrid, of course.  If you can anticipate which instruments will likely have more problems over the years, then you can keep your instruments running for many years without hassel for a lot less money.  Seems impossible, but here are a few tricks.  Obviously, the first thing you're going to do is monitor performance very carefully during the warranty period.  If odd things are happening monthly, or even quarterly, it may be a good idea to consider a service contract.  If you can find out from current owners of the same model instrument whether they have many service calls, that might help make the decision.  Also, if you or your lab is familiar with the innards of the cytometer and aren't afraid to do things like replace valves, regulators, or even lasers then you should be less likely to buy a service contract.  Lastly, the more instruments you have, the more money you'll be wasting on service contracts.  Let's say you have 6 instruments, and the service contract is $15,000 each ($90K total).  It's unlikely that all 6 cytometers will have multiple issues in a given year, so let's say you have 2 instruments with major problems (multiple service calls with big ticket items totaling $30K).  The other 4 run pretty smoothly, and maybe require another couple of service calls for minor issues ($15K).  If you pay out-of-pocket then you'll basically be paying 50% of the cost of a full service contract.  This might be a good year; some other years might not be so favorable.  However, it's likely that many years you'll be under budget and a couple of years you might be over budget.

As an example, I'll share a few stories of my experience.  We had multiple 1st generation FACSCantos that were breaking down monthly.  We were actually paying more out-of-pocket than the cost of a full service contract, so we went ahead and put them on contract.  This was a no brainer.  We also had an old LSRII that, over the course of 6 years had not had a single service call placed on it.  All we've had to do is perform the standard Preventative Maintenance (PM).  We never had a contract on this instrument.  After 6 years of spending nothing on this instrument, we had 2 lasers die at the same time, which required replacement at the cost of $50K.  The service contract cost was $22K per year, so 6 years times $22K = $132K, and actual costs were $50K, a 62% savings.   It is a situation like this that tells us to err on the side of NOT getting a service contract until an instrument proves to be unreliable.  Once it is deemed unreliable we either place it under service contract, or get rid of it and find a more reliable alternative.  It sometimes seems like a gamble, and if I only had 1 or 2 instruments, I'd likely have them on service contracts, but since I have the luxury of duplicate technology and the power of numbers, I'm able to take that gamble and the odds are usually in my favor.

By the way, of the 16 instruments we have in the lab, 3 are on service contract (only the aforementioned early generation FACSCanto-A).  We're able to save money by having a high number of instruments.  We can also negotiate better contracts if desired.  Larger volumes typically lead to better prices per unit.  This is what we call the Wal-Mart effect.  If that's not your case, then you'll likely want to lean more towards the service contract route.





Tuesday, October 25, 2011

The most sensitive Cytometer available?

Recently, we've had a pretty good bump in the usage of our ImageStream X (Amnis, now a part of EMD-Millipore), but many of the new users are using the technology to confirm things they're seeing on the conventional flow cytometers.  So, needless to say, I've been doing a bit more phenotyping on the ISX instead of the usual nuclear translocation or apoptosis assays that we typically do.  In doing so, I was reminded of some comments thrown out by Amnis at the 2011 CYTO meeting saying (and I'll paraphrase) the ISX, and by extension the FlowSight, is the most sensitive cytometer available.  The evidence of such a claim was a screen grab of good ole 8-peak beads (Please don't get me started).  So, I had some data that I recently collected and thought I'd try and validate those statements with some data that makes sense to me.

It's a really simple example, but in short it involves a surface marker (coupled to PE), a Live/Dead dye (Green) and a Nuclear dye (Violet).  By conventional flow cytometry, the PE signal was pretty weak and the user was skeptical that the staining was "real."  So, the idea was to make sure the cells were live (Green low/neg), were actually cells (Violet pos) and had surface staining of PE.  After going through the normal groups of gating, it came time to look at the PE signal.  Surprisingly, it wasn't bad at all (especially with the 561nm laser cranked up to 200mW), however there were some dimmer PE+ cells that were hanging out a bit too close to the negative.

I remember having a discussion with other people in the lab about using carefully calculated masks to pull out the membrane staining and completely removing the cytoplasmic and/or nuclear background which should bring the negative population pretty much down to zero while retaining the specific PE positive fluorescence.  This procedure is actually pretty simple so I'll briefly explain it, and if this whole concept of image masking is foreign to you, just think of masks as parts of a cell defined by morphology within which you're going to measure fluorescence.  This is very different from flow cytometry where you can only measure fluorescence from the entire cell regardless of where that fluorescence is coming from.  With this data, I'm creating a membrane mask, which basically looks like a ring encompassing the outside of the cell.  This should retain most of the specific PE fluorescence and remove both background from intracellular autofluorescence, but also background from the nuclear dye and live/dead dye.  The figures below demonstrate these findings.

 The figure to the left is the originally analyzed data.  On the far left is just a gallery of images that show the different fluorescence (the green live/dead wasn't show since dead cells were gated out).  The top dot plot is a simple SSC/PE scatter plot to show the distribution of the negative and positives.  The image just below is showing the mask used (bluish semitransparent shape overlaying the PE image).  Below that is the ungated population showing the Live/Dead Green fluorescence spilling into the PE channel using the whole cell mask.  And lastly, a histogram showing the PE Fluorescence.  Altogether, a pretty straightforward analysis.  However, I wanted to see what would happen if I restricted the PE mask to only the membrane area, so that is what is shown in the figure below.  Now, it's important to note that this is the exact same data file, analyzing the exact same group of cells.  The only thing changed here is the mask on PE, which is now shown as a ring overlaying the membrane of the cell.  If you now look at the SSC/PE scatter plot at the top, you can see the dramatic tightening of the negative population, which implies a reduction of the high autofluorescence cells that were trailing to the right of the negative population in the total cell mask.  Another benefit of this restrictive masking strategy was the reduction in the spillover of the green dye into the PE channel as shown in the ungated  Live/Dead Green versus PE plot.  And lastly, when you look at the histogram, you can see unequivocally the increase in separation between the negatives and positives.  To drive the point home a bit more, we can overlay the two histograms so you can see exactly how they match up.  Notice that there is a reduction in the intensity of the positive population as well, but this is likely a similar reduction in background fluorescence as is seen in the negative population.  The key here and really in all of flow cytometry is RESOLUTION.  This is, in fact, what most people are really thinking about when they say 'sensitivity.'

So, can we confirm the original statement here about these imaging cytometers being the most sensitive cytometers available?  Well, I'm not sure I'm ready to crown this instrument as the winner just yet, but at least in some circumstances, the ability to only analyze the part of the cell that is actually stained or not stained could prove to be an extremely vital tool especially if you need to resolve dimly stained cells from unstained cells.


Thursday, October 13, 2011

Counting Cells with the EMD-Millipore Scepter 2.0

I recently had the chance to play around with the Scepter 2.0 Automatic Cell Counter from EMD-Millipore.  The Scepter uses the Coulter Volume principle to count cells in a microfluidic chamber connected to a handheld device.   I'm basically using it for things like confirming pre- and post-sort cell counts, as well as counting cells being passaged and primary cells such as PBMCs and splenocytes. The device itself is basically shaped like a pipetteman, and even has a plunger type action which simulates pipetting.

EMD-Millipore Scepter 2.0
To use the device, you need to attach a single-use 40um or 60um sensor, which provides the microfluidic channel through which the sample is passed.  Once attached, you simply hold down the plunger, submerge the sensor tip into a sample volume of ~100ul and then let go of the plunger.  It takes up about 50ul of your sample through an orifice in the sensor and measures the volume of the cells.  It then plots the volume (or diameter) of the cells in a frequency histogram displayed right there on the device's built-in display.  Using a click-wheel on the finger grip side of the scepter, you can adjust the low and high bounds of the histogram in order to remove small (dead/debris) and large (aggregate/larger cells) events.  Once you set these bounds, it displays the event number/mL at the bottom of the display.  The sensor tips are single use and each have a range of cell sizes and sample densities it can handle.  The 40um tip is geared towards cells with a diameter of 3um to 17um and a cell density of 50,000 cells - 1.5x10^6 cells per mL.  The 60um tip can handle cells with a diameter of 6um to 36um and a cell density of 10,000 - 500,000 cells per mL.  The handheld unit can store up to 42 histograms, but this data can be downloaded to a computer and analyzed with the Scepter Software Pro (Mac/Win - Free!).  In the desktop software, you can add info like original volume, dilution factor, sample names etc..  You can also re-gate and overlay histograms to create handy figures.  When you have everything set up, you can export reports in table format which you can open up with Excel or other spreadsheet programs.

Scepter Software Pro Screenshot
So, how does it work?  Well, it certainly counts things very accurately.  Previously, I've found that my MoFlo XDP reports sorted cells really well (at least when the side streams are behaving themselves), and I have lots of data comparing sorter counts to counts using a standard coulter counter or even counting cells on a conventional cytometer using absolute counting beads (which I get from Spherotech, by the way).  The only drawback is you don't get a live/dead report like you might with visual-based systems (e.g. Hemacytometer counts with Trypan Blue, Coulter's Vi-Cel, Invitrogen's Countess, or even Nexcelom's Cellometer).  Sure, you can sort of approximate what's live and dead using volume or diameter as a discriminator, but all the profiles I have been collecting don't really show a clear distinction.  It's certainly not like staining some cells with PI and throwing them on a FACScan with counting beads.  But, with that said, the system worked pretty darn well.  Getting back to the accuracy thing, it matched my counts from a sorted fraction on my MoFlo to within 1%.  That makes me feel good in two ways:  1.  My MoFlo is sorting well, and 2.  The Scepter can actually count really well in a very short amount of time (30 seconds by the way).  I did have one small snafu (described below) which was giving me some really weird results, but outside of that, it performed as advertised.  The one thing that I might have to complain about is the cost of the single-use, disposable sensors.  $3 a piece.  Ouch!

Blue histogram represents a collection in the upright
 position, which leads to low end bubbles showing
 up and screwing up the counts.  The green histogram
 repesents a collection up-side-down and no bubbles
 ending up in the chamber leading to way more accurate
 counts.
Now, about that snafu.  What I kept finding was after the sample was loaded into the sensor, and then the sample started traveling through the sensor orifice into the counting micro-channel, I kept seeing bubbles creep in there.  The effect of this was I'd start getting these really low volume events piling up near the end of the counting process.  It was a small number of low volume events that I could probably gate out (see figure below), but it still messed things up for me.  Since I was doing a 1:10 dilution (10ul sample, 90ul buffer), when I back calculate (or better yet, let Scepter Software Pro back-calculate for me) the concentrations, I was off as much as 1x10^6 cells (or a 12% swing in total cell counts).  To solve this problem, I made one modification to the collection process.  As soon as the sample was loaded into the sensor (it beeps at this point), I immediately flipped the entire Scepter apparatus upside-down as to force any air that begins to enter the sensor to remain near the tip and not enter the orifice and microfluidic channel.  This got rid of all the air bubbles and my counts became extremely accurate.  In one case, my MoFlo told me there should be 8.02x10^6 cells, and the Scepter counted 8.01x10^6 cells.  This made me happy.  To see this awesome flip move in action, check out the video below.  I apologize for the sound, I was filming this in my sorter room, which has the gentle hum of a twin diesel engine for background noise.  Also, you'll just have to trust me when I say "see the bubbles." UPDATE:  After playing around with volumes a bit more, it's pretty evident that you definitely need 100+ microliters of volume in your tube.  I could get bubbles every time if I only had the requisite 50ul of sample, but if I had 100-120ul, I almost never got bubbles.  With this volume, there's no need to turn the scepter upside-down.



So, in all, I think this product was successful for what my purposes were.  It's small. The counting process is fast.  I can offload the data to my computer, and the counting was very accurate (as long as I remembered to hold it up-side-down to avoid the bubbles).  Will I continue to use it?  I guess it sort of depends on whether or not I can get over not 'knowing' the %live/dead.  For what I'm doing, that's probably fine, but could another option be just as easy and accurate and cheap AND give me live/dead?  To be determined.  I will say that I've used early versions of the Countess and the Nexcelom, and neither impressed me so much as to make me want to buy one immediately.  Hopefully I'll be able to check them out again and perhaps put together a head-to-head review.

Wednesday, October 5, 2011

GLIIFCA 20 Wrap-up.

If you're unfamiliar with the Great Lakes International Imaging and Flow Cytometry Association (GLIIFCA) meeting, you can check out this year's program online here.  It's sort of a morph between a technology focused user group meeting and a smaller scale scientific meeting.  The focus really is on the utilization of our technology (which I'll refer to under the umbrella term Cytometry) in clinical, translational, and basic research.  There is also a strong cytometry vendor presence; about 30 different companies bringing their latest and greatest products.  If you'd like to see who attends and supports the association, you can see a list of sponsors on the GLIIFCA site.  A part of the meeting that's always a bit disconcerting for me is the Friday night Industrial Science Symposium, which is code-language for "vendor sales pitches."  It's been pretty poor some years and not-so-bad others.  It really depends on the presentation and the quality of information put forth.  You can tell some people are up there literally just trying to sell a product.  A good presenter will educate the audience so that the individuals sitting in the chairs come to the conclusion on their own that this is the product they need.  And I have to say, we witnessed one of the best examples of this last Friday night in a presentation given by a Chicago-favorite, Kelly Lundsten from BioLegend.  Great talk, and actually a pretty good session in total.

A Slide grabbed from Janet Siebert's
(Cytoanalytics) Presentation at GLIIFCA 20
The "theme" of the meeting was Cytoinformatics (as opposed to Bioinformatics).  As far as the scientific program, it was the first time I found myself thinking, maybe these informatics people aren't wacked.  I hear what they're saying, but it usually doesn't strike a chord with me.  The basic idea is that you're generating tons of data of various kinds that needs to be quickly integrated in a consistent format in order to support analysis and subsequent decision-making.  And I think my resistance has always been in the format of, "Well I don't really generate THAT much data, so I don't have to worry about this stuff."  After sitting through a few examples of data generation from some groups that I know pretty well, it got me thinking.  The quantity of data can be pretty big even if you're only doing 8-12 parameter flow cytometry or less.  This isn't something only for the 18-parameter groups, it's for everyone.  Besides the flow data, it would be nice to integrate this info with subject info, imaging info, genomics info, etc..  I think what was pretty successful for this meeting is the fact that it was setup in such a way that you could see the progression of ideas surrounding management of data.  1.  Here's the problem: People collect lots of heterogeneous data types.  2.  Here's the types of tools needed:  Data warehousing, including dimensional models, ETL (extract, transform and load data), and end-user tools to read the relational database.  3.  Here are some examples of how people are using these tools with real data and how it impacts decision-making.  That was basically GLIIFCA 20, Symposium 1, 2, 3.  Kudos to the program committee.

UCFlow's GLIIFCA 20 Poster
There were also a pretty good crop of posters presented this year, including mine (which won a poster award, thank you very much).  Two of them which stuck with me were the "Increased number of laser lines on your cytometer might mess stuff up, so be careful" poster and "Look at this awesome temperature control/antagonist injection apparatus I soldered together with some parts from Home Depot" poster.  I'm paraphrasing the titles, of course, and you can find the full poster abstract in the GLIIFCA 20 program linked above.  The first one is from the folks just up the road at Northwestern (Geoff Kraker and James Marvin), and the second one comes to us from Roswell Park courtesy of Ed Podniesinski and Paul Wallace.  The UCFlow poster was about how "I can't stand looking at QC data, so I'll start using cool Google tools and graphics to make it more interesting and maybe I'll stick with it longer."

So, there you have it.  Another year, another GLIIFCA.  For the record, this was my 11th GLIIFCA attendance.  I have officially attended a majority of GLIIFCA meetings.

Thursday, September 22, 2011

Safety; It's not to be taken lightly

I'll begin by saying, I definitely need to pay closer attention to the various safety concerns in a lab.  All too often we sacrifice our own safety in order to get things done quicker; cutting corners, thinking I'll be careful.  And then, bam, you have an incident that you regret.  Fortunately, I haven't had to deal with this first hand, but what I'm going to describe here happened close enough to home that it caused me to pause for a minute and evaluate my own techniques and protocols in the lab.

Perhaps some of you are aware of a recent incident at the University of Chicago, where a scientist became infected with the same strain of bacteria that is being studied in the lab (B. cereus).  According to information published on the Science Magazine site, the infected individual was not even working with the microbe but may have transfered it to an uncovered wound via a spill (http://news.sciencemag.org/scienceinsider/2011/09/university-of-chicago-microbiologist.html).  I believe the infected person is going to be fine and needed to undergo surgery to remove the infected tissue, so that's positive.  As a result of this (and another incident just 2 years ago), the PI is moving these sets of experiments to Argonne National Labs in the Howard T. Ricketts lab, where they are also running experiments on Plague, MRSA, and Anthrax.

It was roughly two years ago to the day that a researcher in this same Laboratory at the University of Chicago died from exposure to an attenuated form of Y. pestis.  In this case, the researcher may have felt a bit safer than he was, since the strain was determined to be non-lethal.  His co-workers admitted that his glove wearing practices were inconsistent at best.  It just so happened that he also had an undetected/untreated condition known as hemochromatosis, or an overload of iron in the blood.  It may have been this overload of iron that allowed the attenuated version of the bacteria to become virulent (http://en.wikipedia.org/wiki/Malcolm_Casadaban).  


So, as you can see, we have plenty of examples of the potential threat to our safety and those around us, and we should use examples like these, not to place blame on those who made mistakes, but to remind us of the importance to slow down and think about what we're doing and what we need to do to stay safe.  There's really just two reasons why incidents like this happen; Carelessness or Ignorance.  You have to be aware of what you're working with.  Ask questions if you're unsure.  Educate yourself.  Nothing is so important that you cannot take the extra steps to make sure you and those around you are protected as much as possible.  


Those who work in your safety office are not out to get you.  They're here to educate first and foremost, and yes, to enforce standard operating procedures for your protection.  In perusing our own safety department's web site, I stumbled upon this - Shared Responsibility.


Mission:  http://safety.uchicago.edu/about/index.shtml


Environmental Health and Safety provides services and support for efficient, effective, and compliant work practices, while promoting a culture of shared responsibility by students, faculty, staff and visitors for a healthy, safe, and environmentally sound educational and research community at the University of Chicago. 



And specifically regarding Laboratory Safety, they have this to add: http://safety.uchicago.edu/labpersonnel/index.shtml
Research is one of the two main missions of the University, the other being education. Lab personnel are integral in the creation and maintenance of a safe laboratory environment. They are responsible for ensuring safety in laboratories and lab support areas on campus and within the Medical Center. This responsibility includes:
  • Being familiar with University emergency procedures;
  • Responding appropriately in the event of an emergency;
  • Being familiar with Environmental Health and Safety policies and procedures;
  • Maintaining a safe laboratory environment;
  • Knowing the hazards of the materials and/or equipment being used;
  • Following all safety procedures in the laboratory environment;
  • Selecting, using and understanding the limitations of personal protective equipment;
  • Reporting any unsafe conditions to your supervisor and/or Environmental Health and Safety; and
  • Reporting any job related injuries or illnesses to your supervisor or Human Resources Administrator immediately; and
  • Participating in all required safety training.

So, hopefully if you've taken the time to read through this, you can certainly take the time to re-evaluate the procedures in your lab.  Make a plan to educated your staff and those around you, and open the lines of communication between your lab and those who have been tasked with the safety of your institution.  Stay safe!

Wednesday, August 10, 2011

Where's my Dream Cytometer?

Have the market research groups recently been clamoring at your door?  It seems like a weekly request via email or phone call to take "10 minutes" to answer some questions about "the future directions of flow cytometers and associated reagents."  I've answered these calls so many times in the past few months that I'm starting to sound like a broken record.  My hope is to perhaps just send them this link instead of spending time scoring questions on a scale of 1 - 10 with my stock answer of, "uh, maybe about a 7."  So, what I'm attempting to do here is write down some loose specifications of the sort of instrument I'd like to see in the not-so-distant future, and perhaps comment a bit about reagents as we go along.

Lasers:  I think the real key here, in terms of the number and wavelength of lasers, is options.  If I had an unlimited budget, I think I'd probably put about 8 lasers on my cytometer (UV, Violet, Blue, Green, Yellow, Orange, Red, Far Red) pretty much covering the spectrum.  I'd never dream of running all 8 lasers simultaneously, so they'd all need to have the ability to be shuttered on and off.  I'm not a huge fan of turning lasers on and off constantly throughout the day, so I'd prefer to have them behind an electronic shutter.  It's difficult to imagine purchasing an instrument with fewer than 4 lasers, but perhaps costs may force me to.  I'd probably want to run as many as 5 lasers simultaneously, so we're aiming for 5 interrogation points.  I'd also really like to have the ability to send lasers to different interrogation points.  Most of the time, you'd probably not run UV excitable and Violet excitable dyes at the same time, so they could probably share a pinhole.  But, in the instance where you would like to run them simultaneously, you'd probably want to split them to different pinholes and maybe even separate them by a pinhole or two.  This would require some re-engineering of the way lasers are delivered to the flow cell, but I have a couple of ideas of how this might work, so it looks plausible.  In terms of actual laser wavelength, that's to be determined.  I'd need to weigh the merits of a 550nm laser versus a 561nm laser, etc...  Regarding power, all I'd add is that I don't want to buy a 100mW laser to get 50mW at the point of interrogation.

Optics:  Spectral overlap among fluorochromes excited off the same laser is to be avoided.  So, it really doesn't make any sense to have more than 3 detectors off any one laser line. As you put more and more detectors on a single path, you have no choice but to break the light up into smaller and smaller bits, so by default you'll be compromising on photon collection; squeeze down the PE filter so you can run it, PETexasRed, and FITC all off the 488nm laser - this is absurd.  You'd also want to stagger the emission filters so you're not looking at the same light from different paths that happen to excite off multiple lasers (think PerCP off the blue and PECy5.5 off the Green - change this to PECy5 off the green and PerCPCy5.5 off the blue).  However, it DOES make sense to be able to detect lots of different fluorochromes off any single laser line.  How can this be accomplished?  Through quick change filters.  For example, let's say we have 3 detectors off the Blue laser (SSC, FITC, PerCP, for example).  I'd like to use that FITC detector for FITC, CFSE, GFP, mVenus, Aldefluor, Sytox Green, etc...  Most people will have a 530/30 filter on their 'FITC' channel, but this may not be optimal for all the different 'green' fluors you may use.  So, one option would be to use a wider band pass on that channel, say a 525/50.  This is fine until I need to turn on my Green laser for excitation of some fluorescent proteins like mBanana.  In this case I'd want to change my GFP filter to something like a 510/20, but then change it back when I'm not doing fluorescent protein work.  Ideally, I'd like to tell the software which color combinations I'm using and have it adjust all the filters necessary to optimize fluorescence collection and minimize spectral overlap, but in the meantime, I want a system that has the ability to easily change filters, know which filter in in which detector, and have a place to store filters not currently being used so they don't get all scratched up and full of dust.

Electronics:  I use to scoff at those who said they needed 5 and 6+ logs on their cytometer, but I'm coming around a little bit. I could easily see my next cytometer having at least 5 logs of dynamic range, but only if it has the right electronic components to fill those 5-6 decades.  See this post for some ideas regarding that - Putting an End to the Log Wars.  There's really no reason why our instruments should not have really fast processors that can do fine detail pulse processing.  We're 11 years into the 21st century, yet we're using stuff developed in the 1980's. A great optical system is nothing without an equally great Electronics system.

Fluidics:  In my eyes, Hydrodynamic focusing is still king (See edit below for clarification on how acoustic focusing is implemented specifically on the Attune Acoustic focusing focuses the cells, but not the sample fluid so you end up picking up fluorescence from unbound fluors in the illumination volume - this is the same issue with capillary systems), whether it's in a small chip or in a flow cell, however the sheath velocity going through the sensing area could be sped up to allow for higher event rates without increasing the size of the sample core stream.  This, of course would require better electronics with much higher resolution to sample the short pulses and really good collection optics to collect as high a percent of emitted photons as possible, not just the small fraction that just happen to emit at 90 degrees to the incident light.  I'd also caution against the desire to make super complicated fluidics systems that tend to break constantly (I'm looking at you FACSAria I and FACSCanto-A).

Software:  Flow Cytometers are built by engineers, and it's usually the case that they find the engineer who knows most about writing software and say, "Let's get some software written to run this thing."  There's usually not much usability testing, UI design thought, etc...  The last batch of cytometers I've looked at have had a bit more polish on their software, so things look like they're headed in the right direction.  The trend to borrow from MS Office and use ribbons all over the place is probably a safe bet. You'd have to assume Microsoft has done a bunch of usability testing, and if it's good enough for them, it's probably good enough for us.  However, we're not word processing or making tables or even making presentations, we're adjusting hardware components using software tools, collecting data, and displaying that data on screen.  So, in reality, we should be using a model that the everyday Jane Q. Researcher would be familiar with that performs a similar task.  I'll throw out a couple of examples.  I love the OSD (On Screen Display) on my Samsung LCD television.  It allows me to easily get in, adjust settings like Color mode, Brightness, and Sound and get out all while not completely obstructing my view of the picture behind.  Just change out Color mode, Brightness and Sound with Parameters, Voltage, and Compensation and switch picture with plots, and there you go.  If you're a Photog, you probably use software like Aperture or Lightroom.  These software tools allow for some pretty specific settings and adjustments but in a clean, easy-to-use interface. So, let's use these types of software to model our cytometer software after instead of a word processing software.

Reagents:  I want lots of antibody choices, which is only going to be possible from a company that has ties to Research Institutions that make new antibodies and are willing to license them to companies for profit (eh-hem, our Monoclonal Antibody Facility has done and continues to do this on a regular basis).  I also want them coupled to a wide range of fluors, especially the new ones like the brilliant violet.  I want to be able to try before I commit, whether this be via a free sample, or a really inexpensive small aliquot.  I'm not at all concerned about having reagents tied to my equipment, and I actually dislike that trend.  I'm not going to buy a cytometer because some company made a canned "apoptosis kit" that works specifically for their instrument only to find out it's using Annexin V FITC and PI.  The 5 questions I ask when finding an antibody.  Do they have the antibody?  Is it coupled to a range of fluors that work for my cytometer configuration?  Does it fit in my budget?  Are there multiple size options?  Is there support information so I know it's going to work?

So, there you have it.  How much am I willing to spend on this instrument?  Well, I'm willing to buy as much instrument as I need.  If I want a 2-laser, 6-color instrument, I think it has to be priced around $100K.  If I want an 8-laser, 15-FL detector (5 pinholes x 3 detectors) with all the filters I need to look at 45 distinct fluorochromes, I'd say it'd have to be around $350K.

Edit:  A comment above about acoustic focusing may be only partially correct.  Although it is true that acoustic focusing is responsible for focusing the cells and not the sample core stream, this is not how it has been implemented in the Life Technologies Attune Cytometer.  In fact, the Attune has both acoustic focusing for the cells and hydrodynamic focusing for the sample core stream.  When utilizing the low flow rate on the Attune (25ul/min), you can achieve a significant amount of core stream tapering due to a narrowing of the entire stream from a cross-sectional area of 340um (where the cells are being focused by the acoustic wave form) to a 200um cuvette (where laser interrogation occurs).  In this case, the constriction of the entire stream provides the hydrodynamic force needed to narrow the core stream.  When running the sample at a higher flow rate, you'd increase the size of the core stream proportionally as is what happens on non-acoustically focused systems.  However, in the case of the Attune, even at this higher volume flow rate, the cells still remain focused leading to better and more uniform illumination by the lasers.

Wednesday, June 15, 2011

Putting an end to the "Log Wars"

A long time ago, in a laboratory far far away there was a lowly FACScan able to display data on a 4-log scale.  Fast-forward to today, and you'll find some instruments with as many as 7 logs of scale.  That's a huge improvement, right?  Well, maybe not.  The origin of the 4-log scale probably had more to do with the Analog-to-Digital Converters (ADC) being used than the technological needs of the science being done in the 80s.  With the advancements in ADCs in other markets, flow cytometry manufacturers could now include converters with greater bit density and still provide a relatively affordable product.  The standard for many years was the 10-bit ADC, which yields 1024 bins of resolution across the scale.  Spreading these 1024 bins across a 4-log scale appears to give enough resolution while expanding to a reasonable range.  After many years using these solid electronic components, BD completely redesigned the electronic system on its cell sorter (called the BD FACSVantage) to give us the FACSDiVa (or Digital Vantage) architecture.  Now, instead of using traditional ADCs and log amplifiers, BD switched things up by using "high" speed Digital Signal Processors (DSPs) to directly digitize the analog pulse and then do log conversion using look-up tables.  The DSPs converted the linear data at a bin density of 14-bit (16,384 bins) and when the data is log converted, it is upscaled to 18-bit (262144 bins).  Now, with 18-bit data, they are able to display this data on a 5-log scale.  The reason?  Well, if I were forced to guess, I'd say it was a marketing decision to differentiate BD's new line of cytometers from it's old line as well as it's competitors.  With this new 5-log data came with it the "picket fencing" phenomenon, which basically demonstrated that the 18-bit data (which was really 14-bit data) did not have enough bin resolution to display data properly in the 1st decade.  The solution?  Simple, hide the 1st decade and display decades 2 through 5 (right back at a 4-log scale).  Because the BD instruments were so popular, other companies jumped on the bandwagon and thought, well if BD is doing 5-logs then we should do 6-logs or maybe 7-logs.  And that's how we arrived here today, and now I'd like to show you why this is a bad thing.

Let me start with my conclusion first, and then show you how I arrived here.  The figure to the right shows a minimun analog to digital conversion bit density for a given range of log scale.  As you can see, if we wanted to display our data on a 5 log scale, we should have at least a 20-bit ADC. Side note - Bit(eff) means Effective Bit density, which basically takes into account that if you put a 20-bit ADC on your instrument, it probably doesn't actually perform at a full 20-bit.  This is because there is some noise associated with the ADC, which limits the performance of the ADC. /Side note.

So, how did I arrive at this conclusion?  Well first let me demonstrate that bit-density is important with an example.  I created a mock data set of 3 Gaussian distributions (n=1000 data points for each) where the mean of the distributions and the SD were altered such that the populations were overlapping significantly.  I then plotted these distributions on 4 histograms with different quantities of bin resolution ranging from 3-bit to 8-bit.  It's important to remember that this is the exact same data set merely binned differently according to the available resolution.  As you can see, the 3 populations are not at all discernable at the 3-bit range and it's not until we get to the 6-bit histogram that you can start to see the 3 different populations.  Using this information, we can appreciate the importance of having sufficient bin density to resolve distributions from one another.

As an example to a system that might not have enough bin density, I display the following.  Here we have a 20-bit ADC yielding over 1 million bins of resolution to spread across a 6-log scale.  This may sound sufficient, but when we break it down per log, we see that in the first decade, where we have scale values of 1-10, we would only have 11 bins of resolution which would certainly lead to picket fencing and poor resolution of populations in that range.  The Effective bins column shows an example where the noise of the ADC is such that our true bin resolution would be much less than the theoretical 20-bit.

Going through the process and crunching numbers for different scenarios, I conclude that ideally we would like to have on the order of 100s of bins of resolution in the 1st decade.  So, in order to achieve that level on a 6-log scale, we'd actually need to have an 24-bit ADC.  Now, the breakdown would be like what's shown below.  

Take-home message:  First of all, is a 6-log scale really necessary?  For you the answer may be yes, but for most, probably not.  The second question to ask your friendly sales representative is what sort of analog-to-digital conversion is done, and what the bit resolution of the converter is.  It means nothing to have a 7-log scale displaying data from a 10-bit ADC.  No matter how good the optics are you'll never be able to resolve dim populations from unstained cells.  What really matters is having a really good optical system that has high speed, high density electronics that can display all the fine detail of your distributions.  Find an instrument like that, and you have a winner.