Maximising transformation efficiency is a key objective in library creation alongside the actual mutation spectrum and loading. That means no USER cloning without ligation or Gibson assembly; no dirty DNA; no chemically competent cells or simply vortexed with sepiolite and no skipping recovery time. One has to ligate (after restriction or USER cloning), clean the DNA, make good electrocompetent cells and electroporate. Yet even after that it may not be enough.
Often the strain has poor transformation efficiency, e.g. a B strain, MDS42 (the Blattner strain) or some Keio strains (e.g. ∆gua ∆lpd ∆thyA, cf. Patrick et al. 2007), and folk have to transform in another strain first and grow a lawn up as little as possible to reduce library redundancy. In some cases this is made worse by the fact the strain is an old B strain with the hsdSRM restriction system intact —I wasted a month once before I realised I had this case.
In other cases the plasmid is too big. Keeping the plasmid small for selection is easier said than done.
- One has to work with the large pSC101 origin because pBR322 and p15A are used.
- One is shuffling a whole operon around.
- One has a giant protein.
- And so forth.
Supercoiling of DNA is heralded as the reason why the DNA gets in better and DNA replicated in vivo is negatively supercoiled. So why is DNA not supercoiled in vitro before transformation? The only paper that seems to try this is from 1995 and largely forgotten. There are three options:
- Too niche
- Minor benefit
- Severe side-effects
I have had the chance to give it a go, but my money is on #1+2.
Too niche?
If it were good, it would be implemented already. But there is the possibility nobody has implemented this because of the 'somebody else's problem' effect. After all even though method papers get cited a lot more than discovery papers, there is a lot of aversion against them —mainly due to funding. But maybe company with amazing metagenomic libraries do already.
Minor benefits?
Let's have a look at the paper. For small plasmids the effect is negligible. For 15 kbp plasmids the effects are twice better —which is still schmeh. For the 20 kbp there is nearly a ten fold improvement. This is pretty cool, but the efficiency normally drops at the 5 kbp mark and above. So it could be more miraculous. If I were making a library with a shuffled operon in 10 kbp plasmid, say, I would not use it. But if I were making a metagenomic library of a 20+ kbp I would definitely try it.
No comments:
Post a Comment