Dry lab

From Wikipedia, the free encyclopedia

A dry lab is a laboratory where the nature of the experiments does not involve significant risk. This is in contrast to a wet lab where it is necessary to handle various types of chemicals and biological hazards. An example of a dry lab is one where computational or applied mathematical analyses are done on a computer-generated model to simulate a phenomenon in the physical realm.[1] Examples of such phenomena include a molecule changing quantum states, the event horizon of a black hole or anything that otherwise might be impossible or too dangerous to observe under normal laboratory conditions. This term may also refer to a lab that uses primarily electronic equipment, for example, a robotics lab. A dry lab can also refer to a laboratory space for the storage of dry materials.[2]

Dry labbing can also refer to supplying fictional (yet plausible) results in lieu of performing an assigned experiment, or carrying out a systematic review.

In silico chemistry[edit]

As computing power has grown exponentially this approach to research, often referred to as in silico (as opposed to in vitro and in vivo), has amassed more attention, especially in the area of bioinformatics. More specifically, within bioinformatics, is the study of proteins or proteomics, which is the elucidation of their unknown structures and folding patterns. The general approach in the elucidation of protein structure has been to first purify a protein, crystallize it and then send X-rays through such a purified protein crystal to observe how these x-rays diffract into specific pattern—a process referred to as X-ray crystallography. However, many proteins, especially those embedded in cellular membranes, are nearly impossible to crystallize due to their hydrophobic nature. Although other techniques exists, such as ramachandran plotting and mass spectrometry, these alone generally do not lead to the full elucidation of protein structure or folding mechanisms.

Distributed computing[edit]

As a means of surpassing the limitations of these techniques, projects such as Folding@home and Rosetta@home are aimed at resolving this problem using computational analysis, this means of resolving protein structure is referred to as protein structure prediction. Although many labs have a slightly different approach, the main concept is to find, from a myriad of protein conformations, which conformation has the lowest energy or, in the case of Folding@Home, to find relatively low energies of proteins that could cause the protein to misfold and aggregate other proteins to itself—like in the case of sickle cell anemia. The general scheme in these projects is that a small number of computations are parsed to, or sent to be calculated on, a computer, generally a home computer, and then that computer analyzes the likelihood that a specific protein will take a certain shape or conformation based on the amount of energy required for that protein to stay in that shape, this way of processing data is what is generally referred to as distributed computing. This analysis is done on an extraordinarily large number of different conformations, owing to the support of hundreds of thousands of home-based computers, with the goal of finding the conformation of lowest possible energy or set of conformations of lowest possible energy relative to any conformations that are just slightly different. Although doing so is quite difficult, one can, by observing the energy distribution of a large number of conformations, despite the almost infinite number of different protein conformations possible for any given protein (see Levinthal Paradox), with a reasonably large number of protein energy samplings, predict relatively closely what conformation, within a range of conformations, has the expected lowest energy using methods in statistical inference. There are other factors such as salt concentration, pH, ambient temperature or chaperonins, which are proteins that assist in the folding process of other proteins, that can greatly affect how a protein folds. However, if the given protein is shown to fold on its own, especially in vitro, these findings can be further supported. Once we can see how a protein folds then we can see how it works as a catalyst, or in intracellular communication, e.g. neuroreceptor-neurotransmitter interaction. How certain compounds may be used to enhance or prevent the function of these proteins and how an elucidated protein overall plays a role in disease can also be much better understood.[3]

There are many other avenues of research in which the dry lab approach has been implemented. Other physical phenomena, such as sound, properties of newly discovered or hypothetical compounds and quantum mechanics models have recently[when?] received more attention in this area of approach.

As a method of deception[edit]

Dry labbing, as referring to the process of claiming results without actually doing lab work, is a notoriously shady practice that has been carried out through the ages. While dry labs are a huge problem today and cast question on modern research in some cases, their use dates back to Archimedes, who claimed that heavier objects fall faster and lighter objects fall slower without actually carrying out the experiments himself. His error would not be fully corrected until the days of Simon Stevin and Galileo.

See also[edit]

References[edit]

  1. ^ "dry lab". Merriam-Webster. Archived from the original on 28 January 2013. Retrieved 22 February 2013.
  2. ^ "Laboratory: Dry". National Institute of Building Sciences. Retrieved 22 February 2013.
  3. ^ "Folding@home Diseases Studied FAQ". Stanford University. Archived from the original on 25 August 2012. Retrieved 22 February 2013.