History of Biotechnology-

Biotechnology is not limited to medical/health applications (unlike Biomedical Engineering, which includes much biotechnology). Although not normally thought of as biotechnology, agriculture clearly fits the broad definition of "using a biotechnological system to make products" such that the cultivation of plants may be viewed as the earliest biotechnological enterprise. Agriculture has been theorized to have become the dominant way of producing food since the Neolithic Revolution. The processes and methods of agriculture have been refined by other mechanical and biological sciences since its inception. Through early biotechnology, farmers were able to select the best suited crops, having the highest yields, to produce enough food to support a growing population. Other uses of biotechnology were required as the crops and fields became increasingly large and difficult to maintain. Specific organisms and organism by-products were used to fertilize, restore nitrogen, and control pests. Throughout the use of agriculture, farmers have inadvertently altered the genetics of their crops through introducing them to new environments and breeding them with other plants—one of the first forms of biotechnology. Cultures such as those in Mesopotamia, Egypt, and India developed the process of brewing beer. It is still done by the same basic method of using malted grains (containing enzymes) to convert starch from grains into sugar and then adding specific yeasts to produce beer. In this process the carbohydrates in the grains were broken down into alcohols such as ethanol. Later other cultures produced the process of lactic acid fermentation which allowed the fermentation and preservation of other forms of food. Fermentation was also used in this time period to produce leavened bread. Although the process of fermentation was not fully understood until Pasteur's work in 1857, it is still the first use of biotechnology to convert a food source into another form.

For thousands of years, humans have used selective breeding to improve production of crops and livestock to use them for food. In selective breeding, organisms with desirable characteristics are mated to produce offspring with the same characteristics. For example, this technique was used with corn to produce the largest and sweetest crops.

In the early twentieth century scientists gained a greater understanding of microbiology and explored ways of manufacturing specific products. In 1917, Chaim Weizmann first used a pure microbiological culture in an industrial process, that of manufacturing corn starch using Clostridium acetobutylicum, to produce acetone, which the United Kingdom desperately needed to manufacture explosives during World War I.

Biotechnology has also led to the development of antibiotics. In 1928, Alexander Fleming discovered the mold Penicillium. His work led to the purification of the antibiotic by Howard Florey, Ernst Boris Chain and Norman Heatley penicillin. In 1940, penicillin became available for medicinal use to treat bacterial infections in humans.

The field of modern biotechnology is thought to have largely begun on June 16, 1980, when the United States Supreme Court ruled that a genetically modified microorganism could be patented in the case of Diamond v. Chakrabarty. Indian-born Ananda Chakrabarty, working for General Electric, had developed a bacterium (derived from the Pseudomonas genus) capable of breaking down crude oil, which he proposed to use in treating oil spills.

Revenue in the industry is expected to grow by 12.9% in 2008. Another factor influencing the biotechnology sector's success is improved intellectual property rights legislation—and enforcement—worldwide, as well as strengthened demand for medical and pharmaceutical products to cope with an ageing, and ailing, U.S. population.

Rising demand for biofuels is expected to be good news for the biotechnology sector, with the Department of Energy estimating ethanol usage could reduce U.S. petroleum-derived fuel consumption by up to 30% by 2030. The biotechnology sector has allowed the U.S. farming industry to rapidly increase its supply of corn and soybeans—the main inputs into biofuels—by developing genetically modified seeds which are resistant to pests and drought. By boosting farm productivity, biotechnology plays a crucial role in ensuring that biofuel production targets are met.

   

Origins of Biotechnology-

Biotechnology arose from the field of zymotechnology, which began as a search for a better understanding of industrial fermentation, particularly beer. Beer was an important industrial, and not just social, commodity. In late 19th century Germany, brewing contributed as much to the gross national product as steel, and taxes on alcohol proved to be significant sources of revenue to the government. In the 1860s, institutes and remunerative consultancies were dedicated to the technology of brewing. The most famous was the private Carlsberg Institute, founded in 1875, which employed Emil Christian Hansen, who pioneered the pure yeast process for the reliable production of consistent beer. Less well known were private consultancies that advised the brewing industry. One of these, the Zymotechnic Institute, was established in Chicago by the German-born chemist John Ewald Siebel.

The heyday and expansion of zymotechnology came in World War I in response to industrial needs to support the war. Max Delbruck grew yeast on an immense scale during the war to meet 60 percent of Germany's animal feed needs. Compounds of another fermentation product, lacti acid, made up for a lack of hydraulic fluid, glycerol. On the Allied side the Russian chemist Chaim Weizmann used starch to eliminate Britain's shortage of acetone, a key raw material in explosives, by fermenting maize to acetone. The industrial potential of fermentation was outgrowing its traditional home in brewing, and "zymotechnology" soon gave way to "biotechnology."

With food shortages spreading and resources fading, some dreamed of a new industrial solution. The Hungarian Karl Ereky coined the word "biotechnology" in Hungary during 1919 to describe a technology based on converting raw materials into a more useful product. He built a slaughterhouse for a thousand pigs and also a fattening farm with space for 50,000 pigs, raising over 100,000 pigs a year. The enterprise was enormous, becoming one of the largest and most profitable meat and fat operations in the world. In a book entitled Biotechnologie, Ereky further developed a theme that would be reiterated through the 20th century: biotechnology could provide solutions to societal crises, such as food and energy shortages. For Ereky, the term "biotechnologie" indicated the process by which raw materials could be biologically upgraded into socially useful products.

This catchword spread quickly after the First World War, as "biotechnology" entered German dictionaries and was taken up abroad by business-hungry private consultancies as far away as the United States. In Chicago, for example, the coming of prohibition at the end of World War I encouraged biological industries to create opportunities for new fermentation products, in particular a market for nonalcoholic drinks. Emil Siebel, the son of the founder of the Zymotechnic Institute, broke away from his father's company to establish his own called the "Bureau of Biotechnology," which specifically offered expertise in fermented nonalcoholic drinks.

The belief that the needs of an industrial society could be met by fermenting agricultural waste was an important ingredient of the "chemurgic movement." Fermentation-based processes generated products of ever-growing utility. In the 1940s, penicillin was the most dramatic. While it was discovered in England, it was produced industrially in the U.S. using a deep fermentation process originally developed in Peoria, Illinois. The enormous profits and the public expectations penicillin engendered caused a radical shift in the standing of the pharmaceutical industry. Doctors used the phrase "miracle drug", and the historian of its wartime use, David Adams, has suggested that to the public penicillin represented the perfect health that went together with the car and the dream house of wartime American advertising. In the 1950s,steroids were synthesized using fermentation technology. In particular, cortisone promised the same revolutionary ability to change medicine as penicillin had.

    

Applications-

Biotechnology has applications in four major industrial areas, including health care (medical), crop production and agriculture, non food (industrial) uses of crops and other products (e.g. vegetable oil, biofuels), and environmental uses.

For example, one application of biotechnology is the directed use of organisms for the manufacture of organic products (examples include beer and milk products). Another example is using naturally present bacteria by the mining industry in bioleaching. Biotechnology is also used to recycle, treat waste, clean up sites contaminated by industrial activities and also to produce biological weapons.

A series of derived terms have been coined to identify several branches of biotechnology; for example:

Bioinformatics is an interdisciplinary field which addresses biological problems using computational techniques, and makes the rapid organization and analysis of biological data possible. The field may also be referred to as computational biology, and can be defined as, "conceptualizing biology in terms of molecules and then applying informatics techniques to understand and organize the information associated with these molecules, on a large scale." Bioinformatics plays a key role in various areas, such as functional genomics, structural genomics, and protemics, and forms a key component in the biotechnology and pharmaceutical sector.

 Blue biotechnology is a term that has been used to describe the marine and aquatic applications of biotechnology, but its use is relatively rare.

 Green biotechnology is biotechnology applied to agricultural processes. An example would be the selection and domestication of plants via micropropagation. Another example is the designing of transgenic plants to grow under specific environments in the presence (or absence) of chemicals. One hope is that green biotechnology might produce more environmentally friendly solutions than traditional industrial agriculture. An example of this is the engineering of a plant to express a pesticide, thereby ending the need of external application of pesticides. An example of this would be Bt corn. Whether or not green biotechnology products such as this are ultimately more environmentally friendly is a topic of considerable debate.

Red biotechnology is applied to medical processes. Some examples are the designing of organisms to produce antibiotics, and the engineering of genetic cures through genetic manipulation.

White biotechnology, also known as industrial biotechnology, is biotechnology applied to industrial processes. An example is the designing of an organism to produce a useful chemical. Another example is the using of enzymes as industrial catalysts to either produce valuable chemicals or destroy hazardous/polluting chemicals. White biotechnology tends to consume less in resources than traditional processes used to produce industrial goods. The investment and economic output of all of these types of applied biotechnologies is termed as bioeconomy.

 

Pharmacogenomics-

Pharmacogenomics is the branch of pharmacology which deals with the influence of genetic variation on drug response in patients by correlating gene expression or single-nucleotide polymorphisms with a drug's efficacy or toxicity. By doing so, pharmacogenomics aims to develop rational means to optimise drug therapy, with respect to the patients' genotype, to ensure maximum efficacy with minimal adverse effects. Such approaches promise the advent of "personalized medicine"; in which drugs and drug combinations are optimized for each individual's unique genetic makeup.

Pharmacogenomics is the whole genome application of pharmacogenetics, which examines the single gene interactions with drugs.

Pharmacogenomics is being used for all critical illnesses like cancer, cardio vascular disorders, HIV, tuberculosis, asthma, and diabetes.

In cancer treatment, pharmacogenomics tests are used to identify which patient will have toxicity from commonly used cancer drugs and identify which patient will not respond to commonly used cancer drug. Over the last couple of years, pharmacogenomics is also known as companion diagnostics, meaning tests being bundled with drugs. Two good examples are K-ras test with cituximab and EGFR test with Gefitinib.

In cardio vascular disorders, the main concern is response to drugs including warfarin, clopidogrel, beta blockers, and statins.

Pharmacogenomics results in the following benefits:

Development of tailor-made medicines. Using pharmacogenomics, pharmaceutical companies can create drugs based on the proteins, enzymes and RNA molecules that are associated with specific genes and diseases. These tailor-made drugs promise not only to maximize therapeutic effects but also to decrease damage to nearby healthy cells.

More accurate methods of determining appropriate drug dosages. Knowing a patient's genetics will enable doctors to determine how well his/ her body can process and metabolize a medicine. This will maximize the value of the medicine and decrease the likelihood of overdose.

Improvements in the drug discovery and approval process. The discovery of potential therapies will be made easier using genome targets. Genes have been associated with numerous diseases and disorders. With modern biotechnology, these genes can be used as targets for the development of effective new therapies, which could significantly shorten the drug discovery process.

Better vaccines. Safer vaccines can be designed and produced by organisms transformed by means of genetic engineering. These vaccines will elicit the immune response without the attendant risks of infection. They will be inexpensive, stable, easy to store, and capable of being engineered to carry several strains of pathogen at once.

   

Pharmaceutical products-

Most traditional pharmaceutical drugs are relatively simple molecules that have been found primarily through trial and error to treat the symptoms of a disease or illness. Biopharmaceuticals are large biological molecules such as proteins and these usually target the underlying mechanisms and pathways of a malady it is a relatively young industry. They can deal with targets in humans that may not be accessible with traditional medicines. A patient typically is dosed with a small molecule via a tablet while a large molecule is typically injected.

Small molecules are manufactured by chemistry but larger molecules are created by living cells such as those found in the human body: for example, bacteria cells, yeast cells, animal or plant cells.

Modern biotechnology is often associated with the use of genetically altered micro-organisms such as E. coli or yeast for the production of substances like synthetic insulin or antibiotics. It can also refer to transgenic animals or transgenic plants, such as Bt corn. Genetically altered mammalian cells are also used to manufacture certain pharmaceuticals. Another promising new biotechnology application is the development of plant made pharmaceuticals.

Biotechnology is also commonly associated with landmark breakthroughs in new medical therapies to treat hepatitis B, hepatitis C, cancers, arthritis, cardiovascular disorders. The biotechnology industry has also been instrumental in developing molecular diagnostic devices that can be used to define the target patient population for a given biopharmaceutical. Herceptin, for example, was the first drug approved for use with a matching diagnostic test and is used to treat breast cancer in women whose cancer cells express the protein HER2.

Modern biotechnology can be used to manufacture existing medicines relatively easily and cheaply. The first genetically engineered products were medicines designed to treat human diseases. Insulin, widely used for the treatment of diabetes, was previously extracted from the pancreas of abattoir animals (cattle and/or pigs). The resulting genetically engineered bacterium enabled the production of vast quantities of synthetic human insulin at relatively low cost. According to a 2003 study undertaken by the International Diabetes Federation (IDF) on the access to and availability of insulin in its member countries, synthetic 'human' insulin is considerably more expensive in most countries where both synthetic 'human' and animal insulin are commercially available: e.g. within European countries the average price of synthetic 'human' insulin was twice as high as the price of pork insulin. Yet in its position statement, the IDF writes that "there is no overwhelming evidence to prefer one species of insulin over another" and "[modern, highly purified] animal insulins remain a perfectly acceptable alternative.

   

Genetic testing-

Genetic testing involves the direct examination of the DNA molecule itself. A scientist scans a patient's DNA sample for mutated sequences.

There are two major types of gene tests. In the first type, a researcher may design short pieces of DNA ("probes") whose sequences are complementary to the mutated sequences. These probes will seek their complement among the base pairs of an individual's genome. If the mutated sequence is present in the patient's genome, the probe will bind to it and flag the mutation. In the second type, a researcher may conduct the gene test by comparing the sequence of DNA bases in a patient's gene to disease in healthy individuals or their progeny.

Genetic testing is now used for:

Carrier screening, or the identification of unaffected individuals who carry one copy of a gene for a disease that requires two copies for the disease to manifest;

Confirmational diagnosis of symptomatic individuals;

Forensic/identity testing;

Newborn screening;

Prenatal diagnostic screening;

Presymptomatic testing for estimating the risk of developing adult-onset cancers;

Presymptomatic testing for predicting adult-onset disorders.

Some genetic tests are already available, although most of them are used in developed countries. The tests currently available can detect mutations associated with rare genetic disorders. Recently, tests have been developed to detect mutation for a handful of more complex conditions such as breast, ovarian, and colon cancers. However, gene tests may not detect every mutation associated with a particular condition because many are as yet undiscovered, and the ones they do detect may present different risks to different people and populations.

The absence of privacy and anti-discrimination legal protections in most countries can lead to discrimination in employment or insurance or other use of personal genetic information. This raises questions such as whether genetic privacy is different from medical privacy.

Reproductive issues. These include the use of genetic information in reproductive decision-making and the possibility of genetically altering reproductive cells that may be passed on to future generations. For example, germline therapy changes the genetic make-up of an individual's descendants. Thus, any error in technology or judgment may have far-reaching consequences (though the same can also happen through natural reproduction). Ethical issues like designed babies and human cloning have also given rise to controversies between and among scientists and bioethicists, especially in the light of past abuses with eugenics. Clinical issues. These center on the capabilities and limitations of doctors and other health-service providers, people identified with genetic conditions, and the general public in dealing with genetic information.

Effects on social institutions. Genetic tests reveal information about individuals and their families. Thus, test results can affect the dynamics within social institutions, particularly the family.

Conceptual and philosophical implications regarding human responsibility, free will vis-à-vis genetic determinism, and the concepts of health and disease.

 

Gene therapy-

Scientists have taken the logical step of trying to introduce genes directly into human cells, focusing on diseases caused by single-gene defects, such as cystic fibrosis, haemophilia, muscular dystrophy and sickle cell anemia. However, this has proven more difficult than modifying bacteria, primarily because of the problems involved in carrying large sections of DNA and delivering them to the correct site on the gene. Today, most gene therapy studies are aimed at cancer and hereditary diseases linked to a genetic defect. Antisense therapy is not strictly a form of gene therapy, but is a related, genetically-mediated therapy.

The most common form of genetic engineering involves the insertion of a functional gene at an unspecified location in the host genome.This is accomplished by isolating and copying the gene of interest, generating a construct containing all the genetic elements for correct expression, and then inserting this construct into a random location in the host organism. Other forms of genetic engineering include gene targeting and knocking out specific genes via engineered nucleases such as zinc finger nucleases, engineered I-CreI homing endonucleases, or nucleases generated from TAL effectors. An example of gene-knockout mediated gene therapy is the knockout of the human CCR5 gene in T-cells in order to control HIV infection. This approach is currently being used in several human clinical trials.

The biology of human gene therapy remains complex and many techniques need further development. Many diseases and their strict genetic link need to be understood more fully before gene therapy can be used appropriately. The public policy debate surrounding the possible use of genetically engineered material in human subjects has been equally complex. Major participants in the debate have come from the fields of biology, government, law, medicine, philosophy, politics, and religion, each bringing different views to the discussion.

Gene therapy may be used for treating, or even curing, genetic and acquired diseases like cancer and AIDS by using normal genes to supplement or replace defective genes or to bolster a normal function such as immunity. It can be used to target somatic cells (i.e., those of the body) or gametes (i.e., egg and sperm) cells. In somatic gene therapy, the genome of the recipient is changed, but this change is not passed along to the next generation. In contrast, in germline gene therapy, the egg and sperm cells of the parents are changed for the purpose of passing on the changes to their offspring.

There are basically two ways of implementing a gene therapy treatment:

Ex vivo, which means "outside the body" – Cells from the patient's blood or bone marrow are removed and grown in the laboratory. They are then exposed to a virus carrying the desired gene. The virus enters the cells, and the desired gene becomes part of the DNA of the cells. The cells are allowed to grow in the laboratory before being returned to the patient by injection into a vein.

In vivo, which means "inside the body" – No cells are removed from the patient's body. Instead, vectors are used to deliver the desired gene to cells in the patient's body.

As of June 2001, more than 500 clinical gene-therapy trials involving about 3,500 patients have been identified worldwide. Around 78% of these are in the United States, with Europe having 18%. These trials focus on various types of cancer, although other multigenic diseases are being studied as well. Recently, two children born with severe combined immunodeficiency disorder ("SCID") were reported to have been cured after being given genetically engineered cells.

Gene therapy faces many obstacles before it can become a practical approach for treating disease. At least four of these obstacles are as follows:

Gene delivery tools. Genes are inserted into the body using gene carriers called vectors. The most common vectors now are viruses, which have evolved a way of encapsulating and delivering their genes to human cells in a pathogenic manner. Scientists manipulate the genome of the virus by removing the disease-causing genes and inserting the therapeutic genes. However, while viruses are effective, they can introduce problems like toxicity, immune and inflammatory responses, and gene control and targeting issues. In addition, in order for gene therapy to provide permanent therapeutic effects, the introduced gene needs to be integrated within the host cell's genome. Some viral vectors effect this in a random fashion, which can introduce other problems such as disruption of an endogenous host gene.

High costs. Since gene therapy is relatively new and at an experimental stage, it is an expensive treatment to undertake. This explains why current studies are focused on illnesses commonly found in developed countries, where more people can afford to pay for treatment. It may take decades before developing countries can take advantage of this technology.

Limited knowledge of the functions of genes. Scientists currently know the functions of only a few genes. Hence, gene therapy can address only some genes that cause a particular disease. Worse, it is not known exactly whether genes have more than one function, which creates uncertainty as to whether replacing such genes is indeed desirable.

Multigene disorders and effect of environment. Most genetic disorders involve more than one gene. Moreover, most diseases involve the interaction of several genes and the environment. For example, many people with cancer not only inherit the disease gene for the disorder, but may have also failed to inherit specific tumor suppressor genes. Diet, exercise, smoking and other environmental factors may have also contributed to their disease.

The Human Genome Project is an initiative of the U.S. Department of Energy ("DOE") that aims to generate a high-quality reference sequence for the entire human genome and identify all the human genes.

The DOE and its predecessor agencies were assigned by the U.S. Congress to develop new energy resources and technologies and to pursue a deeper understanding of potential health and environmental risks posed by their production and use. In 1986, the DOE announced its Human Genome Initiative. Shortly thereafter, the DOE and National Institutes of Health developed a plan for a joint Human Genome Project ("HGP"), which officially began in 1990.

The HGP was originally planned to last 15 years. However, rapid technological advances and worldwide participation accelerated the completion date to 2003 (making it a 13 year project). Already it has enabled gene hunters to pinpoint genes associated with more than 30 disorders.

 

Cloning-

Cloning in biology is the process of producing similar populations of genetically identical individuals that occurs in nature when organisms such as bacteria, insects or plants reproduce asexually. Cloning in biotechnology refers to processes used to create copies of DNA fragments (molecular cloning), cells (cell cloning), or organisms. The term also refers to the production of multiple copies of a product such as digital media or software.

The term clone is derived from the Greek word for "trunk, branch", referring to the process whereby a new plant can be created from a twig. In horticulture, the spelling clon was used until the twentieth century; the final e came into use to indicate the vowel is a "long o" instead of a "short o". Since the term entered the popular lexicon in a more general context, the spelling clone has been used exclusively.

Molecular cloning refers to the process of making multiple molecules. Cloning is commonly used to amplify DNA fragments containing whole genes, but it can also be used to amplify any DNA sequence such as promoters, non-coding sequences and randomly fragmented DNA. It is used in a wide array of biological experiments and practical applications ranging from genetic fingerprinting to large scale protein production. Occasionally, the term cloning is misleadingly used to refer to the identification of the chromosomal location of a gene associated with a particular phenotype of interest, such as in positional cloning. In practice, localization of the gene to a chromosome or genomic region does not necessarily enable one to isolate or amplify the relevant genomic sequence. To amplify any DNA sequence in a living organism, that sequence must be linked to an origin of replication, which is a sequence of DNA capable of directing the propagation of itself and any linked sequence. However, a number of other features are needed and a variety of specialised cloning vectors (small piece of DNA into which a foreign DNA fragment can be inserted) exist that allow protein expression, tagging, single stranded RNA and DNA production and a host of other manipulations.

Cloning of any DNA fragment essentially involves four steps

fragmentation - breaking apart a strand of DNA

ligation - gluing together pieces of DNA in a desired sequence

transfection - inserting the newly formed pieces of DNA into cells

screening/selection - selecting out the cells that were successfully transfected with the new DNA

Although these steps are invariable among cloning procedures a number of alternative routes can be selected, these are summarized as a 'cloning strategy'.

Initially, the DNA of interest needs to be isolated to provide a DNA segment of suitable size. Subsequently, a ligation procedure is used where the amplified fragment is inserted into a vector (piece of DNA). The vector (which is frequently circular) is linearised using restriction enzymes, and incubated with the fragment of interest under appropriate conditions with an enzyme called DNA ligase. Following ligation the vector with the insert of interest is transfected into cells. A number of alternative techniques are available, such as chemical sensitivation of cells, electroporation, optical injection and biolistics. Finally, the transfected cells are cultured. As the aforementioned procedures are of particularly low efficiency, there is a need to identify the cells that have been successfully transfected with the vector construct containing the desired insertion sequence in the required orientation. Modern cloning vectors include selectable antibiotic resistance markers, which allow only cells in which the vector has been transfected, to grow. Additionally, the cloning vectors may contain colour selection markers, which provide blue/white screening (??-factor complementation) on X-gal medium. Nevertheless, these selection steps do not absolutely guarantee that the DNA insert is present in the cells obtained. Further investigation of the resulting colonies must be required to confirm that cloning was successful. This may be accomplished by means of PCR, restriction fragment analysis and/or DNA sequencing.

Cloning a cell means to derive a population of cells from a single cell. In the case of unicellular organisms such as bacteria and yeast, this process is remarkably simple and essentially only requires the inoculation of the appropriate medium. However, in the case of cell cultures from multi-cellular organisms, cell cloning is an arduous task as these cells will not readily grow in standard media.

A useful tissue culture technique used to clone distinct lineages of cell lines involves the use of cloning rings (cylinders). According to this technique, a single-cell suspension of cells that have been exposed to a mutagenic agent or drug used to drive selection is plated at high dilution to create isolated colonies; each arising from a single and potentially clonal distinct cell. At an early growth stage when colonies consist of only a few of cells, sterile polystyrene rings (cloning rings), which have been dipped in grease are placed over an individual colony and a small amount of trypsin is added. Cloned cells are collected from inside the ring and transferred to a new vessel for further growth.

  

Agriculture-                 

Using the techniques of modern biotechnology, one or two genes (Smartstax from Monsanto in collaboration with Dow AgroSciences will use 8, starting in 2010) may be transferred to a highly developed crop variety to impart a new character that would increase its yield. However, while increases in crop yield are the most obvious applications of modern biotechnology in agriculture, it is also the most difficult one. Current genetic engineering techniques work best for effects that are controlled by a single gene. Many of the genetic characteristics associated with yield (e.g., enhanced growth) are controlled by a large number of genes, each of which has a minimal effect on the overall yield. There is, therefore, much scientific work to be done in this area.

Crops containing genes that will enable them to withstand biotic and abiotic stresses may be developed. For example, drought and excessively salty soil are two important limiting factors in crop productivity. Biotechnologists are studying plants that can cope with these extreme conditions in the hope of finding the genes that enable them to do so and eventually transferring these genes to the more desirable crops. One of the latest developments is the identification of a plant gene, At-DBF2, from Arabidopsis thaliana, a tiny weed that is often used for plant research because it is very easy to grow and its genetic code is well mapped out. When this gene was inserted into tomato and tobacco cells, the cells were able to withstand environmental stresses like salt, drought, cold and heat, far more than ordinary cells. If these preliminary results prove successful in larger trials, then At-DBF2 genes can help in engineering crops that can better withstand harsh environments. Researchers have also created transgenic rice plants that are resistant to rice yellow mottle virus (RYMV). In Africa, this virus destroys majority of the rice crops and makes the surviving plants more susceptible to fungal infections.

Proteins in foods may be modified to increase their nutritional qualities. Proteins in legumes and cereals may be transformed to provide the amino acids needed by human beings for a balanced diet. A good example is the work of Professors Ingo Potrykus and Peter Beyer in creating Golden rice.

Modern biotechnology can be used to slow down the process of spoilage so that fruit can ripen longer on the plant and then be transported to the consumer with a still reasonable shelf life. This alters the taste, texture and appearance of the fruit. More importantly, it could expand the market for farmers in developing countries due to the reduction in spoilage. However, there is sometimes a lack of understanding by researchers in developed countries about the actual needs of prospective beneficiaries in developing countries. For example, engineering soybeans to resist spoilage makes them less suitable for producing tempeh which is a significant source of protein that depends on fermentation. The use of modified soybeans results in a lumpy texture that is less palatable and less convenient when cooking.

The first genetically modified food product was a tomato which was transformed to delay its ripening. Researchers in Indonesia, Malaysia, Thailand, Philippines and Vietnam are currently working on delayed-ripening papaya in collaboration with the University of Nottingham and Zeneca.

Biotechnology in cheese production: enzymes produced by micro-organisms provide an alternative to animal rennet – a cheese coagulant – and an alternative supply for cheese makers. This also eliminates possible public concerns with animal-derived material, although there are currently no plans to develop synthetic milk, thus making this argument less compelling. Enzymes offer an animal-friendly alternative to animal rennet. While providing comparable quality, they are theoretically also less expensive.

About 85 million tons of wheat flour is used every year to bake bread. By adding an enzyme called maltogenic amylase to the flour, bread stays fresher longer. Assuming that 10–15% of bread is thrown away as stale, if it could be made to stay fresh another 5–7 days then perhaps 2 million tons of flour per year would be saved. Other enzymes can cause bread to expand to make a lighter loaf, or alter the loaf in a range of ways.

Most of the current commercial applications of modern biotechnology in agriculture are on reducing the dependence of farmers on agrochemicals. For example, Bacillus thuringiensis (Bt) is a soil bacterium that produces a protein with insecticidal qualities. Traditionally, a fermentation process has been used to produce an insecticidal spray from these bacteria. In this form, the Bt toxin occurs as an inactiveprotoxin, which requires digestion by an insect to be effective. There are several Bt toxins and each one is specific to certain target insects. Crop plants have now been engineered to contain and express the genes for Bt toxin, which they produce in its active form. When a susceptible insect ingests the transgenic crop cultivar expressing the Bt protein, it stops feeding and soon thereafter dies as a result of the Bt toxin binding to its gut wall. Bt corn is now commercially available in a number of countries to control corn borer (a lepidopteran insect), which is otherwise controlled by spraying (a more difficult process).

Crops have also been genetically engineered to acquire tolerance to broad-spectrum herbicide. The lack of herbicides with broad-spectrum activity and no crop injury was a consistent limitation in crop weed management. Multiple applications of numerous herbicides were routinely used to control a wide range of weed species detrimental to agronomic crops. Weed management tended to rely on preemergence—that is, herbicide applications were sprayed in response to expected weed infestations rather than in response to actual weeds present. Mechanical cultivation and hand weeding were often necessary to control weeds not controlled by herbicide applications. The introduction of herbicide-tolerant crops has the potential of reducing the number of herbicide active ingredients used for weed management, reducing the number of herbicide applications made during a season, and increasing yield due to improved weed management and less crop injury. Transgenic crops that express tolerance to glyphosate, glufosinate and bromoxynil have been developed. These herbicides can now be sprayed on transgenic crops without inflicting damage on the crops while killing nearby weeds.

From 1996 to 2001, herbicide tolerance was the most dominant trait introduced to commercially available transgenic crops, followed by insect resistance. In 2001, herbicide tolerance deployed in soybean, corn and cotton accounted for 77% of the 626,000 square kilometres planted to transgenic crops; Bt crops accounted for 15%; and "stacked genes" for herbicide tolerance and insect resistance used in both cotton and corn accounted for 8%.

Biotechnology is being applied for novel uses other than food. For example, oilseed can be modified to produce fatty acids for detergents, substitute fuels and petrochemicals. Potatoes, tomatoes, rice tobacco, lettuce, safflowers, and other plants have been genetically engineered to produce insulin and certain vaccines. If future clinical trials prove successful, the advantages of edible vaccines would be enormous, especially for developing countries. The transgenic plants may be grown locally and cheaply. Homegrown vaccines would also avoid logistical and economic problems posed by having to transport traditional preparations over long distances and keeping them cold while in transit. And since they are edible, they will not need syringes, which are not only an additional expense in the traditional vaccine preparations but also a source of infections if contaminated. In the case of insulin grown in transgenic plants, it is well-established that the gastrointestinal system breaks the protein down therefore this could not currently be administered as an edible protein. However, it might be produced at significantly lower cost than insulin produced in costly bioreactors. For example, Calgary, Canada-based SemBioSys Genetics, Inc. reports that its safflower-produced insulin will reduce unit costs by over 25% or more and approximates a reduction in the capital costs associated with building a commercial-scale insulin manufacturing facility of over $100 million, compared to traditional biomanufacturing facilities.

   

Biological engineering-

Biological engineering, biotechnological engineering or bioengineering (includingbiological systems engineering) is the application of concepts and methods of physics, chemistry, mathematics, and computer science to solve problems in life sciences, using engineering's own analytical and synthetic methodologies and also its traditional sensitivity to the cost and practicality of the solution(s) arrived at. In this context, while traditional engineering applies physical and mathematical sciences to analyze, design and manufacture inanimate tools, structures and processes, biological engineering uses the same sciences, as well as the rapidly-developing body of knowledge known as molecular biology, to study many aspects of living organisms.

An especially important application is the analysis and cost-effective solution of problems related to human health, but the field is much more general than that. For example, biomimetics is a branch of biological engineering which strives to understand how living organisms, as a result of the prolonged trial-and-error processes known as evolution, have solved difficult problems in the past, and to find ways to use this knowledge to solve similar problems in artificial systems. Systems biology, on the other hand, seeks to utilize the engineer's familiarity with complex artificial systems, and perhaps the concepts used in "reverse engineering", to facilitate the difficult process of recognition of the structure, function, and precise method of operation of complex biological systems.

Thus biological engineering is a science-based discipline founded upon the biological sciences in the same way that chemical engineering, electrical engineering, and mechanical engineering are based upon chemistry, electricity and magnetism, and classical mechanics, respectively.

Biological engineering can be differentiated from its roots of pure biology or classical engineering in the following way. Biological studies often follow a reductionist approach in viewing a system on its smallest possible scale which naturally leads toward tools such as functional genomics. Engineering approaches, using classical design perspectives, are constructionist, building new devices, approaches, and technologies from component concepts. Biological engineering utilizes both kinds of methods in concert, relying on reductionist approaches to identify, understand, and organize the fundamental units which are then integrated to generate something new. In addition, because it is an engineering discipline, biological engineering is fundamentally concerned with not just the basic science, but the practical application of the scientific knowledge to solve real-world problems in a cost-effective way.

Although engineered biological systems have been used to manipulate information, construct materials, process chemicals, produce energy, provide food, and help maintain or enhance human health and our environment, our ability to quickly and reliably engineer biological systems that behave as expected is at present less well developed than our mastery over mechanical and electrical systems.

The differentiation between biological engineering and overlap with Biomedical Engineering can be unclear, as many universities now use the terms "bioengineering" and "biomedical engineering" interchangeably. But according to Prof. Doug Lauffenberger of MIT, Biological Engineering (like biotechnology) has a broader base which applies engineering principles to an enormous range of size and complexities of systems ranging from the molecular level - molecular biology, biochemistry, microbiology, pharmacology, protein chemistry, cytology, immunology, neurobiology and neuroscience (often but not always using biological substances) - to cellular and tissue-based methods (including devices and sensors), whole macroscopic organisms (plants, animals), and up increasing length scales to whole ecosystems. Neither biological engineering nor biomedical engineering is wholly contained within the other, as there are non-biological products for medical needs and biological products for non-medical needs.

ABET, the U.S.-based accreditation board for engineering B.S. programs, makes a distinction between Biomedical Engineering and Biological Engineering; however, the differences are quite small. Biomedical engineers must have life science courses that include human physiology and have experience in performing measurements on living systems while biological engineers must have life science courses (which may or may not include physiology) and experience in making measurements not specifically on living systems. Foundational engineering courses are often the same and include thermodynamics, fluid and mechanical dynamics, kinetics, electronics, and materials properties.

The word bioengineering was coined by British scientist and broadcaster Heinz Wolff in 1954. The term bioengineering is also used to describe the use of vegetation in civil engineering construction. The term bioengineering may also be applied to environmental modifications such as surface soil protection, slope stabilisation, watercourse and shoreline protection, windbreaks, vegetation barriers including noise barriers and visual screens, and the ecological enhancement of an area. The first biological engineering program was created at Mississippi State University in 1967, making it the first biological engineering curriculum in the United States. More recent programs have been launched at MIT  and Utah State University.

Biological Engineers or bioengineers are engineers who use the principles of biology and the tools of engineering to create usable, tangible, economically viable products. Biological Engineering employs knowledge and expertise from a number of pure and applied sciences, such as mass and heat transfer, kinetics, biocatalysts, biomechanics, bioinformatics, separation and purification processes, bioreactor design, surface science, fluid mechanics, thermodynamics, and polymer science. It is used in the design of medical devices, diagnostic equipment, biocompatible materials, renewable bioenergy, ecological engineering, and other areas that improve the living standards of societies.

In general, biological engineers attempt to either mimic biological systems to create products or modify and control biological systems so that they can replace, augment, or sustain chemical and mechanical processes. Bioengineers can apply their expertise to other applications of engineering and biotechnology, including genetic modification of plants and microorganisms, bioprocess engineering, and biocatalysis.

Biotechnology is being used to engineer and adapt organisms especially microorganisms in an effort to find sustainable ways to clean up contaminated environments. The elimination of a wide range of pollutants and wastes from the environment is an absolute requirement to promote a sustainable development of our society with low environmental impact. Biological processes play a major role in the removal of contaminants and biotechnology is taking advantage of the astonishing catabolic versatility of microorganisms to degrade/convert such compounds. New methodological breakthroughs in sequencing, genomics, proteomics, bioinformatics and imaging are producing vast amounts of information. In the field of Environmental Microbiology, genome-based global studies open a new era providing unprecedented in silico views of metabolic and regulatory networks, as well as clues to the evolution of degradation pathways and to the molecular adaptation strategies to changing environmental conditions. Functional genomic and metagenomic approaches are increasing our understanding of the relative importance of different pathways and regulatory networks to carbon flux in particular environments and for particular compounds and they will certainly accelerate the development of bioremediation technologies and biotransformation processes.

Marine environments are especially vulnerable since oil spills of coastal regions and the open sea are poorly containable and mitigation is difficult. In addition to pollution through human activities, millions of tons of petroleum enter the marine environment every year from natural seepages. Despite its toxicity, a considerable fraction of petroleum oil entering marine systems is eliminated by the hydrocarbon-degrading activities of microbial communities, in particular by a remarkable recently discovered group of specialists, the so-called hydrocarbonoclastic bacteria (HCCB).

Interest in the microbial biodegradation of pollutants has intensified in recent years as humanity strives to find sustainable ways to cleanup contaminated environments. These bioremediation and biotransformation methods endeavour to harness the astonishing, naturally occurring, ability of microbial xenobiotic metabolism to degrade, transform or accumulate a huge range of compounds including hydrocarbons (e.g. oil), polychlorinated biphenyls (PCBs), polyaromatic hydrocarbons (PAHs), heterocyclic compounds (such as pyridine or quinoline), pharmaceutical substances, radionuclides and metals. Major methodological breakthroughs in recent years have enabled detailed genomic, metagenomic, proteomic, bioinformatic and other high-throughput analyses of environmentally relevant microorganisms providing unprecedented insights into key biodegradative pathways and the ability of organisms to adapt to changing environmental conditions.

The elimination of a wide range of pollutants and wastes from the environment is an absolute requirement to promote a sustainable development of our society with low environmental impact. Biological processes play a major role in the removal of contaminants and they take advantage of the astonishing catabolic versatility of microorganisms to degrade/convert such compounds. New methodological breakthroughs in sequencing, genomics, proteomics, bioinformatics and imaging are producing vast amounts of information. In the field of Environmental Microbiology, genome-based global studies open a new era providing unprecedented in silico views of metabolic and regulatory networks, as well as clues to the evolution of degradation pathways and to the molecular adaptation strategies to changing environmental conditions. Functional genomic and metagenomic approaches are increasing our understanding of the relative importance of different pathways and regulatory networks to carbon flux in particular environments and for particular compounds and they will certainly accelerate the development of bioremediation technologies and biotransformation processes.

The burgeoning amount of bacterial genomic data provides unparalleled opportunities for understanding the genetic and molecular bases of the degradation of organic pollutants. Aromatic compounds are among the most recalcitrant of these pollutants and lessons can be learned from the recent genomic studies of Burkholderia xenovorans LB400 and Rhodococcus sp. strain RHA1, two of the largest bacterial genomes completely sequenced to date. These studies have helped expand our understanding of bacterial catabolism, non-catabolic physiological adaptation to organic compounds, and the evolution of large bacterial genomes. First, the metabolic pathways from phylogenetically diverse isolates are very similar with respect to overall organization. Thus, as originally noted in pseudomonads, a large number of "peripheral aromatic" pathways funnel a range of natural and xenobiotic compounds into a restricted number of "central aromatic" pathways. Nevertheless, these pathways are genetically organized in genus-specific fashions, as exemplified by the b-ketoadipate and Paa pathways. Comparative genomic studies further reveal that some pathways are more widespread than initially thought. Thus, the Box and Paa pathways illustrate the prevalence of non-oxygenolytic ring-cleavage strategies in aerobic aromatic degradation processes. Functional genomic studies have been useful in establishing that even organisms harboring high numbers of homologous enzymes seem to contain few examples of true redundancy. For example, the multiplicity of ring-cleaving dioxygenases in certain rhodococcal isolates may be attributed to the cryptic aromatic catabolism of different terpenoids and steroids. Finally, analyses have indicated that recent genetic flux appears to have played a more significant role in the evolution of some large genomes, such as LB400's, than others. However, the emerging trend is that the large gene repertoires of potent pollutant degraders such as LB400 and RHA1 have evolved principally through more ancient processes. That this is true in such phylogenetically diverse species is remarkable and further suggests the ancient origin of this catabolic capacity.

Anaerobic microbial mineralization of recalcitrant organic pollutants is of great environmental significance and involves intriguing novel biochemical reactions. In particular, hydrocarbons and halogenated compounds have long been doubted to be degradable in the absence of oxygen, but the isolation of hitherto unknown anaerobic hydrocarbon-degrading and reductively dehalogenating bacteria during the last decades provided ultimate proof for these processes in nature. Many novel biochemical reactions were discovered enabling the respective metabolic pathways, but progress in the molecular understanding of these bacteria was rather slow, since genetic systems are not readily applicable for most of them. However, with the increasing application of genomics in the field of environmental microbiology, a new and promising perspective is now at hand to obtain molecular insights into these new metabolic properties. Several complete genome sequences were determined during the last few years from bacteria capable of anaerobic organic pollutant degradation. The ~4.7 Mb genome of the facultative denitrifying Aromatoleum aromaticum strain EbN1 was the first to be determined for an anaerobic hydrocarbon degrader (using toluene or ethylbenzene as substrates). The genome sequence revealed about two dozen gene clusters (including several paralogs) coding for a complex catabolic network for anaerobic and aerobic degradation of aromatic compounds. The genome sequence forms the basis for current detailed studies on regulation of pathways and enzyme structures. Further genomes of anaerobic hydrocarbon degrading bacteria were recently completed for the iron-reducing species Geobacter metallireducens (accession nr. NC_007517) and the perchlorate-reducing Dechloromonas aromatica (accession nr. NC_007298), but these are not yet evaluated in formal publications. Complete genomes were also determined for bacteria capable of anaerobic degradation of halogenated hydrocarbons by halorespiration: the 1.4 Mb genomes of Dehalococcoides ethenogenes strain 195 and Dehalococcoides sp. strain CBDB1 and the 5.7 Mb genome of Desulfitobacterium hafniense strain Y51. Characteristic for all these bacteria is the presence of multiple paralogous genes for reductive dehalogenases, implicating a wider dehalogenating spectrum of the organisms than previously known. Moreover, genome sequences provided unprecedented insights into the evolution of reductive dehalogenation and differing strategies for niche adaptation.

Recently, it has become apparent that some organisms, including Desulfitobacterium chlororespirans, originally evaluated for halorespiration on chlorophenols, can also use certain brominated compounds, such as the herbicide bromoxynil and its major metabolite as electron acceptors for growth. Iodinated compounds may be dehalogenated as well, though the process may not satisfy the need for an electron acceptor.

Bioavailability, or the amount of a substance that is physiochemically accessible to microorganisms is a key factor in the efficient biodegradation of pollutants. O'Loughlin et al (2000) showed that, with the exception of kaolinite clay, most soil clays and cation exchange resins attenuated biodegradation of 2-picoline by Arthrobacter sp. strain R1, as a result of adsorption of the substrate to the clays. Chemotaxis, or the directed movement of motile organisms towards or away from chemicals in the environment is an important physiological response that may contribute to effective catabolism of molecules in the environment. In addition, mechanisms for the intracellular accumulation of aromatic molecules via various transport mechanisms are also important.

Petroleum oil contains aromatic compounds that are toxic for most life forms. Episodic and chronic pollution of the environment by oil causes major ecological perturbations. Marine environments are especially vulnerable since oil spills of coastal regions and the open sea are poorly containable and mitigation is difficult. In addition to pollution through human activities, about 250 million liters of petroleum enter the marine environment every year from natural seepages. Despite its toxicity, a considerable fraction of petroleum oil entering marine systems is eliminated by the hydrocarbon-degrading activities of microbial communities, in particular by a remarkable recently discovered group of specialists, the so-called hydrocarbonoclastic bacteria (HCB). Alcanivorax borkumensis was the first HCB to have its genome sequenced. In addition to hydrocarbons, crude oil often contains various heterocyclic compounds, such as pyridine, which appear to be degraded by similar, though separate mechanisms than hydrocarbons.

Cholesterol is a steroid highly abundant in the environment that plays a major role in the global carbon cycle. Many synthetic steroidic compounds like some sexual hormones frequently appear in municipal and industrial wastewaters, acting as environmental pollutants with strong metabolic activities negatively affecting the ecosystems. Since these compounds are common carbon sources for many different microorganisms their aerobic and anaerobic mineralization has been extensively studied. The interest of these studies lies on the biotechnological applications of sterol transforming enzymes for the industrial synthesis of sexual hormones and corticoids. Very recently the catabolism of cholesterol has acquired a high relevance because it is involved in the infectivity of Mycobacterium tuberculosis.

Sustainable development requires the promotion of environmental management and a constant search for new technologies to treat vast quantities of wastes generated by increasing anthropogenic activities. Biotreatment, the processing of wastes using living organisms, is an environmentally friendly, relatively simple and cost-effective alternative to physico-chemical clean-up options. Confined environments, such as bioreactors, have been engineered to overcome the physical, chemical and biological limiting factors of biotreatment processes in highly controlled systems. The great versatility in the design of confined environments allows the treatment of a wide range of wastes under optimized conditions. To perform a correct assessment, it is necessary to consider various microorganisms having a variety of genomes and expressed transcripts and proteins. A great number of analyses are often required. Using traditional genomic techniques, such assessments are limited and time-consuming. However, several high-throughput techniques originally developed for medical studies can be applied to assess biotreatment in confined environments.

The study of the fate of persistent organic chemicals in the environment has revealed a large reservoir of enzymatic reactions with a large potential in preparative organic synthesis, which has already been exploited for a number of oxygenases on pilot and even on industrial scale. Novel catalysts can be obtained from metagenomic libraries and DNA sequence based approaches. Our increasing capabilities in adapting the catalysts to specific reactions and process requirements by rational and random mutagenesis broadens the scope for application in the fine chemical industry, but also in the field of biodegradation. In many cases, these catalysts need to be exploited in whole cell bioconversions or in fermentations, calling for system-wide approaches to understanding strain physiology and metabolism and rational approaches to the engineering of whole cells as they are increasingly put forward in the area of systems biotechnology and synthetic biology.

In the ecosystem, different substrates are attacked at different rates by consortia of organisms from different kingdoms. Aspergillus and other moulds play an important role in these consortia because they are adept at recycling starches, hemicelluloses, celluloses, pectins and other sugar polymers. Some aspergilli are capable of degrading more refractory compounds such as fats, oils, chitin, and keratin. Maximum decomposition occurs when there is sufficient nitrogen, phosphorus and other essential inorganic nutrients. Fungi also provide food for many soil organisms.

For Aspergillus the process of degradation is the means of obtaining nutrients. When these moulds degrade human-made substrates, the process usually is called biodeterioration. Both paper and textiles (cotton, jute, and linen) are particularly vulnerable to Aspergillusdegradation. Our artistic heritage is also subject to Aspergillus assault. To give but one example, after Florence in Italy flooded in 1969, 74% of the isolates from a damaged Ghirlandaio fresco in the Ognissanti church were Aspergillus versicolor.