Friday, May 22, 2020

Rembrandt Essay examples - 874 Words

Rembrandt Harmenszoon van Rijn is generally considered one of the greatest painters in European art history and the most important in Dutch history. Rembrandt was also a proficient printmaker and made many drawings. His contributions to art came in a period that historians call the Dutch Golden Age (roughly equivalent to the 17th century), in which Dutch culture, science, commerce, world power and political influence reached their pinnacles. In all, Rembrandt produced around 600 paintings, 300 etchings, and 2,000 drawings. He was a prolific painter of self-portraits, producing almost a hundred of them (including some 20 etchings) throughout his long career. Together they give us a remarkably clear picture of the man, his looks, and Ââ€" more†¦show more content†¦By 1631, Rembrandt had established such a good reputation that he received several assignments for portraits from Amsterdam. As a result, he moved to that city and into the house of an art dealer, Hendrick van Uylenburgh. This move eventually led, in 1634, to the marriage of Rembrandt and Hendricks greatniece, Saskia van Uylenburg. This was obviously a marriage for love. Although she came from a good family (her father had been burgomaster of Leeuwarden), Saskia was an orphan and was probably not very wealthy. She lived with her sister in Frisia and did not have many grand connections in Amsterdam. These events, however, are widely disputed. In 1639, Rembrandt and Saskia moved to a prominent house in the Jodenbreestraat in the Jewish quarter, which later became the Rembrandt House Museum. Although they were affluent the couple had several setbacks in their personal life. Three of their children died shortly after birth. Only their fourth child, a son, Titus, who was born in 1641, survived into adulthood. Saskia died in 1642 soon after Tituss birth, probably from tuberculosis. After her death he began an affair with Tituss nurse, a widow called Geertje Dircx. This ended in a lawsuit. Geertje claimed that Rembrandt had broken his promise to marry her, and demanded that the council force him to marry her. The council did not go that far but Rembrandt was asked to pay her a lot of money. He then cooperated with Geertjes family to have her locked up in aShow MoreRelatedRembrandt Analysis986 Words   |  4 PagesThe landscape etching presented here, The Three Trees, is a masterpiece created by Rembrandt Harmenszoon van Rijn. Rembrandt was a prolific Dutch Baroque painter, printmaker, and draughtsman, who had produced over a thousand paintings and etchings throughout his career (â€Å"Rembrandt and the Face of Jesus†). He was one of the most celebrated visual artists during the Renaissance and one of the pioneers of the Dutch Golden Age, a period of commercial and artistic prosperity. As critical components ofRead MoreEssay about Rembrandt1237 Words   |  5 PagesRembrandt Rembrandt is known as one of the greatest artists of the 17th century. He is also known as one of the greatest painters in western art. Rembrandt earned these titles rightfully so. He belonged to the Baroque period. The Baroque period had works of art that were emotional, dramatic, and included real people as the primary subjects. Rembrandt showed all of this in his work. Rembrandt was influenced by nature, religion, the Bible, and humanity. His paintings showed his moods andRead MoreRembrandt and the Nude Essay1607 Words   |  7 PagesRembrandt van Rijn was a Dutch artist well known for his many paintings and etchings of landscapes, figures and animals. His subjects included biblical, secular and mythological scenes. Rembrandt also dabbled in the nude even though they were not popular among his contemporaries. Rembrandt’s nudes were his interpretation of the real human body. He painted every wrinkle and every fold of the body. Rembrandt’s nudes were not considered ugly and grotesque. His nudes were unlike the Greek goddesses hisRead More Rembrandt and The Western Art438 Words   |  2 PagesRembrandt uses oil on wood in one of his earliest self-portraits, depicting himself in a fabulous costume that includ es a plumed hat, a silk scarf, and an elegant jacket complete with gold trim. There is a heavy use of shadow, concealing half of his face, with the left half being illuminated by a sole source of light located out of the frame. His face purposefully lacks an expression, as if to create an open-ended illusion of confusion. Unlike most of his works, this painting is quite large, havingRead MoreEssay Rembrandt Harmensz van Rijns Self-Portraits2130 Words   |  9 Pagesjust a man on the wall. Rembrandt Harmensz van Rijn painted many self-portraits throughout his career, and about forty have survived. In studying these paintings, one will find that a certain growth and development of his style happened throughout his life. For example, in his earlier self-portraits, he used a technique that is called chiaroscuro, which is the use of deep variations of light and shade. In these early paintings, it is hard to tell what Rembrandt looked like because of theRead MoreEssay on Rembrandt Harmenszoon van Rijn1035 Words   |  5 PagesRembrandt Harmenszoon van Rijn Rembrandt (1606-1669), Dutch baroque artist, who ranks as one of the greatest painters in the history of Western art. His full name was Rembrandt Harmenszoon van Rijn. He possessed a profound understanding of human nature that was matched by a brilliant technique—not only in painting but in drawing and etching—and his work made an enormous impact on his contemporaries and influenced the style of many later artists. Perhaps no painter has ever equaled RembrandtsRead MoreRembrandt Van Rijn Essay1430 Words   |  6 PagesHovater World History 28 March 2013 Rembrandt van Rijn People consider the Dutch painter and etcher, Rembrandt Harmenszoon van Rijn, an important figure in Dutch history. He achieved success at a very early age but had personal tragedies and financial hardships in his later years. Many are familiar with Rembrandt’s reputation. Rembrandt’s works show his greatest accomplishments. Mainly his childhood and home life influenced his works. Steadfastly, Rembrandt van Rijn, the greatest Dutch portraitRead MoreRembrandt van Rijn Essay1024 Words   |  5 PagesRembrandt van Rijn was born on July 15, 1606 in Leiden, Netherlands. He came from a large family where he was the ninth child. His father was a miller and saw to it that Rembrandt had an excellent education. Rembrandt began attending the University of Leiden, but really wanted to study art. Eventually he left school to become an apprentice to the artist Jacob van Swanenburgh. He also was a student of the painter Pieter Lastman. Company Frans Banning Cocq and Willem van Ruyte nburch , known as theRead MoreBiography of Rembrandt Harmensz van Rijn681 Words   |  3 PagesRembrandt Harmensz van Rijn was the premier artist in 17th century Europe during the great Golden Age of Dutch painting, well known for his Dutch Baroque style of art that lead to his title as one of the greatest painters in history. His incredible brush technique and his masterful appreciation of human nature combined to create a humongous impact on his peers as well as influencing tons of artists of later years.1 He was born on July 15, 1606, in Leiden, Netherlands, a small city located in theRead More Self-portraits - Van Gogh And Rembrandt Essay974 Words   |  4 Pagesculture of the time. Rembrandt van Rijn and Vincent van Gogh are 2 artists of whom are well represented by this statement. Rembrandt was born into a Dutch society of the Baroque era . This time period influenced his style of artwork heavily as these were the Post-High Renaissance years. This meant that the accepted artworks of the society at the time were religiously based works influenced by the efforts of the Reformation which was also occurring at the time. This meant that Rembrandt painted his works

Monday, May 18, 2020

Andrea Yates Mother Or Monster Essay - 2418 Words

Andrea Yates: Mother or Monster Mothers have always been thought of as nurturing care givers. Caregivers who have always done and thought of what is best for their children. But what happens when the nurturing mother becomes the monster? What causes them to undergo such a drastic change. Let’s take a look at Andrea Yates. In Houston, Texas Andrea Yates was born on July 2, 1964 and raised as a devout Roman Catholic. Montaldo (2016) reports that Andrea Yates graduated from Milby High School in 1982 with high honors. Yates was involved in extracurricular activities like captain of her swim team and an officer in the national honor society (p. 1). The mother turned monster then enrolled in the two-year pre-nursing program and graduated in 1986. From there Andrea Yates went to work as a registered nurse at the University of Texas M.D. Anderson Cancer (1986-1994). With a well-educated background and a well earning and respectable career, what was it that pushed Yates to do w hat she did? Was it her own personal life or her marital life? Montaldo (2016) writes, â€Å"Andrea and her future husband, Rusty Yates met each other in their Houston apartment complex. Much of their time involved revolved around religious studies and prayers. They were happily married on April 17, 1993 and later raised five children. The Yates couple’s first-born Noah was born on (2/26/1994), John (12/12/1995), Paul (9/13/1997), Luke (2/15/1999), and Mary (11/30/2000)† (p. 1). Any regular motherShow MoreRelatedAndrea Yates : Mother Or Monster2426 Words   |  10 PagesAndrea Yates: Mother or Monster Mothers are nurturing caregivers. Caregivers who have always done and thought of what is best for their children. Nevertheless, what happens when the nurturing mother becomes the monster? What causes them to undergo such a drastic transformation? Let us look at Andrea Yates. In Houston, Texas Andrea Yates was born on July 2, 1964, and raised as a devout Roman Catholic. Montaldo (2016) reports that Andrea Yates graduated from Milby High School in 1982 with highRead MoreDangerous Women : Why Mothers, Daughters, And The Murderers1735 Words   |  7 PagesNuttiraporn Linaman Georgia Wheatley WMST 1080 N1 8 December 2014 Dangerous Women Larry A. Morris, a forensic psychologist with a private practice in Arizona, writes the book named Dangerous Women: Why Mothers, Daughters, and Sisters Become Stalkers, Molesters, and Murderers. This book is about exploring case studies on girls and women who committed violent crimes, including sexual crimes. The book is helping the readers to understand why girls and women commit violent crimes and to give informationRead MoreMedi Gender And Crime1771 Words   |  8 Pagesone, but two of her husbands that served as a red flag. Eventually in 2007 Sandra Avila was arrested in Mexico and extradited to the US years later. ​On the other side of the spectrum of crime there was Andrea Yates who was responsible for drowning five of her children. It was documented that Yates had been suffering from postpartum depression. She was suicidal and tried to kill herself on numerous occasion and she was admitted to the hospital where they prescribed her antidepressants and antipsychoticRead MoreThe Mind of a Serial Killer Essay2054 Words   |  9 Pageskillers are extremely rare. Most serial killers know right from wrong and are not considered insane. Even Jeffery Dahmer who dismembered his victims and ate their flesh was deemed sane by Dr. Park Elliot Dietz. (Frank, 2000) (Dr. Dietz also deemed Andrea Yates as sane.) People who murder for money or to escape detection such as gang-bangers or the mafias are not considered serial killers. I’m going to focus this paper on the majority (2/3) of serial killers who have been clinically in the DiagnosticRead MoreOne Significant Change That Has Occurred in the World Between 1900 and 2005. Explain the Impact This Change Has Made on Our Lives and Why It Is an Important Change.163893 Words   |  656 Pagestitled â€Å"The Exit Revolution† and John Torpey, â€Å"Leaving: A Comparative View,† in Citizenship and Those Who Leave: The Politics of Emigration and Expatriation, ed. Nancy L. Green and Francois Weil (Urbana: University of Illinois Press, 2007). 18. Andreas Fahrmeir, Olivier Faron, and Patrick Weil, eds., Migration Control in the North Atlantic World: The Evolution of State Practices in Europe and the United States from the French Revolution to the Inter-War Period (New York: Berghahn Books, 2003);Read MoreStephen P. Robbins Timothy A. Judge (2011) Organizational Behaviour 15th Edition New Jersey: Prentice Hall393164 Words   |  1573 Pagesof Texas at Dallas Chris Roberts, University of Massachusetts Amherst Sherry Robinson, Pennsylvania State University Hazleton Christopher Ann Robinson-Easley, Governors State University Joe Rode, Miami University Bob Roller, LeTourneau University Andrea Roofe, Florida International University Craig Russell, University of Oklahoma at Norman Manjula Salimath, University of North Texas Mary Saunders, Georgia Gwinnett College Andy Schaffer, North Georgia College and State University Elizabeth Scott,

Thursday, May 7, 2020

A Career as a Decorator Essay - 1257 Words

During my career as a decorator, I was never really sure if rug pads were ever truly necessary, especially with larger rugs. Though interior decorating is my career, I was not always sure about whether I should use a rug pad for an area rug or just throw the rug down. Rug dealers always pushes consumers a little more on a rug pad, but do you really need to invest on it or is it an unnecessary upgrade? Sometimes my clients think like I am crazy when I tell them that a rug pad is necessary for all type of rugs. But honestly, do you really need a rug pad? Some experts will tell you they are, some will tell you they are a rip off and only needed with small rugs. Many experts will say that it is a must, while some may argue that it is a just a†¦show more content†¦If you are having a large rug, you can buy a simple non-slip rug pad to keep your area rug from slipping or shifting from its position. If you are only interested in keeping your larger area rugs from sliding from its position, then just buying a basic non-slip rug pad can do the trick. However, a quality rug pad is only essential for smaller rugs. For larger area rugs, a basic non-skid rug pad can help in preventing them from slipping or sliding. My recommendation is a natural rubber rug pad because it is more eco-friendly (made from rubber tree sap) and wont stain your hardwood floors. My best bet is eco-friendly rug pad, the ones that are made from the sap of the rubber tree. They not only helps the environment but also wont leave any stain on your floor. You can go for eco-friendly natural rubber rug pad, because it can help the environment and most importantly, it wont stain your hardwood floor. The best option is natural rubber rug pad, since it is more environmentally- friendly and also wont leave any kind of stain on your floor surface. Again, they are a bit more expensive than traditional china imported pvc padding, but I wouldnt risk damaging your hardwood floors as some pvc padding has been known to yellow or stain the flooring. They are quite costly than the china made pvc padding, but it wont damage your hardwood floors like traditionalShow MoreRelatedThe Works of Elise de Wolfe,Eleanor Brown, and Dorothy Draper1284 Words   |  6 PagesInterior Decorators such as Elsie de Wolfe, Eleanor McMillen Brown, and Dorothy Draper helped to pave the way for the Interior Design profession today. Their influential decisions to stray away from the Victorian style of design helped guide both the interior decorating profession, as well as architects who no longer wanted to design in the bulky and cluttered Victorian Style. Elsie de Wolfe designed during the Victorian movement, however â€Å"had adopted the 1890’s preference for Neoclassicism† (SmithRead MoreElsie de Wolfe, Eleanor McMillen Brown, and Dorothy Drapers Impacts on Interior Decorating886 Words   |  4 PagesCentury a movement began; the movement that made women strive to be professional interior decorators. It was a new idea of decorating but to make it a profession, that’s what made the difference. During this time a few women made history, Elsie de Wolfe, Eleanor McMillen Brown, and Dorothy Draper. Their styles and way of doing things put them down in the books as some of the greatest. Between the decorators their styles vary, some have similarities while some are completely different. For instanceRead MoreMethods And Reflection Of Major Depressive Disorder845 Words   |  4 Pagesyour career path? Understanding mental illness is highly important in any career path. Especially when your career is client based. That being said, if any client may suffer from mental illness, it is important to understand the situation and know how to react. As an interior decorator, this career path is client based. We will be working with several clients, going into their personal spaces such as their homes, and will be helping them with what they need. When people hire interior decorators, itRead MoreDecor De Jour : Every House Has A Story1336 Words   |  6 Pageshas a Story!!! I’ll help you to write yours Decor De Jour Development and Statement of the Mission, Goals, and Objectives for the â€Å"Myself† brand Decor De Jour is founded by a twain of bold as well as subtle and quirky, as well as a simple Home Decorator who believes that every house is special and has a story to tell. At Decor De Jour, I create, design, curate and style an eclectic mix of products. I have something for every house. I create unique fusion of products from Table Tray Sets, JewelleryRead MoreThe Design Philosophy of Ruby Ross Wood1039 Words   |  5 PagesSyosset, New York called Little Ipswich. The architect William Adams Delano designed Little Ipswich for them. The 29-acre property was leveled in the early 1990s and replaced by a 21-home housing complex called Pironi Estates. Mrs. Wood’s journalist career began after she moved to New York. She served as a ghostwriter, under editor and Chief Theodore Dreiser, writing poems, articles, and fiction about interior design. The magazine â€Å"The Delineator† was a popular women s magazine. She also wrote aRead MoreElsie de Wolfe, Eleanor McMillen Brown, and Dorothy Draper Paved the Way for Interior Designers1121 Words   |  5 Pagespioneered the field we know today as interior decorating. All three of these women grew up in high societies, which gave them an excellent understanding of the rich, the famous and their expensive tastes. De Wolfe, McMillen, and Draper all had prominent careers from the mid 1800’s until the early to mid 1900’s. Most of their work was for the rich and famous in American high societies. Elsie de Wolfe was born in New York City in 1865. Although said to be an ugly little girl, from a young age she wowed peopleRead MoreInterior Design1418 Words   |  6 PagesProfile Essay: Interior Design Interior design is a popular career for many people. That is especially those who are creative and who have a high number of ideas about how they would change a room or a whole home to look completely different. There are many different projects that are involved with interior design and they are all separate but related to one another. The people who conduct the projects are called interior designers. Development is the first phase of interior design (Pile, 2003)Read MoreDisney Imagineering : The True Magic Behind The Parks794 Words   |  4 Pagescommon ones that you’re hearing about, or seeing, are Engineering/ Building ideas, Animation/ Working with movies, Robotics, Design, Food/ Cooking, and Decorating. Designers and decorators are perfect examples of jobs within jobs. You could be a designer and design toys, rides, hotels, and stores. You could be a decorator and decorate hotels, parks, rides, stores, and restaurants. These jobs could also be located in Tokyo, Paris, Hawaii, Shanghai, Florida, California, Hong Kong, or South CarolinaRead MoreHr Practice At Publix : Career Growth And Development843 Words   |  4 PagesHR practice at Publix Career growth and development is one of the strategies that Publix uses to achieve employee retention in the organization. According to Branham (2012) employees are more likely to stay longer in an organization where there are prospects for career growth as opposed to where they feel they cannot grow in their career. One of the practice that is used by the organization to achieve career growth and development is promotion within the organization. This means that people whoRead MoreThe Creation of an E-Portafolio795 Words   |  3 Pagesdecorations or a looking for a new and a creative decorator for their usual parties such as birthday parties or New Year’s parties, Baby Showers or even more special occasion such wedding parties. As I am doing a Business Management and Retail Marketing, have a good understanding of computers and software and what I can get from them it is really important and useful. Moreover, I would like to become a well-known event planner and parties’ decorator as it is my passion because it is a nice and

Wednesday, May 6, 2020

Franz Kafka and Ismail Kadare - 861 Words

Franz Kafka and Ismail Kadare were two of the most extravagant storytellers of modern times. Franz Kafka wrote the short story, The Metamorphosis and Ismail Kadare wrote the novel, Broken April. In these two stories, there is a sense of sadness and darkness that both author’s portrayed in them. The characterization between Gregor (from The Metamorphosis) and Gjorg (from Broken April) were actually similar in comparison. The similarities are isolation, loneliness, and their father figures. Nevertheless, both stories are magnificent to read during spare time. In The Metamorphosis, Gregor lives a melancholy life with his parents and sister. One day Gregor awakes to find that he has been transformed into a bug. A jump through the story,†¦show more content†¦Isolation intertwine the characters, Gregor and Gjorg with each other in the same way but it has different outcomes. In addition, Gregor and Gjorg both contribute from being lonely. This is said because in The Metamorp hosis, Gregor feels like his parents don’t even want to be around him. â€Å"†¦he (Gregor) would not be in the mood to bother about his family, he was only filled with rage at the way they were neglecting him†¦Ã¢â‚¬  (pg. 846, Literature Across Cultures: Fourth Edition). In the story, Gregor’s father felt despondent when his son went through a metamorphosis because slowly he was getting a mind set that Grgor would never be normal again. Gregor would be left in his room ,deteriorating, feeling lonely because his family do not want to be around him. Not only does he feel lonely about being trapped in his room but he is afraid to go outside because of the dangers that prowls around out there. Thus, in Broken April, Gjorg found himself alone on the highroad; he felt all hope of ever accomplishing his journey failed within him. This shows that Gjorg is suffering from a lack of hope which is most likely caused by suffering from being alone walking all day. Next, â€Å"The emptiness of the road of either side seemed emptier still†¦Ã¢â‚¬ . (pg.53, Kadare). This is a revelation that emptiness is only a symbol of loneliness that the author portrays. Loneliness is defined as simply a feeling that cause depression or other emotional sickness. Gregor and Gjorg

Disruptive Technology Free Essays

One of the most consistent patterns in business is the failure of leading companies to stay at the top of their industries when technologies or markets change. Goodyear and Firestone entered the radial-tire market quite late. Xerox let Canon create the small-copier market. We will write a custom essay sample on Disruptive Technology or any similar topic only for you Order Now Bucyrus-Erie allowed Caterpillar and Deere to take over the mechanical excavator market. Sears gave way to Wal-Mart. The pattern of failure has been especially striking in the computer industry. IBM dominated the mainframe market but missed by years the emergence of minicomputers, which were technologically much simpler than mainframes. Digital Equipment dominated the minicomputer market with innovations like its VAX architecture but missed the personal-computer market almost completely. Apple Computer led the world of personal computing and established the standard for user-friendly computing but lagged five years behind the leaders in bringing its portable computer to market. Why is it that companies like these invest aggressively-and successfully-in the technologies necessary to retain their current customers but then fail to make certain other technological investments that customers of the future will demand? Undoubtedly, bureaucracy, arrogance, tired executive blood, poor planning, and short-term investment horizons have all played a role. But a more fundamental reason lies at the heart of the paradox: leading companies succumb to one of the most popular, and valuable, management dogmas. They stay close to their customers. Although most managers like to think they are in control, customers wield extraordinary power in directing a company’s investments. Before managers decide to launch a technology, develop a product, build a plant, or establish new channels of distribution, they must look to their customers first: Do their customers want it? How big will the market be? Will the investment be profitable? The more astutely managers ask and answer these questions, the more completely their investments will be aligned with the needs of their Customers. This is the way a well-managed company should operate. Right? But what happens when customers reject a new technology, product concept, or way of doing business because it does not address their needs as effectively as a company’s current approach? The large photocopying centers that represented the core f Xerox’s customer base at first had no use for small, slow tabletop copiers. The excavation contractors that had relied on Bucyrus-Erie’s big-bucket steam- and diesel-powered cable shovels didn’t want hydraulic excavators because, initially they were small and weak. IBM’s large commercial, government, and industrial customers saw no immediate use for minicomputers. In each instance, companies listened to their customers, gave them the produc t performance they were looking for, and, in the end, were hurt by the very technologies their customers led them to ignore. We have seen this pattern repeatedly in an ongoing study of leading companies in a variety of industries that have confronted technological change. The research shows that most well-managed, established companies are consistently ahead of their industries in developing and commercializing new technologies- from incremental improvements to radically new approaches- as long as those technologies address the next-generation performance needs of their customers. However, these same companies are rarely in the forefront of commercializing new technologies that don’t initially meet the needs of mainstream customers and appeal only to small or emerging markets. Using the rational, analytical investment processes that most well-managed companies have developed, it is nearly impossible to build a cogent case for diverting resources from known customer needs in established markets to markets and customers that seem insignificant or do not yet exist. After all, meeting the needs of established customers and fending off competitors takes all the resources a company has, and then some. In well-managed companies, the processes used to identify customers’ needs, forecast technological trends, assess profitability, allocate resources across competing proposals for investment, and take new products to market are focused-for all the right reasons-on current customers and markets. These processes are designed to weed out proposed products and technologies that do not address customers’ needs. In fact, the processes and incentives that companies use to keep focused on their main customers work so well that they blind those companies to important new technologies in emerging markets. Many companies have learned the hard way the perils of ignoring new technologies that do not initially meet the needs of mainstream customers. For example, although personal computers did not meet the requirements of mainstream minicomputer users in the early 1980s, the computing power of the desktop machines mproved at a much faster rate than minicomputer users’ demands for computing power did. As a result, personal computers caught up with the computing needs of many of the customers of Wang, Prime, Nixdorf, Data General, and Digital Equipment. Today they are performance-competitive with minicomputers in many applications. For the minicomputer makers, keeping close to mainstream customers and ignoring what were initially low-performance desktop technologies used by seemingly insignific ant customers in emerging markets was a rational decision-but one that proved disastrous. The technological changes that damage established companies are usually not radically new or difficult from a technological point of view. They do, however, have two important characteristics: First, they typically present a different package of performance attributes- ones that, at least at me outset, are not valued by existing customers. Second, the performance attributes that existing customers do value improve at such a rapid rate that the new technology can later invade those established markets. Only at this point will mainstream customers want the technology. Unfortunately for the established suppliers, by then it is often too late: the pioneers of the new technology dominate the market. It follows, then, that senior executives must first be able to spot the technologies that seem to fall into this category. Next, to commercialize and develop the new technologies, managers must protect them from the processes and incentives that are geared to serving established customers. And the only way to protect them is to create organizations that are completely independent from the mainstream business. No industry of staying too close to customers more dramatically than the hard-disk-drive industry. Between 1976 and 1992, disk-drive performance improved at a stunning rate: the physical size of a 100-megabyte (MB) system shrank from 5,400 to 8 cubic inches, and the cost per MB fell from $560 to $5. Technological change, of course, drove these breathtaking achievements. About half of the improvement came from a host of radical advances that were critical to continued improvements in disk-drive performance; the other half came from incremental advances. The pattern in the disk-drive industry has been repeated in mar/y other industries: the leading, established companies have consistently led the industry in developing and adopting new technologies that their customers demanded- even when those technologies required completely different technological competencies and manufacturing capabilities from the ones the companies had. In spite of this aggressive technological posture, no single disk-drive manufacturer has been able to dominate the industry for more than a few years. A series of companies have entered the business and risen to prominence, only to be toppled by newcomers who pursued technologies that at first did not meet the needs of mainstream customers. As a result, not one of the independent disk-drive companies that existed in 1976 survives today. To explain the differences in the impact of certain kinds of technological innovations on a given industry, the concept of performance trajectories – the rate at which the performance of a product has improved, and is expected to improve, over time – can be helpful. Almost every industry has a critical performance trajectory. In mechanical excavators, the critical trajectory is the annual improvement in cubic yards of earth moved per minute. In photocopiers, an important performance trajectory is improvement in number of copies per minute. In disk drives, one crucial measure of performance is storage capacity, which has advanced 50% each year on average for a given size of drive. Different types of technological innovations affect performance trajectories in different ways. On the one hand, sustaining technologies tend to maintain a rate of improvement; that is, they give customers something more or better in the attributes they already value. For example, thin-film components in disk drives, which replaced conventional ferrite heads and oxide disks between 1982 and 1990, enabled information to be recorded more densely on disks. Engineers had been pushing the limits of the’ performance they could wring from ferrite heads and oxide disks, but the drives employing these technologies seemed to have reached the natural limits of an S curve. At that point, new thin-film technologies emerged that restored- or sustained-the historical trajectory of performance improvement. On the other hand, disruptive technologies introduce a very different package of attributes from the one mainstream customers historically value, and they often perform far worse along one or two dimensions that are particularly important to those customers. As a rule, mainstream customers are unwilling to use a disruptive product in applications they know and understand. At first, then, disruptive technologies tend to be used and valued only in new markets or new applications; in fact, they generally make possible the emergence of new markets. For example, Sony’s early transistor adios sacrificed sound fidelity but created a market for portable radios by offering a new and different package of attributes- small size, light weight, and portability. In the history of the hard-disk-drive industry, the leaders stumbled at each point of disruptive technological change: when the diameter of disk drives shrank from the original 14 inches to 8 inches, then to 5. 25 inches, and finall y to 3. 5 inches. Each of these new architectures, initially offered the market substantially less storage capacity than the typical user in the established market required. For example, the 8-inch drive offered 20 MB when it was introduced, while the primary market for disk drives at that time-mainframes-required 200 MB on average. Not surprisingly, the leading computer manufacturers rejected the 8-inch architecture at first. As a result, their suppliers, whose mainstream products consisted of 14-inch drives with more than 200 MB of capacity, did not pursue the disruptive products aggressively. The pattern was repeated when the 5. 25-inch and 3. 5-inch drives emerged: established computer makers rejected the drives as inadequate, and, in turn, their disk-drive suppliers ignored them as well. But while they offered less storage capacity, the disruptive architectures created other important attributes- internal power supplies and smaller size (8-inch drives); still smaller size and low-cost stepper motors (5. 25-inch drives); and ruggedness, light weight, and low-power consumption (3. 5-inch drives). From the late 1970s to the mid-1980s, the availability of the three drives made possible the development of new markets for minicomputers, desktop PCs, and portable computers, respectively. Although the smaller drives represented disruptive technological change, each was technologically straightforward. In fact, there were engineers at many leading companies who championed the new technologies and built working prototypes with bootlegged resources before management gave a formal go-ahead. Still, the leading companies could not move the products through their organizations and into the market in a timely way. Each time a disruptive technology emerged, between one-half and two-thirds of the established manufacturers failed to introduce models employing the new architecture-in stark contrast to their timely launches of critical sustaining technologies. Those companies that finally did launch new models typically lagged behind entrant companies by two years-eons in an industry whose products’ life cycles are often two y. ears. Three waves of entrant companies led these revolutions; they first captured the new markets and then dethroned the leading companies in the mainstream markets. How could technologies that were initially inferior and useful only to new markets eventually threaten leading companies in established markets? Once the disruptive architectures became established in their new markets, sustaining innovations raised each architecture’s performance along steep trajectories- so steep that the performance available from each architecture soon satisfied the needs of customers in the established markets. For example, the 5. 25-inch drive, whose initial 5 MB of capacity in 1980 was only a fraction of the capacity that the minicomputer market needed, became fully performance-competitive in the minicomputer market by 1986 and in the mainframe market by 1991. (See the graph â€Å"How Disk-Drive Performance Met Market Needs. ) A company’s revenue and cost structures play a critical role in the way it evaluates proposed technological innovations. Generally, disruptive technologies look financially unattractive to established companies. The potential revenues from the discernible markets are small, and it is often difficult to project how big the markets for the technology will be over the long term. As a result, managers typically conclude that the technology cannot make a meaningful contribution to corporate growth and, therefore, that it is not worth the management effort required to develop it. In addition, established companies have often installed higher cost structures to serve sustaining technologies than those required by disruptive technologies. As a result, managers typically see themselves as having two choices when deciding whether to pursue disruptive technologies. One is to go downmarket and accept the lower profit margins of the emerging markets that the disruptive technologies will initially serve. The other is to go upmarket with sustaining technologies and enter market segments whose profit margins are alluringly high. For example, the margins of IBM’s mainframes are still higher than those of PCs). Any rational resource-allocation process in companies serving established markets will choose going upmarket rather than going down. Managers of companies that have championed disruptive technologies in emerging markets look at the world quite differently. Without the high cost structures of their established counterparts, these companies find the emerging markets appealing. Once the companies have secured a foothold in the markets and mproved the performance of their technologies, the established markets above them, served by high-cost suppliers, look appetizing. When they do attack, the entrant companies find the established players to be easy and unprepared opponents because the opponents have been looking upmarket themselves, discounting the threat from below. It is tempting to stop at this point and conclude that a valuable lesson has been learned: managers can avoid missing the next wave by paying careful attention to potentially disruptive technologies that do not meet current customers’ needs. But recognizing the pattern and figuring out how to break it are two different things. Although entrants invaded established markets with new technologies three times in succession, none of the established leaders in the disk-drive industry seemed to learn from the experiences of those that fell before them. Management myopia or lack of foresight cannot explain these failures. The problem is that managers keep doing what has worked in the past: serving the rapidly growing needs of their current customers. The processes that successful, well-managed companies have developed to allocate resources among proposed investments are incapable of funneling resources into programs that current customers explicitly don’t want and whose profit margins seem unattractive. Managing the development of new technology is tightly linked to a company’s investment processes. Most strategic proposals-to add capacity or to develop new products or processes- take shape at the lower levels of organizations in engineering groups or project teams. Companies then use analytical planning and budgeting systems to select from among the candidates competing for funds. Proposals to create new businesses in emerging markets are particularly challenging to assess because they depend on notoriously unreliable estimates of market size. Because managers are evaluated on their ability to place the right bets, it is not surprising that in well-managed companies, mid- and top-level managers back projects in which the market seems assured. By staying close to lead customers, as they have been trained to do, managers focus resources on fulfilling the requirements of those reliable customers that can be served profitably. Risk is reduced-and careers are safeguarded-by giving known customers what they want. Seagate Technology’s experience illustrates the consequences of relying on such resource-allocation processes to evaluate disruptive technologies. By almost any measure, Seagate, based in Scotts Valley, California, was one of the most successful and aggressively’ managed companies in the history of the microelectronics industry: from its inception in 1980, Seagate’s revenues had grown to more than $700 million by 1986. It had pioneered 5. 5-inch hard-disk drives and was the main supplier of them to IBM and IBM-compatible personal-computer manufacturers. The company was the leading manufacturer of 5. 25-inch drives at the time the disruptive 3. 5-inch drives emerged in the mid-1980s. Engineers at Seagate were the second in the industry to develop working prototypes of 3. 5-inch drives. By early 1985, they had made more than 80 such models with a low level of company funding. The engineers forwarded the new models to key marketing executives, and the trade press reported that Seagate was actively developing 3. -inch drives. But Seagate’s principal customers- IBM and other manufacturers of AT-class personal computers- showed no interest in the new drives. They wanted to incorporate 40-MB and 60-MB drives in their next-generation models, and Seagate’s early 3. 5-inch prototypes packed only 10 MB. In response, Seagate’s marketing executives lowered their sales forecasts for the new ‘disk drives. Manufacturing and financial executives at the company pointed out another drawback to the 3. 5-inch drives. According to their analysis, the new drives would never be competitive with the 5. 5-inch architecture on a cost-per-megabyte basis-an important metric that Seagate’s customers used to evaluate disk drives. Given Seagate’s cost structure, margins on the higher-capacity 5. 25-inch models therefore promised to be much higher than those on the smaller products. Senior managers quite rationally decided that the 3. 5-inch drive would not provide the sales volume and profit margins that Seagate needed from a new product. A ‘former Seagate marketing executive recalled, â€Å"We needed a new model that could become the next ST412 [a 5. 5-inch drive generating more than $300 million in annual sales, which was nearing the end of its life cycle]. At the time, the entire market for 3. 5-inch drives was less than $50 million. The 3. 5-inch drive just didn’t fit the bill- for sales or profits. † The shelving of the 3. 5-inch drive was not a signal that Seagate was complacent about innovation. Seagate subsequently introduced new models of 5. 25-inch drives at an accelerated rate and, in so doing, introduced an impressive array of sustaining technological improvements, even though introducing them rendered a significant portion of its manufacturing capacity obsolete. How to cite Disruptive Technology, Essay examples Disruptive Technology Free Essays Disruptive Technology Abstract The objective of this project is to explain the emergence of disruptive technology in the IT industry that will enable and help the organizations growth in a cost effective manner. One of the hottest topics in today’s IT corridors is the uses and benefits of virtualization technologies. IT companies all over the globe are executing virtualization for a diversity of business requirements, driven by prospects to progress server flexibility and decrease operational costs. We will write a custom essay sample on Disruptive Technology or any similar topic only for you Order Now InfoTech Solutions being dominant IT solution provider can be broadly benefited by implementing the virtualization. This paper is intended to provide the complete details of virtualization, its advantages and strategies for SMEs to migrate. Introduction 2009 IT buzz word is ‘Virtualization’. Small, medium and large business organizations seriously started to re organize their e-business strategy towards the successful disruptive technology of virtualization. Virtualization of business applications permits IT operations in organizations of all sizes to decrease costs, progress IT services and to reduce risk management. The most remarkable cost savings are the effect of diminishing hardware, utilization of space and energy, as well as the productivity gains leads to cost savings. In the Small business sector virtualization can be defined as a technology that permits application workloads to be maintained independent of host hardware. Several applications can share a sole, physical server. Workloads can be rotated from one host to another without any downtime. IT infrastructure can be managed as a pool of resources, rather than a collection of physical devices. Disruptive Technology Disruptive Technology or disruptive Innovation is an innovation that makes a product or service better by reducing the price or changing the market dramatically in a way it does not expect. Christensen (2000) stated that ‘‘disruptive technologies are typically simpler, cheaper, and more reliable and convenient than established technologies’’ (p. 192). Before we do any research on disruptive technology it is useful and necessary to summarize the Christensen’s notion of disruptive technology. Christensen was projected as â€Å"guru† by the business (Scherreik, 2000). His work has been broadly referred by scholars or researchers working in different disciplines and topics like the development of new product, strategies like marketing and management and so on. In his book â€Å"The Innovator’s Dilemma,† (Christensen 1997) Christensen had done significant observations about the circumstances under which companies or organizations that are established lose market to an entrant that was referred as disruptive technology. This theory became extremely influential in the management decision making process (Vaishnav, 2008). Christensen’s arguments, from the academic references (Christensen 1992; Christensen and Rosenbloom 1995; Christensen, Suarez et al. 1996) instead of looking in to his famous paperbacks (Christensen 1997; Christensen and Raynor 2003), explains that the entrant might have more advantage then the incumbent and it requires the understanding of three important forces: technological capability (Henderson and Clark 1990), organizational dynamics (Anderson and Tushman 1990), and value (Christensen and Rosenbloom 1995). He argued further that company’s competitive strategy and mainly its earlier choices of markets to serve, decides its perceptions of economic value in new technology, and improves the rewards it will expect to obtain through innovation. Christensen (1995) classifies new technology into two types: sustaining and disruptive. Sustaining technology depends on rising improvements to an already established technology, at the same time Disruptive technology is new, and replaces an established technology unexpectedly. The disruptive technologies may have lack of refinement and often may have performance problems because these are fresh and may not have a verified practical application yet. It takes a lot of time and energy to create something new and innovative that will significantly influence the way that things are done. Most of the organizations are concerned about maintaining and sustaining their products and technologies instead of creating something new and different that may better the situation. They will make change and minor modifications to improve the current product. These changes will give a bit of new life to those products so that they can increase the sales temporarily and keeps the technology a bit longer. Disruptive technologies generally emerge from outside to the mainstream. For example the light bulb was not invented by the candle industry seeking to improve the results. Normally owners of recognized technology organizations tend to focus on their increased improvements to their existing products and try to avoid potential threat to their business (Techcom, 2004). Compared to sustaining products, disruptive technologies take steps into various directions, coming up with ideas that would work against with products in the current markets and could potentially replace the mainstream products that are being used. So it is not considered as disruption, but considered as innovation. It is not only replacing, but improving ahead what we have now making things enhanced, quicker, and mostly cooler. Either it may be disruptive or innovative; technologies are changing the â€Å"future wave† in to reality and slowly started occupying the world. On one hand, the warning of disruption makes incumbents suspicious about losing the market, while emerging new entrants confident of inventing the next disruptive technology. Perhaps, such expects and worries produce more competition in the market place. It seems that every year there is a laundry list of products and technologies that are going to â€Å"change the world as we know it. † One that seems to have potential to achieve the title of a disruptive technology is something that has been around for a while now: virtualization. Gartner (2008) describes disruptive technology as â€Å"causing major change in the accepted way of doing things, including business models, processes, revenue streams, industry dynamics and consumer behaviors†. Virtualization is one of the top ten disruptive technologies listed by Gartner (Gartner. com). This virtualization technology is not new to the world. As computers turn into more common though, it became obvious that simply time-sharing a single computer was not always ideal because the systems can be misused intentionally or unintentionally and that may crash the entire system to alt. To avoid this multi system concept emerged. This multi system concept provided a lot of advantages in the organizational environment like Privacy, security to data, Performance and isolation. For example in organization culture it is required to keep certain activities performing from different systems. A testing application run in a system sometimes may halt the system or crash the syst em completely. So it is obvious to run the application in a separate system that won’t affect the net work. On the other hand placing different applications in the same system may reduce the performance of the system as they access the same available system resources like memory, network input/output, Hard disk input/output and priority scheduling (Barham, at,. el, 2003). The performance of the system and application will be greatly improved if the applications are placed in different systems so that they can have its own resources. It is very difficult for most of the organization to invest on multiple systems and at times it is hard to keep all the systems busy to its full potential and difficult to maintain and also the asset value keeps depreciating. So investing in multiple systems becomes waste at times, however having multi systems obviously has its own advantages. Considering this cost and waste, IBM introduced the first virtual machine in 1960 that made one system to be as it was multiple. In the starting, this fresh technology allowed individuals to run multiple applications at the same time to increase the performance of person and computer to do multitask abilities. Along with this multi tasking factor created by virtualization, it was also a great money saver. The multitasking ability of virtualization that allowed computers to do more than one task at a time become more valuable to companies, so that they can leverage their investments completely (VMWare. com). Virtualization is a hyped and much discussed topic recently due to its potential characteristics. Firstly it has capacity to use the computer resources in a better potential way maximizing the company’s hardware investment. It is estimated that only 25% of the total resources are utilized in an average data center. By virtualization large number older systems can be replaced by a highly modern, reliable and scalable enterprise servers reduce the hardware and infrastructure cost significantly. It is not just server consolidation, virtualization offers much more than that like the ability to suspend, resume, checkpoint, and migrate running Chesbrough (1999a, 1999b). It is exceptionally useful in handling the long running jobs. If a long running job is assigned to a virtual machine with checkpoints enabled, in any case it stops or hangs, it can be restarted from where it stopped instead of starting from the beginning. The main deference of today’s virtualization compared to the older mainframe age is that it can be allocated any of the service’s choice location and is called as of Distributed Virtual Machines that opens a whole lot of possibilities like monitoring of network, validating security policy and the distribution of content (Peterson et, al, 2002). The way virtual technology breaks the single operating system boundaries is what made it to be a significant part of technology that leads in to the disruptive technology group. It allows the users to run multiple applications in multiple operating systems on a single computer simultaneously. (VMWare. com, 2009) Basically, this new move will have a single physical server and that hardware can be made in to software that will use all the available hardware resources to create a virtual mirror of it. The replications created can be used as software based computers to run multiple applications at the same time. These software based computers will have the complete attributes like RAM, CPU and NIC interface of the physical computers. The only different is that there will be only one system instead of multiple running different operating systems (VMWare. com, 2009) called guest machines. Virtual Machine Monitor Guest virtual machines can be hosted by a method called as Virtual Machine Monitor or VMM. This should go hand-in-hand with virtual machines. In realty, VMM is referred as the host and the hosted virtual machines are referred as guests. The physical resources required by the guests are offered by the software layer of the VMM or host. The following figure represents the relationship between VMM and guests. The VMM supplies the required virtual versions of processor, system devices such as I/O devices, storage, memory, etc. It also presents separation between the virtual machines and it hosts so that issues in one cannot effect another. As per the research conducted by Springboard Research study recently, the spending related to virtualization software and services will reach to 1. 5 billion US dollar by the end of 2010. The research also adds that 50% of CIOs interested in deploying virtualization to overcome the issues like poor performance system’s low capacity utilization and to face the challenges of developing IT infrastructure. TheInfoPro, a research company states that more than 50% of new servers installed were based on virtualization and this number is expected to grow up to 80% by the end of 2012. V irtualization will be the maximum impact method modifying infrastructure and operations by 2012. In reference to Gartner, Inc. 008, Virtualization will renovate how IT is bought, planed, deployed and managed by the companies. As a result, it is generating a fresh wave of competition among infrastructure vendors that will result in market negotiation and consolidation over the coming years. The market share for PC virtualization is also booming rapidly. The growth is expected to be 660 million compared to 5 million in till 2007. Virtualization strategy for mid-sized businesses Virtualization has turn out to be a significant IT strategy for small and mid-sized business (SMEs) organizations. It not only offers the cost savings, but answers business continuity issues and allows IT managers to: †¢Manage and reduce the downtime caused due to the planed hardware maintenance that will reduce the down time resulting higher system availability. †¢Test, investigate and execute the disaster recovery plans. †¢Secure the data, as well as non-destructive backup and restore Processes †¢Check the stability and real-time workloads In these competitive demanding times, SME businesses organizations require to simplify the IT infrastructure and cut costs. However, with various storage, server and network requirements, and also sometimes might not have sufficient physical space to store and maintain systems, the company’s chances can be restricted by both less physical space and budget concerns. The virtualization can offer solutions for these kind issues and SMEs can significantly benefit not only from server consolidation, but also with affordable business continuity. What is virtualization for mid-sized businesses? In the Small business sector virtualization can be defined as a technology that permits application workloads to be maintained independent of host hardware. Several applications can share a sole, physical server. Workloads can be rotated from one host to another without any downtime. IT infrastructure can be managed as a pool of resources, rather than a collection of physical devices. It is assumed that the virtualization is just for large enterprises. But in fact it is not. It is a widely-established technology that decreases hardware requirements, increases use of hardware resources, modernizes management and diminish energy consumption. Economics of virtualization for the midmarket The research by VMWare. om (2009) shows that the SMEs invested on virtualization strategy has received their return of investment (ROI) in less than year. In certain cases, this can be less than seven months with the latest Intel Xeon 5500 series processors http://www-03. ibm. com/systems/resources/6412_Virtualization_Strategy_-_US_White_Paper_-_Apr_24-09. pdf [accessed on 04/09/09] The below image explains how the virtualization simplified a large utility company infrastructure with 1000 systems with racks and cables to a dramatically simpler form. Source : http://www-03. ibm. om/systems/resources/6412_Virtualization_Strategy_-_US_White_Paper_-_Apr_24-09. pdf [accessed on 04/09/09] Virtualization SME advantages 1. Virtualization and management suite presents a stretchable and low -cost development platform and an environment with high capability. 2. Virtualization provides the facility to rotate virtual machines that are live between physical hosts. This ability numerous advantages like business continuity, recovery in disaster, balancing of workload, and even energy-savings by permitting running applications to be exchanged between physical servers without disturbing the service. . Virtualization can help you take full advantage of the value of IT Pounds: †¢Business alertness in varying markets †¢A flexible IT infrastructure that can scale with business growth †¢ High level performance that can lever the majority of d emanding applications †¢ An industry-standard platform architecture with intellectual management tools †¢ Servers with enterprise attributes—regardless of their size or form factor 4. Virtualization can help you to advance IT services: †¢The provision to maintain the workloads rapidly by setting automatic maintenance process that can be configured to weeks, days or even to inutes. †¢Improve IT responsiveness to business needs †¢Down times can be eliminate by shifting the †¢To a great extent decrease, even eliminate unplanned downtime. †¢Reducing costs in technical support, training and mainte ¬nance. Conclusion: This is the right time for Small and mid-sized businesses like InfoTech Solutions to implement a virtualization strategy. Virtualization acts as a significant element of the IT strategy for businesses of all sizes, with a wide range of benefits and advantages for all sized businesses. It helps InfoTech Solutions to construct an IT infrastructure with enterprise-class facilities and with a with a form factor of Return Of Investment. It is expected that more than 80% of organizations will implement virtualization by the end of 2012. So SME organizations like InfoTech Solutions should seriously look in to their E-business strategy for considering the virtualization or they may be left behind the competitors. References 1. Adner, Ron (2002). When Are Technologies Disruptive? A Demand- Based View of the Emergence of Competition. Strategic Management Journal 23(8):667–88. . Anderson, P. and M. L. Tushman (1990). â€Å"Technological Discontinuities and Dominant Designs – a Cyclical Model of Technological-Change. † Administrative Science Quarterly 35(4): 604-633. 3. Barham, B. Dragovic, K. Fraser, S. Hand, T. Harris, A. Ho, R. Neugebauer, I. Pratt, and A. Warfield. Xen and the art of virtualization. In Proc. 19th SOSP, October 2003. 4. Chesbrough, Hen ry (1999a). Arrested Development: The Experience of European Hard-Disk-Drive Firms in Comparison with U. S. and Japanese Firms. Journal of Evolutionary Economics 9(3):287–329. 5. Chintan Vaishnav , (2008) Does Technology Disruption Always Mean Industry Disruption, Massachusetts Institute of Technology 6. Christensen, Clayton M. (2000). The Innovator’s Dilemma. When New Technologies Cause Great Firms to Fail. Boston, MA: Harvard Business School Press. 7. Christensen, C. M. (1992). â€Å"Exploring the limits of technology S-curve: Architecture Technologies. † Production and Operations Management 1(4). 8. Christensen, C. M. and R. S. Rosenbloom (1995). â€Å"Explaining the Attackers Advantage -Technological Paradigms, Organizational Dynamics, and the Value Network. † Research Policy 24(2): 233-257. . Christensen, C. M. , F. F. Suarez, et al. (1996). Strategies for survival in fast-changing industries. Cambridge, MA, International Center for Research on the Management 10. Christensen, C. M. (1992). â€Å"Exploring the limits of technology S-curve: Component Technologies. † Production and Operations Management 1(4). 11. Christensen, C. M. (1997). The innovator’s dilemma : when new technologies cause great firms to fail. Boston, Mass. , Harvard Business School Press. 12. Christensen, C. M. and M. E. Raynor (2003). The innovator’s solution : creating and sustaining successful growth. Boston, Mass. , Harvard Business School Press. 13. Cohan, Peter S. (2000). The Dilemma of the ‘‘Innovator’s Dilemma’’: Clayton Christensen’s Management Theories Are Suddenly All the Rage, but Are They Ripe for Disruption? Industry Standard, January 10, 2000. 14. Gartner Says; http://www. gartner. com/it/page. jsp? id=638207 [ accessed on 04/09/09] 15. Henderson, R. M. and K. B. Clark (1990). â€Å"Architectural Innovation – the Reconfiguration of Existing Product Technologies and the Failure of Established Firms. † Administrative Science Quarterly 35(1): 9-30. 16. MacMillan, Ian C. nd McGrath, Rita Gunther (2000). Technology Strategy in Lumpy Market Landscapes. In: Wharton on Managing Emerging Technologies. G. S. Day, P. J. H. Schoemaker, and R. E. Gunther (eds. ). New York: Wiley, 150–171. 17. Scherreik, Susan (2000). When a Guru Manages Money. Business Week, July 31, 2000. 18. L. Peterson, T. Anderson, D. Culler, and T. R oscoe, â€Å"A Blueprint for Introducing Disruptive Technology into the Internet,† in Proceedings of HotNets I, Princeton, NJ, October 2002. 19. â€Å"VirtualizationBasics. † VMWare. com. http://www. vmware. com/virtualization/ [Accessed on 04/09/09] How to cite Disruptive Technology, Essay examples

LDH Purification lab Report free essay sample

Abstract The enzyme lactate dehydrogenase (LDH) catalyzes the last step of anaerobic glycolysis that is important for the normal function of the body. Purification of LDH is essential to understand its structure and function. The purpose of this experiment was to extract and purify LDH enzyme from chicken muscle tissue using a variety of various. Analytical methods such as activity and protein assay were employed to determine the presence and purity of LDH. The cells were initially disrupted and proteins were solubilized. LDH was purified from the ammonium sulfate precipitated protein mixture by affinity chromatography and its activity was studied by spectrophotometric determination of NADH at 340 nm. From Pierce BCA assay of crude homogenate, initial protein concentration was shown to be 100 mg/ml. The final protein concentration of the pooled affinity sample was shown to be 0.2 mg/ml. It was found that the total specific activity of LDH was 58.5  µmol/min/mg, and yield of 0. 6%. Even though we were successful in purifying LDH enzyme, further steps can be taken to increase the yield. Materials and Methods Cell Lysis and Extraction of LDH: Approximately 40 g of minced chicken breast meat (40.327 g) is blended with 75ml cold extraction buffer in four 30-seconds bursts for homogenation of the muscle tissue. The extraction buffer contained 10mM Tris-HCl (pH-7.4), 1mM 2-Mercaptoethanol, 1mM Phenylmethylsulfonylflouride (PMSF), 1mM Ethylene diamine tetraacetic acid (EDTA). The homogenization procedure was carried out in the cold room to prevent the denaturation of proteins. The homogenate was centrifuged at 15,000 rpm for 20 minutes at 40 C. The supernatant was filtered through two layers of cheesecloth to remove lipids from the supernatant. The total volume was noted and three 0.5 ml aliquots (crude extract) were stored at -200 C. Ammonium sulfate precipitation: 60% ammonium sulfate concentration was used to precipitate proteins. 0.39 g of ammonium sulfate per ml of the  supernatant was added gradually to the supernatant for 15-20 min with continuous gentle stirring at 40 C. The mixture was centrifuged for 20 minutes at 15,000 rpm at 40 C. The supernatant was discarded and the pellet was stored at -200 C. Dialysis: Ammonium precipitation leads to high concentration of salts in protein mixture that can interfere with further purification steps. In order to remove excess salts, dialysis was performed. The pellet was suspended in Tris-PMSF buffer (10 mM Tris-HCl, pH 8.6, 0.5 mM 2-Mercaptoethanol, and 1mM ratio of EDTA) and mixed very gently until it dissolved at 40 C. Volume of 4ml protein mixture was added in the dialysis tubing and incubated twice overnight with two 1L buffer changes (Same buffer as extraction buffer that was used for cell lysis). After two incubations, protein mixture was resuspended gently and centrifuged for 10 minutes at 15,000rpm at 40C. Pellet was discarded, total volume of supernatant was noted and three 0.1 ml aliquots were collected. Affinity Chromatography: Cibarcon Blue column was used to separate LDH from the other proteins. 5ml fractions were collected in thirteen test tubes. Column was first rinsed with Tris-PMSF buffer followed by addition of protein mixture. Then, 10ml NAD Buffer (10mM Tris-HCl pH-8.6, 0.5mM 2-Mercaptoethanol, 1mM Lithium acetate and 1mM NAD+) was added followed by 10ml NADH (10mM Tis-HCL PH 8.6, 1mM NADH and 0.5mM 2-Mercaptoethanol). Between each steps, column was washed with 10ml Tris-PMSF Buffer. Each fraction was subjected to absorbance reading of 280nm. For absorbance above 1.5nm, 1:10 dilutions were carried out. Activity Assay: We used LDH Enzyme assay to measure the amount of LDH activity in our protein mixture. LDH catalyzes the conversion of lactate to pyruvate and NAD+ to NADH. The NADH can be determined spectrophotometrically at 340 nm. The LDH assay was performed in the crude homogenate, desalted fraction and six peak fractions from the Cibacron blue column. A cocktail solution was prepared by mixing lactate stock solution (120 mM lithium lactate, 10 mM Tris-HCl; pH 8.6), NAD+ stock solution (12 mM NAD+, 10 mM Tris HCl; pH 8.6) and bicarbonate stock solution (18 mM NaHCO3, 0.5 M NaCl)  in the ratio of 6:4:2 in cuvette. 10 microliters of the sample is then added and the assay absorption is measured at 340nm. If absorbance was above 1.5, samples were diluted. Protein Assay: The Pierce BCA Protein Assay (Thermo Scientific) is a detergent-compatible formulation based on bicinchoninic acid (BCA) for the colorimetric detection and quantitation of total protein concentration. A series of standard solution of Bovine Serum Albumin (BSA) ranging from 0-2000  µg/ml was prepared from a stock solution of 2 mg/ml BSA. 25ul of diluted crude (1:500, 1:250), desalted (1:100, 1:50), and 6 peak fractions from cibarcon blue column (1:10, 1:5) were loaded in microplate along with 175ul of BCA working reagent. Microplate was incubated for 30min at 370C and then the absorbance was measured at 562nm. Results/Discussion The purpose of this experiment was to extract and purify LDH enzyme from chicken muscle tissue using a variety of techniques including homogenization, ammonium sulfate precipitation, dialysis, and affinity chromatography. Activity and Protein assay were used to track the overall amount of LDH present in the samples. Crude Extraction: Chicken muscle tissue was homogenized in a blender with cold extraction buffer in order to lyse cells, releasing LDH into slurry of tissue components. Centrifugation separated membranes, nuclei, and other large cellular components to a pellet leaving a supernatant of crude product. Controlling temperature was a major consideration after homogenization since not only did this step releases proteins like LDH from the cell, but it also releases proteases that can now interact to degrade the LDH. Keeping samples on ice, pre-cooling the buffer, and avoiding excess kinetic energy through conservative blending were methods used to minimize activity of these proteases. After filtration through cheesecloth, our final volume of crude homogenate sample 74ml, much more volume than expected. Addition of more than 75ml of buffer volume could have increased the volume. Other possible explanation is that more solid components such as fats were present in the sample and hence, more than 20 minutes of centrifugation was  required. Desalted Sample: 60% ammonium sulfate is added to the crude extract that precipitates LDH proteins. The resulting 40% pellet theoretically contains most of the original LDH, which is re-suspended in very less volume (4ml) to create a more concentrated sample. This process leads to high concentration of salts in protein mixture that can interfere with subsequent purification steps. 4ml protein mixture underwent dialysis procedure that removes excess salts and our final volume after dialysis was 6ml. One possible explanation for increase in our volume could be that extraction buffer got mixed with protein mixture either due to tubing leaking or tubing clips not being properly tightened. Affinity Chromatography: Cibacron Blue column is an affinity column, which is specific to dehydrogenase type proteins, due to a compound structurally similar to NADH being attached covalently attached to the column. 13 fractions were collected and absorbance was measured at 280nm to check presence of LDH protein in the fractions. 1:10 dilution was performed if absorbance reading was above 1.5nm since it spectrophoretically indicates saturation and less than 1% light reaching the detector. During the addition of protein mixture (fraction# 4), high absorbance reading of 10nm was obtained (Fig.1). This could be due to lot of non-dehydrogenase-type proteins present in our sample that got eluted first during affinity chromatography. Second peak was seen after NAD+ was added since NAD solution results in the removal of the loosely bound protein. Third peak was seen after NADH was added since NADH solution results in release of maximum LDH proteins (Fig. 1). Enzyme Activity Assay: The LDH activity was measured spectrophotometrically by measuring the absorbance of NADH at 340 nm. Three peak fractions were selected for this assay based on their absorbance values obtained after adding NAD+ (fraction# 6, 7, 8) and other three after adding NADH in the affinity chromatography step (fraction# 10,11,12). A huge activity of 141umol/min/ml was seen at fraction# 7(PF1) which indicated that we had lot of proteins present in our sample. Second peak activity was seen at fraction #10 indicating that more LDH proteins is present in this fraction than in fraction# 11 (PF2) (fig.1). Based on this information, we selected fraction #10 as for our protein assay. Desalted showed highest activity among all the samples (Table1) possible due to errors occurring during dialysis explained previously. Figure 1. Absorbance readings of elutes obtained from affinity chromatography with LDH activity for 6 peak fractions. The desalted fraction was loaded to the Cibarcon blue column and proteins were eluted with Tris-PMSF, NAD+ and NADH wash subsequently. The absorbance at 280 nm of elutes were measured after each collected fractions. The LDH activity was calculated from the absorbance values obtained at 340nm. Protein Assay: We used BCA Pierce Assay to determine protein concentrations in our protein mixture. BSA standard curve was created for series of dilutions ranging from 0-2000  µg/ml and linear graph equation was used to calculate protein concentrations for the samples (Table 1). Based on Table 1, with each subsequent purification step, protein concentration decreases as sample become more concentrated with only LDH protein. Specific activity should increase and total activity should decrease with every purification step as samples get less and less diluted. Similar trend was observed in our study as well. However, exception is PF1 that has higher specific activity due to high activity suggesting more loosely bound proteins were eluted after NAD+ was added.