Yahoo Web Search

Search results

  1. The popular theatrical shows of the 1850s and 1860s were often lewd, and designed for a male audience. The shows presented a variety of entertainment: dancing girls, comics, singers and musicians ...

  2. Early Movie Audiences. Beginning in the late 1890s, film was becoming the new popular entertainment in cities and towns across the United States. The first film screening in America took place in ...

  3. People also ask

  4. Sep 29, 2023 · Today the American industry is still thriving. There is a concept of the so-called Hollywood Institute of Movie Stars, which includes many of the brightest actors of the 20th century. In the 21st century, the U.S. is rightly a trendsetter in film fashion. It is in Hollywood where new modern genres of cinema and innovative technology originated.

    • Overview
    • United States

    In the last years of the 20th century and the early years of the 21st century, the idea of “synergy” dominated the motion-picture industry in the United States, and an unprecedented wave of mergers and acquisitions pursued this ultimately elusive concept. Simply put, synergy implied that consolidating related media and entertainment properties under a single umbrella could strengthen every facet of a coordinated communications empire. Motion pictures, broadcast television, cable and satellite systems, radio networks, theme parks, newspapers and magazines, book publishers, manufacturers of home entertainment products, sports teams, Internet service providers—these were among the different elements that came together in various corporate combinations under the notion that each would boost the others. News Corporation Ltd., originally an Australian media company, started the trend by acquiring Twentieth Century–Fox in 1985. The Japanese manufacturing giant Sony Corporation acquired Columbia Pictures Entertainment, Inc., from The Coca-Cola Company in 1989. Another Japanese firm, Matsushita, purchased Universal Studios (as part of Music Corporation of America, or MCA) in 1990; it then was acquired by Seagram Company Ltd. (1995), became part of Vivendi Universal Entertainment (2000), and merged with the National Broadcasting Co., Inc. (2004), a subsidiary of the Comcast Corporation. Paramount Pictures, as Paramount Communications, Inc., became part of Viacom Inc. In perhaps the most striking of all ventures, Warner Communications merged with Time Inc. to become Time Warner Inc., which in turn came together with the Internet company America Online (AOL) to form AOL Time Warner in 2001. The company then changed its name again, back to Time Warner Inc., in 2003; it was purchased by AT&T in 2018 and renamed WarnerMedia. The Disney Company itself became an acquirer, adding Miramax Films, the television network American Broadcasting Company, the cable sports network ESPN, and, in 2019, 20th Century Fox, among other properties. The volume of corporate reshuffling and realignment had an undoubted impact on the studios involved. Nevertheless, the potential for success of such synergistic entities—and, more particularly, the positive or negative effect on their motion-picture units—remained an open question.

    It could well be argued, however, that motion-picture companies’ corporate links with the wider media world and emergent communications forms such as the Internet fostered receptivity to new technologies that rapidly transformed film production in the 1990s and into the 21st century. As early as 1982, the Disney film Tron made extensive use of computer-generated images, which were introduced in a short special-effects sequence in which a human character is deconstructed into electronic particles and reassembled inside a computer. A few years later computer-generated imagery was greatly facilitated when it became possible to transfer film images into a computer and manipulate them digitally. The possibilities became apparent in director James Cameron’s Terminator 2: Judgment Day (1991), in images of the shape-changing character T-1000.

    In the 1990s computer-generated imagery (CGI) made rapid strides and became a standard feature not only of Hollywood action-adventure films but also of nearly any work that required special visual effects. Examples of landmark films utilizing the new technologies included Steven Spielberg’s Jurassic Park (1993); Independence Day (1996), directed by Roland Emmerich; and The Matrix (1999), written and directed by Larry (later Lana) Wachowski and Andy (later Lilly) Wachowski. In Spielberg’s film, based on a best-selling novel by Michael Crichton, a number of long-extinct dinosaur species are re-created through genetic engineering. At the special-effects firm Industrial Light and Magic, models of the dinosaurs were scanned into computers and animated realistically to produce the first computer-generated images of lifelike action, rather than fantasy scenes. In Independence Day, a film combining the science-fiction and disaster genres in which giant alien spaceships attack Earth, an air battle was programmed in a computer so that each individual aircraft maneuvered, fired its weapons, and dueled with other flying objects in intricate patterns of action that would have been too time-consuming and costly to achieve by traditional special-effects means. By the end of the 1990s, the developing new technologies were displayed perhaps more fully than ever before in the Wachowskis’ spectacular film, in which the computer functions as both a central subject and a primary visual tool. For a scene in which actor Keanu Reeves appears to be dodging bullets that pass by him in a visible slow-motion trajectory, a computer program determined what motion-picture and still images were to be photographed, and then the computer assembled the images into a complete visual sequence.

    In part through the expensive and lavish effects attained through the new technologies, American cinema at the end of the 20th century sustained and even widened its domination of the world film marketplace. Domestically, the expansion of ancillary products and venues—which during the 1990s were dominated by the sale and rental of video cassettes and then DVDs for home viewing as well as by additional cable and satellite outlets for movie presentation—produced new revenues that were becoming equal to, or in some cases more important than, income from theatrical exhibition. Nevertheless, exhibition outlets continued to grow, with new “megaplex” theatres offering several dozen cinemas, while distribution strategies called for opening major commercial films on 1,000 or more—sometimes as many as 3,000 by the late 1990s—screens across the country. The competition for box-office returns became something of a spectator sport, with the media reporting every Monday on the previous weekend’s multimillion-dollar grosses and ranking the top-10 films by ticket sales. The exhibition environment seemed to demand more than ever that film production be geared to the tastes of teenage spectators who frequented the suburban mall cinemas on weekends, and commentators within the industry as well as outside it observed what they regarded as the diminished quality of mainstream films. As if reflecting that judgment, in 1996 only one major studio film, Jerry Maguire, was among the five nominees for best picture at the annual Academy of Motion Picture Arts and Sciences awards ceremony (the other nominees were an American independent film, Fargo; an Australian work, Shine; a film from Britain, Secrets & Lies; and the winner, an international production with British stars and based on a novel written by a Canadian, The English Patient).

    The motion-picture industry’s emphasis on pleasing the youth audience with special effects-laden blockbusters and genre works such as teen-oriented horror films and comedies inevitably diminished the role of directors as dominant figures in the creative process, further reducing the status that Hollywood directors had attained in the auteur-oriented 1960s and ’70s. Still, more than a handful of filmmakers, several of them veterans of that earlier era, maintained their prestige as artists practicing in a commercial medium. Two of the most prominent, who had launched their careers in the early 1970s, were Steven Spielberg and Martin Scorsese. In addition to Jurassic Park, Spielberg’s works in the 1990s included Schindler’s List (1993, winner of an Academy Award for best picture), Amistad (1997), and Saving Private Ryan (1998), with A.I. Artificial Intelligence (2001), Munich (2005), Lincoln (2012), and Bridge of Spies (2015) among his subsequent films. Scorsese directed GoodFellas (1990), The Age of Innocence (1993), Casino (1995), Kundun (1997), Gangs of New York (2002), The Departed (2006; winner of an Academy Award for best picture), and The Irishman (2019), the latter of which made use of CGI to make veteran actors look decades younger.

    The actor-director Clint Eastwood was also prolific in this period, winning the best picture Academy Award with Unforgiven (1992) and directing such other films as Mystic River (2003), Million Dollar Baby (2004; Academy Award for best picture and best director), Letters from Iwo Jima (2006), Gran Torino (2008), Invictus (2009), American Sniper (2014), Sully (2016), and The Mule (2018). Stanley Kubrick died before the release of Eyes Wide Shut (1999), his first film since Full Metal Jacket (1987). Two decades passed between Terrence Malick’s Days of Heaven (1978) and The Thin Red Line (1998), but he became more prolific after the turn of the 21st century, directing The New World (2005), The Tree of Life (2011), Knight of Cups (2015), and A Hidden Life (2019).

    In the last years of the 20th century and the early years of the 21st century, the idea of “synergy” dominated the motion-picture industry in the United States, and an unprecedented wave of mergers and acquisitions pursued this ultimately elusive concept. Simply put, synergy implied that consolidating related media and entertainment properties under a single umbrella could strengthen every facet of a coordinated communications empire. Motion pictures, broadcast television, cable and satellite systems, radio networks, theme parks, newspapers and magazines, book publishers, manufacturers of home entertainment products, sports teams, Internet service providers—these were among the different elements that came together in various corporate combinations under the notion that each would boost the others. News Corporation Ltd., originally an Australian media company, started the trend by acquiring Twentieth Century–Fox in 1985. The Japanese manufacturing giant Sony Corporation acquired Columbia Pictures Entertainment, Inc., from The Coca-Cola Company in 1989. Another Japanese firm, Matsushita, purchased Universal Studios (as part of Music Corporation of America, or MCA) in 1990; it then was acquired by Seagram Company Ltd. (1995), became part of Vivendi Universal Entertainment (2000), and merged with the National Broadcasting Co., Inc. (2004), a subsidiary of the Comcast Corporation. Paramount Pictures, as Paramount Communications, Inc., became part of Viacom Inc. In perhaps the most striking of all ventures, Warner Communications merged with Time Inc. to become Time Warner Inc., which in turn came together with the Internet company America Online (AOL) to form AOL Time Warner in 2001. The company then changed its name again, back to Time Warner Inc., in 2003; it was purchased by AT&T in 2018 and renamed WarnerMedia. The Disney Company itself became an acquirer, adding Miramax Films, the television network American Broadcasting Company, the cable sports network ESPN, and, in 2019, 20th Century Fox, among other properties. The volume of corporate reshuffling and realignment had an undoubted impact on the studios involved. Nevertheless, the potential for success of such synergistic entities—and, more particularly, the positive or negative effect on their motion-picture units—remained an open question.

    It could well be argued, however, that motion-picture companies’ corporate links with the wider media world and emergent communications forms such as the Internet fostered receptivity to new technologies that rapidly transformed film production in the 1990s and into the 21st century. As early as 1982, the Disney film Tron made extensive use of computer-generated images, which were introduced in a short special-effects sequence in which a human character is deconstructed into electronic particles and reassembled inside a computer. A few years later computer-generated imagery was greatly facilitated when it became possible to transfer film images into a computer and manipulate them digitally. The possibilities became apparent in director James Cameron’s Terminator 2: Judgment Day (1991), in images of the shape-changing character T-1000.

    In the 1990s computer-generated imagery (CGI) made rapid strides and became a standard feature not only of Hollywood action-adventure films but also of nearly any work that required special visual effects. Examples of landmark films utilizing the new technologies included Steven Spielberg’s Jurassic Park (1993); Independence Day (1996), directed by Roland Emmerich; and The Matrix (1999), written and directed by Larry (later Lana) Wachowski and Andy (later Lilly) Wachowski. In Spielberg’s film, based on a best-selling novel by Michael Crichton, a number of long-extinct dinosaur species are re-created through genetic engineering. At the special-effects firm Industrial Light and Magic, models of the dinosaurs were scanned into computers and animated realistically to produce the first computer-generated images of lifelike action, rather than fantasy scenes. In Independence Day, a film combining the science-fiction and disaster genres in which giant alien spaceships attack Earth, an air battle was programmed in a computer so that each individual aircraft maneuvered, fired its weapons, and dueled with other flying objects in intricate patterns of action that would have been too time-consuming and costly to achieve by traditional special-effects means. By the end of the 1990s, the developing new technologies were displayed perhaps more fully than ever before in the Wachowskis’ spectacular film, in which the computer functions as both a central subject and a primary visual tool. For a scene in which actor Keanu Reeves appears to be dodging bullets that pass by him in a visible slow-motion trajectory, a computer program determined what motion-picture and still images were to be photographed, and then the computer assembled the images into a complete visual sequence.

    In part through the expensive and lavish effects attained through the new technologies, American cinema at the end of the 20th century sustained and even widened its domination of the world film marketplace. Domestically, the expansion of ancillary products and venues—which during the 1990s were dominated by the sale and rental of video cassettes and then DVDs for home viewing as well as by additional cable and satellite outlets for movie presentation—produced new revenues that were becoming equal to, or in some cases more important than, income from theatrical exhibition. Nevertheless, exhibition outlets continued to grow, with new “megaplex” theatres offering several dozen cinemas, while distribution strategies called for opening major commercial films on 1,000 or more—sometimes as many as 3,000 by the late 1990s—screens across the country. The competition for box-office returns became something of a spectator sport, with the media reporting every Monday on the previous weekend’s multimillion-dollar grosses and ranking the top-10 films by ticket sales. The exhibition environment seemed to demand more than ever that film production be geared to the tastes of teenage spectators who frequented the suburban mall cinemas on weekends, and commentators within the industry as well as outside it observed what they regarded as the diminished quality of mainstream films. As if reflecting that judgment, in 1996 only one major studio film, Jerry Maguire, was among the five nominees for best picture at the annual Academy of Motion Picture Arts and Sciences awards ceremony (the other nominees were an American independent film, Fargo; an Australian work, Shine; a film from Britain, Secrets & Lies; and the winner, an international production with British stars and based on a novel written by a Canadian, The English Patient).

    The motion-picture industry’s emphasis on pleasing the youth audience with special effects-laden blockbusters and genre works such as teen-oriented horror films and comedies inevitably diminished the role of directors as dominant figures in the creative process, further reducing the status that Hollywood directors had attained in the auteur-oriented 1960s and ’70s. Still, more than a handful of filmmakers, several of them veterans of that earlier era, maintained their prestige as artists practicing in a commercial medium. Two of the most prominent, who had launched their careers in the early 1970s, were Steven Spielberg and Martin Scorsese. In addition to Jurassic Park, Spielberg’s works in the 1990s included Schindler’s List (1993, winner of an Academy Award for best picture), Amistad (1997), and Saving Private Ryan (1998), with A.I. Artificial Intelligence (2001), Munich (2005), Lincoln (2012), and Bridge of Spies (2015) among his subsequent films. Scorsese directed GoodFellas (1990), The Age of Innocence (1993), Casino (1995), Kundun (1997), Gangs of New York (2002), The Departed (2006; winner of an Academy Award for best picture), and The Irishman (2019), the latter of which made use of CGI to make veteran actors look decades younger.

    The actor-director Clint Eastwood was also prolific in this period, winning the best picture Academy Award with Unforgiven (1992) and directing such other films as Mystic River (2003), Million Dollar Baby (2004; Academy Award for best picture and best director), Letters from Iwo Jima (2006), Gran Torino (2008), Invictus (2009), American Sniper (2014), Sully (2016), and The Mule (2018). Stanley Kubrick died before the release of Eyes Wide Shut (1999), his first film since Full Metal Jacket (1987). Two decades passed between Terrence Malick’s Days of Heaven (1978) and The Thin Red Line (1998), but he became more prolific after the turn of the 21st century, directing The New World (2005), The Tree of Life (2011), Knight of Cups (2015), and A Hidden Life (2019).

  5. The early years of the 20th century, before World War I, continued to see realism as the main development in drama. But starting around 1900, there was a revival of poetic drama in the States, corresponding to a similar revival in Europe (e.g. Yeats, Maeterlinck and Hauptmann).

  6. Western theatre - American, Drama, Performance: The growth of the early American theatre owed more to its actors than to its dramatists. In the early decades of the 19th century, the finest English actors, notably Edmund Kean, William Charles Macready, and Charles Kemble, visited the United States and provided a stimulus for the local actors with whom they worked. Before long, the gesture was ...

  7. By 1908 there were thousands of storefront Nickelodeons, Gems and Bijous across North America. A few theaters from the nickelodeon era are still showing films today. The 1913 opening of the Regent Theater in New York City signaled a new respectability for the medium, and the start of the two-decade heyday of American cinema design.

  1. People also search for