Yahoo Web Search

Search results

  1. Sep 29, 2023 · November 9, 2023. 5 Min Read. The growth of American film is a remarkable story of innovation, reflecting the country’s evolving cultural landscapes and technical breakthroughs. From the early 1900s’ flickering silent films to the stunning digital effects of the twenty-first century, the silver screen has been a mirror to society’s ...

  2. Charts the rise of film in early twentieth-century America from its origins to 1960, exploring mainstream trends and developments, along with topics often relegated to the margins of standard film histories.

  3. Deeming the middle 1980s through 2010 to be the “Sundance-Miramax” era in independent-film history, the author explores distinctions between indie and mainstream production as perceived by American moviegoers and analyzes marketing and exhibition techniques.

  4. People also ask

  5. Dec 5, 1994 · Movie-Made America: A Cultural History of American Movies [Sklar, Robert] on Amazon.com. *FREE* shipping on qualifying offers.

    • (123)
    • John Cawelti
    • $23
    • Vintage
    • Overview
    • United States

    In the last years of the 20th century and the early years of the 21st century, the idea of “synergy” dominated the motion-picture industry in the United States, and an unprecedented wave of mergers and acquisitions pursued this ultimately elusive concept. Simply put, synergy implied that consolidating related media and entertainment properties under a single umbrella could strengthen every facet of a coordinated communications empire. Motion pictures, broadcast television, cable and satellite systems, radio networks, theme parks, newspapers and magazines, book publishers, manufacturers of home entertainment products, sports teams, Internet service providers—these were among the different elements that came together in various corporate combinations under the notion that each would boost the others. News Corporation Ltd., originally an Australian media company, started the trend by acquiring Twentieth Century–Fox in 1985. The Japanese manufacturing giant Sony Corporation acquired Columbia Pictures Entertainment, Inc., from The Coca-Cola Company in 1989. Another Japanese firm, Matsushita, purchased Universal Studios (as part of Music Corporation of America, or MCA) in 1990; it then was acquired by Seagram Company Ltd. (1995), became part of Vivendi Universal Entertainment (2000), and merged with the National Broadcasting Co., Inc. (2004), a subsidiary of the Comcast Corporation. Paramount Pictures, as Paramount Communications, Inc., became part of Viacom Inc. In perhaps the most striking of all ventures, Warner Communications merged with Time Inc. to become Time Warner Inc., which in turn came together with the Internet company America Online (AOL) to form AOL Time Warner in 2001. The company then changed its name again, back to Time Warner Inc., in 2003; it was purchased by AT&T in 2018 and renamed WarnerMedia. The Disney Company itself became an acquirer, adding Miramax Films, the television network American Broadcasting Company, the cable sports network ESPN, and, in 2019, 20th Century Fox, among other properties. The volume of corporate reshuffling and realignment had an undoubted impact on the studios involved. Nevertheless, the potential for success of such synergistic entities—and, more particularly, the positive or negative effect on their motion-picture units—remained an open question.

    It could well be argued, however, that motion-picture companies’ corporate links with the wider media world and emergent communications forms such as the Internet fostered receptivity to new technologies that rapidly transformed film production in the 1990s and into the 21st century. As early as 1982, the Disney film Tron made extensive use of computer-generated images, which were introduced in a short special-effects sequence in which a human character is deconstructed into electronic particles and reassembled inside a computer. A few years later computer-generated imagery was greatly facilitated when it became possible to transfer film images into a computer and manipulate them digitally. The possibilities became apparent in director James Cameron’s Terminator 2: Judgment Day (1991), in images of the shape-changing character T-1000.

    In the 1990s computer-generated imagery (CGI) made rapid strides and became a standard feature not only of Hollywood action-adventure films but also of nearly any work that required special visual effects. Examples of landmark films utilizing the new technologies included Steven Spielberg’s Jurassic Park (1993); Independence Day (1996), directed by Roland Emmerich; and The Matrix (1999), written and directed by Larry (later Lana) Wachowski and Andy (later Lilly) Wachowski. In Spielberg’s film, based on a best-selling novel by Michael Crichton, a number of long-extinct dinosaur species are re-created through genetic engineering. At the special-effects firm Industrial Light and Magic, models of the dinosaurs were scanned into computers and animated realistically to produce the first computer-generated images of lifelike action, rather than fantasy scenes. In Independence Day, a film combining the science-fiction and disaster genres in which giant alien spaceships attack Earth, an air battle was programmed in a computer so that each individual aircraft maneuvered, fired its weapons, and dueled with other flying objects in intricate patterns of action that would have been too time-consuming and costly to achieve by traditional special-effects means. By the end of the 1990s, the developing new technologies were displayed perhaps more fully than ever before in the Wachowskis’ spectacular film, in which the computer functions as both a central subject and a primary visual tool. For a scene in which actor Keanu Reeves appears to be dodging bullets that pass by him in a visible slow-motion trajectory, a computer program determined what motion-picture and still images were to be photographed, and then the computer assembled the images into a complete visual sequence.

    In part through the expensive and lavish effects attained through the new technologies, American cinema at the end of the 20th century sustained and even widened its domination of the world film marketplace. Domestically, the expansion of ancillary products and venues—which during the 1990s were dominated by the sale and rental of video cassettes and then DVDs for home viewing as well as by additional cable and satellite outlets for movie presentation—produced new revenues that were becoming equal to, or in some cases more important than, income from theatrical exhibition. Nevertheless, exhibition outlets continued to grow, with new “megaplex” theatres offering several dozen cinemas, while distribution strategies called for opening major commercial films on 1,000 or more—sometimes as many as 3,000 by the late 1990s—screens across the country. The competition for box-office returns became something of a spectator sport, with the media reporting every Monday on the previous weekend’s multimillion-dollar grosses and ranking the top-10 films by ticket sales. The exhibition environment seemed to demand more than ever that film production be geared to the tastes of teenage spectators who frequented the suburban mall cinemas on weekends, and commentators within the industry as well as outside it observed what they regarded as the diminished quality of mainstream films. As if reflecting that judgment, in 1996 only one major studio film, Jerry Maguire, was among the five nominees for best picture at the annual Academy of Motion Picture Arts and Sciences awards ceremony (the other nominees were an American independent film, Fargo; an Australian work, Shine; a film from Britain, Secrets & Lies; and the winner, an international production with British stars and based on a novel written by a Canadian, The English Patient).

    The motion-picture industry’s emphasis on pleasing the youth audience with special effects-laden blockbusters and genre works such as teen-oriented horror films and comedies inevitably diminished the role of directors as dominant figures in the creative process, further reducing the status that Hollywood directors had attained in the auteur-oriented 1960s and ’70s. Still, more than a handful of filmmakers, several of them veterans of that earlier era, maintained their prestige as artists practicing in a commercial medium. Two of the most prominent, who had launched their careers in the early 1970s, were Steven Spielberg and Martin Scorsese. In addition to Jurassic Park, Spielberg’s works in the 1990s included Schindler’s List (1993, winner of an Academy Award for best picture), Amistad (1997), and Saving Private Ryan (1998), with A.I. Artificial Intelligence (2001), Munich (2005), Lincoln (2012), and Bridge of Spies (2015) among his subsequent films. Scorsese directed GoodFellas (1990), The Age of Innocence (1993), Casino (1995), Kundun (1997), Gangs of New York (2002), The Departed (2006; winner of an Academy Award for best picture), and The Irishman (2019), the latter of which made use of CGI to make veteran actors look decades younger.

    The actor-director Clint Eastwood was also prolific in this period, winning the best picture Academy Award with Unforgiven (1992) and directing such other films as Mystic River (2003), Million Dollar Baby (2004; Academy Award for best picture and best director), Letters from Iwo Jima (2006), Gran Torino (2008), Invictus (2009), American Sniper (2014), Sully (2016), and The Mule (2018). Stanley Kubrick died before the release of Eyes Wide Shut (1999), his first film since Full Metal Jacket (1987). Two decades passed between Terrence Malick’s Days of Heaven (1978) and The Thin Red Line (1998), but he became more prolific after the turn of the 21st century, directing The New World (2005), The Tree of Life (2011), Knight of Cups (2015), and A Hidden Life (2019).

    In the last years of the 20th century and the early years of the 21st century, the idea of “synergy” dominated the motion-picture industry in the United States, and an unprecedented wave of mergers and acquisitions pursued this ultimately elusive concept. Simply put, synergy implied that consolidating related media and entertainment properties under a single umbrella could strengthen every facet of a coordinated communications empire. Motion pictures, broadcast television, cable and satellite systems, radio networks, theme parks, newspapers and magazines, book publishers, manufacturers of home entertainment products, sports teams, Internet service providers—these were among the different elements that came together in various corporate combinations under the notion that each would boost the others. News Corporation Ltd., originally an Australian media company, started the trend by acquiring Twentieth Century–Fox in 1985. The Japanese manufacturing giant Sony Corporation acquired Columbia Pictures Entertainment, Inc., from The Coca-Cola Company in 1989. Another Japanese firm, Matsushita, purchased Universal Studios (as part of Music Corporation of America, or MCA) in 1990; it then was acquired by Seagram Company Ltd. (1995), became part of Vivendi Universal Entertainment (2000), and merged with the National Broadcasting Co., Inc. (2004), a subsidiary of the Comcast Corporation. Paramount Pictures, as Paramount Communications, Inc., became part of Viacom Inc. In perhaps the most striking of all ventures, Warner Communications merged with Time Inc. to become Time Warner Inc., which in turn came together with the Internet company America Online (AOL) to form AOL Time Warner in 2001. The company then changed its name again, back to Time Warner Inc., in 2003; it was purchased by AT&T in 2018 and renamed WarnerMedia. The Disney Company itself became an acquirer, adding Miramax Films, the television network American Broadcasting Company, the cable sports network ESPN, and, in 2019, 20th Century Fox, among other properties. The volume of corporate reshuffling and realignment had an undoubted impact on the studios involved. Nevertheless, the potential for success of such synergistic entities—and, more particularly, the positive or negative effect on their motion-picture units—remained an open question.

    It could well be argued, however, that motion-picture companies’ corporate links with the wider media world and emergent communications forms such as the Internet fostered receptivity to new technologies that rapidly transformed film production in the 1990s and into the 21st century. As early as 1982, the Disney film Tron made extensive use of computer-generated images, which were introduced in a short special-effects sequence in which a human character is deconstructed into electronic particles and reassembled inside a computer. A few years later computer-generated imagery was greatly facilitated when it became possible to transfer film images into a computer and manipulate them digitally. The possibilities became apparent in director James Cameron’s Terminator 2: Judgment Day (1991), in images of the shape-changing character T-1000.

    In the 1990s computer-generated imagery (CGI) made rapid strides and became a standard feature not only of Hollywood action-adventure films but also of nearly any work that required special visual effects. Examples of landmark films utilizing the new technologies included Steven Spielberg’s Jurassic Park (1993); Independence Day (1996), directed by Roland Emmerich; and The Matrix (1999), written and directed by Larry (later Lana) Wachowski and Andy (later Lilly) Wachowski. In Spielberg’s film, based on a best-selling novel by Michael Crichton, a number of long-extinct dinosaur species are re-created through genetic engineering. At the special-effects firm Industrial Light and Magic, models of the dinosaurs were scanned into computers and animated realistically to produce the first computer-generated images of lifelike action, rather than fantasy scenes. In Independence Day, a film combining the science-fiction and disaster genres in which giant alien spaceships attack Earth, an air battle was programmed in a computer so that each individual aircraft maneuvered, fired its weapons, and dueled with other flying objects in intricate patterns of action that would have been too time-consuming and costly to achieve by traditional special-effects means. By the end of the 1990s, the developing new technologies were displayed perhaps more fully than ever before in the Wachowskis’ spectacular film, in which the computer functions as both a central subject and a primary visual tool. For a scene in which actor Keanu Reeves appears to be dodging bullets that pass by him in a visible slow-motion trajectory, a computer program determined what motion-picture and still images were to be photographed, and then the computer assembled the images into a complete visual sequence.

    In part through the expensive and lavish effects attained through the new technologies, American cinema at the end of the 20th century sustained and even widened its domination of the world film marketplace. Domestically, the expansion of ancillary products and venues—which during the 1990s were dominated by the sale and rental of video cassettes and then DVDs for home viewing as well as by additional cable and satellite outlets for movie presentation—produced new revenues that were becoming equal to, or in some cases more important than, income from theatrical exhibition. Nevertheless, exhibition outlets continued to grow, with new “megaplex” theatres offering several dozen cinemas, while distribution strategies called for opening major commercial films on 1,000 or more—sometimes as many as 3,000 by the late 1990s—screens across the country. The competition for box-office returns became something of a spectator sport, with the media reporting every Monday on the previous weekend’s multimillion-dollar grosses and ranking the top-10 films by ticket sales. The exhibition environment seemed to demand more than ever that film production be geared to the tastes of teenage spectators who frequented the suburban mall cinemas on weekends, and commentators within the industry as well as outside it observed what they regarded as the diminished quality of mainstream films. As if reflecting that judgment, in 1996 only one major studio film, Jerry Maguire, was among the five nominees for best picture at the annual Academy of Motion Picture Arts and Sciences awards ceremony (the other nominees were an American independent film, Fargo; an Australian work, Shine; a film from Britain, Secrets & Lies; and the winner, an international production with British stars and based on a novel written by a Canadian, The English Patient).

    The motion-picture industry’s emphasis on pleasing the youth audience with special effects-laden blockbusters and genre works such as teen-oriented horror films and comedies inevitably diminished the role of directors as dominant figures in the creative process, further reducing the status that Hollywood directors had attained in the auteur-oriented 1960s and ’70s. Still, more than a handful of filmmakers, several of them veterans of that earlier era, maintained their prestige as artists practicing in a commercial medium. Two of the most prominent, who had launched their careers in the early 1970s, were Steven Spielberg and Martin Scorsese. In addition to Jurassic Park, Spielberg’s works in the 1990s included Schindler’s List (1993, winner of an Academy Award for best picture), Amistad (1997), and Saving Private Ryan (1998), with A.I. Artificial Intelligence (2001), Munich (2005), Lincoln (2012), and Bridge of Spies (2015) among his subsequent films. Scorsese directed GoodFellas (1990), The Age of Innocence (1993), Casino (1995), Kundun (1997), Gangs of New York (2002), The Departed (2006; winner of an Academy Award for best picture), and The Irishman (2019), the latter of which made use of CGI to make veteran actors look decades younger.

    The actor-director Clint Eastwood was also prolific in this period, winning the best picture Academy Award with Unforgiven (1992) and directing such other films as Mystic River (2003), Million Dollar Baby (2004; Academy Award for best picture and best director), Letters from Iwo Jima (2006), Gran Torino (2008), Invictus (2009), American Sniper (2014), Sully (2016), and The Mule (2018). Stanley Kubrick died before the release of Eyes Wide Shut (1999), his first film since Full Metal Jacket (1987). Two decades passed between Terrence Malick’s Days of Heaven (1978) and The Thin Red Line (1998), but he became more prolific after the turn of the 21st century, directing The New World (2005), The Tree of Life (2011), Knight of Cups (2015), and A Hidden Life (2019).

  6. 2006. Wedding Crashers (2005), which earned over $209 million, surpassed There’s Something About Mary (1998) as the top R-rated comedy in two decades. However, 2005 was predominantly characterized by PG-13 films, which placed 14 of their type in the top 25 moneymakers. PG-13 films accounted for 85% of movie theatre attendance in 2005.

  7. Dec 1, 2011 · Comprising over 90 essays and richly illustrated with over 200 images The Wiley-Blackwell History of American Film provides a chronological portrait of American film history from its origins to the present day.

  1. People also search for