Cinema of the United States
filmmaking industry in the United States
The cinema of the United States, also known as Hollywood, has had a large effect on the global film industry since the early 20th century. The dominant style of American cinema is classical Hollywood cinema, which developed from 1913 to 1969 and is still typical of most films made there to this day.
This art-related article is a stub. You can help out with Wikiquote by expanding it! |
Quotes
edit- American capitalism finds its sharpest and most expressive reflection in the American cinema.
- Sergei Eisenstein (1957) Film form [and]: The film sense; two complete and unabridged works. p. 196
- The two-year period immediately following 9/11 was an era in which the media was defined both by its jingoism and patriotism and also by its aversion to images of violence and destruction. The images of gleeful destruction the ’90s had reveled in (think Independence Day and Armageddon) disappeared almost overnight, and the few stragglers that crept by (like 2003’s The Core, which destroys both Rome and San Francisco) were quietly buried.
War movies were essentially nonexistent, with most offerings at the multiplex leaning toward fantasy and family-friendly fare. Until Steven Spielberg pioneered the 9/11 visual parable through heavily codified imagery with 2005’s War of the Worlds, scenes of realistic mass destruction temporarily all but disappeared from the media landscape, a far cry from the explosion-riddled works of Michael Bay, Zack Snyder, and their disciples that we enjoy today.- Lindsay Ellis, "Movies, patriotism, and cultural amnesia: tracing pop culture’s relationship to 9/11", Vox, (updated Sep 11, 2017)
- American motion pictures are written by the half-educated for the half-witted.
- St. John Ervine, Ney York Mirror (6 June 1963)
External links
edit- Encyclopedic article on Cinema of the United States on Wikipedia