We did not find results for: american film history.
Check spelling or type a new query.
Filmmaking industry in the United States
The cinema of the United States, consisting mainly of major film studios along with some independent films, has had a large effect on the global film industry since the early 20th century. Wikipedia