Film
What is Indie Film?
Indie film is a term that has been used to describe films made outside of the Hollywood studio system. The term “indie” comes from the word “independent,” which means it is not dependent on major corporate funding and is usually produced with smaller budgets and lesser known actors, but just Read more…