Below is my final project for my spring 2012 DCC class. It’s my first time trying to do advance things like green screen and such, so the quality isn’t the best, but hopefully you get a good understanding of what I’m trying to accomplish.
From blowing up the White House to renaming restaurants and creating fake Metro stops, nearly every movie filmed in Washington, D.C., disrupts the landscape in some way — and my goal was to capture that transformation.
I got my project idea after watching St. Elmo’s Fire and noticing that it was partially filmed on this university’s campus, even though it’s supposed to take place at Georgetown University. Then I started thinking about how filmmakers interrupt the norm to change the space of a place, just as Bill Wasik explains in his article “#Riot: Self Organized, Hyper-Networked Revolts — Coming to a City Near You.” In a way, movie sets have the same functions as riots: They make us aware of the space and relationships between people in a specific place because, well, first of all, movie sets are pretty evident with the big cameras and equipment.
I wanted viewers to be made aware of how filmmakers change places to accommodate their movie sets and scenes, and I wanted people to not only learn that a site was the set of a movie but actually be able to experience part of that movie. So, with my video, you can see how Hollywood transformed a D.C. location by watching a clip from a movie that was filmed in such a location. It’s one thing for someone to tell you that part of Independence Day takes place at the White House; it’s more interesting, I feel, to actually watch the White House being blown up on your iPad or smartphone when you’re standing in front of the physical building.
This brings up Adriana de Souza e Silva’s idea of hybrid spaces and the connection between physical and virtual realities. Mobile technology is progressively blurring the lines between the digital and physical world, perhaps to the point where we’ll be living our everyday lives in some sort of augmented space. Similarly, Mark Weiser brings up the point of ubiquitous computing, where the future computers will be so integrated into our lives that they will no longer be visible. While my tour clearly shows the use of an iPad to view movie scenes, maybe one day ubiquitous computing could allow for the scenes to be replayed right in front of our eyes in some sort of augmented reality. I don’t know how exactly it would work, but it would be cool to be wearing, say, a pair of glasses that enabled one to watch the plane from Night at the Museum 2: Battle of the Smithsonian fly over the national mall.
With the exception of long commutes into the city due to Metro construction, I did enjoy putting together my project. I always love an excuse to go into D.C. for the day, and it was nice to see the monuments up close again. I’ve been on a few tours of D.C. when I went on field trips or with my family back in middle and high school, so seeing the monuments themselves wasn’t anything new. But my project enabled me to view the monuments in a new perspective—in the way that Hollywood filmmakers might see the city. No longer is the Lincoln Memorial and Reflecting Pool just a pretty scene; it’s the place to hold a peace rally and reunite lovers. Famous stories in history on display in the Smithsonian museums can come to live. Even the White House can be reimagined in a cartoon world.
I compiled my project in iMovie but used a few other programs to design and edit video and photo clips. I learned some very basic video editing tools in Premiere, such as how to combine multiple takes on one frame, and did similar work in After Effects. I used Photoshop to edit the still shots.
While I do love learning new programs, I felt I was a little pressed for time and couldn’t devote all of my time to learning new software. So, when I needed a slightly more customized title screen than iMovie could offer, I decided to use the programs I know—PowerPoint and a screen-recording program—to make my title and insert it into iMovie. I downloaded the screen-recording program from the App Store and found it to be incredibly useful. In addition to using it to capture my title animations, I used Screen Record Pro to record video clips from YouTube and save my Google Earth tours.
When I had trouble trying to overlay video clips on top of other clips and images, I found out that iMovie has slightly more advanced settings that allow for green and blue screen editing. My graphics didn’t turn out great—they’re far from flawless—but at least iMovie enabled to let me put a movie clip on top of a still image. My ultimate goal is for viewers to understand how watching a movie clip on a mobile device demonstrates Hollywood changed the space to create a movie scene.
Movies, Photos and Audio
“Aliens Blow Up The White House.” YouTube, 11 Sept. 2008. Web. 30 April 2012. <http://www.youtube.com/watch?v=z3qu-sCei3U>.
“Forrest Gump – Best Movie Scene.” YouTube, 14 Oct. 2010. Web. 30 April 2012. <http://www.youtube.com/watch?v=M2QGUkVqv-M>.
Fun. “Walking the Dog.” Rec. 25 Aug. 2009. Nettwerk, 2009. MP3.
Golub, Evan. Transformers Invade Washington, DC. 2010. Photograph. Washington,
DC. Demotix. Web. 9 May 2012. <http://www.demotix.com/news/473406/transformers-invade-washington-dc, 10 Dec. 2010>.
Levy, Shawn, dir. Night at the Museum: Battle of the Smithsonian. Twentieth Century Fox Film Corp., 2009.
Schumacher, Joel, dir. St. Elmo’s Fire. Columbia Pictures Corp., 1985.
“Schwarzenegger Simpsons Clip.” YouTube, 7 Aug. 2007. 30 April 2012. <http://www.youtube.com/watch?v=5D3mqiKgquY>.
De Souza E Silva, Adriana. “From Cyber to Hybrid: Mobile Technologies as Interfaces of Hybrid Spaces.” Space and Culture 9.3 (2006): 261-78. Web.
“Ubiquitous Computing.” Ubiquitous Computing. Stanford University, 29 Apr. 1999. Web. 12 May 2012. <http://library.stanford.edu/weiser/Ubiq.html>.
Wasik, Bill. “#Riot: Self-Organized, Hyper-Networked Revolts—Coming to a City Near You.” Wired. Conde Nast Digital, 16 Dec. 2011. Web. 12 May 2012 <http://www.wired.com/magazine/2011/12/ff_riots>.