The American Film Institute defines Western films as those "set in the American West that [embody] the spirit, the struggle, and the demise of the new frontier" ...
確定! 回上一頁