There is no denying that Disney has rooted itself in Hollywood. The company has acquired multiple companies since their creation in the 1920s, which has allowed them to expand far beyond animated pictures. Even with blockbusters like the newest Star Wars and Marvel movies, Disney’s animated movies hold a special place in a lot of people’s hearts.