Everyone has probably heard that Disney has bought the rights to make Star Wars movies.
And on my Facebook, I see almost every day people who are soooo extremely angry about this! I don't understand why?
Disney has a whole bunch of well-made movies behind them and I think they will make something good of this too. Disney does not automatically mean pink princesses and singing animals, which many seem to forget, this will definitely open new eyes to the older Star Wars films also. That's what I think.
What do you think? : D