Thursday, February 16, 2012

finding #1

How did WWI And WWII affect American lit?
After the wars American literature became more deep. people had began wriiting their feelings and thoughts. American literature was becoming increasingly more through out everywhere.The literature that emerges from the experience of World War II is different from that of WWI. people and children began to write more and more after the wars. african american also began to write more after these wars their freedom had become a little better.african americans had also started hetting their books published as them as authors. Many things had began to change and til american lit goes away itll never stop changing

No comments: