Thursday, October 1, 2009

What is the role of banks?

The role of banks and financial institutions in America is to be there to help americans involving the economy. They let people to take out loans in their financial struggles, and expecting twice the money that the person took out expecting the person to pay it back within time.

No comments:

Post a Comment