Can someone provide a straightforward (but not simpler than possible) explanation of a transaction as applied to computing (even if copied from Wikipedia)?
A transaction is a unit of work that you want to treat as "a whole." It has to either happen in full or not at all.
A classical example is transferring money from one bank account to another. To do that you have first to withdraw the amount from the source account, and then deposit it to the destination account. The operation has to succeed in full. If you stop halfway, the money will be lost, and that is Very Bad.
In modern databases transactions also do some other things - like ensure that you can't access data that another person has written halfway. But the basic idea is the same - transactions are there to ensure, that no matter what happens, the data you work with will be in a sensible state. They guarantee that there will NOT be a situation where money is withdrawn from one account, but not deposited to another.