Here is the first part of a series of articles written by Tim Mattson that provide the information you need to correctly use and understand the jargon that has sprung up around parallel computing.
Concurrency: A property of a system in which multiple tasks that comprise the system remain active and make progress at the same time.
Parallelism: Exploiting concurrency in a program with the goal of solving a problem in less time.
Concurrency in an algorithm implies that instead of a single sequence of steps, you have multiple sequences of steps that execute together. These steps are interleaved in different ways depending on how the tasks are scheduled for execution. This means the order of memory access operations will vary between runs of a program. If those memory access operations mix reads and writes to the same location, results can vary from one run of a program to the next. This is called a race condition or just a “race” for short.