Published on June 12th, 2019
Last updated on June 3rd, 2020
Attention Deficit Disorder (ADD) and Attention Deficit Hyperactivity Disorder (ADHD) are two similar conditions that typically affect the ability of an individual to focus on specific tasks. Though many people know of the existence of ADD/ADHD, there are a few misconceptions about it. The most common is the belief that ADD/ADHD affects only children. People tend to think the symptoms go away after adolescence, while some don’t even believe it exists at all.
The truth is that not only is ADD/ADHD a real and debilitating condition, but around 60% of people diagnosed when they were children will still exhibit noticeable symptoms into adulthood. Common misconceptions like these have lead many people to go undiagnosed over the years, with even less being actively treated. Estimates say around 6.1 million (9.4%) of children deal with some level of ADD/ADHD in the U.S, and approximately 5.2% worldwide, with at least 3% of adults also affected.
While it’s not entirely known what causes ADD/ADHD, researchers do know it runs in families. Research shows that parents have between one-third and one-half chance of passing ADD/ADHD onto their children. There are other risk factors, such as being born premature or at a low weight, head injuries early in life, or exposure to toxicants while still in the womb. That said, there are a few things we know for sure do not cause ADD/ADHD, such as too much sugar or TV.
The truth is ADD/ADHD is a chemical imbalance that causes neurotransmitters to function differently. Testing shows that certain parts of the brain are less active or smaller in children with ADD/ADHD, while others may be working overtime. Regardless of why living with ADD/ADHD can often be extremely difficult if you don’t know your treatment options. So let’s take a look at the peer-reviewed evidence.