We all know that marketing is essential for any prosperous venture. Therefore the importance of…
American innovation has revolutionized the way we think, interact, and explore our creativity. As human beings, it’s our brains ability to draw reasonable conclusions and make logical decisions that separate us from other earthly creatures. With the gift of free cognitive thought, we make our own choices, develop our own visions, and form ideas, thoughts, and viewpoints, that establish our individuality. Like the creatures we are, we learn some choices we make are good, while others are bad.
Learning has always been a part of mankind’s curiosity. During the Renaissance Period, man read relentlessly becoming diversely knowledgeable through self-education. As scholarly men rose, deep thinking gave way to various statesmen, politicians, intellects, authors, and lawyers. Many of whom became our Founding Fathers. It’s not surprising that when our Founding Fathers came together in Philadelphia, they converged into various groups to discuss, dispute, and support the makings of a document that has transcended time.
As evolution progressed and technology infiltrated our lives, one must ask, “Are we really the same thinkers our ancestors were?”
Since the launch of the first personal computer and digital technology, our adoption to new methods of communication has edged its way into our lives. Today, our developed society functions almost seamlessly through technological devices that make life easier. We’re more informed, more socially connected, and free to discuss and respond to timely events that have an impact on our society. But how much of being connected affects our brain? The answer maybe more than you think.
In the book, The Shallows, by Nicholas Carr, he tackles this very question with some surprising utopia. Our digital dependency has creeped upon us slowly. We’ve become addicted to the devices we own; never realizing the impact they’ve made on our lives. There’s a growing believe that digital technology is affecting us more than we think, and ironically many of us don’t know it.
On a daily basis we:
• Stare at our screens spending most of our time on the Web
• Click away at our computers using the keyboard, mouse, trackpads and tablets for work, fun, and play
• We constantly check smartphones and iPads, rotating, touching icons, and tapping keys to interact with others
• We share our thoughts and feelings by Twittering, Facebooking, LinkedIn, Pinterest, uTube, and other social connections
• We setup our devices to listen to music, purchase ringtones, and alert us when receiving new messages
• We make purchases and reservations online, build communities, and support causes to be heard, promoted, and acted upon
Our daily routine is filled with repeated distractions and reactions to information that demands our attention—our days are never ordinary. Research is beginning to show our devices are affecting our minds cognitive ability to function. Here’s why.
• The Web delivers content to our visual and auditory senses continuously and rapidly
• We see text, colors, and shapes through hyperlinks, videos, animation, buttons, and pictures
• Our fingers click, scroll, touch, and tap navigating the repetitive function of instant information
• We hear sounds repetitively through chimes, alerts, music and whistles
• We become distracted by visual pop ups, reminders, rss feeds and software updates repetitively
• We drag, drop, open and close windows without blinking an eye
The translation . . . the way we think, perform, concentrate, and focus our attention has changed; possibly in a way we can never get back. The days of quietness which was once associated with the Renaissance Period and heavily focused on deep reading and thought, has become lost. We’ve become good skimmers, quickly reviewing information that is relevant in our “need to know” day. We do this frequently when we look at road signs, menu items, headlines, grocery aisles, and our TV channels. We’ve lost how to focus, concentrate, and control our ability to have patience—something once derived from devoted attention.
To really understand how our brain is affected amidst the intricate process of memory and storage, a simple explanation is the only way to deliver an analogy. But keep in mind, this analogy does not accurately explain the science behind the brain’s function.
In its simplest description, to store a memory, regions of the brain record an experience. The frontal lobe is where incoming messages are processed and working memory is stored, and it’s here where the brain is most impacted. The Hippocampus (region of brain deep within the temporal lobe) weaves new and old memories together, signaling a consolidation process, where the brain’s attention is required to pass new information from short-term working memory into long-term memories. If during this cognitive process the brain becomes distracted, there’s an increased risk in losing new content and new memory. Repeated distractions that trigger rapid shifts in attention take a toll on the frontal lobe which affect judgement, creativity, focus, and lead to shallow thinking. Changes in our brain happen unconsciously and without warning. Repeated distractions in thought, and our lack of concentration on tasks, are indications something about us is different, but not obvious. Silently and automatically the brain makes its transition. The biggest risk is a lost in human qualities—like emotion, deep thinking, and cognitive reasoning, to an Internet Digital Addiction (IDA). Nevertheless, studies are revealing progress in understanding the brain’s cognitive secret. But we shouldn’t take our responsibility in cerebral well-being too lightly.
This reverts back to the question asked earlier, “Are we really the same thinkers our ancestors were?” If our dependency on digital technology has succumb to less thinking and more fragmented thoughts, then is it reasonable to conclude the brain can rewire itself based on how we think? The answer is yes, it’s quite possible and research is backing this up.
While it’s important to recognize our actions in the digital age do have consequences; digital interaction offers benefits too.
• Our hand eye coordination and reflexes are quicker and more reactive
• Our ability to envision 3D graphics and visual animation is better
• Face-to-face interaction supports teamwork and collaboration
• Online word games often challenge the brain
The next question becomes, “What can we do to protect our brain when we’re so connected to digital technology?” The good news is we are in control. What we remember or forget is based on how present we are in the moment. Here are a few tips.
• Limit exposure to repeated distractions
• Check email and social connections once or twice a day
• Take regular breaks for mental sharpness
• Get exercise
• Solve puzzles and word games to retain mental focus
• Spend time personally with friends and family
Perhaps the Renaissance men really were better thinkers. They thought deeply on the content they read and were able to recall with certainty facts and ideas that supported their position. They embraced learning and creativity through the skills and talent learned from one another. Mankind thought, created, and succeed in using cognitive thinking to empower the world. But has technology come full circle? Our world of progress has become so seamless and our connection to digital technology so addictive—that it’s now changing the way we think. Ironically, our will power is a battle we fight within our own mind. Unlike the Renaissance men, who’s struggles were based on life challenges; they never encountered a dual within the depths of their own minds.
Nicholas Carr summarizes the Internet by saying, “The Net may well be the single most powerful mind-altering technology that has ever come into general use.” And he’s probably right. Taking control means exercising self-restraint, and trading off one behavior for another. Good brain health is dependent on the decisions we make—or don’t make—and how willing we are to control our own behavior. But this decision . . . is one only our brain can make.