On June 27, 1985, the famous Route 66 was officially decertified as a U.S. highway.
Route 66
What the Oregon Trail was to the 19th century, Route 66 became for the 20th. Like its predecessor, Route 66 carried vast numbers of people westward. It also become a cultural icon memorialized in song, fiction, television, and pop culture.
During the early 1920s, Cyrus Avery of Tulsa, Oklahoma, and John Woodruff of Springfield, Missouri, provided an early impetus for creating a highway link between the Midwest and California. They understood that such a route would provide an economic boost to their home states along the route. Spurred by the burgeoning automotive industry, Congress initiated legislation for a comprehensive plan of public highways in 1916, with revisions in 1921, and a finalized plan in 1925.
Following extended wrangling, Avery's proposed road became Route 66 on November 26, 1925. The road would run from Jackson and Michigan Avenues in Chicago southwest through St. Louis, Missouri, on to Tulsa and Oklahoma City, then straight west though the Texas Panhandle, northern New Mexico, and Arizona, ending in Pacific Palisades Park, California, where Santa Monica Boulevard meets Ocean Boulevard. However, the trauma of the Great Depression held up completion. The entire road would not be paved until 1930.
During the early 1930s, an estimated 210,000 desperate people headed west on Route 66 to escape the Dust Bowl. John Steinbeck re-created this epic migration in 1939 in The Grapes of Wrath. Like countless other families, the Joads joined the migrant stream on Route 66, "the Mother Road." The novel, together with the film the following year, made Route 66 a living legend as the path to opportunity.
Among the host of travelers during the 1930s and 1940s was Robert William "Bobby" Troup, of Harrisburg, Pennsylvania, and his wife. He wrote a song about the route, "Get Your Kicks on Route 66." Crooner Nat King Cole recorded the song in 1946, and it became a huge, long-lasting hit.
[...]
During the 1960s, however, CBS television would immortalize the road in the popular series Route 66. The program first aired on October 7, 1960, and ran for 116 episodes, until September 18, 1964. The real star of the show was a flashy Chevrolet Corvette convertible. [...] The thin, contrived plot led the [characters] on various implausible adventures along the fabled route. [...] However, producers shot much of the show on other highways that they believed better represented the true spirit (if not the reality) of Route 66.
Television could not save the road. By 1970 modern four-lane interstate highways had replaced most of the route. In October 1984 the last, poorly maintained stretch of U.S. Highway 66 gave way to Interstate 40 at Williams, Arizona. It took five interstates to replace the Mother Road: I-55, I-44, I-40, I-15 and I-10.
The death of the real road, however, spawned a legion of legendary supporters. Writer Michael Wallis, born near the road, published Route 66: The Mother Road in 1990 and issued a video documentary, Route 66 Revisited, four years later. In 1993 NBC launched another TV series in which two new heroes inherited a Corvette and drove off in further search of adventure. Since the mid-1990s, the Annual Mother Road Ride/Rally has drawn hordes of motorcyclists to tour down the historic route. The Albuquerque Convention and Visitors Bureau and the New Mexico Route 66 Association developed a number of events to celebrate the route's 75th anniversary during July 20–21, 2001. PBS television produced an hour-long documentary.
Museums and associations keep the road's memory alive. The National Route 66 Museum in Elk City, Oklahoma, uses a road motif to carry visitors through all eight states along the original road. Murals and vignettes depict various eras and places of the road. Another museum, the California Route 66 Museum in Victorville, also honors the route. The road has been designated a Historic Monument administered by the National Park Service. As author Michael Wallis observes, many who search out the route today still "find the time holy."
--Excerpt from "Route 66." American History. ABC-CLIO, 2011. Web. 27 June 2011.
Monday, June 27, 2011
Friday, June 24, 2011
Movies and the Real-Life Mob
When special agents of the Boston FBI reported that gangster James “Whitey” Bulger finally had been captured after 16 years in hiding, the story immediately became front page news across the country. Bulger, a former South Boston mob figure implicated in 19 murders and various other nefarious activities including drugs and prostitution, literally vanished just hours before he was supposed to be taken into custody by FBI agents in January, 1995. As it turned out, Bulger was tipped off that he was going to be arrested by one of the FBI’s own, former agent John J. Connolly, Jr., for whom Bulger was working as an informant while he was still involved in his hugely profitable criminal enterprise. Bulger finally made his way to Santa Monica, California, where the now 81-year-old fugitive was quietly taken into custody on June 22.
It seems fitting that Bulger was finally found living in Santa Monica, just down the road from Hollywood, as his life played out much like a character in a gangster film. Indeed, once the news about Bulger’s arrest broke, the Internet lit up with stories about how Jack Nicholson’s character in the film The Departed was at least loosely based on Bulger. Made by Martin Scorsese, The Departed finally earned the renowned director his first Oscar for Best Direction. As the encyclopedia entry on Scorsese that appears in ABC-CLIO’s newly released Movies in American History points out, Scorsese grew up in different Italian-American neighborhoods in New York City, where “he experienced a stark contrast between authority figures—wise guys and Catholic priests. Interestingly, as he grew up, Scorsese, although drawn toward the wise-guys, thought seriously about becoming a priest. Though he obviously joined neither group, his fascination with both shines through in his movies.”
Scorsese was perfectly placed to direct The Departed, having already made other gangster films, such as Mean Streets, Goodfellas, and Casino. One of the things that have made Scorsese’s gangster movies different from others like those in The Godfather series is that he focuses on the lives of local, neighborhood mobsters—like Whitey Bulger. These distinctions are discussed in detail in Movies in American History, in which you’ll find in-depth pieces not only on particular movies, but also on the people and subjects that make film and its relationship to history so powerful.
Thursday, June 9, 2011
Sarah Palin, Paul Revere, and Wikipedia: A Teachable Moment?
News came early this week that supporters of Sarah Palin attempted to edit Wikipedia in order for its account of Paul Revere's famous ride to match Palin's version of the event. The incident provides yet another opportunity for educators and others to question the free online encyclopedia's status as the arbiter of all "facts."
Getting less attention than the kerfuffle over Palin's remarks and the Wikipedia "edit war" was the key primary source, an after-the-fact account by Revere himself of the stirring events of that April night in 1775:
I told (the British) …that I had alarmed the country all the way up, that their boats were caught aground, and I should have 500 men there soon. One of the (British) said they had 1500 coming; he seemed surprised and rode off into the road, and informed them who took me, they came down immediately on a full gallop. One of them… clapped his pistol to my head, and said he was going to ask me some questions, and if I did not tell the truth, he would blow my brains out.
The entire document—by the way—is featured in the ABC-CLIO online American History education solution. This database is enhanced and updated daily by a staff of three historian editors. It cannot have its "facts" changed or re-interpreted by political partisans to suit their agenda.
On the chance that the Revere-Palin-Wikipedia contretemps is yet another "teachable moment," we offer three possibilities:
- The "he said, she said" debate between Palin's followers and critics on Wikipedia casts doubt on the ubiquitous site's utility as an accurate custodian of American History, especially given that some educators and scholars now consider it a legitimate "first stop" for students beginning their research.
- Primary sources are (with proper context and sourcing) often far more interesting (and certainly more trustworthy) than the axe-grinding interpretations of politicians and their partisans.
- How relevant and exciting (and politically charged) American History remains. Palin couched her version of the famous ride to Lexington in the spirit of the (not-yet-contemplated) Second Amendment. Why? Perhaps in order to associate her own and her followers' views on gun rights with Revere. Palin's status as one of the standard bearers of the modern Tea Party Movement, and Revere's real-life participation in the "real" Tea Party (Boston Harbor, 1773) makes the story even sweeter.
For more on Palin, ABC-CLIO databases, and the Tea Party movements (original and modern), check out the links below!
--Vince Burns, VP Editorial, ABC-CLIO
--Vince Burns, VP Editorial, ABC-CLIO
-------------------------------------------------------------------------------------------------------------
Jacob H. Huebert
This objective, well-researched biography tells the story of the woman whose meteoric rise to the 2008 Republican vice presidential candidacy made history. From the explorers of the Americas to the issues of today's headlines, American History investigates the people, events, and stories of our nation's evolution.
Jacob H. Huebert
This thorough guide to the burgeoning Tea Party movement goes beyond the typical overheated political rhetoric to discuss where the party came from, what it's about, who's involved, and where it's headed.
Gregory Fremont-Barnes and Richard A. Ryerson, Volume Editors
Wednesday, June 1, 2011
Happy Birthday, Marilyn Monroe
Norma Jean Mortenson, who became known to millions as Marilyn Monroe before her untimely death at the age of 36, was one of the most famous people the United States has ever produced. While her route to fame was through the films she starred in, her bombshell looks, breathy voice, and personal mystique caused her fame to soar far beyond her talents as an actress. Her typecasting as a dumb, sexy blonde contributed to her unhappiness—some sources report that the actress had an IQ of 160. Her suicide has come to serve as a symbol of the destruction that fame and beauty can wreak. ...
Monroe was born on June 1, 1926 in Los Angeles, California. Accounts of her childhood and early years are sketchy, but most agree that Monroe's mother had to be hospitalized for a mental condition when her daughter was very young and that the little girl was shuttled around to a dozen foster families. Monroe also seems to have lived in an orphanage at one point. She received a broken education at various public schools around Los Angeles, the last of which was Van Nuys High School. When she was 16, Monroe learned that her current foster family had to leave California. To avoid going into yet another foster situation, she accepted a proposal of marriage. Her husband soon left for the U.S. Merchant Marine, however, and their marriage did not survive much past the end of World War II. While he was gone, Monroe got a few jobs to bring in extra money, working as a parachute inspector and an aircraft paint sprayer.
Returning to freelancing as a photographer's model, Monroe also got a tiny part in a Marx Brothers movie. She made a favorable impression on director John Huston, who happened to catch her walk-on, and he signed her to play a prostitute in his 1950 film Asphalt Jungle. Although she was not even mentioned in the movie's screen credits, the actress received so much fan mail after the film that Twentieth Century-Fox executives asked her to come back to work for them. Monroe accepted and appeared in the hit film All about Eve. Her performance so pleased the studio that she got a new, seven-year contract with options up to $3,500 a week. ... Monroe's movies of this period include The Fireball, Let's Make It Legal, Love Nest, and As Young As You Feel. ...
By 1952, Monroe was starting to become a household name. ... Several top film critics were describing Monroe as the "most promising actress" and the "most popular actress." At the end of 1953, she had earned more money for Twentieth Century-Fox than any other Hollywood star had earned for their studios. By the time she broke her contract with the studio at the end of 1954, Monroe had starred in such smash-hit films as Gentlemen Prefer Blondes, The Seven-Year Itch, and There's No Business Like Show Business. Despite her skyrocketing popularity and some critical success for her comic timing, Monroe was increasingly unhappy that she could not seem to win any of the more serious roles she really wanted. The breakup of her 1953 marriage to baseball star Joe DiMaggio after nine months only made matters worse. She left Hollywood for New York to attend the Actors Studio.
In January 1955, Monroe announced that she had founded her own production company, Marilyn Monroe Productions. The next year, the company bought the rights to a play that it would later film under the name The Prince and the Show Girl. Meanwhile, Twentieth Century-Fox signed Monroe to do four films over seven years. The first of these, Bus Stop, came out in 1956 and showed off Monroe's natural talent as a comedienne. Prior to its release, she had married her third husband, playwright Arthur Miller, and immediately afterward, the couple flew to London to start filming of The Prince and the Show Girl. Revered British actor Sir Laurence Olivier was her costar and the film's director. Although it did not receive rave reviews, Monroe was once again complimented for her light comic touch. ...
For the next two years, Monroe lived quietly in New York and Connecticut with Miller and did not try to parlay her new success as a serious actress into other roles, although she did continue to study at the Actors Studio. In 1958, however, she returned to Hollywood amid a firestorm of publicity to star in Billy Wilder's film Some Like It Hot. The movie was released in 1959. Critics immediately hailed it as one of the funniest movies ever made and applauded Monroe especially for her "deliciously naïve quality." However, that ephemeral quality was starting to come at a higher and higher price. Monroe, who at this point had already been plagued for years by an obsessive perfectionism, began to worry about her future as an actress. Aware that her appeal as a star would probably not outlast her youthful beauty, she studied fanatically anything she thought would help her acting. Miller wrote a screenplay for her, The Misfits, a troubling movie in which she starred as a wandering beauty who falls in with some other drifters. ... She and Miller divorced shortly before the film's release in 1961. The Misfits was Monroe's last film.
Having become involved in drug and alcohol abuse toward the end of her life, Monroe's reputation as a difficult actress become even worse. Twentieth Century-Fox eventually canceled her contract when she was virtually unable to remember any lines or show up for shooting at all. She died of an overdose of sleeping pills at her home in Hollywood on August 5, 1962.
Excerpt from Pop Culture Universe
Harmon, Justin, et. al. "Marilyn Monroe." Pop Culture Universe: Icons, Idols, Ideas. ABC-CLIO, 2011. Web. 1 June 2011.
------------------------------------------------------------------------------------------------------------
Additional Resources
An irresistible and authoritative digital database on popular culture in America and the world, both past and present—in a package as dynamic as the topic it covers.
By Philip C. DiMare, Editor Philip C. DiMare, Editor
This provocative three-volume encyclopedia is a valuable resource for readers seeking an understanding of how movies have both reflected and helped engender America's political, economic, and social history.
Tuesday, May 31, 2011
The Future of al-Qaida without bin Laden
By James J.F. Forest
The world is without doubt a safer place now that Osama bin Laden is dead. He personified terrorism; he promoted an ideology that called for killing and massive destruction in order to achieve political change through radicalized Islam. He is responsible for the murder of thousands of people around the world. His death is a major symbolic, tactical and emotional victory for the civilized world. Perhaps the largest impact right now is a sense of closure for the thousands of families who lost loved ones in those attacks on 9/11, as well as the families of victims of the USS Cole bombing, the 1998 Embassy bombings, and other attacks that bin Laden is responsible for.
But at the same time, amid the feelings of relief (and the demand from some that the administration release grisly photos of bin Laden’s corpse), I think it is also safe to say that many Americans, along with many Pakistanis and Europeans, are right now holding their breath, waiting to see what happens next. We are all anxious about where and when any possible retaliation attacks might take place. With that in mind, it is important to take a hard look at what will likely be the short-term future of al-Qaida.
Yes, unfortunately, al-Qaida does have a future despite the elimination of a central figure like bin Laden. After all, al-Qaida is not really a group; it’s more of a movement with a central base of inspiration and support, but with affiliate groups in various parts of the world, and individuals who are inspired to carry out violent acts based on al-Qaida’s ideology. They have embraced what scholars call a “leaderless resistance” model of terrorism, in which local affiliates and individuals are encouraged and guided to raise their own funds, acquire their own weapons, choose targets and carry out their own attacks in support of al Qaida’s ideology and strategic objectives.
Al-Qaida’s ideology is their center of gravity, a collection of beliefs and strategic guidance that can be summarized in just four words: think globally, act locally. Myriad propaganda videos describe the world in a dark “us versus them” narrative in which the Muslim world is being systematically attacked by the international community. Building on this narrative, al-Qaida’s central message encourages individuals to “Think about how much better your lives would be if a global Islamic caliphate ruled mankind; now, do something to help bring this utopian vision closer to reality.” The overall goal is to inspire individuals, and in some cases locally established terrorist or insurgent groups, to consider themselves part of a global movement, and then carry out attacks locally in the name of that movement. So from this perspective, al-Qaida is still a very significant threat despite the death of bin Laden. As long as the ideology resonates among some communities, and is able to influence and inspire violent acts on behalf of that ideology, al-Qaida will live on.
-------------------------------------------------------------------------------------------------------------
James J.F. Forest, Ph.D. is an associate professor at the University of Massachusetts Lowell, in the Department of Criminal Justice and Criminology, where he teaches undergraduate and graduate courses on terrorism, weapons of mass destruction and contemporary security studies. He is also a senior fellow with the Joint Special Operations University, where he holds a TS/SCI security clearance with the U.S. Department of Defense and conducts research (both classified and unclassified) on insurgencies, emerging terrorist threats for the U.S. Special Forces community.
Dr. Forest is the former Director of Terrorism Studies at the United States Military Academy. During his tenure at West Point (2001-2010) he taught courses on international relations, terrorism, counterterrorism, information warfare, comparative politics and sub-Saharan Africa. He also directed a series of research initiatives and education programs for the Combating Terrorism Center at West Point, covering topics such as terrorist recruitment, training, and organizational knowledge transfer. Dr. Forest was selected by the Center for American Progress and Foreign Policy as one of “America’s most esteemed terrorism and national security experts” and participated in their annual Terrorism Index studies 2006 thru 2010. He is the author of Homeland Security: Protecting America's Targets and The Making of a Terrorist: Recruitment, Training, and Root Causes (Praeger).
The world is without doubt a safer place now that Osama bin Laden is dead. He personified terrorism; he promoted an ideology that called for killing and massive destruction in order to achieve political change through radicalized Islam. He is responsible for the murder of thousands of people around the world. His death is a major symbolic, tactical and emotional victory for the civilized world. Perhaps the largest impact right now is a sense of closure for the thousands of families who lost loved ones in those attacks on 9/11, as well as the families of victims of the USS Cole bombing, the 1998 Embassy bombings, and other attacks that bin Laden is responsible for.
But at the same time, amid the feelings of relief (and the demand from some that the administration release grisly photos of bin Laden’s corpse), I think it is also safe to say that many Americans, along with many Pakistanis and Europeans, are right now holding their breath, waiting to see what happens next. We are all anxious about where and when any possible retaliation attacks might take place. With that in mind, it is important to take a hard look at what will likely be the short-term future of al-Qaida.
Yes, unfortunately, al-Qaida does have a future despite the elimination of a central figure like bin Laden. After all, al-Qaida is not really a group; it’s more of a movement with a central base of inspiration and support, but with affiliate groups in various parts of the world, and individuals who are inspired to carry out violent acts based on al-Qaida’s ideology. They have embraced what scholars call a “leaderless resistance” model of terrorism, in which local affiliates and individuals are encouraged and guided to raise their own funds, acquire their own weapons, choose targets and carry out their own attacks in support of al Qaida’s ideology and strategic objectives.
Al-Qaida’s ideology is their center of gravity, a collection of beliefs and strategic guidance that can be summarized in just four words: think globally, act locally. Myriad propaganda videos describe the world in a dark “us versus them” narrative in which the Muslim world is being systematically attacked by the international community. Building on this narrative, al-Qaida’s central message encourages individuals to “Think about how much better your lives would be if a global Islamic caliphate ruled mankind; now, do something to help bring this utopian vision closer to reality.” The overall goal is to inspire individuals, and in some cases locally established terrorist or insurgent groups, to consider themselves part of a global movement, and then carry out attacks locally in the name of that movement. So from this perspective, al-Qaida is still a very significant threat despite the death of bin Laden. As long as the ideology resonates among some communities, and is able to influence and inspire violent acts on behalf of that ideology, al-Qaida will live on.
-------------------------------------------------------------------------------------------------------------
James J.F. Forest, Ph.D. is an associate professor at the University of Massachusetts Lowell, in the Department of Criminal Justice and Criminology, where he teaches undergraduate and graduate courses on terrorism, weapons of mass destruction and contemporary security studies. He is also a senior fellow with the Joint Special Operations University, where he holds a TS/SCI security clearance with the U.S. Department of Defense and conducts research (both classified and unclassified) on insurgencies, emerging terrorist threats for the U.S. Special Forces community.
Dr. Forest is the former Director of Terrorism Studies at the United States Military Academy. During his tenure at West Point (2001-2010) he taught courses on international relations, terrorism, counterterrorism, information warfare, comparative politics and sub-Saharan Africa. He also directed a series of research initiatives and education programs for the Combating Terrorism Center at West Point, covering topics such as terrorist recruitment, training, and organizational knowledge transfer. Dr. Forest was selected by the Center for American Progress and Foreign Policy as one of “America’s most esteemed terrorism and national security experts” and participated in their annual Terrorism Index studies 2006 thru 2010. He is the author of Homeland Security: Protecting America's Targets and The Making of a Terrorist: Recruitment, Training, and Root Causes (Praeger).
Friday, May 27, 2011
Memorial Day
Memorial Day is observed in the United States on the last Monday in May to honor the nation’s war dead. The holiday emerged in the wake of the Civil War as “Decoration Day,” a name that endured well into the twentieth century and that described the most common commemorative rite, a sorrowful strewing of freshly-cut flowers on the graves of soldiers who had fallen in the Civil War. The history of Memorial Day is as complicated, surprising, and paradoxical as the story of any public holiday on the American calendar. Its origins are ambiguous and its transformation dramatic, as it developed from a somber and melancholy fête into a light day of leisure and pleasure, the unofficial openingday of summer. Memorial Day, focused largely on the mortal, patriotic service of men, has been a holiday in which women played a central role, and yet it was hardly feminist and remained profoundly conservative. Although a day originally designed to heighten memory, it has become for most Americans an occasion of blissful, escapist amusement or material consumption. Few Americans today know the origins or original purpose of Memorial Day.
- Excerpt from Encyclopedia of American Holidays and National Days by Len Travers
------------------------------------------------------------------------------------------------------------
Additional Resources
- Excerpt from Encyclopedia of American Holidays and National Days by Len Travers
------------------------------------------------------------------------------------------------------------
Additional Resources
A Political, Social, and Military History, Second Edition
By Spencer C. Tucker, Editor
Medal of Honor Recipients from the Civil War to Afghanistan
By James H. Willbanks, Editor
A Chronology, 1775 to the Present
By John C. Fredriksen
Wednesday, May 25, 2011
Egyptian Pyramids Discovered with Infra-red Satellites
Infra-red satellites have spotted 17 lost Egyptian pyramids and more than 1,000 tombs and 3,000 ancient settlements. To read the full BBC article, click here. The find is credited to US Egyptologist Dr. Sarah Parcak from the University of Alabama at Birmingham. Read below for an excerpt from Dr. Parcak's contributing piece on Satellite Archaeology on the World History: Ancient and Medieval Eras database.
Satellite Archaeology
Satellite archaeology is an exciting new field in which archaeologists are using cutting edge spatial technologies to detect many new archaeological features across the globe. Although the term "remote sensing" can mean anything that allows one to see things remotely (such as a camera), in archaeology, it refers to how archaeologists use satellites and aerial photographs to view ancient features otherwise invisible to the naked eye. Remote sensing in general is used in most of the sciences, including geology, physics, environmental studies/biology, and polar studies, to view long and short-term landscape changes. The same science can be applied to not only detect archaeological features of interest, but can examine modern issues such as population expansion and urbanization, both of which affect archaeological site preservation.
Archaeologists have utilized remote sensing since the early 1900s, when they viewed ancient features such as Stonehenge from balloons. Early World War I aerial photography allowed amateur archaeologists to record archaeological sites in the Middle East, which continued in World War II in Europe, the Mediterranean, and the Far East. The 1970s saw the first uses of satellite remote sensing for archaeology with the launch of the Landsat satellite by NASA. Archaeologists quickly grasped the potential of this technology for detecting long-lost sites in the Americas, and use of satellites only increased in the 1980s and 1990s on every continent with known archaeological material. With the launch of Google Earth, nearly every archaeological project in the world now utilizes some form of remote sensing for project planning, mapping, and survey.
[...]
Many archaeological projects have made use of remote sensing technology to not only detect sites, but to reconstruct past landscapes, and to answer questions about past social, political, economic, and environmental changes. One archaeological project by the author has detected hundreds of previously unknown ancient sites across Egypt. This project used a combination of high resolution and NASA satellite imagery in places that archaeologists had not surveyed in nearly 200 years. On ground survey allowed the detection of 132 "new" ancient sites, including a major complex dating to the time of the pyramids and a large desert trading post. Based on this research, there are likely thousands of ancient sites left to find in Egypt. Another project has used NASA satellite data in Guatemala and Belize to detect long-lost Maya settlements. The satellite data allowed the scientists to see color changes in rainforest trees too subtle to be seen by the human eye alone, which indicated long-covered over limestone monuments. On Easter Island, very high-resolution satellite data has allowed archaeologists to detect the roads used to transport the ancient Moai, the traditional name for the large ceremonial stone heads. [...] Many additional projects exist that use satellite technology, and many more will be developed as the technology improves.
Google Earth allows anyone to use remote sensing for archaeological site detection. For the first time, archaeologists are working together with the general public to investigate features found using satellite data. For example, archaeological features found in fields in France by members of the public turned out to be Roman-period villas. Unlike most satellite imagery, Google Earth is free, although it does not permit advanced remote sensing analysis. Satellite remote sensing analysis can take years of training to become proficient, and requires access to memory-intensive computers. It is a combination of both approaches that can be most useful in the detection of archaeological sites. One of the key issues archaeologists face in the 21st century is how to use advanced technology to protect and preserve the past for future generations, especially in the face of archaeological site looting. With ongoing conflicts in some of the richest archaeological landscapes in the world, it may be many years before excavations can resume in places like Iraq or Afghanistan. Archaeologists can use high-resolution imagery to detect previously unknown ancient sites, as well as protect known sites by monitoring them regularly for site looting. As a whole, satellite remote sensing has much to contribute to the field of archaeology. It is not a tool to be used on its own, but combined with ground survey and excavation, will significantly advance the field of archaeology over the next 25 years.
-------------------------------------------------------------------------------------------------------------
This database covers early human history around the globe—from prehistoric times to the beginnings of the Renaissance. Click here for a free trial.
Subscribe to:
Posts (Atom)