The iPhone is one of the world’s most iconic devices and, in the grand scheme of things, it isn’t very old. But when did it begin? Where did the idea come from?
The very first iPhone was unveiled in January 2007 at the MacWorld convention. Steve Jobs revealed what Apple had been developing for nearly 3 years and, for its time, it represented the cutting edge of technology.
The device was introduced as an iPod with a wider screen, controlled by touch instead of physical buttons. In short, it was a mobile phone and a device to communicate with the internet. At the time, Jobs told the audience that this device would “reinvent the phone”.
While revealing the design of this new device, Jobs took time out to make fun of the current smartphones on the market, ones that relied on a physical keyboard and were unwieldy to use. He showed off how simple it was to control a phone using simple touch gestures on a screen and the audience were hooked.
In the beginning, there was Dear Abby – an American institution since the 1950s. Write in, and get nice, sensible advice on your dating dilemma. But the catch was that it was PG-rated, written by a woman and generally for them. Young men continued to get a lot of their dating advice from their peers in the locker room, and if they were lucky, an older male mentor who knew the ropes.
Away from the safe, clean-cut mainstream arena, there were always the racy magazines like Playboy which gave a totally different view. After finishing the articles, the attentive reader could go to the back and find books advertised with titles like “How to Pick Up Girls”. This classic by Eric Weber appeared in 1970, and included advice such as wearing bell-bottoms and marching in peace marches to pick up the hot hippies. The 1950s were gone, and men’s dating advice had moved on from how long you should wait to kiss a girl on the cheek at her doorstep. This was red-blooded and unashamed pick-up artistry! The term itself became part of the language, and the 1970s was a time when pick-up artistry flourished, albeit still underground.
Things changed as the 1980s brought in a different atmosphere. Reagan was in the White House, concerned parents were clamping down on rock and roll lyrics, and most importantly the specter of AIDS changed the whole dating landscape. The free-wheeling 1970s over, in dating advice magazines and on TV, there was understandably a focus on staying safe.
Moving into the 1990s, Oprah gave voice to women’s viewpoints on relations between the sexes, and ushered in the sensitive metrosexual. It was also a time of such bestselling books as “Men Are From Mars, Women Are From Venus” and “The Rules”. While the former was written by a man, it seemed to cater more to women and required men to “get with the program”. And the latter defined the 1990s dating scene with what was effectively game for women. It was the decade of women making the rules. It was a tough and confusing time to be a man in the 1990s, as men no longer knew whether to be a traditional macho male, or a Sensitive New Age Guy.
Towards the end of the decade, new things started to stir. AIDS was no longer front-and-center in people’s minds, Oprah’s “you go girl!” brand of feminism was mainstream, and the time was right for men to start something of their own. Most importantly, the Web was starting to provide a new platform for men to give advice on the dating arena. The internet would well and truly shake things up.
One of the first gurus of the new era was Ross Jeffries. Selling books and CDs from his website, he told the would-be Casanova that no matter how nebbish you might be, you could learn to charm any woman into bed. Jeffries was big on NLP, effectively a rebranded form of hypnosis. In many ways, he was like a holdover from the 1970s with his unrepentant focus on getting women into bed. Love him or hate him, he showed that there was a huge market on the internet for male-focused dating tips.
Next on the scene was David De Angelo. More family-friendly than Ross Jeffries, who could come across as misogynistic, De Angelo promised you could “Double Your Dating” through attending his seminars or buying his DVD sets. De Angelo offered some key concepts which at the time were breakthroughs to many young men. His audience were the “nice guys” created by the feminism of the 80s and 90s who now found that the jerks seemed to be getting all the girls. He told them to be “cocky and funny”, skating just on the socially acceptable side of jerkdom. De Angelo also introduced the concept of the “neg”, one of the most notorious (and misunderstood) concepts of new-school “Game”.
Basically, a “neg” is giving a back-handed or ambiguous compliment to a girl, such as “I love your hair – is it real?” or telling an obviously glamorous and beautiful woman “ you’re cute – like my bratty little sister”. Designed to get the attention of sought-after women used to getting fawning compliments, it was easily abused by novices, who would make these kind of remarks to less attractive and more insecure women, or turn them into outright insults. This was the kind of dating advice that had its place, but could easily go wrong, and tended to get bad press. De Angelo was everywhere on the internet in the first few years of the 2000s, as he was essentially a marketer and businessman using the dating advice for men arena as his stepping stone to bigger things. Now under his real name, he has gone on to become a hugely successful and wealthy entrepreneur.
But even more important than De Angelo and the marketers was the ragtag group of men who started to congregate on various internet message boards in the late 1990s and early 2000s. This was something new – men learning and sharing their dating experiences pseudonymously in real time. This new era of sharing allowed them to develop complex theories on everything from how to approach women to how to successful first dates to even, what women want in bed.
One of the more famous of these men was a young Canadian by the name of Eric von Markovic, better known as Mystery. Striding around the streets of Toronto in platform boots and a top hat, wearing a feather boa (“peacocking”), Mystery would perform magic tricks to the delight of young women, in the most flamboyant pick-up artist tradition. While perhaps not the greatest or most original of PUAs, he now had an online audience of eager acolytes who took his approach and went out on the streets to try to replicate it.
A movement was forming, and online forum posts were studded with enough jargon and acronyms to require a glossary. The online seduction community, or simply “The Community” as it was known, grew enormously as it bubbled up from the grassroots. Various gurus espoused their own approaches, such as “direct” or “indirect”. The most notorious would simply advise men to “get into sexual state” as they talked to women, and let their raging pheromones do the seducing for them. Others, building from Mystery’s more cerebral approach, built elaborate theoretical models which could be drawn up on a whiteboard like a physics problem.
The time was right for a breakout into the mainstream, and it occurred in 2005 with the publication of “The Game: Penetrating the Secret Society of Pickup Artists” by Neil Strauss. This book provided a window on “the Community” and revealed the techniques of Mystery and his followers, as well as their real-life stories. Suddenly, a whole new generation of young men were getting advice on women which wasn’t given by women or filtered through family values.
Within a couple of years, most young men were at least passingly familiar with this kind of approach to dating and women, especially after more media exposure like the VH1 show “The Pick-Up Artist” which starred none other than Mystery. Some people laughed it off, others drew out what they saw as valuable lessons. In the wake of this mainstream exposure, a number of companies and individuals sprang up, creating a small industry of dating and pickup advice through DVDs and coaching and “boot camps” where – for a fee – you could go out on the town with a seduction guru and learn the “crimson arts”. Many major cities have become accustomed to these groups of young men going out and practicing their techniques, with lesser or greater effectiveness.
This vast array of dating products and services flooding the market started to cause concern amongst large sections of the ‘community’, due to the complete lack of regulation. Any person could walk in off the street, proclaim themselves a ‘guru’, and start selling products and services, often for fees in excess of $4,000, to anyone who would purchase them. To combat the rise of the fake guru who was just looking to make a quick dollar, PUA accountability groups started appears. PUAWatchDog started analysing the claims of different Gurus to test their validity and dating product review sites such as Seduction Review started giving customers a voice to share their opinions and experiences with other potential customers. These accountability groups helped ensure that customers weren’t getting ripped off by unscrupulous marketers.
With the level of information on pickup techniques reaching new saturation point, with every possible scenario for attracting a woman having it’s own book, DVD, and live coaching program, seduction community focussed turned to the effect the inner-psychology of the individual had on the outcome of the flirting and seduction attempts. This was known as ‘Inner Game’. Companies such as Authentic Man Program and The Attraction Institute developed structures and tactics for eliminating ones inner-roadblocks that were getting in the way of a successful seduction.
At this point, the PUA gurus and online communities are no longer new and novel, but they haven’t become exactly mainstream either. More recently, there has been less of a focus on “Game” and specific techniques to seduce women, and more emphasis on general self-improvement – fitness, finances, etc. – in order to increase not only success with women but overall life satisfaction. This definitely seems to be a more balanced approach than going out to hit on dozens of women every night.
There are other trends for men seeking to improve their dating success, such as an increased realization that American and Western women in general are not the only women in the world, and many men are reporting having better love lives from branching out to foreign countries. Whole online communities are devoted to the life of the budding international playboy, who enjoys not only the sights and food of other countries but also reports back on their success with the local women.
Meanwhile, there are totally different online communities, such as popular dating advice sub–Reddits, where a group of peers give advice either to the opposite sex or to their own gender. It could be considered a kind of crowd-sourced dating advice and unlike the “PUA” communities, gurus are shunned in favor of a more democratic and gender-equal approach. Men who identify as feminist or are in favor of a chivalrous approach to women will find a more friendly place for dating advice on mainstream sites like Reddit. One thing is for sure, there are many options and approaches to dating in 2014, and the spread of information is ensuring that men have plenty of information and advice to help them navigate this aspect of their lives.
Asbestos is a highly versatile, strong, cheap, non flammable malleable substance that has been used in building, textiles and construction for the last 2000 years. Asbestos is also a highly toxic airborne fibrous substance that causes a number of different incurable cancers in the humans that are exposed to it. Asbestos is in many homes around the world and is still being used.
Asbestos became popular in the building industry for its natural properties and affordability – desirable physical properties: sound absorption, average tensile strength, its resistance to fire, heat, electrical and chemical damage. When asbestos is used for its resistance to fire or heat, the fibers are often mixed with cement or woven into fabric or mats. These desirable properties made asbestos a very widely used material, and its use continued to grow throughout most of the 20th century until the carcinogenic (cancer-causing) effects of asbestos dust caused its effective demise as a mainstream construction and fireproofing material in most countries
So how did Asbestos become so wide spread? Where did it come from and how to we rid ourselves of the asbestos that is in more than a 3rd of the homes around the world?
Asbestos is mined straight from the ground. It is a naturally occurring mineral that can be dug out of the earths surface, with Russia as the greatest supplier or Asbestos. There are six different types of Asbestos, defined mostly by their colour.
Asbestos is minded from an open pit and looks a lot like wood in it’s raw form. After it is separated from the earth and other matter, the asbestos is processed and refined into fluffy fibres. These fibres are then mixed with a binding agent a lot like cement. Sheets and pipes made from Asbestos are not 100 percent asbestos but simply a product that contains asbestos.
Asbestos has been mined and used for over 4 000 years, however it was not mined on a large scale until the 19th century when it started to be used in housing. Health issues related to asbestos exposure can be found in records dating back to Roman times.
The word asbestos comes from the ancient Greek, meaning “unquenchable” or “inextinguishable”. Pliny (the younger) make reference to clothes being made of asbestinon in his earliest journals. He states, ‘it is rare and impressive and sold for the same price and the finest pearls.’ He makes note of people cleaning their napkins by setting them on fire. He also makes note of a sickness in the asbestos miners, but there are few details relating to this.
Pliny the Younger wrote in AD 61-114 that slaves who worked with the mineral asbestos became ill, there seems to be no exact reference that can be found. Word of mouth only.
For a long time the damaging effects of Asbestos fibres to people, it was not until 1924 that the very first case of asbestosis was diagnosed. Asbestosis would later be called Mesothelioma as the cancer that asbestos causes effects the mesothelial cells.
Asbestos regained significant popularity as the world, specifically Great Britain, entered the Industrial Revolution. As powered machinery and steam power became more and more prevalent, so did the need for an efficient and effect way to control the heat needed to create and power the machines at the centre of the paradigm shift. Asbestos served as a perfect insulator for high-temperature products like steam pipes, turbines, ovens, and kilns; all things that helped facilitate the Industrial Revolution.
The increase in demand for asbestos sparked the first commercial asbestos mines to open in 1879 in Quebec providence of Canada. Mines opened shortly thereafter in Russia, Australia, and South Africa. By 1900, doctors started reporting lung sickness and pulmonary fibrosis in patients who had worked in asbestos textile factories and asbestos mines.
Despite the resurgence of health concerns, asbestos became very important in the United States as the railroad infrastructure was put into place. Asbestos become an important solution to prevent heat build up and temperature fluctuation in steam powered trains, and again when the steam powered trains shifted to diesel power. By WWII, asbestos was being used in the shipping industry (as insulation to components subjected to high heat), the automobile industry (as brake and clutch lining), and in the construction industry (in a wide variety of products included insulation, siding, and cement).
During the industrial revolution asbestos rose in popularity because of it’s amazing ability to control heat. Asbestos served as a perfect insulator for high-temperature products like steam pipes, turbines, ovens, and kilns; all things that helped facilitate the Industrial Revolution and the industrialisation of production and manufacture.
The increase in demand for asbestos sparked the first commercial asbestos mines to open in 1879 in Quebec providence of Canada. It was not long after this mine opened that others were established in Russia, Australia, and South Africa. By 1900, doctors started reporting lung sickness and pulmonary fibrosis in patients who had worked in asbestos textile factories and asbestos mines.
Despite the resurgence of health concerns, asbestos became very important in the United States as the railroad infrastructure was put into place. wether the toxic risk of Asbestos was underestimated, ignored or hidden, asbestos played a huge part in the production and building of railway lines all over the world.
By WWII, asbestos was being used in the shipping industry, the automobile industry (as brake and clutch lining), and in the construction industry (in a wide variety of products included insulation, siding, and cement).
Is the cancer that effects the mesothelial cells. The mesothelial cells cover almost every organ inside your body. These cells form a lubricating and protective coating over the organs called a mesothelium. Mesothelioma is the cancer of the mesothelial cells.
Almost everyone who is diagnosed with mesothelioma was exposed to Asbestos, be it from the workplace, home or air bone fibres.
James Hardie was one of the largest manufacturers and distributers of Asbestos in Australia. While many companies over the last 50 years have been paying compensation to employees who were victim to Asbestos related diseases and cancers. The history of Asbestos is closely linked to it’s victims however it is too enormous to cover in this article, read more on James Hardie Here and Here.
The removal of Asbestos from building and homes will be a long and expensive process. Asbestos can only be disposed of at a registered disposal facility. These sites are registered with the Australian government and are the only people that can perform the disposal. It is illegal to leave asbestos anywhere else in Australia.
While it is legal for you to remove Asbestos from your home yourself, it is advised that you do not undertake this process alone. Safety equipment, breathing apparatus and the proper means to clean up afterwards should be factored into your asbestos removal.
While the toxic and carsengenic qualities of asbestos are widely know, there are still a number of countries in the world that are mining huge amounts of asbestos for commercial use. We can be sure that no more of it is used in Australia, but there is no such uniform ban on the substance throughout the world.
Hollywood: Perhaps no other place on earth evokes the same air of show-business magic and glamour. The legend of Hollywood began in the early 20th century and is an earmark of modern American society rich in history and innovation.
The origin of movies and motion pictures began in the late 1800’s, with the invention of “motion toys” designed to trick the eye into seeing an illusion of motion from a display of still frames in quick succession, such as the thaumatrope and the zoetrope. In 1872, Edward Muybridge created the first true “motion picture” by placing twelve cameras on a racetrack and rigging the cameras to capture shots in quick sequence as a horse crossed in front of their lenses.
The first film for motion photography was invented in 1885 by George Eastman and William H. Walker, which contributed to the advance of motion photography. Shortly thereafter, the brothers Auguste and Louis Lumiere created a hand-cranked machine called the cinematographe, which could both capture pictures and project still frames in quick succession.
The 1900’s were a time of great advancement for film and motion picture technology. Exploration into editing, backdrops, and visual flow motivated aspiring filmmakers to push into new creative territory. One of the earliest and most famous movies created during this time was The Great Train Robbery, created in 1903 by Edwin S. Porter.
Around 1905, “Nickelodeons”, or 5-cent movie theaters, began to offer an easy and inexpensive way for the public to watch movies. Nickelodeons helped the movie industry move into the 1920’s by increasing the public appeal of film and generate more money for filmmakers, alongside the widespread use of theaters to screen World War I propaganda. After World War I ended and ushered the United States into a cultural boom, a new industry center was on the rise: Hollywood, the home of motion pictures in America.
According to industry myth, the first movie made in Hollywood was Cecil B. DeMille’s The Squaw Man in 1914 when its director decided last-minute to shoot in Los Angeles, but In Old California, an earlier film by DW Griffith, had been filmed entirely in the village of Hollywood in 1910. By 1919, “Hollywood” had transformed into the face of American cinema and all the glamour it would come to embody.
The 1920’s were when the movie industry began to truly flourish, along with the birth of the “movie star”. With hundreds of movies being made each year, Hollywood was the rise of an American force. Hollywood alone was considered a cultural icon set apart from the rest of Los Angeles, emphasizing leisure, luxury, and a growing “party scene”.
Hollywood was the birthplace of movie studios, which were of great importance to America’s public image in the movie industry. The earliest and most affluent film companies were Warner Brothers Pictures, Paramount, RKO, Metro Goldwin Meyer, and 20th Century Fox, each of whom owned their own film production sets and studios. Universal, United, and Columbia Pictures were also considered noteworthy, despite not owning their own theaters, while Disney, Monogram, and Republic were considered third-tier.
This age also saw the rise of two coveted roles in the movie industry: the director and the star. Directors began to receive greater recognition for using and trademarking personal styles in the creation of their films, which previously in history had not been possible due to limitations in filmmaking technology. Additionally, movie stars began to receive greater fame and notoriety due to increases in publicity and shifts in American trends to value faces from the big screen.
The 1930’s was considered the Golden Age of Hollywood. A new era in film history began in this decade with the introduction of sound into film, creating new genres such as action, musicals, documentaries, social statement films, comedies, westerns, and horror movies. The use of audio tracks in motion pictures created a new viewer dynamic and also initiated Hollywood’s leverage in the upcoming World War II.
The early 1940’s were a tough time for the American film industry, especially after the attack on Pearl Harbor by the Japanese. However, production saw a rebound due to advances in technology such as special effects, better sound recording quality, and the beginning of color film use, all of which made movies more modern and appealing.
Like all other American industries, the film industry responded to World War II with increased productivity, creating a new wave of wartime pictures. During the war, Hollywood was a major source of American patriotism by generating propaganda, documentaries, educational pictures, and general awareness of wartime need. The year 1946 saw an all-time high in theater attendance and total profits.
The 1950’s were a time of immense change in American culture and around the world. In the post-war United States, the average family grew in affluence, which created new societal trends, advances in music, and the rise of pop culture – particularly the introduction of television sets. By 1950, an estimated 10 million homes owned a television set.
A shift in demographics created a change in the film industry’s target market, which began creating material aimed at American youth. Instead of traditional, idealized portrayals of characters, filmmakers started creating tales of rebellion and rock n’ roll. This era saw the rise of films featuring darker plot lines and characters played by “edgier” stars like James Dean, Ava Gardner, and Marilyn Monroe.
The appeal and convenience of television caused a major decline in movie theater attendance, which resulted in many Hollywood studios losing money. To adapt to the times, Hollywood began producing film for TV in order to make the money it was losing in movie theaters. This marked the entrance of Hollywood into the television industry.
The 1960’s saw a great push for social change. Movies during this time focused on fun, fashion, rock n’ roll, societal shifts like the civil rights movements, and transitions in cultural values. It was also a time of change in the world’s perception of America and its culture, largely influenced by the Vietnam War and continuous shifts in governmental power.
1963 was the slowest year in film production; approximately 120 movies were released, which was fewer than any year to date since the 1920’s. This decline in production was caused by lower profits due to the pull of television. Film companies instead began to make money in other areas: music records, movies made for TV, and the invention of the TV series.
Additionally, the average film ticket price was lowered to only a dollar, hoping to create greater appeal to former moviegoers. By 1970, this caused a depression in the film industry that had been developing over the past 25 years. A few studios still struggled to survive and made money in new ways, such as theme parks like Florida’s Disney World. Because of financial struggles, national companies bought out many studios. The Golden Age of Hollywood was over.
With the Vietnam War in full swing, the 1970’s began with an essence of disenchantment and frustration within American culture. Although Hollywood had seen its lowest times, during the late 1960’s, the 1970’s saw a rush of creativity due to changes in restrictions on language, sex, violence, and other strong thematic content. American counterculture inspired Hollywood to take greater risks with new alternative filmmakers.
The rebirth of Hollywood during the 1970’s was based on making high-action and youth-oriented pictures, usually featuring new and dazzling special effects technology. Hollywood’s financial trouble was somewhat alleviated with the then-shocking success of movies like Jaws and Star Wars, which became the highest-grossing movies in film history (at that time).
This era also saw the advent of VHS video players, laser disc players, and films on videocassette tapes and discs, which greatly increased profits and revenue for studios. However, this new option to view movies at home once again caused a decrease in theater attendance.
In the 1980’s, the past creativity of the film industry became homogenized and overly marketable. Designed only for audience appeal, most 1980’s feature films were considered generic and few became classics. This decade is recognized as the introduction of high concept films that could be easily described in 25 words or less, which made the movies of this time more marketable, understandable, and culturally accessible.
By the end of the 1980’s, it was generally recognized that films of that time were intended for audiences who sought simple entertainment, as most pictures were unoriginal and formulaic. Many studios sought to capitalize on advancements in special effects technology, instead of taking risks on experimental or thought-provoking concepts. The future of film looked precarious as production costs increased and ticket prices continued to drop. But although the outlook was bleak, films such as Return of the Jedi, Terminator, and Batman were met with unexpected success.
Due to the use of special effects, the budget of film production increased and consequently launched the names of many actors into overblown stardom. International big business eventually took financial control over many movies, which allowed foreign interests to own properties in Hollywood. To save money, more and more films started to launch production in overseas locations. Multi-national industry conglomerates bought out many studios, including Columbia and 20th Century Fox.
The economic decline of the early 1990’s caused a major decrease in box office revenue. Overall theater attendance was up due to new multiscreen Cineplex complexes throughout the United States. Use of special effects for violent scenes such as car chases and gunfights in high-budget films was a primary appeal for many moviegoers.
Meanwhile, pressure on studio executives to make ends meet while creating hit movies was on the rise. In Hollywood, movies were becoming exorbitantly expensive to make due to higher costs for movie stars, agency fees, rising production costs, advertising campaigns, and crew threats to strike.
VCR’s were still popular at this time, and profits from video rentals were higher than the sales of movie tickets. In 1992, CD-ROM’s were created. These paved the way for movies on DVD, which hit stores by 1997. DVD’s featured a much better image quality as well as the capacity for interactive content, and videotapes became obsolete a few years later.
The turn of the millennium brought a new age in film history with rapid and remarkable advances in technology. The movie industry has already seen achievements and inventions in the 2000’s, such as the Blu-ray disc and IMAX theaters. Additionally, movies and TV shows can now be watched on smartphones, tablets, computers, and other personal devices with the advent of streaming services such as Netflix, which you can watch anywhere in the world.
The 2000’s have been an era of immense change in the movie and technology industries, and more change is sure to come quickly. What new innovations will the future bring us? Only time will tell.
Crohn’s disease is a type of inflammatory bowel disease but it may affect any part of the whole digestion tract from the mouth, through the stomach to the colon and anus. Crohn’s disease may affect it’s patients in many different ways; with symptoms including pain, cysts, fever, diarrhoea, bleeding from sores in the gut track, infection and weight loss. Bowel obstructions and severe constipation are also complications from Crohn’s disease that may result in the patient needing surgery and / or a colostomy bag. Patients with Crohn’s disease are at greater risk of developing bowel cancer.
The exact cause of Crohn’s disease is unknown however it has been linked to a combination of environmental factors, immune function and bacterial factors, as well as a patient’s genetic susceptibility to developing the disease.
From these symptoms patients incur a whole range of issues such as tiredness, life style disruptions, anemia or nutritional deficiencies. Crohn’s disease may effect a patients ability to work, support themselves or even go about their normal lives. A recent survey done by Crohn’s and Colitis UK acknowledged that patients may even be giving up sport and exercise due to their illness. Due to the wide variance of Crohn’s symptoms, there is no definitive cure or treatment for the disease. Everyone is different and must be treated according to the individual’s needs however, as Crohn’s directly effects the digestion tract, there is a huge effort to treat and manage symptoms through moderating and altering diet.
Awareness of Crohn’s disease has increased a great deal in the last 40 years, as patients feel more comfortable to discuss the disorder and share their experiences with others. What was once a very taboo topic is now common knowledge. This new awareness for Crohn’s disease may help to explain why the number of patients with diagnosed Crohn’s disease is increasing.
The Pointe Shoe, is synonymous with Ballet and Ballerina’s around the world. While we might take them for granted as having always been a part of the long history of Ballet, the pointe shoe has gone through a very long and interesting history itself. It might surprise you to learn that the art of Ballet was established 200 years before the pointe shoe was developed and dancers rose up onto the tips of their toes to dance.
The Royal Academy of Dance, Académie Royale de Danse, was the first dance institution to be founded in the western world. It was established in France in 1661 as a Theatre, Dance and Opera institution by the French king, Louis XIV. Twenty years after it was founded, the first official Ballet productions went to stage.
This academy placed Ballet within the creative arts and distinguished it as it’s own form of dance and performance. While Ballet had been practiced in Europe prior to this time, it’s official birth place in France cemented French as the international language of Ballet. Ballet classes around the world are still directed and run in French.
Heeled Ballet Slippers?
The first Ballet shoes worn by the dancers of the Royal Academy of Dance were heeled slippers. These shoes were quite difficult to wear and prohibited any jumps and a lot of technical movements. The heeled slipper did not stay around for very long. No one knows exactly when the heel was dropped and ballerinas wore non-heeled shoes, but the abandonment of the heel meant that the dancers could do far more than ever before. It is rumoured that Marie Camargo of the Paris Opera Ballet may have been the first dancer to take the heels from the slippers.
The new flat bottomed slippers spread quickly throughout the Ballet community as dancers were liberated by the abandonment of the heel. The new flat bottomed slippers worn during the 18th century are much like the demi-pointe rehearsal and learning shoes worn by young ballerina’s in classes today. They were secured to the feet with ribbons around the ankle and were pleated under the toes for a better fit. The new slippers allowed for a full extension and enabled the dancer to use the whole foot.
Dancing on The Tip of the Toes
The first dancers to rise up onto their toes did so with an invention by Charles Didelot in 1795. His “flying machine” lifted dancers upward, allowing them to stand on their toes before leaving the ground. This lightness and ethereal quality was so well received by audiences and, as a result, choreographers began to look for ways to incorporate more pointe work into their pieces.
As dance progressed into the 19th century, the emphasis on technical skill increased, as did the desire to dance en pointe without the aid of wires. Marie Taglioni is often credited as being the first to dance on pointe but like many things in the early history of Ballet, no one knows for sure.
In 1832, when Marie Taglioni first danced the entire La Sylphide en pointe, her shoes were nothing more than modified satin slippers; the soles were made of leather and the sides and toes were darned to help the shoes hold their shapes. Because the shoes of this period offered no support, dancers would pad their toes for comfort and rely on the strength of their feet and ankles for support.
Dancers looking after their own feet
The next substantially different form of pointe shoe appeared in Italy in the late 19th century with a modified toe area which was the beginning stages of what we now call the toe box. Dancers like Pierina Legnani wore shoes with a sturdy, flat platform at the front end of the shoe, rather than the more sharply pointed toe of earlier models.
The Italian school could now push technique to the limit in order to achieve dazzling virtuosic feats. These more sturdy toe areas were a Ballerina’s secret weapon, a closely guarded trade secret, for turning multiple pirouettes: spotting.
These shoes went on to included a box—made of layers of fabric—for containing the toes, and a stiffer, stronger sole. They were constructed without nails and the soles were only stiffened at the toes, making them nearly silent. As the Pointe Shoe developed, so did Ballet itself. As the shoes allowed dancers to do more and more, the dancers started to want more from their shoes.
The birth of the modern pointe shoe is often attributed to the early 20th-century Russian ballerina, Anna Pavlova, who was one of the most famous and influential dancers of her time. Pavlova had particularly high, arched insteps, which left her vulnerable to injury when dancing en pointe. She also had slender, tapered feet, which resulted in excessive pressure on her big toes. To compensate for this, she inserted toughened leather soles into her shoes for extra support and flattened and hardened the toe area to form a box.
The soft slippers used by these ballerinas were far different from the “blocked” toe shoes that eventually appeared in their earliest form in the 1880s. (Previously, dancers also spent far less time on pointe than ballerinas do today.)
Ballet dancers in the early part of this century also wore shoes that would seem unmanageably soft today. Tamara Karsavina was said to dance in toe shoes of Swiss goatskin, while the ballerina Pierozi reportedly wore only Moroccan leather. It was fundamental to the development of Ballet technique that the pointe shoes be stiffened and stronger to support longer balances and challenging pirouettes.
Today most toe shoes are fashioned of layers of satin stiffened with glue, with a narrow sole often made of leather.
The life of a Pointe Shoe
Depending on the Ballet dancers experience and skill, a pair of Pointe shoes can last between 2 to 12 hours of dancing. If a dancer is attending a one hour pointe shoe class per week; her pointe shoes will last about three months. For a professional dancer, her shoes will last far less time. A Professional Ballerina can go through 100 and 120 pointe shoes in a single dancing year. Some pointe shoes will only last a single performance in a heavy duty role where the shoes are worked hard. Ballet Companies will often employ professional pointe shoe makers and fitters to work within the company producing and buying over 8 000 shoes during the dance year.
Even different ballet role demands different strengths and flexability in their shoes. “For the technically and physically demanding role of the Black Swan in “Swan Lake,” a strong shoe with a lot of support is required, whereas the role of the sylph in “La Sylphide” has more jumps and less pirouettes, so a light, gentle shoe is needed.” CNN
Development and the future of the Pointe Shoe
The pointe shoe has remained very much unchanged for the last 200 years. Recent developments and changes have begun to appear now within companies that produce Ballet wear such as Nike in conjunction with Bloch Dance wear have designed these shoes called Arc Angel by Guercy Eugene. These shoes have come from a need to protect and advance the support of Ballerina’s very important asset – their feet!
Christmas trees have become a worldwide symbol for peace and good cheer. They are at the center of one of the largest family gatherings of the year. And religions of all shapes and sizes brandish it as something spiritual and meaningful to their beliefs. While random uncles and cats think of it as a nice bit of shade to have a quick nap. The question is, who was first? No, not the cat or the uncle, although that could be quite a tantalizing investigation. What I mean is, where did this idea of the Christmas tree come from? How did it begin and who was the first to place gifts for loved ones beneath or but shiny balls on its branches? Well, that is what we are here to find out. Explore and enjoy, dear reader.
Christmas trees haven’t always been associated with Christianity, in fact, Christianity was one of the last religions to jump on this “green religion” bandwagon. Winters were brutal in ancient times, so much so that ancient Egyptians and Romans believed that their gods were struggling just as much as they were. See, winter was seen as a time of death and any plant that remained green during this time, such as firs, evergreens and rushes, were seen as holy and signs that the gods would triumph over this winter and bring new life onto the earth.
The Egyptians worshiped the god Ra, the sun god. At winter solstice, the shortest day and longest night of the year, the people of Egypt believed Ra had fallen into an illness, and they would decorate their homes with green palm rushes to symbolize his triumph over death.
The Romans treated the solstice as more of a celebration by holding a feast called the Saturnalia in honor of the god of agriculture, Saturn. The Romans looked ahead toward Spring, knowing that soon everything would be green and thriving once more. They would celebrate the coming of Spring by decorating their homes with evergreen boughs; in Northern Europe, Druids also decorated their temples for everlasting life; while the vikings in Scandinavia believed that the evergreen was a special plant given to them by their god, Balder.
It wasn’t until the 16th Century, in Germany, that the Christmas tree began to look as it does now. Devout Christians would bring the trees into their home, while others would build wooden Christmas pyramids that were decorated with evergreens and candles. Martin Luther is credited as being the first person to adding lights to a Christmas tree. The story goes, that as he walked through the night preparing a sermon, he was amazed by the stars twinkling beneath the branches of the evergreens. To truly capture this moment for his family, he demonstrated it to them by erecting a tree in his home and attaching candles to it with wiring.
Most Puritan Americans, in the 17th century, saw the Christmas tree as a pagan symbol. Only German settlers were widely known to decorate their home with them, as Germany had a long history with community trees already. But interestingly, Christians during this time were doing their best to stamp out Christmas and Christmas trees all-together. Going so far as to fining people found recognizing the “pagan mockery” as a holiday. This is interesting because it was the Catholic Church that attached themselves to that holiday in order to convert pagans into their religion. But despite all of the Puritans troubles in attempting to stamp out the long-lasting tradition, an influx in German and Irish settlers overpowered any attempt to keep the tradition from growing.
And grow it did. The Christmas tree truly became popular in 1846, when British Royals, Queen Victoria and Prince Albert were sketched along with their family standing in front of a Christmas tree. Victoria was extremely popular, which instantly made Christmas trees the most fashionable thing in town since sliced bread. The 19th century finally saw a rise in acceptance and popularity in the U.S, and the production of ornament became extremely popular during this time as well.
Everywhere around the world Christmas is being celebrated and trees are being decorated, but each country has its own unique and special brand of holiday cheer.
In the U.K, trees stand normally about four feet in height, while people in the U.S. do everything they can to get their trees to scrape the ceiling.
In Mexico, the nativity scene is the most popular and important facet, and Christmas trees are a luxury for most families; but if one is used it is worked around the nativity decorations.
Despite Australia’s sweltering summer heat in the month of December, christmas trees are still delivered to all parts of Sydney. That’s right, summer. Since Australia lies on the southern-hemisphere, families tend to celebrate Christmas outside and at the beach.
Greenland is the complete opposite with temperatures so low that no Christmas trees will even grow!
Christmas trees have been a huge part of a tradition that seems like it will never die out. It has truly stood the test of time. In a few thousand years our ancestors will be looking back on how their metallic, sentient pyramid began as a proud evergreen tree.
Beards have had many uses during the history of humans. Early humans used beards for warmth and intimidation. In current times, they have been used to show masculinity, royalty, fashion, and status.
Prehistoric men grew beards for warmth, intimidation and protection. Facial hair kept prehistoric men warm and it also protected their mouths from sand, dirt, the sun and many other different elements. A beard on a man’s face creates the look of a stronger looking jaw line; this exaggeration helped them appear more intimidating.
In 3000 BCE to 1580 BCE, Egyptians royalty used a false beard that was made of metal. This false beard was held onto the face by a ribbon that was tied over their heads. This practice was down by both kings and queens. Ancient Egyptians were also known to die their chin beads with reddish brown to strong brown dyes.
Mesopotamian civilizations took great care of their beards. They would use products like beard oil to keep their beards looking healthy. They would also fashion their beards using ancient curling irons and make ringlets, frizzles, and tiered effects. The Assyrians dyed their beards black, and the Persians died theirs a orange-red color. During ancient times, in Turkey and India, when someone had a long beard it was considered a symbol of wisdom and dignity.
During ancient times, in Greece, beards were a sign of honor. Ancient Greeks commonly curled their beards with tongs in order to create hanging curls. Their beards were cut only as a punishment. Around 345 BCE Alexander the Great decreed that soldiers couldn’t have beards. He was afraid that opposing soldiers would grab on to the Grecians’ beards and use it against them while in battle.
Ancient Romans preferred their beads to be trimmed and well groomed. A Roman by the name of, Lucius Tarquinius Pricus, encouraged the use of razors in order to guide the city to hygienic reform in 616-578 BCE. Although Pricus tried to encourage shaving, it still was not generally accepted until 454 BCE. In 454 BCE, a group of Greek Sicilian barbers traveled from Sicily unto main land Italy. They set up barber shops that were situated on the mains streets of Rome. These barber shops were typically only used by people who didn’t own slaves, because if you owned a slave they would shave you instead. Eventually shaving started to become the trend in ancient Rome, philosophers kept their beards regardless of the trend.
Anglo-Saxons wore beards until the advent of Christianity in the 7th century. Once Christianity came around the clergy were required by law to shave. English princes sported mustaches until 1066-1087 CE when a law by William the First created a law that required them to shave in order to fit in with Norman fashions. Once the Crusades began the return of beards also began. For four centuries all sorts of facial hair was allowed. It was much like current times, where men could choose from beards, mustaches and clean shaven faces. In 1535 beards became fashionable again and with it came all sorts of sorts of styles and lengths. Anglo-Saxon men began to starch their beards in the 1560s.
In the early 1600s, a painter named Sir Anthony Vandyke began to paint many aristocrats with pointed beards. This style of beard was called the Vandyke. The men used pomade or wax to shape their beards, and they applied with tiny brushes and combs. The people of this time invented different gadgets in order to keep mustaches and beards in shape while they slept.
There have been many beard styles throughout the ages. A style made popular by Abraham Lincoln, is called the chin curtain. This is when there is facial hair along the jawline which is long enough to hang from the chin. American essayist, Henry David Thoreau, had a style called the chinstrap beard. This style is achieved when sideburns are connected to each other by a narrow hair line along the jaw. English heavy metal musician, Lemmy Kilmister wore his facial hair in a style called, friendly muttonchops. Friendly muttonchops are formed when muttonchops are connected by a mustache and there is no chin hair. Another facial hair style is the goatee. The goatee is when only the hair around the chin and mustache are left on the face. American professional wrestler, Hulk Hogan, was famous for the style horseshoe mustache. This is a full mustache with ends that extend down in parallel strait lines all the way down to the chin line.
Currently, about 33% of American males have facial hair of some kind, while 55% of males worldwide have facial hair. Women found full bearded men to be only 2/3rd as attractive as clean-shaven men.
Contemporary Beard Products
Beard products have come a long way from their humble beginnings. In ancient Egypt they used false beards, you can still purchase false beards. Unlike in ancient Egypt these false beards are not made of gold.
Also, just like men from Mesopotamia used beard oil, you can purchase beard oil.
More Historical Fun Facts
Otto the Great, swore on his beard, as someone in current times would swear on their mother’s grave.
During the middle ages, if a man touched another man’s beard it was offensive and could be grounds for a duel.
In the 16th century, men started experimenting with their beards and came up with trends like the forked beard and even a style called the stiletto beard.
The earliest known recorded crochet patterns where printed in 1824, and yet there is a great deal of evidence pointing to the fact that woman particularly have been recording and sharing crochet patterns since well before then.
While the exact origins of Crochet are unclear as the skill was originally word of mouth, Lis Paludan theorises that crochet evolved from traditional practices in Iran, South America or China, but there is no decisive evidence of the craft being performed before its popularity in Europe during the 19th century.
What is Crochet
Crochet is a process by which yarn or thread and a single hook of any size can be used to make fabric, lace, garments and toys. Crochet may also be used to make hats, bags and jewellery.
Crochet as we say in the English Language is derived from the French word croche, which literally means hook. Like knitting, crochet stitches are made by pulling the yarn through an active loop. While knitting involves a row of open active loops (or stitches) the process of crochet only uses one loop or stitch at a time. A variety of textures, patterns and shapes can be created through varying tension, dropping and adding stitches, and wrapping the yarn around the hook during a stitch.
According to Brandon Gaille, there are over 152 million blogs on the internet today with a new blog being added every half second. So what is a blog exactly? Over 32 million Americans currently read blogs every week yet most people can not explain what a blog actually is.
At their inception, the first ‘blogs’ were limited to chronicling a single persons life. These early blogs were an online diary or forum of hosted public journals. Today, blogs cover a vast cross section of topics, from personal blogs, niche hobbies, news, health advice, dating advice, parenting advocates, business coaching, social issues and financial guidance; anything you can imagine, there is already a blog about it.
Darren Rowse, founder of Problogger defines a blog as a frequent and chronological online publication. A blog is a personal or commercial website or web page which is regularly updated with opinions, reviews, articles and links to other websites. While a website or online store may remain unchanged for years, a blog is frequently updated, current and chronological.
The Origins of the Blog?
Blogs, as we know them today, were born from online forum software. In the 1990’s Internet forum software companies such as WebEx started to develop running conversation threads through online software. As the forums grew larger, these threads of conversations where then organised and connected through topical connections. They were then sorted via an online, metaphorical cork board.
According to Rebecca Blood from the long standing blog Rebecca’s Pocket, these threads slowly developed into online diaries around 1994. Justin Hall is credited as begin one of the first bloggers, beginning his blog Justin’s Links from the Underground, in 1994. Hall blogged for 11 years during his years at Swarthmore College and over time the blog begun to focus heavily on the intimate details of his life.
Jorn Barger is credited with coining the word weblog (A Web Log) in December 1997, but it was not until two years later that Peter Merholz jokingly split the word into the phrase we blog in the side bar of his personal blog in May 1999.
The growth of blogs in the late 1990’s coincided with developments in Web publishing tools that were allowed more non-technical users to post content onto their own blogs. Since this point, a knowledge and literacy of HTML and FTP was necessary to publish content. In August 1999 Evan Williams and Meg Hourihan launched Blogger[dot]com, a free blog hosting Content Management System.
The Content Management System (CMS) WordPress was released in May 2003. Before 2003, the program, widely known as Cafe Log, hosted approximately 2 000 blogs. Since the release of more user friendly and intuitive software over the last 11 years, making free blogging software available to more people, there are over 14.4 billion blog pages hosted on WordPress.
The Authorship of Blogs
When looking at the history of blogging it is interesting to look at the history of blogging authorship and how that has changed over the last 20 years. The history of Blogging has been directly shaped by those with in ability and access to the language, software and platforms that are used to create blogs.
When blogging was first ‘invented’ the authorship of Blogging was limited to those who were were able to writing HTML and FTP, therefore the dialogue and trajectory of blogs was limited to a particular voice and a certain audience. The voice of blogging, in the beginning was very limited and singular, read only by those who were able to find them.
As the software and knowledge needed to create and maintain a blog became less technical, this resulted in the distinct class of online publishing that produces blogs we recognise today. For instance, the use of some sort of browser-based software is now a typical aspect of “blogging”. Blogs now may be hosted online by a dedicated Blog Hosting Company, or run through free and paid online blog software such as WordPress, Movable Type, Blogger or Live Journal.
The Future of Blogging.
Where the world of blogging goes next is anyone’s guess… Everyone from unknown writers in small country towns to large multinational companies run their own blogs today and it’s impossible to know where they’re going to take it next.
As the popularity of Blogging continues to rise, a number of blogging awards, Blogging programs and even Blogging courses are now widely accepted and enrolled in. The Australian Writers Centre, in conjunction with Random house Australia, award annual prizes the the best Australian Blogs Each year. The finalists for the best Australian blogs of 2014, can be seen here.
In the last few years blogging as a writing platform has received a lot of criticism from many writing industry professionals. As a non-regulated publishing profession, blog articles are more likely to have spelling mistakes, grammatical errors, factual biased and a poorly structured argument. Some may say that Blogging will be the down fall of the English Language. You only need to look closely at this article, to see that they may be correct.
While there are many that openly criticise blogging, there are many who must be praised for their efforts. Through Blogging, a direct dialogue between author and audience may be achieved, allowing for a more intimate connection, and current communication exchange. Blogs provide decentralised information dissemination over a wide range of opinions.
Praised for providing voices and documenting stories of the marginalised, blogs have been credited with preserving and nurturing minor and diminishing small language groups; bringing together survivors of rare illness and providing unsolicited advice and support to anyone who is seeking it.
Blogs have also been criticised for the very same reasons stated above, as in 2006, Prison Blogging rose in popularity. Prison Blogging is a mean by which offenders and overseas criminals are given a platform to speak to the world and express themselves. Prison Blogging is highly controversial and touches at the very heart of what Blogging is all about. Blogging continues to struggle between the lines of providing a platform for free speech, and giving a voice to those who should not or may not want to be heard by a wider community.
The internet is a vast and beautiful entity that allows unprecedented access to information, the likes of which we have never seen before. Never before has one had as much access to knowledge as today, and that’s thanks to the internet. Not only is the internet a giant repository of information, it is also a vast marketplace. Never before has the world’s market been so easily accessible to individuals and businesses all over the world. The internet has changed the way we do business forever.
These days, ecommerce has become as common place as watching tv. It’s something that many people do every single day, and probably couldn’t imagine their lives without. There are lots of people that started out with small local business who have used ecommerce to boost sales and make up a large part of their income. It’s exciting to see that this entirely new system of doing business only came into existence a few decades ago. It’s amazing to see how far it’s come, and even more exciting to speculate how it will advance in the future.
Electronic Data Interchange (1960-1982)
In the very, very beginning, there was the development of the Electronic Data Interchange (EDI). The EDI was very convenient as it replaced the more traditional forms of document exchange such as mailing and faxing. This system was used mainly by trading partners who utilised it to transfer orders, invoices and pretty much any other business transaction. The data format that was used, met the ANSI ASC X12, which was the main set of standards in America.
When an order is sent, it was then examined by a Value Added Network, and then proceeded to be processed by the recipient’s system. The EDI was a great tool in its time. It allowed quick and easy transfer of data, without the need for any human intervention.
The man credited with inventing the earliest form of ecommerce is Michael Aldrich. He was an English inventor and entrepreneur. According to the stories, he was one day out with his wife and he was complaining about having to make a long trip to the supermarket. He was then struck with a sudden wave of inspiration. He had the idea of hooking up television to your supermarket to get them to deliver your groceries. In 1979, he connected his television to a computer that was designed for processing transactions. He then coined the term “teleshopping”, which is the earliest form of ecommerce that we know today.
The 90’s and Beyond
The internet as we know it today, was invented by a man called Tim Berners Lee. He was formerly an employee of CERN. He and his friend Robert Caillau, created a proposal to build a “hypertext project” called “WorldWideWeb” in 1990. Later that year, Lee used his NeXT computer (product of Steve Jobs after being ousted from apple) and created the very first web server and hand coded the first browser. Soon after, he went on to make the internet publicly available on August 6, 1991. He went on further to integrate hypertext in the internet and proceeded to develop the URL, HTML and HTTP.
Initially, there was a ban on ecommerce. People were not allowed to engage in commercial use of the internet. Eventually, the National Science Foundation lifted the ban in 1991. Since then, the internet and ecommerce has been experiencing exponential growth. It wasn’t until 1995 that the NSF began charging a fee for registering domain names. There were then 120,000 registered domain names. Within 3 years, however, that number grew to over 2 million. At that point in time, the NSF no longer controlled the internet.
The 1992 book, Future Shop: How Technologies Will Change The Way We Shop And What We Buy, provided insight and predictions on the future of consumerism. An overview of the book explains:
For hundreds of years the marketplace has been growing more complex and more confusing for consumers to navigate. Published in 1992, long before the Internet became a household word. Future Shop argued that new information technologies, combined with innovative public policies, could help consumers overcome that confusion. A prescient manifesto of the coming revolution in e-commerce, Future Shop’s vision of consumer empowerment still resonates today.
From the early days of the internet, there were many concerns regarding online shopping. In 1994, Netscape developed Secure Socket Layers (SSL) which was a new security protocol that protected sensitive information transferred over the web. Browsers had the ability to detect if a site had an SSL certificate, which was a major indicator as to the trustworthiness of a site.
Nowadays, the SSL encryption is one of the most powerful security protocols of the internet. Recently, it was recently exposed to the Heartbleed exploit, which made waves in the web industry. It just goes to show how important SSL was in the online community.
The Dot Com Bubble
The dot-com bubble was one of the darkest times in internet history. It was giant bubble in the stock market that was created by eager investors looking to cash in on the new “dot com” companies. They were drawn in by the hype and the novelty which caused them to ignore common sense business strategies. Eventually, the bubble popped in 2001. This resulted in many people going bankrupt, trillions of dollars lost and some very valuable lessons learned. The bubble was so bad, that it triggered a small economic recession in the early 2000’s.
There were numerous factors that contributed to the bubble, the period of speculation and investment in internet firms between the years 1995 and 2001. In 1995, with the increase in popularity of internet users, many firms saw the increase in users as potential customers and increased revenue. As a result of this, many internet start-ups were conceptualized in the late 1990’s. They came to be known as “dot coms” because of the popular TLD “.com” which followed their names.
The entire dot com industry was well known for their rash business practices which were based on “potential” rather than actual revenue. The policies were mainly growth over profit, which was the incorrect assumption that if they increased their customer base, that their profits would also rise. Many companies spent millions of dollars attempting to dominate the market for a specific product or need.
Very few companies survived and thrived after the infamous dot com bubble, these included e-commerce giants such as eBay and Amazon. Today Amazon and Ebay are both amongst the most successful companies on the Internet!
Only a few weeks after selling his first ever book online, the founder of Amazon, Jeff Bezos, was selling books to every state in the U.S and over 40 other countries. The simplified process of order fulfilment and shipping enabled him to order books directly from the publishers.
Another eCommerce giant, eBay, saw amazing growth as well. They allowed pretty much anyone to buy and sell online. In just a couple of years the website became a household name. It revolutionised ecommerce and was turning over hundreds of millions of dollars each year.
From its humble beginning in 1995 modern Ecommerce has become the fastest growing area of business, showing continued growth year after year. Technology has advanced further making it so much more accessible to people from all walks of life, and entire industries have been built around Ecommerce which are today, the who’s who of the business world.
These days, practically anything can be bought or sourced online. From your dinner to, clothes to a private jet. Over 60% of adults have purchased stuff online and this figure will only increase in the coming years. This shows that people love shopping online. The convenience of not having to leave your home, to the transparency of user reviews make it irresistible to today’s youth. One of the greatest lures of ecommerce is the fact that anyone with drive and motivation can succeed. The potential for growth and scalability is unprecedented. The advantages are practically endless.
There has been another surge in tech companies recently, following the increased popularity of mobile phone and web apps, as well as social media. These tech startups are often the epitome of new-age business. Seemingly small companies are worth billions of dollars and in many cases the employees don’t even live in the same city, or even the same country. These companies are being sold for millions of dollars with the help of website brokers. This is a career solely based on the buying and selling of web properties.
Many speculators are worried that this may be the beginning of another bubble. But silicon valley investors are relentless and for the time being, have been seeing a substantial ROI.
In conclusion, the way we do business has been changed substantially ever since the inception of the internet. From eCommerce giants like Amazon and eBay making it easier to get anything you want online, to regular mom-and-pop stores extending their reach globally, to bloggers that earn based on their ability to engage an audience, to multi-billion dollar tech startups. The world is changing rapidly and more opportunities are being created. Many people no longer have to depend on their local economy to find work, start a business and earn money.
While divorce perhaps doesn’t have the same stigma connected to it as it once did, the practice is still a touchy subject in many parts of America. Indeed, as we will see throughout the article, it has changed drastically in law as well as in the attitudes of the general population across the history of the country.
What was once a forbidden practice and only every used as a last resort is now very common. The medium length for a marriage in the US these days is around 11 years and divorce rates have been rising steadily throughout the 20th century and some 29% of marriages will suffer some form of ‘disruption’ and in many cases lead to a divorce.
However how has divorce law changed over time?
Even before the United States officially became the nation that we know it as today divorce was a hot topic in the colonies.
One of the earliest instances of a divorce law was in the Colony of Massachusetts Bay who actually created a judicial tribunal that dealt with divorce matters in 1629. This legislative body was allowed to grant divorces on the basis of adultery, desertion, bigamy and in many cases impotence as well. In the North the colonies adopted their own approaches that made divorce available whereas the southern colonies did all they could to prevent the act even if they did have legislation in place.
After 1776 divorce law was actually less restrictive. Hearing divorce cases took the legislature away from what they deemed as more important work so it was handed to the judiciary where it remains today. The big problem at the time, for women at least, was that they were basically a legal non-entity in the sense that it was difficult for them to claim ownership of property or financial assets which worked against them in the case of a divorce.
The Married Women’s Property Acts in 1848 went some way to rectifying this however throughout the 17th, 18th and 19th centuries divorce remain fairly uncommon if we think of how much it is used today and women were at a tremendous disadvantage from the get go.
Early 20th Century
By the end of the 18th century there were numerous ‘divorce mill’ states or places such as Indiana, Utah, and the Dakotas where you could go and get a divorce. Many towns provided accommodation, restaurants, bars and events centered on this trade. In 1887 Congress ordered the first compilation fo divorce statistics at a federal level to see how big the ‘problem’ had become.
The Inter-Church Conference on Marriage and Divorce which was held in in 1903 in an attempt to use religion to ensure that divorce was kept at a minimum. However with the onset of feminism and the general relaxation of views towards divorce from a societal and moral stand point the practice was gaining traction.
In the 1920’s trial marriages were established that allowed a couple of try out a marriage without actually being married, not having kids or any lifelong financial commitments. In a way it was simply two people of the opposite sex living in the same quarters however for the time it was a new concept and was one of the first ways in which the law tried to accommodate prenuptial contracts. In fact marriage counseling was beginning to become popular as well and represented the recognition that a problem existed even if they law did not strictly prohibit it.
The Family Court
As the years rolled by and the nation found itself embroiled in two world wars, divorce took a back seat as far as lawmakers were concerned. However the Family Court system that started in the 1950’s was the first time in decades that the legislature and judicial system in the US tackled the divorce issue.
For years, couples had to go through the traditional court system to get a divorce or at least plead their case to do so. However with new laws in the place that established the Family Court, this created a way for judges to essentially ratify agreements between couples for divorce that had been created beforehand. While the law used to ensure that a case had to be heard in a court of law this now changed.
With these changes, law firms specialising in divorce started appearing all over the country – San Francisco, Chicago, New York, and just about every other large city soon became involved in these family courts.
No Fault Divorces
Possibly the biggest change to divorce law in the United States in its history came with no fault divorces in the 1970’s. Up until now there still had to be a party at fault. Even in the Family Courts there was still a need for an adulterer or such like to be identified and then for the terms of the divorce to be agreed however with the change in the law then a divorce could be granted if neither party was at fault.
California actually led the way in 1969 however it wasn’t until the 1970’s that other states (Iowa being the second) adopted the law. In many respects it was enacted to bring down the cost of divorce in terms of hiring lawyers and expensive court fees with drawn out trials although that didn’t really come to fruition. Divorce lawyers and financial advisors all still profited greatly from divorce proceedings even if both parties simply wanted to split and move on.
Something that this change in the law didn’t focus on was child custody and it still remained a neglected topic. Laws to address this included:
While the law has attempted to create a fair an equal child custody process it still isn’t quite right in many respects and even with the legislation that has been enacted over the years there remains work to do.
Modern Day America
Divorce towards the end of the 20th century and into the early 21st century was a much different proposition from a hundred years ago.
While there are new laws being enacted all the time to deal with the finer points of divorce the no fault legislation essentially changed everything about the practice and made it into the divorce proceedings that we know today. That being said the attitudes towards divorce are still traditional in many quarters. Even though it has been set in law and that, in general at least, the stigma around divorce has gone it still plays a major role in affected a child’s upbringing and other societal problems.
Furthermore the equal share of property and finances is something else that the law is still trying to get right. Although this differs from state to state across the United States of America in most cases who is to blame doesn’t always transfer over to who gets the property. The legislature and the court system are still trying to find a balance in modern day America between a system that allows for divorce without needing evidence of wrong doing and one that is fair and equal while also addressing the child factor as well.
It isn’t easy but there is still a lot of work behind the scenes to address it.
Divorces were being carried out before the United States of America was even a nation. The colonies had their own measures and laws for dealing with such things however for centuries they were largely used in extreme cases. Indeed, up until the No Fault rule it was unusual to see a divorce that was granted on the basis that both parties simply wanted to break up.
This happens fairly regularly these days however back then there generally had to be a reason of some sort behind the divorce – a women cheating on a man for instance or a man having several wives.
The big question now is whether or not the law can develop even further and change with the rising divorce cases across the country and the more complicated financial and property ownership models. Up until now at least divorce law in the United States has developed at a fairly fast rate. It might not always have favored the couple given that much of the early legislation was there to deal with extreme cases that were even frowned upon by the religious orders of the day.
Divorce law was very reactionary and has been throughout the past 300 years aside from a few isolated cases. It is still adapting to a growing trend however while the stigma of divorce has largely vanished in many places the law is still trying to keep up.
The internet is a relatively new invention but boy have things changed in its short life! The internet has changed the way we live and it has been responsible for the creation of thousands upon thousands of jobs that simply would not exist without it.
One of those categories of job is web design, something that we would sorely miss now if it disappeared. What would we do without the animations? The colorful backgrounds, the fancy writing and the music playing in the background?
When did it Begin?
In 1990, Tim Berners-Lee developed the very first web browser, and it was called WorldWdeWeb, although it was later renamed as Nexus. At that time, only text could be displayed on a web page. No fancy fonts, no pretty pictures or videos, just simple plain text, with links underlined in blue.
In 1993 Mosaic was released, the first ever web browser that allowed developers to add images to their web pages. It was able to support .gif images and web forms, a massive leap forward for the time.
Design was not brilliant because of the constraint of the browser and to a limit in bandwidth programmers rather than designers designed most websites.
Mid 1990’s to 2000
By the mid-nineties, Netscape was the top web browser but it was soon knocked off its pedestal by Internet Explorer and so began the war of the browsers. Around this time, web design began to get a little more complex, using frames and tables as well as images.
From 1998, we began to see the introduction of web development tool kits. Remember DreamWeaver? GoLive? These began to be more popular as they gave a larger number of users access to web page creation.
Jobs in web design began to grow as more designers were offered jobs to build sites. Flash technology also made its appearance during this era of web site design although it was not all that popular to start off with.
In the year 2000, the bubble burst and hundreds of thousands of web businesses crashed. However, while this may have put the clamps on things for a while, it was not for long. Web design standards began to pick up again.
Now we started to see a better class of design. We got designs that were not based on tables, we got transparency with .png images and CMS began to grow in popularity. Content management System was a program that allowed designers to publish content on the web. They could go back in and edit what they had published and modify it as they saw fit.
2004 – 2007
Web 2.0 was born in 2004. This was the era of bold websites, sites that were aimed at communities. There was bold typography and shiny gradients. Corners became rounded, edges softened and web design, once again, took off at the speed of light.
Websites began to be more functional and needed more in the way of an interface to work properly. Widgets were introduced all over the place to help integrate one site with another. This was more often, where a social network site was involved, lining outside feeds to the site, or lining from the site to a blog.
This era was also marked by an increase in accessibility of websites to common people. Developments such as WordPress and Blogger, along with user-friendly guides on how to make a website helped every day people make a website without having to learn HTML or CSS.
2008 to the Present
Web site design has evolved incredibly over the last few years and one thing that has given it a push, unbelievably, was the iPhone. Mobile website design was introduced, allowing people to view sites properly on their phones.
Many of the bigger websites created mobile versions of their sites specifically for the smartphone and the tablets that were fast becoming popular devices. On the internet itself, the large and fast growing social network sites created more widgets for user to put on their blogs and other websites created widgets designed to go on social network sites.
In design, typography increased tremendously and grid-based designs are fast becoming the norm.
Today, website design is a huge business. Designs are more complex yet less cluttered. Early websites were difficult to navigate; today, a well-designed website is enough to ensure your business will succeed.
In terms of design, where the internet goes from now is anyone’s guess. We have color, we have fonts and we have images. We can even embed videos into websites now so who knows where the next trend will take us.
Mental Illness in Antiquity
The label schizophrenia is a recent term, first used in 1908 by Eugen Bleuler, a Swiss psychiatrist, and was meant to describe the disunity of functioning between personality, perception, thinking and memory. Whilst the label is new, accounts of schizophrenia-like symptoms can be found in ancient texts dating back to 2000 BC, and across a number of cultural contexts. The oldest of these texts is the ancient Egyptian Ebers papyrus, around two millennia old.
There are descriptions of illnesses marked by bizarre behaviour and lack of self-control in the Hindu Arthava Veda, dating approximately 1400 BC, and a Chinese text from approximately 1000 BC called The Yellow Emperor’s Classic of Internal Medicine, which attributes insanity and seizures to supernatural and demonic forces.
The Greeks and Romans are also found to have a general awareness of psychotic illnesses. Plato, who lived in the fourth and fifth centuries BC, spoke of a madness of divine origin, which could inspire poets and create prophets. Demonic possession and supernatural forces as the cause of mental illness are a common theme in the ancient literature.
Whilst we can infer these ancient scribes were reporting on the symptoms and causes of the illness we currently describe as schizophrenia, we cannot be certain of it. Some suggest that the lack of clear diagnostic examples in the older literature points to schizophrenia being an entirely modern affliction. Perhaps cultural differences in the understanding of a sufferer’s behaviour can account for the discrepancy in reporting of the illness in ancient times.
The Middle Ages – A Demonic Affliction
The Medieval era saw the beginnings of formal detention and institutionalisation of those deemed mentally ill. In Europe, sufferers were occasionally cared for in monasteries. Some towns had “fools towers”, which housed madmen. In The 1400’s, a number of hospitals to treat the insane sprang up throughout Spain.
In England in 1247, The Priory of Saint Mary of Bethlehem was founded – later known as the notorious Bedlam, the word becoming synonymous with madness itself.
Whilst scholars and Universities at this time had developed a scientific approach towards mental disturbances, there was still a great deal of belief in the lay population in supernatural forces.
In 15th century Europe, delusions and hallucinations were seen as proof of demonic possession. Treatments to overcome these disturbances included confession and exorcism.
Schizophrenia and Early Psychiatry
It is not until the middle of the 19th century that European psychiatrists begin to describe a disease, of unknown origin, typically with an adolescent onset and with a propensity towards chronic deterioration. Emil Kraeplin, a German psychiatrist, utilised the term “dementia praecox” to describe a variety of previously separately recognised illnesses, such as adolescent insanity and catatonia syndrome.
Kraeplin’s long term studies of a large number of cases led him to believe that despite the diversity of clinical presentations, the commonalities in the progression of the illness meant they could be categorised under the singular heading of dementia praecox. Later, he suggested nine categories of the disorder.
This leads us to Eugen Bleuler, who coined the term schizophrenia, meaning “split mind”, replacing the previous terminology dementia praecox. Bleuler’s “schizophrenia” incorporated an understanding that the disorder was a group of illnesses, and did not always deteriorate into a permanent state of “dementia” – as was previously considered by Kraeplin to be a hallmark of the disease.
Further, Bleuler suggested schizophrenia had four main symptoms, known as the 4 A’s: blunted Affect – a reduction in emotional response to stimuli, loosening of Associations and disordered pattern of thought, Ambivalence, or difficulty making decisions, and Autism, by which he meant a loss of awareness of external events and preoccupation with one’s own thoughts.
Schizophrenia and Eugenics
Increased scientific understanding of schizophrenia and other mental illness was overshadowed by persistent stigma and misunderstanding of mental illness. Schizophrenia was thought to be an inheritable disorder, and as such sufferers were subject to Eugenics and sterilisation.
In 1910, Winston Churchill, wrote to the Prime Minister Herbert Asquith, insisting on the implementation of mass forced sterilisations of those deemed feeble minded and insane.
Churchill was not successful in implementing this policy. Forced sterilisation was, however, practised in parts of the USA throughout the twentieth century, and Nazi Germany utilised Eugenics as justification for extreme measures against those it saw as undesirable, including the mentally ill.
Examples of treatments for what would be recognised today as a mental illness go back thousands of years, and include trepanning, the drilling of holes into the skull to allow evil spirits to exit, and various forms of exorcism. The ancient Greeks and Romans tended to employ somewhat enlightened and humane treatment methods.
The Greeks applied their theory of humoural pathology, or the belief that an imbalance in the body’s various fluids could induce madness, amongst other illnesses.
Treatment involved correcting the imbalance in fluids, and encompassed dietary and lifestyle changes, to blood-letting and purging. The Roman treatments consisted of warm baths, massage and diets, although more punitive treatments were also suggested by Cornelius Celsus, stemming from the belief that the symptoms were caused through having angered the gods, and included flogging and starvation.
We may view some of the older techniques for treating mental illness as deplorable, yet many modern pre-pharmacotherapy treatments were unfortunately not much better in some respects.
From the wretched conditions of many asylums, the raising of the body temperature by injection of sulphur and oils to insulin shock therapy, which kept the patient in a coma, deep sleep therapy and electroconvulsive therapy, which were all widely used treatments for schizophrenia and a variety of other mental illnesses prior to the advent of anti-psychotics, patients could expect widely variable results and the risk of further harm.
Lobotomy, developed in the 1930’s, also became a popular treatment for schizophrenia. Initially, the procedure required an operating theatre as holes were drilled into the skull, and either alcohol injected into the frontal lobes or an instrument called a leucotome used to create lesions in the brain.
The technique was soon refined and simplified. American psychiatrist Walter Freeman, seeking to make the procedure accessible to patients in asylums where there was no access to an operating theatre, developed the trans orbital lobotomy. Freeman accessed the prefrontal area through the eye socket, and using an instrument similar to an ice pick made a series of cuts.
The process was quick, and for many had devastating effects, patients were left with impairments of intellectual, social and cognitive function, and often there was no great improvement in the symptoms for which the procedure was performed.
Current Treatments and Research
Antipsychotic drugs to treat schizophrenia were first introduced in the 1950’s. Their success led, in part, to the deinstitutionalisation and integration of sufferers into the community. Antipsychotics, whilst allowing many sufferers of schizophrenia to lead functional lives, have their drawbacks.
Common adverse side effects can include weight gain, involuntary movements, lowered libido, low blood pressure and tiredness. Antipsychotics do not represent a cure for schizophrenia, but used in combination with community based and psychological therapies, sufferers have every chance of recovery.
The internet has also become a useful tool for schizophrenia sufferers and their families, friends and carers, with many useful resources and schizophrenia support sites now available.
Scientific investigations in to the causes and treatment of schizophrenia are ongoing, with a focus on genetic research, which will hopefully lead to more effective treatments and possibly prevention. Information on current research is available here.
The iPhone is one of the world’s most iconic devices and, in the grand scheme of things, it isn’t very old. But when did it begin? Where did…
In the beginning, there was Dear Abby – an American institution since the 1950s. Write in, and get nice, sensible advice on your dating dilemma. But the catch…
Asbestos is a highly versatile, strong, cheap, non flammable malleable substance that has been used in building, textiles and construction for the last 2000 years. Asbestos is also…
Hollywood: Perhaps no other place on earth evokes the same air of show-business magic and glamour. The legend of Hollywood began in the early 20th century and is…
What is Crohn’s disease? Crohn’s disease is a type of inflammatory bowel disease but it may affect any part of the whole digestion tract from the mouth, through the…
The Pointe Shoe, is synonymous with Ballet and Ballerina’s around the world. While we might take them for granted as having always been a part of the long…