The Pointe Shoe, is synonymous with Ballet and Ballerina’s around the world. While we might take them for granted as having always been a part of the long history of Ballet, the pointe shoe has gone through a very long and interesting history itself. It might surprise you to learn that the art of Ballet was established 200 years before the pointe shoe was developed and dancers rose up onto the tips of their toes to dance.
The Royal Academy of Dance, Académie Royale de Danse, was the first dance institution to be founded in the western world. It was established in France in 1661 as a Theatre, Dance and Opera institution by the French king, Louis XIV. Twenty years after it was founded, the first official Ballet productions went to stage.
This academy placed Ballet within the creative arts and distinguished it as it’s own form of dance and performance. While Ballet had been practiced in Europe prior to this time, it’s official birth place in France cemented French as the international language of Ballet. Ballet classes around the world are still directed and run in French.
Heeled Ballet Slippers?
The first Ballet shoes worn by the dancers of the Royal Academy of Dance were heeled slippers. These shoes were quite difficult to wear and prohibited any jumps and a lot of technical movements. The heeled slipper did not stay around for very long. No one knows exactly when the heel was dropped and ballerinas wore non-heeled shoes, but the abandonment of the heel meant that the dancers could do far more than ever before. It is rumoured that Marie Camargo of the Paris Opera Ballet may have been the first dancer to take the heels from the slippers.
The new flat bottomed slippers spread quickly throughout the Ballet community as dancers were liberated by the abandonment of the heel. The new flat bottomed slippers worn during the 18th century are much like the demi-pointe rehearsal and learning shoes worn by young ballerina’s in classes today. They were secured to the feet with ribbons around the ankle and were pleated under the toes for a better fit. The new slippers allowed for a full extension and enabled the dancer to use the whole foot.
Dancing on The Tip of the Toes
The first dancers to rise up onto their toes did so with an invention by Charles Didelot in 1795. His “flying machine” lifted dancers upward, allowing them to stand on their toes before leaving the ground. This lightness and ethereal quality was so well received by audiences and, as a result, choreographers began to look for ways to incorporate more pointe work into their pieces.
As dance progressed into the 19th century, the emphasis on technical skill increased, as did the desire to dance en pointe without the aid of wires. Marie Taglioni is often credited as being the first to dance on pointe but like many things in the early history of Ballet, no one knows for sure.
In 1832, when Marie Taglioni first danced the entire La Sylphide en pointe, her shoes were nothing more than modified satin slippers; the soles were made of leather and the sides and toes were darned to help the shoes hold their shapes. Because the shoes of this period offered no support, dancers would pad their toes for comfort and rely on the strength of their feet and ankles for support.
Dancers looking after their own feet
The next substantially different form of pointe shoe appeared in Italy in the late 19th century with a modified toe area which was the beginning stages of what we now call the toe box. Dancers like Pierina Legnani wore shoes with a sturdy, flat platform at the front end of the shoe, rather than the more sharply pointed toe of earlier models.
The Italian school could now push technique to the limit in order to achieve dazzling virtuosic feats. These more sturdy toe areas were a Ballerina’s secret weapon, a closely guarded trade secret, for turning multiple pirouettes: spotting.
These shoes went on to included a box—made of layers of fabric—for containing the toes, and a stiffer, stronger sole. They were constructed without nails and the soles were only stiffened at the toes, making them nearly silent. As the Pointe Shoe developed, so did Ballet itself. As the shoes allowed dancers to do more and more, the dancers started to want more from their shoes.
The birth of the modern pointe shoe is often attributed to the early 20th-century Russian ballerina, Anna Pavlova, who was one of the most famous and influential dancers of her time. Pavlova had particularly high, arched insteps, which left her vulnerable to injury when dancing en pointe. She also had slender, tapered feet, which resulted in excessive pressure on her big toes. To compensate for this, she inserted toughened leather soles into her shoes for extra support and flattened and hardened the toe area to form a box.
The soft slippers used by these ballerinas were far different from the “blocked” toe shoes that eventually appeared in their earliest form in the 1880s. (Previously, dancers also spent far less time on pointe than ballerinas do today.)
Ballet dancers in the early part of this century also wore shoes that would seem unmanageably soft today. Tamara Karsavina was said to dance in toe shoes of Swiss goatskin, while the ballerina Pierozi reportedly wore only Moroccan leather. It was fundamental to the development of Ballet technique that the pointe shoes be stiffened and stronger to support longer balances and challenging pirouettes.
Today most toe shoes are fashioned of layers of satin stiffened with glue, with a narrow sole often made of leather.
The life of a Pointe Shoe
Depending on the Ballet dancers experience and skill, a pair of Pointe shoes can last between 2 to 12 hours of dancing. If a dancer is attending a one hour pointe shoe class per week; her pointe shoes will last about three months. For a professional dancer, her shoes will last far less time. A Professional Ballerina can go through 100 and 120 pointe shoes in a single dancing year. Some pointe shoes will only last a single performance in a heavy duty role where the shoes are worked hard. Ballet Companies will often employ professional pointe shoe makers and fitters to work within the company producing and buying over 8 000 shoes during the dance year.
Even different ballet role demands different strengths and flexability in their shoes. “For the technically and physically demanding role of the Black Swan in “Swan Lake,” a strong shoe with a lot of support is required, whereas the role of the sylph in “La Sylphide” has more jumps and less pirouettes, so a light, gentle shoe is needed.” CNN
Development and the future of the Pointe Shoe
The pointe shoe has remained very much unchanged for the last 200 years. Recent developments and changes have begun to appear now within companies that produce Ballet wear such as Nike in conjunction with Bloch Dance wear have designed these shoes called Arc Angel by Guercy Eugene. These shoes have come from a need to protect and advance the support of Ballerina’s very important asset – their feet!
Christmas trees have become a worldwide symbol for peace and good cheer. They are at the center of one of the largest family gatherings of the year. And religions of all shapes and sizes brandish it as something spiritual and meaningful to their beliefs. While random uncles and cats think of it as a nice bit of shade to have a quick nap. The question is, who was first? No, not the cat or the uncle, although that could be quite a tantalizing investigation. What I mean is, where did this idea of the Christmas tree come from? How did it begin and who was the first to place gifts for loved ones beneath or but shiny balls on its branches? Well, that is what we are here to find out. Explore and enjoy, dear reader.
Christmas trees haven’t always been associated with Christianity, in fact, Christianity was one of the last religions to jump on this “green religion” bandwagon. Winters were brutal in ancient times, so much so that ancient Egyptians and Romans believed that their gods were struggling just as much as they were. See, winter was seen as a time of death and any plant that remained green during this time, such as firs, evergreens and rushes, were seen as holy and signs that the gods would triumph over this winter and bring new life onto the earth.
The Egyptians worshiped the god Ra, the sun god. At winter solstice, the shortest day and longest night of the year, the people of Egypt believed Ra had fallen into an illness, and they would decorate their homes with green palm rushes to symbolize his triumph over death.
The Romans treated the solstice as more of a celebration by holding a feast called the Saturnalia in honor of the god of agriculture, Saturn. The Romans looked ahead toward Spring, knowing that soon everything would be green and thriving once more. They would celebrate the coming of Spring by decorating their homes with evergreen boughs; in Northern Europe, Druids also decorated their temples for everlasting life; while the vikings in Scandinavia believed that the evergreen was a special plant given to them by their god, Balder.
It wasn’t until the 16th Century, in Germany, that the Christmas tree began to look as it does now. Devout Christians would bring the trees into their home, while others would build wooden Christmas pyramids that were decorated with evergreens and candles. Martin Luther is credited as being the first person to adding lights to a Christmas tree. The story goes, that as he walked through the night preparing a sermon, he was amazed by the stars twinkling beneath the branches of the evergreens. To truly capture this moment for his family, he demonstrated it to them by erecting a tree in his home and attaching candles to it with wiring.
Most Puritan Americans, in the 17th century, saw the Christmas tree as a pagan symbol. Only German settlers were widely known to decorate their home with them, as Germany had a long history with community trees already. But interestingly, Christians during this time were doing their best to stamp out Christmas and Christmas trees all-together. Going so far as to fining people found recognizing the “pagan mockery” as a holiday. This is interesting because it was the Catholic Church that attached themselves to that holiday in order to convert pagans into their religion. But despite all of the Puritans troubles in attempting to stamp out the long-lasting tradition, an influx in German and Irish settlers overpowered any attempt to keep the tradition from growing.
And grow it did. The Christmas tree truly became popular in 1846, when British Royals, Queen Victoria and Prince Albert were sketched along with their family standing in front of a Christmas tree. Victoria was extremely popular, which instantly made Christmas trees the most fashionable thing in town since sliced bread. The 19th century finally saw a rise in acceptance and popularity in the U.S, and the production of ornament became extremely popular during this time as well.
Everywhere around the world Christmas is being celebrated and trees are being decorated, but each country has its own unique and special brand of holiday cheer.
In the U.K, trees stand normally about four feet in height, while people in the U.S. do everything they can to get their trees to scrape the ceiling.
In Mexico, the nativity scene is the most popular and important facet, and Christmas trees are a luxury for most families; but if one is used it is worked around the nativity decorations.
Despite Australia’s sweltering summer heat in the month of December, christmas trees are still delivered to all parts of Sydney. That’s right, summer. Since Australia lies on the southern-hemisphere, families tend to celebrate Christmas outside and at the beach.
Greenland is the complete opposite with temperatures so low that no Christmas trees will even grow!
Christmas trees have been a huge part of a tradition that seems like it will never die out. It has truly stood the test of time. In a few thousand years our ancestors will be looking back on how their metallic, sentient pyramid began as a proud evergreen tree.
Beards have had many uses during the history of humans. Early humans used beards for warmth and intimidation. In current times, they have been used to show masculinity, royalty, fashion, and status.
Prehistoric men grew beards for warmth, intimidation and protection. Facial hair kept prehistoric men warm and it also protected their mouths from sand, dirt, the sun and many other different elements. A beard on a man’s face creates the look of a stronger looking jaw line; this exaggeration helped them appear more intimidating.
In 3000 BCE to 1580 BCE, Egyptians royalty used a false beard that was made of metal. This false beard was held onto the face by a ribbon that was tied over their heads. This practice was down by both kings and queens. Ancient Egyptians were also known to die their chin beads with reddish brown to strong brown dyes.
Mesopotamian civilizations took great care of their beards. They would use products like beard oil to keep their beards looking healthy. They would also fashion their beards using ancient curling irons and make ringlets, frizzles, and tiered effects. The Assyrians dyed their beards black, and the Persians died theirs a orange-red color. During ancient times, in Turkey and India, when someone had a long beard it was considered a symbol of wisdom and dignity.
During ancient times, in Greece, beards were a sign of honor. Ancient Greeks commonly curled their beards with tongs in order to create hanging curls. Their beards were cut only as a punishment. Around 345 BCE Alexander the Great decreed that soldiers couldn’t have beards. He was afraid that opposing soldiers would grab on to the Grecians’ beards and use it against them while in battle.
Ancient Romans preferred their beads to be trimmed and well groomed. A Roman by the name of, Lucius Tarquinius Pricus, encouraged the use of razors in order to guide the city to hygienic reform in 616-578 BCE. Although Pricus tried to encourage shaving, it still was not generally accepted until 454 BCE. In 454 BCE, a group of Greek Sicilian barbers traveled from Sicily unto main land Italy. They set up barber shops that were situated on the mains streets of Rome. These barber shops were typically only used by people who didn’t own slaves, because if you owned a slave they would shave you instead. Eventually shaving started to become the trend in ancient Rome, philosophers kept their beards regardless of the trend.
Anglo-Saxons wore beards until the advent of Christianity in the 7th century. Once Christianity came around the clergy were required by law to shave. English princes sported mustaches until 1066-1087 CE when a law by William the First created a law that required them to shave in order to fit in with Norman fashions. Once the Crusades began the return of beards also began. For four centuries all sorts of facial hair was allowed. It was much like current times, where men could choose from beards, mustaches and clean shaven faces. In 1535 beards became fashionable again and with it came all sorts of sorts of styles and lengths. Anglo-Saxon men began to starch their beards in the 1560s.
In the early 1600s, a painter named Sir Anthony Vandyke began to paint many aristocrats with pointed beards. This style of beard was called the Vandyke. The men used pomade or wax to shape their beards, and they applied with tiny brushes and combs. The people of this time invented different gadgets in order to keep mustaches and beards in shape while they slept.
There have been many beard styles throughout the ages. A style made popular by Abraham Lincoln, is called the chin curtain. This is when there is facial hair along the jawline which is long enough to hang from the chin. American essayist, Henry David Thoreau, had a style called the chinstrap beard. This style is achieved when sideburns are connected to each other by a narrow hair line along the jaw. English heavy metal musician, Lemmy Kilmister wore his facial hair in a style called, friendly muttonchops. Friendly muttonchops are formed when muttonchops are connected by a mustache and there is no chin hair. Another facial hair style is the goatee. The goatee is when only the hair around the chin and mustache are left on the face. American professional wrestler, Hulk Hogan, was famous for the style horseshoe mustache. This is a full mustache with ends that extend down in parallel strait lines all the way down to the chin line.
Currently, about 33% of American males have facial hair of some kind, while 55% of males worldwide have facial hair. Women found full bearded men to be only 2/3rd as attractive as clean-shaven men.
Contemporary Beard Products
Beard products have come a long way from their humble beginnings. In ancient Egypt they used false beards, you can still purchase false beards. Unlike in ancient Egypt these false beards are not made of gold.
Also, just like men from Mesopotamia used beard oil, you can purchase beard oil.
More Historical Fun Facts
Otto the Great, swore on his beard, as someone in current times would swear on their mother’s grave.
During the middle ages, if a man touched another man’s beard it was offensive and could be grounds for a duel.
In the 16th century, men started experimenting with their beards and came up with trends like the forked beard and even a style called the stiletto beard.
The earliest known recorded crochet patterns where printed in 1824, and yet there is a great deal of evidence pointing to the fact that woman particularly have been recording and sharing crochet patterns since well before then.
While the exact origins of Crochet are unclear as the skill was originally word of mouth, Lis Paludan theorises that crochet evolved from traditional practices in Iran, South America or China, but there is no decisive evidence of the craft being performed before its popularity in Europe during the 19th century.
What is Crochet
Crochet is a process by which yarn or thread and a single hook of any size can be used to make fabric, lace, garments and toys. Crochet may also be used to make hats, bags and jewellery.
Crochet as we say in the English Language is derived from the French word croche, which literally means hook. Like knitting, crochet stitches are made by pulling the yarn through an active loop. While knitting involves a row of open active loops (or stitches) the process of crochet only uses one loop or stitch at a time. A variety of textures, patterns and shapes can be created through varying tension, dropping and adding stitches, and wrapping the yarn around the hook during a stitch.
According to Brandon Gaille, there are over 152 million blogs on the internet today with a new blog being added every half second. So what is a blog exactly? Over 32 million Americans currently read blogs every week yet most people can not explain what a blog actually is.
At their inception, the first ‘blogs’ were limited to chronicling a single persons life. These early blogs were an online diary or forum of hosted public journals. Today, blogs cover a vast cross section of topics, from personal blogs, niche hobbies, news, health advice, dating advice, parenting advocates, business coaching, social issues and financial guidance; anything you can imagine, there is already a blog about it.
Darren Rowse, founder of Problogger defines a blog as a frequent and chronological online publication. A blog is a personal or commercial website or web page which is regularly updated with opinions, reviews, articles and links to other websites. While a website or online store may remain unchanged for years, a blog is frequently updated, current and chronological.
The Origins of the Blog?
Blogs, as we know them today, were born from online forum software. In the 1990’s Internet forum software companies such as WebEx started to develop running conversation threads through online software. As the forums grew larger, these threads of conversations where then organised and connected through topical connections. They were then sorted via an online, metaphorical cork board.
According to Rebecca Blood from the long standing blog Rebecca’s Pocket, these threads slowly developed into online diaries around 1994. Justin Hall is credited as begin one of the first bloggers, beginning his blog Justin’s Links from the Underground, in 1994. Hall blogged for 11 years during his years at Swarthmore College and over time the blog begun to focus heavily on the intimate details of his life.
Jorn Barger is credited with coining the word weblog (A Web Log) in December 1997, but it was not until two years later that Peter Merholz jokingly split the word into the phrase we blog in the side bar of his personal blog in May 1999.
The growth of blogs in the late 1990’s coincided with developments in Web publishing tools that were allowed more non-technical users to post content onto their own blogs. Since this point, a knowledge and literacy of HTML and FTP was necessary to publish content. In August 1999 Evan Williams and Meg Hourihan launched Blogger[dot]com, a free blog hosting Content Management System.
The Content Management System (CMS) WordPress was released in May 2003. Before 2003, the program, widely known as Cafe Log, hosted approximately 2 000 blogs. Since the release of more user friendly and intuitive software over the last 11 years, making free blogging software available to more people, there are over 14.4 billion blog pages hosted on WordPress.
The Authorship of Blogs
When looking at the history of blogging it is interesting to look at the history of blogging authorship and how that has changed over the last 20 years. The history of Blogging has been directly shaped by those with in ability and access to the language, software and platforms that are used to create blogs.
When blogging was first ‘invented’ the authorship of Blogging was limited to those who were were able to writing HTML and FTP, therefore the dialogue and trajectory of blogs was limited to a particular voice and a certain audience. The voice of blogging, in the beginning was very limited and singular, read only by those who were able to find them.
As the software and knowledge needed to create and maintain a blog became less technical, this resulted in the distinct class of online publishing that produces blogs we recognise today. For instance, the use of some sort of browser-based software is now a typical aspect of “blogging”. Blogs now may be hosted online by a dedicated Blog Hosting Company, or run through free and paid online blog software such as WordPress, Movable Type, Blogger or Live Journal.
The Future of Blogging.
Where the world of blogging goes next is anyone’s guess… Everyone from unknown writers in small country towns to large multinational companies run their own blogs today and it’s impossible to know where they’re going to take it next.
As the popularity of Blogging continues to rise, a number of blogging awards, Blogging programs and even Blogging courses are now widely accepted and enrolled in. The Australian Writers Centre, in conjunction with Random house Australia, award annual prizes the the best Australian Blogs Each year. The finalists for the best Australian blogs of 2014, can be seen here.
In the last few years blogging as a writing platform has received a lot of criticism from many writing industry professionals. As a non-regulated publishing profession, blog articles are more likely to have spelling mistakes, grammatical errors, factual biased and a poorly structured argument. Some may say that Blogging will be the down fall of the English Language. You only need to look closely at this article, to see that they may be correct.
While there are many that openly criticise blogging, there are many who must be praised for their efforts. Through Blogging, a direct dialogue between author and audience may be achieved, allowing for a more intimate connection, and current communication exchange. Blogs provide decentralised information dissemination over a wide range of opinions.
Praised for providing voices and documenting stories of the marginalised, blogs have been credited with preserving and nurturing minor and diminishing small language groups; bringing together survivors of rare illness and providing unsolicited advice and support to anyone who is seeking it.
Blogs have also been criticised for the very same reasons stated above, as in 2006, Prison Blogging rose in popularity. Prison Blogging is a mean by which offenders and overseas criminals are given a platform to speak to the world and express themselves. Prison Blogging is highly controversial and touches at the very heart of what Blogging is all about. Blogging continues to struggle between the lines of providing a platform for free speech, and giving a voice to those who should not or may not want to be heard by a wider community.
The internet is a vast and beautiful entity that allows unprecedented access to information, the likes of which we have never seen before. Never before has one had as much access to knowledge as today, and that’s thanks to the internet. Not only is the internet a giant repository of information, it is also a vast marketplace. Never before has the world’s market been so easily accessible to individuals and businesses all over the world. The internet has changed the way we do business forever.
These days, ecommerce has become as common place as watching tv. It’s something that many people do every single day, and probably couldn’t imagine their lives without. There are lots of people that started out with small local business who have used ecommerce to boost sales and make up a large part of their income. It’s exciting to see that this entirely new system of doing business only came into existence a few decades ago. It’s amazing to see how far it’s come, and even more exciting to speculate how it will advance in the future.
Electronic Data Interchange (1960-1982)
In the very, very beginning, there was the development of the Electronic Data Interchange (EDI). The EDI was very convenient as it replaced the more traditional forms of document exchange such as mailing and faxing. This system was used mainly by trading partners who utilised it to transfer orders, invoices and pretty much any other business transaction. The data format that was used, met the ANSI ASC X12, which was the main set of standards in America.
When an order is sent, it was then examined by a Value Added Network, and then proceeded to be processed by the recipient’s system. The EDI was a great tool in its time. It allowed quick and easy transfer of data, without the need for any human intervention.
The man credited with inventing the earliest form of ecommerce is Michael Aldrich. He was an English inventor and entrepreneur. According to the stories, he was one day out with his wife and he was complaining about having to make a long trip to the supermarket. He was then struck with a sudden wave of inspiration. He had the idea of hooking up television to your supermarket to get them to deliver your groceries. In 1979, he connected his television to a computer that was designed for processing transactions. He then coined the term “teleshopping”, which is the earliest form of ecommerce that we know today.
The 90’s and Beyond
The internet as we know it today, was invented by a man called Tim Berners Lee. He was formerly an employee of CERN. He and his friend Robert Caillau, created a proposal to build a “hypertext project” called “WorldWideWeb” in 1990. Later that year, Lee used his NeXT computer (product of Steve Jobs after being ousted from apple) and created the very first web server and hand coded the first browser. Soon after, he went on to make the internet publicly available on August 6, 1991. He went on further to integrate hypertext in the internet and proceeded to develop the URL, HTML and HTTP.
Initially, there was a ban on ecommerce. People were not allowed to engage in commercial use of the internet. Eventually, the National Science Foundation lifted the ban in 1991. Since then, the internet and ecommerce has been experiencing exponential growth. It wasn’t until 1995 that the NSF began charging a fee for registering domain names. There were then 120,000 registered domain names. Within 3 years, however, that number grew to over 2 million. At that point in time, the NSF no longer controlled the internet.
The 1992 book, Future Shop: How Technologies Will Change The Way We Shop And What We Buy, provided insight and predictions on the future of consumerism. An overview of the book explains:
For hundreds of years the marketplace has been growing more complex and more confusing for consumers to navigate. Published in 1992, long before the Internet became a household word. Future Shop argued that new information technologies, combined with innovative public policies, could help consumers overcome that confusion. A prescient manifesto of the coming revolution in e-commerce, Future Shop’s vision of consumer empowerment still resonates today.
From the early days of the internet, there were many concerns regarding online shopping. In 1994, Netscape developed Secure Socket Layers (SSL) which was a new security protocol that protected sensitive information transferred over the web. Browsers had the ability to detect if a site had an SSL certificate, which was a major indicator as to the trustworthiness of a site.
Nowadays, the SSL encryption is one of the most powerful security protocols of the internet. Recently, it was recently exposed to the Heartbleed exploit, which made waves in the web industry. It just goes to show how important SSL was in the online community.
The Dot Com Bubble
The dot-com bubble was one of the darkest times in internet history. It was giant bubble in the stock market that was created by eager investors looking to cash in on the new “dot com” companies. They were drawn in by the hype and the novelty which caused them to ignore common sense business strategies. Eventually, the bubble popped in 2001. This resulted in many people going bankrupt, trillions of dollars lost and some very valuable lessons learned. The bubble was so bad, that it triggered a small economic recession in the early 2000’s.
There were numerous factors that contributed to the bubble, the period of speculation and investment in internet firms between the years 1995 and 2001. In 1995, with the increase in popularity of internet users, many firms saw the increase in users as potential customers and increased revenue. As a result of this, many internet start-ups were conceptualized in the late 1990’s. They came to be known as “dot coms” because of the popular TLD “.com” which followed their names.
The entire dot com industry was well known for their rash business practices which were based on “potential” rather than actual revenue. The policies were mainly growth over profit, which was the incorrect assumption that if they increased their customer base, that their profits would also rise. Many companies spent millions of dollars attempting to dominate the market for a specific product or need.
Very few companies survived and thrived after the infamous dot com bubble, these included e-commerce giants such as eBay and Amazon. Today Amazon and Ebay are both amongst the most successful companies on the Internet!
Only a few weeks after selling his first ever book online, the founder of Amazon, Jeff Bezos, was selling books to every state in the U.S and over 40 other countries. The simplified process of order fulfilment and shipping enabled him to order books directly from the publishers.
Another eCommerce giant, eBay, saw amazing growth as well. They allowed pretty much anyone to buy and sell online. In just a couple of years the website became a household name. It revolutionised ecommerce and was turning over hundreds of millions of dollars each year.
From its humble beginning in 1995 modern Ecommerce has become the fastest growing area of business, showing continued growth year after year. Technology has advanced further making it so much more accessible to people from all walks of life, and entire industries have been built around Ecommerce which are today, the who’s who of the business world.
These days, practically anything can be bought or sourced online. From your dinner to, clothes to a private jet. Over 60% of adults have purchased stuff online and this figure will only increase in the coming years. This shows that people love shopping online. The convenience of not having to leave your home, to the transparency of user reviews make it irresistible to today’s youth. One of the greatest lures of ecommerce is the fact that anyone with drive and motivation can succeed. The potential for growth and scalability is unprecedented. The advantages are practically endless.
There has been another surge in tech companies recently, following the increased popularity of mobile phone and web apps, as well as social media. These tech startups are often the epitome of new-age business. Seemingly small companies are worth billions of dollars and in many cases the employees don’t even live in the same city, or even the same country. These companies are being sold for millions of dollars with the help of website brokers. This is a career solely based on the buying and selling of web properties.
Many speculators are worried that this may be the beginning of another bubble. But silicon valley investors are relentless and for the time being, have been seeing a substantial ROI.
In conclusion, the way we do business has been changed substantially ever since the inception of the internet. From eCommerce giants like Amazon and eBay making it easier to get anything you want online, to regular mom-and-pop stores extending their reach globally, to bloggers that earn based on their ability to engage an audience, to multi-billion dollar tech startups. The world is changing rapidly and more opportunities are being created. Many people no longer have to depend on their local economy to find work, start a business and earn money.
While divorce perhaps doesn’t have the same stigma connected to it as it once did, the practice is still a touchy subject in many parts of America. Indeed, as we will see throughout the article, it has changed drastically in law as well as in the attitudes of the general population across the history of the country.
What was once a forbidden practice and only every used as a last resort is now very common. The medium length for a marriage in the US these days is around 11 years and divorce rates have been rising steadily throughout the 20th century and some 29% of marriages will suffer some form of ‘disruption’ and in many cases lead to a divorce.
However how has divorce law changed over time?
Even before the United States officially became the nation that we know it as today divorce was a hot topic in the colonies.
One of the earliest instances of a divorce law was in the Colony of Massachusetts Bay who actually created a judicial tribunal that dealt with divorce matters in 1629. This legislative body was allowed to grant divorces on the basis of adultery, desertion, bigamy and in many cases impotence as well. In the North the colonies adopted their own approaches that made divorce available whereas the southern colonies did all they could to prevent the act even if they did have legislation in place.
After 1776 divorce law was actually less restrictive. Hearing divorce cases took the legislature away from what they deemed as more important work so it was handed to the judiciary where it remains today. The big problem at the time, for women at least, was that they were basically a legal non-entity in the sense that it was difficult for them to claim ownership of property or financial assets which worked against them in the case of a divorce.
The Married Women’s Property Acts in 1848 went some way to rectifying this however throughout the 17th, 18th and 19th centuries divorce remain fairly uncommon if we think of how much it is used today and women were at a tremendous disadvantage from the get go.
Early 20th Century
By the end of the 18th century there were numerous ‘divorce mill’ states or places such as Indiana, Utah, and the Dakotas where you could go and get a divorce. Many towns provided accommodation, restaurants, bars and events centered on this trade. In 1887 Congress ordered the first compilation fo divorce statistics at a federal level to see how big the ‘problem’ had become.
The Inter-Church Conference on Marriage and Divorce which was held in in 1903 in an attempt to use religion to ensure that divorce was kept at a minimum. However with the onset of feminism and the general relaxation of views towards divorce from a societal and moral stand point the practice was gaining traction.
In the 1920’s trial marriages were established that allowed a couple of try out a marriage without actually being married, not having kids or any lifelong financial commitments. In a way it was simply two people of the opposite sex living in the same quarters however for the time it was a new concept and was one of the first ways in which the law tried to accommodate prenuptial contracts. In fact marriage counseling was beginning to become popular as well and represented the recognition that a problem existed even if they law did not strictly prohibit it.
The Family Court
As the years rolled by and the nation found itself embroiled in two world wars, divorce took a back seat as far as lawmakers were concerned. However the Family Court system that started in the 1950’s was the first time in decades that the legislature and judicial system in the US tackled the divorce issue.
For years, couples had to go through the traditional court system to get a divorce or at least plead their case to do so. However with new laws in the place that established the Family Court, this created a way for judges to essentially ratify agreements between couples for divorce that had been created beforehand. While the law used to ensure that a case had to be heard in a court of law this now changed.
With these changes, law firms specialising in divorce started appearing all over the country – San Francisco, Chicago, New York, and just about every other large city soon became involved in these family courts.
No Fault Divorces
Possibly the biggest change to divorce law in the United States in its history came with no fault divorces in the 1970’s. Up until now there still had to be a party at fault. Even in the Family Courts there was still a need for an adulterer or such like to be identified and then for the terms of the divorce to be agreed however with the change in the law then a divorce could be granted if neither party was at fault.
California actually led the way in 1969 however it wasn’t until the 1970’s that other states (Iowa being the second) adopted the law. In many respects it was enacted to bring down the cost of divorce in terms of hiring lawyers and expensive court fees with drawn out trials although that didn’t really come to fruition. Divorce lawyers and financial advisors all still profited greatly from divorce proceedings even if both parties simply wanted to split and move on.
Something that this change in the law didn’t focus on was child custody and it still remained a neglected topic. Laws to address this included:
While the law has attempted to create a fair an equal child custody process it still isn’t quite right in many respects and even with the legislation that has been enacted over the years there remains work to do.
Modern Day America
Divorce towards the end of the 20th century and into the early 21st century was a much different proposition from a hundred years ago.
While there are new laws being enacted all the time to deal with the finer points of divorce the no fault legislation essentially changed everything about the practice and made it into the divorce proceedings that we know today. That being said the attitudes towards divorce are still traditional in many quarters. Even though it has been set in law and that, in general at least, the stigma around divorce has gone it still plays a major role in affected a child’s upbringing and other societal problems.
Furthermore the equal share of property and finances is something else that the law is still trying to get right. Although this differs from state to state across the United States of America in most cases who is to blame doesn’t always transfer over to who gets the property. The legislature and the court system are still trying to find a balance in modern day America between a system that allows for divorce without needing evidence of wrong doing and one that is fair and equal while also addressing the child factor as well.
It isn’t easy but there is still a lot of work behind the scenes to address it.
Divorces were being carried out before the United States of America was even a nation. The colonies had their own measures and laws for dealing with such things however for centuries they were largely used in extreme cases. Indeed, up until the No Fault rule it was unusual to see a divorce that was granted on the basis that both parties simply wanted to break up.
This happens fairly regularly these days however back then there generally had to be a reason of some sort behind the divorce – a women cheating on a man for instance or a man having several wives.
The big question now is whether or not the law can develop even further and change with the rising divorce cases across the country and the more complicated financial and property ownership models. Up until now at least divorce law in the United States has developed at a fairly fast rate. It might not always have favored the couple given that much of the early legislation was there to deal with extreme cases that were even frowned upon by the religious orders of the day.
Divorce law was very reactionary and has been throughout the past 300 years aside from a few isolated cases. It is still adapting to a growing trend however while the stigma of divorce has largely vanished in many places the law is still trying to keep up.
The internet is a relatively new invention but boy have things changed in its short life! The internet has changed the way we live and it has been responsible for the creation of thousands upon thousands of jobs that simply would not exist without it.
One of those categories of job is web design, something that we would sorely miss now if it disappeared. What would we do without the animations? The colorful backgrounds, the fancy writing and the music playing in the background?
When did it Begin?
In 1990, Tim Berners-Lee developed the very first web browser, and it was called WorldWdeWeb, although it was later renamed as Nexus. At that time, only text could be displayed on a web page. No fancy fonts, no pretty pictures or videos, just simple plain text, with links underlined in blue.
In 1993 Mosaic was released, the first ever web browser that allowed developers to add images to their web pages. It was able to support .gif images and web forms, a massive leap forward for the time.
Design was not brilliant because of the constraint of the browser and to a limit in bandwidth programmers rather than designers designed most websites.
Mid 1990’s to 2000
By the mid-nineties, Netscape was the top web browser but it was soon knocked off its pedestal by Internet Explorer and so began the war of the browsers. Around this time, web design began to get a little more complex, using frames and tables as well as images.
From 1998, we began to see the introduction of web development tool kits. Remember DreamWeaver? GoLive? These began to be more popular as they gave a larger number of users access to web page creation.
Jobs in web design began to grow as more designers were offered jobs to build sites. Flash technology also made its appearance during this era of web site design although it was not all that popular to start off with.
In the year 2000, the bubble burst and hundreds of thousands of web businesses crashed. However, while this may have put the clamps on things for a while, it was not for long. Web design standards began to pick up again.
Now we started to see a better class of design. We got designs that were not based on tables, we got transparency with .png images and CMS began to grow in popularity. Content management System was a program that allowed designers to publish content on the web. They could go back in and edit what they had published and modify it as they saw fit.
2004 – 2007
Web 2.0 was born in 2004. This was the era of bold websites, sites that were aimed at communities. There was bold typography and shiny gradients. Corners became rounded, edges softened and web design, once again, took off at the speed of light.
Websites began to be more functional and needed more in the way of an interface to work properly. Widgets were introduced all over the place to help integrate one site with another. This was more often, where a social network site was involved, lining outside feeds to the site, or lining from the site to a blog.
This era was also marked by an increase in accessibility of websites to common people. Developments such as WordPress and Blogger, along with user-friendly guides on how to make a website helped every day people make a website without having to learn HTML or CSS.
2008 to the Present
Web site design has evolved incredibly over the last few years and one thing that has given it a push, unbelievably, was the iPhone. Mobile website design was introduced, allowing people to view sites properly on their phones.
Many of the bigger websites created mobile versions of their sites specifically for the smartphone and the tablets that were fast becoming popular devices. On the internet itself, the large and fast growing social network sites created more widgets for user to put on their blogs and other websites created widgets designed to go on social network sites.
In design, typography increased tremendously and grid-based designs are fast becoming the norm.
Today, website design is a huge business. Designs are more complex yet less cluttered. Early websites were difficult to navigate; today, a well-designed website is enough to ensure your business will succeed.
In terms of design, where the internet goes from now is anyone’s guess. We have color, we have fonts and we have images. We can even embed videos into websites now so who knows where the next trend will take us.
Mental Illness in Antiquity
The label schizophrenia is a recent term, first used in 1908 by Eugen Bleuler, a Swiss psychiatrist, and was meant to describe the disunity of functioning between personality, perception, thinking and memory. Whilst the label is new, accounts of schizophrenia-like symptoms can be found in ancient texts dating back to 2000 BC, and across a number of cultural contexts. The oldest of these texts is the ancient Egyptian Ebers papyrus, around two millennia old.
There are descriptions of illnesses marked by bizarre behaviour and lack of self-control in the Hindu Arthava Veda, dating approximately 1400 BC, and a Chinese text from approximately 1000 BC called The Yellow Emperor’s Classic of Internal Medicine, which attributes insanity and seizures to supernatural and demonic forces.
The Greeks and Romans are also found to have a general awareness of psychotic illnesses. Plato, who lived in the fourth and fifth centuries BC, spoke of a madness of divine origin, which could inspire poets and create prophets. Demonic possession and supernatural forces as the cause of mental illness are a common theme in the ancient literature.
Whilst we can infer these ancient scribes were reporting on the symptoms and causes of the illness we currently describe as schizophrenia, we cannot be certain of it. Some suggest that the lack of clear diagnostic examples in the older literature points to schizophrenia being an entirely modern affliction. Perhaps cultural differences in the understanding of a sufferer’s behaviour can account for the discrepancy in reporting of the illness in ancient times.
The Middle Ages – A Demonic Affliction
The Medieval era saw the beginnings of formal detention and institutionalisation of those deemed mentally ill. In Europe, sufferers were occasionally cared for in monasteries. Some towns had “fools towers”, which housed madmen. In The 1400’s, a number of hospitals to treat the insane sprang up throughout Spain.
In England in 1247, The Priory of Saint Mary of Bethlehem was founded – later known as the notorious Bedlam, the word becoming synonymous with madness itself.
Whilst scholars and Universities at this time had developed a scientific approach towards mental disturbances, there was still a great deal of belief in the lay population in supernatural forces.
In 15th century Europe, delusions and hallucinations were seen as proof of demonic possession. Treatments to overcome these disturbances included confession and exorcism.
Schizophrenia and Early Psychiatry
It is not until the middle of the 19th century that European psychiatrists begin to describe a disease, of unknown origin, typically with an adolescent onset and with a propensity towards chronic deterioration. Emil Kraeplin, a German psychiatrist, utilised the term “dementia praecox” to describe a variety of previously separately recognised illnesses, such as adolescent insanity and catatonia syndrome.
Kraeplin’s long term studies of a large number of cases led him to believe that despite the diversity of clinical presentations, the commonalities in the progression of the illness meant they could be categorised under the singular heading of dementia praecox. Later, he suggested nine categories of the disorder.
This leads us to Eugen Bleuler, who coined the term schizophrenia, meaning “split mind”, replacing the previous terminology dementia praecox. Bleuler’s “schizophrenia” incorporated an understanding that the disorder was a group of illnesses, and did not always deteriorate into a permanent state of “dementia” – as was previously considered by Kraeplin to be a hallmark of the disease.
Further, Bleuler suggested schizophrenia had four main symptoms, known as the 4 A’s: blunted Affect – a reduction in emotional response to stimuli, loosening of Associations and disordered pattern of thought, Ambivalence, or difficulty making decisions, and Autism, by which he meant a loss of awareness of external events and preoccupation with one’s own thoughts.
Schizophrenia and Eugenics
Increased scientific understanding of schizophrenia and other mental illness was overshadowed by persistent stigma and misunderstanding of mental illness. Schizophrenia was thought to be an inheritable disorder, and as such sufferers were subject to Eugenics and sterilisation.
In 1910, Winston Churchill, wrote to the Prime Minister Herbert Asquith, insisting on the implementation of mass forced sterilisations of those deemed feeble minded and insane.
Churchill was not successful in implementing this policy. Forced sterilisation was, however, practised in parts of the USA throughout the twentieth century, and Nazi Germany utilised Eugenics as justification for extreme measures against those it saw as undesirable, including the mentally ill.
Examples of treatments for what would be recognised today as a mental illness go back thousands of years, and include trepanning, the drilling of holes into the skull to allow evil spirits to exit, and various forms of exorcism. The ancient Greeks and Romans tended to employ somewhat enlightened and humane treatment methods.
The Greeks applied their theory of humoural pathology, or the belief that an imbalance in the body’s various fluids could induce madness, amongst other illnesses.
Treatment involved correcting the imbalance in fluids, and encompassed dietary and lifestyle changes, to blood-letting and purging. The Roman treatments consisted of warm baths, massage and diets, although more punitive treatments were also suggested by Cornelius Celsus, stemming from the belief that the symptoms were caused through having angered the gods, and included flogging and starvation.
We may view some of the older techniques for treating mental illness as deplorable, yet many modern pre-pharmacotherapy treatments were unfortunately not much better in some respects.
From the wretched conditions of many asylums, the raising of the body temperature by injection of sulphur and oils to insulin shock therapy, which kept the patient in a coma, deep sleep therapy and electroconvulsive therapy, which were all widely used treatments for schizophrenia and a variety of other mental illnesses prior to the advent of anti-psychotics, patients could expect widely variable results and the risk of further harm.
Lobotomy, developed in the 1930’s, also became a popular treatment for schizophrenia. Initially, the procedure required an operating theatre as holes were drilled into the skull, and either alcohol injected into the frontal lobes or an instrument called a leucotome used to create lesions in the brain.
The technique was soon refined and simplified. American psychiatrist Walter Freeman, seeking to make the procedure accessible to patients in asylums where there was no access to an operating theatre, developed the trans orbital lobotomy. Freeman accessed the prefrontal area through the eye socket, and using an instrument similar to an ice pick made a series of cuts.
The process was quick, and for many had devastating effects, patients were left with impairments of intellectual, social and cognitive function, and often there was no great improvement in the symptoms for which the procedure was performed.
Current Treatments and Research
Antipsychotic drugs to treat schizophrenia were first introduced in the 1950’s. Their success led, in part, to the deinstitutionalisation and integration of sufferers into the community. Antipsychotics, whilst allowing many sufferers of schizophrenia to lead functional lives, have their drawbacks.
Common adverse side effects can include weight gain, involuntary movements, lowered libido, low blood pressure and tiredness. Antipsychotics do not represent a cure for schizophrenia, but used in combination with community based and psychological therapies, sufferers have every chance of recovery.
The internet has also become a useful tool for schizophrenia sufferers and their families, friends and carers, with many useful resources and schizophrenia support sites now available.
Scientific investigations in to the causes and treatment of schizophrenia are ongoing, with a focus on genetic research, which will hopefully lead to more effective treatments and possibly prevention. Information on current research is available here.
Seated but immense, with his eyes closed in meditation and reflection, the giant, austere statues of the Great Buddha look over a population of adherents that stretches from Indonesia to Russia and from Japan to the Middle East. His gentle philosophy also appeals to many believers scattered all over the world.
Somewhere between 500 million and 1 billion people worldwide are estimated to be Buddhists.
It’s exactly the nebulous nature of Buddha’s philosophy, crisscrossed by many sects of adherents with a dizzying assortment of beliefs and approaches to the faith, that makes it so difficult to estimate exactly how many Buddhists there are. Some scholars go so far as to refuse to define Buddhism as a religion at all, and prefer to refer to it as a personal philosophy, a way of life, rather than a true theology.
Two and a half centuries ago, a boy named Siddhartha Gautama was born into a royal family in a rural backwater in the northeast corner of the Indian subcontinent, in modern-day Nepal. An astrologer told the boy’s father, King Suddhodana, that when the child grew he would either become a king or a monk depending on his experience in the world. Intent on forcing the issue, Siddhartha’s father never let him see the world outside the walls of the palace, a virtual prisoner until he was 29 years old. When he finally ventured forth into the real world, he was touched by the suffering of the ordinary people he encountered.
Siddhartha dedicated his life to ascetic contemplation until he achieved “enlightenment,” a feeling of inner peace and wisdom, and adopted the title of “Buddha.” For over forty years he crisscrossed India on foot to spread his Dharma, a set of guidelines or laws for behaviors for his followers.
When Buddha died in 483 BC, his religion was already prominent throughout central India. His word was spread by monks seeking to become arhats, or holy men. Arhats believed they could reach Nirvana, or perfect peace, in this lifetime by living an ascetic life of contemplation. Monasteries dedicated to the memory of Buddha and his teachings became prominent in large Indian cities like Vaishali, Shravasti, and Rajagriha.
Shortly after Buddha’s death, his most prominent disciple called a meeting of five hundred Buddhist monks. At this assembly, all of Buddha’s teachings, or sutras, as well as all the rules Buddha had set down for life in his monasteries, were read aloud to the congregation. All of this information together forms the core of Buddhist scripture to this day.
With a defined way of life outlined for all his disciples, Buddhism spread throughout the rest of India. Differences in interpretation crept in as the number of adherents grew distant from each other. One hundred years after the first great assembly, another was convened to try to iron out their differences, with little unity but no animosity, either. By the third century BC, eighteen separate schools of Buddhist thought were at work in India, but all the separate schools recognized each other as fellow adherents of Buddha’s philosophy.
A third council was convened in the third century BC, and a sect of the Buddhist called the Sarvastivadins migrated west and established a home in the city of Mathura. Over the intervening centuries their disciples have dominated religious thought throughout much of central Asia and Kashmir. Their descendants form the core of the current-day schools of Tibetan Buddhism.
The Third Emperor of the Mauryan Empire, Ashoka, became a supporter of the Buddhist religion. Ashoka and his descendants used their power to build monasteries and spread Buddhist influence into Afghanistan, great swathes of central Asia, Sri Lanka, and beyond into Thailand, Burma, Indonesia, and then China, Korea, and Japan. These pilgrimages went as far as Greece in the east, where it spawned a hybrid of Indo-Greek Buddhism
Over the centuries, Buddhist thought continued to spread and splinter, with innumerable changes added to its scriptures by a multitude of authors. During the three centuries of the Gupta period, Buddhism reigned supreme and unchallenged throughout India. But then, in the sixth century, invading hordes of Huns raged across India and destroyed hundreds of Buddhist monasteries. The Huns were opposed by a series of kings that defended the Buddhists and their monasteries, and for four hundred years the Buddhists thrived once again in northeastern India.
During the Middle Ages, a great, muscular religion appeared from the deserts of the Middle East to challenge Buddhism. Islam spread quickly east, and by the late Middle Ages Buddhism was wiped almost completely from the map of India. It was the end of the expansion of Buddhism.
Buddhism today is represented by three main strains that cover distinct geographical areas.
Since Buddhist thought is more of a personal philosophy than a well-defined creed, it has always invited an enormous multitude of interpretations. This continual churning of thought in Buddhist thought continues into the present day with contemporary Buddhist movements with names like Neo-Buddhism, Engaged Buddhism, and an array of truly tiny, and sometimes, literally individual traditions in the West.
In the latter half of the 20th century, a movement of Japanese Buddhists calling themselves the Value Creation Society sprang up and spread to neighboring countries. The members of this Soka Gakkai movement are not monks, but consist solely of lay members interpreting and meditating on Buddha’s legacy on their own, centuries after Siddhartha first stepped foot outside his palace walls and looked on the world that he felt need his call for peace, contemplation, and harmony.
A rare disorder
Idiopathic Thrombocytopenic Purpura (ITP) is a misnomer. The rare condition causes antibodies to destroy platelets important for blood clotting, and can produce symptoms of low platelet count, unusual haemorrhaging, including intracranial haemorrhage (rare but potentially life threatening), mucosal and gingival haemorrhaging, abnormal menstruation, petichiae, purpura and a general propensity to bruise easily. However, some patients may remain asymptomatic other than a low platelet count. Acute and spontaneously resolving occurrences are more commonly seen in children, whilst adult onset ITP is more likely to be chronic. The terminology assigned to the disorder has changed and evolved over time, reflecting increased understanding of the mechanisms of ITP through medical and scientific advancements. The issue of the misnomer stems from our increased knowledge – as it turns out, ITP is generally not “Idiopathic”, and purpura is not seen in all patients.
Medicine has a long held fascination for ITP. Stasi and Newland’s ITP: a historical perspective, notes a number of potential examples of ITP, the first dating back almost a thousand years. A description by Avicenna of purpura with characteristics of ITP can be found in the 1025 The Canon of Medicine.In 1556 a case of spontaneously resolving purpura and bleeding is reported by Portuguese physician Amatus Lusitanus in the book Curationum Medicinalium Centuriae. Lazarus de la Riviere, physician to the King of France proposes in 1658 that purpura is a phenomenon caused by a systemic bleeding disorder. In 1735 Paul Gottlieb Werlhof, a German physician and poet, provides us with the first detailed description of a case of ITP, which subsequently becomes known as Werlhof’s disease.
Controversy arises regarding the mechanisms of thrombocytopenia, with Frank in 1915 suggesting it is the result of suppression of megakaryocytes by a substance produced in the spleen, alternatively Kaznelson purports thrombocytopenia is due to increased destruction of platelets in the spleen. In 1916, Kaznelson persuades a professor to perform a splenectomy on a patient with chronic ITP, the outcome of which is a startling postoperative increase in the patient’s platelet count and resolution of purpura. Splenectomy becomes the prevailing treatment for those with refractory ITP for many years.
The Harrington-Hollingsworth experiment
Self-experimentation in medicine is considered by some to be a historical tradition, and preferable to the unethical treatment of patient subjects, the extremes of which can be seen in examples such as the Tuskegee Syphilis Experiment. The self-experimentation undertaken in the Harrington -Hollingsworth experiment was risky for the participants, but is a good example of experimentation that could not be undertaken ethically on research subjects. In 1950 Harrington and Hollingsworth, who were hematology fellows at Barnes Hospital in St Louis, endeavored to test their idea that the cause of ITP was a factor in blood that destroyed platelets. Harrington, who happened to match the blood type of a patient being treated at the hospital for ITP, received a 500ml transfusion of the patient’s blood. Hours after the procedure Harrington’s platelet count plummeted, and he had a major seizure. Bruising and petichiae became conspicuous over four days of low platelet count, improvement not noted until five days later.
On examination of Harrington’s bone marrow, no effect on megakaryocytes could be deduced. This suggested an effect on the platelets, rather than the marrow. The experiment was replicated on all viable members of the hospital’s hematology department, with all recipients of plasma from patients with ITP experiencing a decrease in platelet count within 3 hours of transfusion. The legacy of Harrington-Hollingsworth experiment, along with other reports published in 1951, led not only to new understanding of the disorder, but also a name change: idiopathic thrombocytopenic purpura became immune thrombocytopenic purpura.
Evolution of treatment
The increased understanding of ITP as an autoimmune disorder led to the development of treatments other than splenectomy. Corticosteroids were introduced in the 1950’s, and since the 1960’s a number of immunosuppressive agents have been utilised, however the evidence for their efficacy is somewhat lacking.
Intravenous immunoglobulin (IVIG) as a treatment for ITP was first trialed in 1980 on a 12 year old boy with severe, refractory ITP, with the result of an increased platelet count within 24 hours, and continued increases upon further daily IVIG administrations. Pilot studies ensued, with results establishing the efficacy of IVIG therapy in increasing platelet counts in ITP patients. IVIG consumption, not only for the treatment of ITP but for various hematologic, inflammatory and autoimmune diseases, has increased world-wide since 1980 from 300kg per year to 1000 tonnes per year in 2010. Alongside corticosteroid therapy, IVIG remains a first line treatment for ITP, particularly in patients at high risk for bleeding or pre operatively. Currently, second line therapy includes use of immunosuppressants, corticosteroid-sparing agents, monoclonal antibodies, splenectomy, thrombopoietin receptor agonists and vinca alkaloids. We have come a long way since Werlhof’s apparent cure of ITP with citric acid!
The 1980’s also saw new evidence arise regarding platelet destruction in ITP by investigators at the Puget Sound Blood Centre. Further studies were able to demonstrate the inhibition of megakaryocyte growth and maturation in vitro, of antibodies from ITP patients.
More recently an international working group has established two major diagnostic categories of ITP, Primary ITP, where other conditions of thrombocytopenia are excluded, and Secondary ITP in which the condition is due to infection by other diseases and bacterias, for example HIV or hepatitis C. Further, categories have been established to assist with the approach to management of ITP, including newly diagnosed ITP, where the diagnosis is less than three months old, persistent ITP where diagnosis is between three and twelve months old and the condition has not spontaneously resolved, chronic ITP lasting longer than twelve months, and severe ITP, described as bleeding at presentation requiring treatment, or new bleeding symptoms which demand additional treatment with a different platelet enhancing therapy or increased dosage of current therapy.
Current understanding of Idiopathic Thrombocytopenic Purpura
The pathogenic causes of ITP remain little understood, but a multifaceted etiology is suspected. The role of eradication of Helicobacter pylori in raising platelet counts of ITP patients has recently been explored, with a considerable variability found in response to H.pylori eradication from country to country. This high variation may be due to differences in strains of H.pylori internationally, with Japanese strains being frequently CagA-positive, and American strains usually CagA-negative. Increased platelet responses due to eradication of H.pylori are higher in patients with the CagA-positve strain of the bacteria.
Personal blogs of individuals with ITP are also providing doctors and other medical professionals with greater insights into possible causes of the condition.
Massive leaps in the treatment and management of ITP have been achieved within the last hundred years, though clearly there are still gaps in the understanding of its pathogenesis. Treatment for refractory ITP failing first and second line treatments is an area that may still yield improvements. Greater understanding and management of the course of ITP means more patients are treated appropriately.
From “Werlhof’s disease”, to “Idiopathic Thrombocytopenic Purpura”, to the more recent and appropriate “Primary Immune Thrombocytopenia” – and a number of variants in between, all of which are eponyms – ITP continues to be a source for emerging medical knowledge almost three hundred years since it was first described in depth by Werlhof. The change in name mirrors our scientific and technological advances in treating and understanding ITP.
While the iPhone has become one of the most iconic products on the market and Apple have redesigned how we view cell phones and technology in general, there has always been a view that it can be improved. When the first generation iPhone hit the market back in 2007 it was a revolutionary product and one that had not been seen before with its innovation and decidedly cool design. While the masses lapped up the iPhone in its default state through the various generations there was an online community that felt things could be improved.
How could you improve a device such as the iPhone?
Well hackers and other technological experts decided that the source code could be edited. While Apple tend to keep these things under wraps and protected many people felt that it could be edited just enough to enhance the device and make it seem like a completely different phone with new features.
Many people have compared jailbreaking an Apple iPhone to the process of rooting an Android device.
While the basic concept may have similarities the process and what it actually does is quite different. For example when you root an Android device it reboots a whole new operating system,= onto the phone whereas this is not possible with the iPhone even given the skills of many modern day jailbreakers and those that take an interesting in hacking.
With a jailboken iPhone users can have a whole new experience with the device. While it serves many practical measures such as extending battery life by cutting down on unnecessary apps it also allows for for customizable features and being able to use the Cydia app store with access to thousands of new apps.
The Early Stages
As opposed to an elite network or group of computer hackers spending weeks on end trying complicated ways and methods to jailbreak the iPhone, the whole concept was first started by George Hotz otherwise known as ‘geohot’, a 17 year old who wanted to change his iPhone network. The New Yorker goes into detail how Hotz used a screwdriver, a guitar pick, and a soldering tool to jailbreak the first iPhone.
Hotz was by no means the only one to break into the source code and gain control of the iPhone OS. A separate group of hackers did so a few days after the first iPhone hit the stores and another team calling themselves the iPhone Dev Team released the first public jailbreak in October of that year. While it wasn’t as sophisticated as the jailbreaks we have at the minute and it did come before Cydia was released, it was a major step on the road to offering a jailbreak for everyone to use.
Birth Of Cydia
In 2008 Cydia was born.
We spoke above about what Cydia can offer and it was the iPhone Dev Team again that pioneered this concept of an alternative app store for Apple iPhone first generation users.
It was developed by Jay Freeman who is more commonly known online by the pseudonym ‘saurik’ and since its inception almost 6 years ago in 2008 the alternative to the Apple App Store has grown and grown. That being said the intention behind Cydia for jailbroken iPhone’s was never just to offer alternative apps. Instead it has also been used for new features and while the Apple app store allows various applications to be installed, Cydia can tweak your whole iPhone and add a new default feature or function via a simple download. For early users of the iPhone circa 2008-2010 this was a major advancement in technology.
Apple Strikes Back
Jailbreaking has never been a true ‘underground’ movement and Apple have always been aware of the existence even right back to its early days. In 2009 when iOS 3.0 was released by Apple, jailbreakers had to rethink their approach. Apple have specifically shut up shop and decided to stop the exploitation of the device via a jailbreaking method however it wasn’t long until this too was breached.
Georghe Hotz was back on the scene with the release of a jailbreak known as purplera1n that worked for all iOS 3.0 models and blackra1n came out to coincide with iOS 3.1.2.
The whole jailbreaking process increasingly became like a cat and mouse affair between the two sides; on the one hand Apple kept releasing security fixes to ensure that it became even more difficult to exploit the device and then jailbreakers such as Hotz and the iPhone Dev Team saw this as a challenge to exploit.
Another group calling themselves the Chronic Dev Team also came onto the scene and started to release jailbreaks of their own and used the base of Hotz’s jailbreak to work on Mac devices.
The whole jailbreaking scene was quickly becoming an Apple vs the hackers scenario. Apple were doing their best to stifle the jailbreakers and keep their iOS and source code as tightly secured as possible however within days of every new release and security update they were finding that their code was being breached.
No matter what Apple tried to do it seemed as though they were always a step behind the jailbreakers.
By 2010 jailbreaking was starting to be seen as a mainstream alternative to simply using the default iPhone device.
By now it was openly being used by hackers, tech geeks, and those interested in exploiting the iPhone in general and while there had been attempts at trying to bring it into the mainstream they so far had remain futile.
In 2010 however Comex released JailbreakMe 2.0, a jailbreak that could be accessed by simply visiting a website, paying a small fee and then having your iPhone jailbroken in a short space of time via the use of a tool. Up until this point jailbreaking required a certain degree of technical know how even for the average user. With this new process of jailbreaking the iPhone a normal user of the iPhone range could simply pay a fee to a website and have their iPhone jailbroken.
Apple wasn’t having any of this however.
Only a few weeks after the release of JailbreakMe 2.0 they brought out a security fix and iOS 4.1 that essentially stopped the tool from working. Like all Apple security fixes it is only a matter of time before they are breached again and rather than holding off jailbreakers and hackers for months at a time this was quickly turning into weeks and even days in some cases. As soon as a new iOS security fix hit the device and new jailbreak was on the scene shortly after.
Jailbreaking Becomes Permanent
The Chronic Dev Team were quickly becoming the most prominent iPhone jailbreaking group around. They came up with an ingenuous permanent jailbreak. Exploiting a bootroom vulnerability they released a permanent fix in 2010 and called it ‘SHAtter’ and it would jailbreak all current iOS models for life and not just until a new security fix came out.
Since then JailbreakME 3.0 has been released and even though Apple have been attempting to cover up their security breaches as BGR has explained. This has become particularly true since iOS 7 has been released however over the past 4 years since jailbreaking became permanent there has been a much more wider acceptance of the process.
For example the first jailbreaking convention was held in London in 2011. Called MyGreatFest the event brought together jailbreakers and hackers from all over the world including the likes of the iPhone Dev Team, Chronic Dev Team and George Hotz.
Is Jailbreaking Here To Stay?
The history of jailbreaking the iPhone devices is an interesting one.
What started with a teenager messing about with the iPhone by using a screwdriver and guitar pic has evolved in a million dollar industry with countless services offering to jailbreak all iPhone’s and base band models. A simple Google search yields thousands of results for iPhone jailbreaking and the process really has become a mainstream attraction for users of the phone.
Apple themselves are struggle to counteract the popularity of jailbreaking. The legal status is a grey area however in most countries including America it is currently legal by definition however unlocking has since been outlawed and the issues of jailbreaking is due to be looked at again next year in 2015.
The creation of a permanent jailbreak has really boosted its appeal as there is no longer the need to have the process done over and over again. Indeed Apple have attempted to match the hackers by incorporating some of the main features of jailbreaking into their default device and this was evident in iOS 7 and the subsequent iOS releases including iOS 7.1.
The fact remains that jailbreaking doesn’t look like going anywhere soon.
It’s history is currently very short in the grand scheme of things. Trends come and go and 6 years is not a long time by any means however with literally millions of people using jailbreaking at the minute then there is no suggestion that it will disappear in the near future.
In the 19 games of the series, the Guitar Hero Franchise was very successful even though it only lasted six years. Guitar Hero is a video game where…
The history of chocolate has a record of at least 4,000 years. It comes from the fruit of a Cacao (aka cocoa) tree, known as Theobroma Cacao.The tree…
A Electronic Book or E Book as they are universally known, is a text based publication in digital form. While they may contain images and graphs of some…
A doll with a skinny frame, long blonde hair, blue eyes, perfectly done up makeup, swathes of clothing, and a companion named Ken. Sound familiar? Most Americans will…