Wednesday, March 23, 2016

The Historical Roots and Stages in the Development of ISIS


This study is originally published by The Meir Amit Intelligence and Terrorism Information Center. The study is an overall analysis of ISIS, also known as ISIL, Islamic State (or IS). The study is structured in nine sections,[1] which if read in conjunction with each other, draws a complete picture of ISIS. You can also download the study in PDF format here.

Historical background
ISIS took root in the new era created in Iraq after the Americans took control of the country in 2003. The Second Gulf War led to the overthrow of Saddam Hussein’s regime, the dismantling of the Iraqi army and the destruction of the existing governmental structure. As a result, a security and governmental vacuum was created and the country’s fragile social fabric (in the middle of which was the volatile Sunni-Shi’ite schism) was severely damaged.
During the almost nine years (2003 — 2011) the United States army was stationed in Iraq the Americans failed to establish effective Iraqi army and security forces to fill the newly-created security vacuum. While in Iraq, the Americans encouraged the establishment of what was supposed to be a democratic national Shi’ite regime headed by Nouri al-Maliki. However, the regime alienated the Sunni population, which had traditionally controlled the country, even though they were a minority (about 22% of the Iraqi population is Sunni Arabs — alongside the Kurds, who are also Sunnis — while about 60% of Iraqis are Shi’ites).
The branch of Al-Qaeda in Iraq, established in 2004, entered the security vacuum and took advantage of the increasing political-societal Sunni alienation: It became an important actor in the insurgent organizations fighting the American army, became stronger after the withdrawal of the American troops at the end of 2001, and spread to Syria after the civil war began in March 2011. The establishment of Al-Qaeda and ISIS in Iraq and Syria occurred in four stages:
  1. Stage One (2004-2006) — The establishment of the branch of Al-Qaeda in Iraq led by Abu Musab al-Zarqawi and called “Al-Qaeda in Mesopotamia:” It waged a terrorist-guerilla war against the American and coalition forces and against the Shi’ite population. The first stage ended when Abu Musab al-Zarqawi was killed in an American targeted attack in June 2006.
  2. Stage Two (2006-2011) — Establishment of the Islamic State in Iraq (ISI): ISI served as an umbrella network for several jihadi organizations that continued waging a terrorist-guerilla campaign against the United States, its coalition allies and the Shi’ite population. ISI was weakened towards the end of the American presence in Iraq following successful American military moves and a wise foreign policy that supported the Sunni population and knew how to win their hearts and minds.
  3. Stage Three (2012-June 2014) — The strengthening of ISI and the founding of ISIS: After the American army withdrew from Iraq ISI became stronger. Following the outbreak of the Syrian civil war ISI established a branch in Syria called the Al-Nusra Front (“support front”). Dissension broke out between ISI and its Syrian branch, leading to a rift between ISI and Al-Qaeda and the establishment of the Islamic State in Iraq and Greater Syria (ISIS).
  4. Stage Four (as of June 2014) — Dramatic ISIS military achievements: The most prominent was the takeover of Mosul, the second largest city in Iraq. At the same time ISIS established its control in eastern Syria where it set up a governmental center (its “capital city”) in Al-Raqqah. In the wake of its success, ISIS declared the establishment of an “Islamic State” (IS) (or “Islamic Caliphate”) headed by an ISIS leader named Abu Bakr al-Baghdadi. In September 2014 the United States declared a comprehensive campaign against ISIS, which is currently waging a fierce struggle against its many enemies both at home and abroad.

In ITIC assessment, historically speaking there are similarities between the results of the American invasion of Iraq, the Soviet invasion of Afghanistan and the Israeli invasion of Lebanon. In all three instances the invading country failed to establish a new political order or to stabilize an effective, supportive regime. In effect all three invasions had a deleterious effect on the existing delicate politicalsocial fabric: in Afghanistan and Iraq they caused changes that contributed to the establishment of radical Sunni jihadi terrorist organizations and in Lebanon to a radical Shi’ite terrorist organization following Iranian ideology and receiving Iranian support. The terrorist organizations established in Iraq (the branch of Al-Qaeda), Afghanistan (Al-Qaeda) and Lebanon (Hezbollah) exist to this day. ISIS, which developed from a branch of Al-Qaeda, has become strong in Iraq and Syria and today threatens the order and stability of the Middle East and the entire world.
Establishment of Al-Qaeda’s branch in Iraq led by Abu Musab al-Zarqawi and the beginning of the campaign against the United States and its allies
The establishment of Al-Qaeda and the global jihad in Iraq began when Abu Musab al-Zarqawi, a Jordanian global jihad operative, went to Iraq in 2002 (before the entrance of the Americans). Al-Zarqawi (a nickname for Ahmad Fadil al-Nazal al-Khalayleh) was influenced by the Jordanian Salafist-jihadi movement headed by Abdullah Azzam, Abu Muhammad al-Maqdisi and Abu Qatada (all three of whom are of Palestinian origin). While in Afghanistan in 1989 Abu Musab al-Zarqawi underwent ideological indoctrination and operational training conducted by Abdullah Azzam (Osama bin Laden’s ideological mentor). Al-Zarqawi returned to Jordan in 1993 where he was detained and imprisoned in 1994 and released in 1999, at which point he went back to Afghanistan.
After September 11, 2001, al-Zarqawi fled from Afghanistan and sought refuge in Iran. In 2002, before the American entrance into Iraq, he went to the Kurdish region of northern Iraq. While there he collaborated with a Kurdish jihadi Islamist organization called Ansar al-Islam, established in September 2001 (which is still operative and belongs to the coalition in Iraq collaborating with ISIS). Al-Zarqawi later established his own Islamic jihadi organization, Al-Tawhid wal-Jihad (“the oneness [of Allah] and jihad”). After the Americans invaded Iraq in March 2003 he joined the insurgents fighting the United States and became a prominent figure until he was killed in a targeted American attack.
In October 2004 al-Zarqawi’s organization joined Al-Qaeda. He swore allegiance to Osama bin Laden and was declared the leader (emir) of Al-Qaeda in Iraq). (In Arabic al-qaeda fi bilad al-rafidayn, Al-Qaeda in the country of the two rivers, i.e., Mesopotamia). It was the first branch Al-Qaeda established beyond the borders of Afghanistan and Pakistan. With its founding, al-Zarqawi was no longer the leader of a local Islamic jihadi organization but rather had become the official representative of Al-Qaeda in Iraq, and later one of the prominent terrorists among the global jihad networks. The jihad network al-Zarqawi established in Iraq, initially composed of operatives who had been affiliated with it in Pakistan and Afghanistan, later enlisted operatives from Iraq, Syria and other Arab countries.
As the emir of Al-Qaeda in Iraq al-Zarqawi formulated a strategy for the campaign against the United States. He had the following objectives: harm U.S. forces and its allies; discourage Iraqi collaboration by targeting government infrastructure and personnel; target reconstruction efforts in Iraq with attacks on Iraqi civilian contractors and aid workers; and draw the U.S. military into a sectarian Sunni-Shiite war by targeting Shiites.[6] The wave of terrorism he initiated against the Shi’ite population, the result of his strong anti-Shi’ite doctrine, was carried out by suicide bombers and the use of car bombs which caused many civilian casualties, sowed chaos throughout Iraq, made it difficult to stabilize the internal situation and added a murderous gene to the ISIS DNA.
Abu Musab al-Zarqawi’s strategy, which stressed broad attacks on the Shi’ite population (and sometimes on Sunni civilians as well), was criticized by both Osama bin Laden and his deputy Ayman al-Zawahiri. They were concerned that the indiscriminate killing of innocent Muslim civilians would erode public support for Al-Qaeda throughout the entire region. In July 2005 they criticized his strategy and instructed him to stop attacking Shi’ite religious and cultural sites. He refused, and his relations with the Al-Qaeda leadership deteriorated.[7] The dispute held the seeds of the tensions and rivalry between the branch of Al-Qaeda in Iraq and the central Al-Qaeda leadership, as it was manifested through ISIS’s independent actions and policy, and ISIS and the Al-Qaeda leadership headed by Ayman al-Zawahiri.
The terrorist-guerilla campaign of the branch of Al-Qaeda in Iraq was mainly carried out in and around Baghdad and in western Iraq. The local Sunni population in those regions became hostile to the central Iraqi government and to the United States, and today forms ISIS’s societal and political power base. The most important city in the Sunni region was Fallujah. Fallujah is located in Al-Anbar, the largest province in the country, which became al-Zarqawi’s power base and symbolized the jihadi campaign against the American army. Al-Zarqawi’s main campaign was concentrated in Iraq, but he had made attempts to export jihadi terrorism to other Arab states, including Jordan, his country of origin (See below).
Ideologically, Abu Musab al-Zarqawi handed down to his heirs a radical Islamic, uncompromising legacy whose traces are evident in ISIS’s actions to this day. Noteworthy is its hostility toward Shi’ites in general and Iraqi Shi’ites in particular, whom he referred to in strong terms (“human scum,” “poisonous snakes,” “deadly poison”). He regarded the Shi’ites as a fifth column who, along with pro-American Sunnis, were trying to institute a new Shi’ite regime in Iraq, anti-Sunni and pro-American. That anti-Shi’ite legacy, based on Arabic Islamic sources from the Middle Ages, gave al-Zarqawi what he considered “Islamic legitimacy” to carry out mass-killing attacks on Shi’ites and the Shi’ite-affiliated central government. His objective was to instigate a Shi’ite-Sunni civil war that would destabilize public order, prevent the establishment of a Shi’ite regime and support Al-Qaeda’s takeover of Iraq. ISIS has continued its brutality towards the Shi’ite population in Iraq and Syria, implementing the legacy of al-Zarqawi who, after his death, became a revered figure and role model.[8]

Political “Left” and “Right” Properly Defined

I’m often asked versions of the following: Given that the political right is so corrupted by conservatives who seek to limit liberty in countless ways, wouldn’t it be better to abandon the language of “left” vs. “right” and adopt new terminology?
My answer is that, because the terms “left” and “right” are already widely used to denote the basic political alternative, and because that alternative is in fact binary, the best approach for advocates of freedom is not to reject the prevalent terminology but to clarify it—by defining the relevant terms.
The problem with conventional approaches to the left-right political spectrum is that they either fail to define the alternatives in question, or proceed to define them in terms of non-essentials.
One common approach, for instance, fails to specify the precise nature of either side, yet proceeds to place communism, socialism, and modern “liberalism” on (or toward) the left—and fascism, conservatism, and capitalism on (or toward) the right.
This makes no sense, at least in terms of the right. Capitalism—the social system of individual rights, property rights, and personal liberty—has nothing in common with conservatism or fascism. Take them in turn.
Conservatism is not for individual rights or personal liberty; rather, it is for religious values (euphemistically called “traditional values” or “family values”) and a government that enforces them. Although conservatism calls for some economic liberties, it simultaneously demands various violations of individual rights in order to support certain aspects of the welfare state (e.g., Social Security and government-run schools), in order to shackle or control “greedy” businessmen (e.g., Sarbanes-Oxley and anti-immigration laws), and in order to forbid certain “immoral” acts or relationships (e.g., drug use and gay marriage). Thus, conservatism is utterly at odds with capitalism.
And fascism, far from having anything in common with capitalism, is essentially the same atrocity as communism and socialism—the only difference being that whereas communism and socialism openly call for state ownership of all property, fascism holds that some property may be “private”—so long as government can dictate how such property may be used. Sure, you own the factory, but here’s what you may and may not produce in it; here’s the minimum wage you must pay employees; here’s the kind of accounting system you must use; here are the specifications your machinery must meet; and so on. (Thomas Sowell makes some good observations about the nature of fascism.)
Another ill-conceived approach to the left-right political spectrum is the attempt by some to define the political alternatives by reference to the size or percentage of government. In this view, the far left consists of full-sized or 100 percent government; the far right consists of zero government or anarchy; and the middle area subsumes the various other possible sizes of government, from “big” to “medium” to “small” to “minimal.” But this too is hopeless.
The size of government is not the essential issue in politics. A large military may be necessary to defend citizens from foreign aggressors, especially if there are many potential aggressors—say, multiple communist or Islamist regimes—who might combine forces against a free country. Likewise, a large court system might be necessary to deal with the countless contracts involved in a large free market and with the various disputes that can arise therein.
A small government, by contrast, can violate rights in myriad ways—if its proper purpose is not established and maintained. Observe that governments in the antebellum South were relatively small, yet their laws permitted and enforced the enslavement of men, women, and children. Likewise, the U.S. government was quite small during the 1890s—even though the Sherman Antitrust Act had passed and was violating businessmen’s rights to liberty, property, and the pursuit of happiness.
The essential issue in politics is not the size but the function of government; it’s not whether government is big or small but whether it protects or violates rights. (Ari Armstrong addresses this issue with excerpts from Ludwig von Mises.)
The proper purpose of government is to protect individual rights by banning the use of physical force from social relationships and by using force only in retaliation and only against those who initiate its use. A properly conceived political spectrum must reflect this fact. Whatever terms are used to identify the positions of political ideologies or systems must be defined with regard to the fundamental political alternative: force vs. freedom—or, more specifically, rights-protecting vs. rights-violating institutions.
Because the term “left” is already widely used to denote social systems and ideologies of force (e.g., socialism, communism, “progressivism”), and the term “right” is substantially used to denote social systems and ideologies of freedom (e.g., capitalism, classical liberalism, constitutional republicanism), the best approach for advocates of freedom is not to develop new terminology for the political spectrum, but to define the existing terminology with respect to political essentials—and to claim the extreme right end of the spectrum as rightfully and exclusively ours.
A notable advantage of embracing the political right as our own is that the term “right” happens to integrate seamlessly with the philosophical and conceptual hierarchy that supports freedom. This is a historic accident, but a welcome one. Although “left” and “right” originally referred to seating arrangements of 18th-century legislators in France—arrangements unrelated to anything in contemporary American politics—the term “right” conceptually relates to fundamental moral truths on which freedom depends.
Capitalism—the social system of the political right—is the system of individual rights. It is the system that respects and protects individual rights—by banning physical force from social relationships—and thus enables people to live their lives, to act on their judgment, to keep and use their property, and to pursue personal happiness. This observation grounds the political right in the proper goal of politics: the protection of rights.
Related, and still more fundamental, capitalism is morally right. By protecting individual rights, capitalism legalizes rational egoism: It enables people to act on the truth that each individual is morally an end in himself, not a means to the ends of others, and that each individual should act to sustain and further his own life and happiness by means of his own rational judgment. This observation deepens the significance of the term “right” and anchors it in the only code of morality that is demonstrably true.
In short, seen in this light, the right morality gives rise to the principle of individual rights, which gives rise to the need of a political system that protects rights, which system is properly placed on the political right—in opposition to all systems that in any way violate rights.
Observe the clarity gained by this conception of the political spectrum. The far left comprises the pure forms of all the rights-violating social systems: communism, socialism, fascism, Islamism, theocracy, democracy (i.e., rule by the majority), and anarchism (i.e., rule by gangs). The far right comprises the pure forms of rights-respecting social systems: laissez-faire capitalism, classical liberalism, constitutional republicanism—all of which require essentially the same thing: a government that protects and does not violate rights. The middle area consists of all the compromised, mixed, mongrel systems advocated by modern “liberals,” conservatives, unprincipled Tea Partiers (as opposed to the good ones), and all those who want government to protect some rights while violating other rights—whether by forcing people to fund other people’s health care, education, retirement, or the like—or by forcing people to comply with religious or traditional mores regarding sex, marriage, drugs, or what have you.
Importantly, on this essentialized conception of the political spectrum, the right does not entail degrees; only the left does. This is because degrees of force are degrees of force; violations of rights areviolations of rights. Freedom and rights are absolutes: Either people are free to act on their judgment, to keep and use their property, to pursue their happiness—or they are not free; they are to some extent coerced. Either government protects and does not violate rights—or it violates rights to some extent.
If people are not fully free to run their businesses and voluntarily contract with others as they see fit, to engage in voluntary adult romantic relationships, to engage in their own preferred recreational activities, to purchase or forgo health insurance as they deem best, and so forth, then they are not free; they are victims of coercion.
We who advocate freedom—whether we call ourselvesObjectivists or laissez-faire capitalists or classical liberals or Tea Partiers or whatever—should claim the political right as our own. And we should let conservatives who advocate any kind or degree of rights violations know that their proper place on the political spectrum is somewhere in the mushy, unprincipled middle with their modern “liberal” brethren. Perhaps such notice and company will cause them to think about what’s right.
The political right properly belongs to those who uphold the principle of rights—not merely in theory, but also in practice.
Like this post? Join our mailing list to receive our weekly digest. And for in-depth commentary from an Objectivist perspective, subscribe to our quarterly journal, The Objective Standard.
Related:

Tuesday, March 22, 2016

Another CEO Makes Dire Predictions About America’s Future

Jamie Squire/Getty Images

Scary warnings about the future of the American economy are becoming rather commonplace. Everyone from Nobel Prize-winning economists to business leaders and politicians are throwing in their two cents, and they all have pretty much the same thing to say: that the economy is bound for some unprecedented friction, due to income inequality, the expansion of technology, and automation of jobs.
The question is how to fix it. So far, there’s been a lot of lip service but not much in terms of actual progress made in addressing the predicted scenarios. And now, a prominent business leader has come into the conversation, having formulated a rather startling and worrisome hypothesis that a huge chunk of American businesses, 40%, to be exact, will go under within the next decade.
“Forty percent of businesses in this room, unfortunately, will not exist in a meaningful way in 10 years,” said outgoing Cisco CEO John Chambers, at the company’s consumer conference, as Business Insider reported. “It will become a digital world that will change our life, our health, our education, our business models at the pace of a technology company change.”
“If I’m not making you sweat, I should be,” he added.
Chambers’ warnings come as he described to the audience how he believes that America’s businesses will, for the most part, be unsuccessful in trying to go digital. Basically, Chambers is one of many CEOs that sees the writing on the wall — with the advent of the Internet and the explosive growth of technology, the old ways of doing business, no matter what industry, are being targeted for extinction. For example, look at what Uber has done to transportation, or what the new slate of mobile payment apps are doing to banking. Industries that seemed impermeable ten years ago are in the crosshairs for a new generation of entrepreneurs, and nobody is totally safe.
What we’re really talking about here is that the economy has undergone some dramatic shifts. For a long time, there were several big corporations that effectively ruled the roost in American business, and those are being supplanted or overtaken by new ones. A few decades ago, General Electric, GM, and a small host of other companies were the defacto leaders of the economy. Today, companies like Facebook, Google, and Apple are all taking over, and using their huge stores of capital to disrupt other industries, like the automotive sector.
These big companies are warning everyone about the coming onslaught of creative destruction, which will have entrepreneurs finding ways to exploit weaknesses in any conceivable business structure that they can. Even big companies like Apple, as a recent Slate article pointed out, are merely rehashing ideas from other companies and putting their logo on it.
Since they have the resources to do that, and to successfully snuff out those other companies, a lot of small and mid-size businesses are destined to go down. Either they go down, or they get swallowed up, which is something we’re seeing in the technology sector. Companies like Google and Facebook have been gobbling up smaller companies before they can get a real foothold in the market at an extremely fast rate. That strategy serves a couple of functions — it smothers would-be competition before it has the chance to become a threat, and it folds new talent, ideas, and products into the existing amalgam.
This is exactly why Facebook went out and purchased WhatsApp, and Instagram. They expand their own capabilities, while taking out potential competition at the same time.
This is what’s at the heart of the warnings being passed down by people like Cisco’s outgoing CEO. We may not actually see business die, and jobs disappear (although that will occur to some extent, in all likelihood). We will, however, probably see a lot of business sectors become more concentrated. Just take a look at the telecom industry — there’s a new merger announced among cable, Internet, and telephone service providers seemingly every day.
Of course, whether Chambers’ predictions come to fruition is anyone’s guess. New startups and business models are being thought up every day, and any number of them can turn the business world on its head. That doesn’t mean we shouldn’t be concerned, to be sure. But things rarely play out according to plan.
Follow Sam on Twitter @SliceOfGinger


Monday, March 21, 2016

Understanding Kim Jong Un



Everyone knows that North Korea’s leader is a bloodthirsty madman and buffoon—or is he really? Mark Bowden digs into the hard facts for an unusual portrait.
Does anyone make an easier target than Kim Jong Un? He’s Fatboy Kim the Third, the North Korean tyrant with a Fred Flintstone haircut—the grinning, chain-smoking owner of his own small nuclear arsenal, brutal warden to about 120,000 political prisoners, and effectively one of the last pure hereditary absolute monarchs on the planet. He is the Marshal of the Democratic People’s Republic of Korea, the Great Successor, and the Sun of the 21st Century. At age 32 the Supreme Leader owns the longest list of excessive honorifics anywhere, every one of them unearned. He is the youngest head of state in the world and probably the most spoiled. On the great grade-school playground of foreign affairs, he might as well be wearing across his broad bottom a big KICK ME sign. Kim is so easy to kick that the United Nations, which famously agrees on nothing, voted overwhelmingly in November to recommend that he and the rest of North Korea’s leadership be hauled before the International Criminal Court, in The Hague, and tried for crimes against humanity. He has been in power for a little more than three years.
In the world press, Kim is a bloodthirsty madman and buffoon. He is said to be a drunk, to have become so obese gorging on Swiss cheese that he can no longer see his genitals, and to have resorted to bizarre remedies for impotence, such as a distillation from snake venom. He is said to have had his uncle, Jang Song Thaek, and the entire Jang family mowed down by heavy machine guns (or possibly exterminated with mortar rounds, rocket-propelled grenades, or flamethrowers), or to have had them fed live to ravenous dogs. He is reported to have a yen for bondage porn and to have ordered all young men in his country to adopt his peculiar hairstyle. It is said that he has had former girlfriends executed.
All of the above is untrue—or, perhaps safer to say, unfounded. The Jang-fed-to-dogs story was actually invented by a Chinese satirical newspaper, as a joke, before it began racing around the world as a viral version of truth. (And to be sure, he did send Uncle Jang to his death.) It says something about Kim that people will believe almost anything, the more outrageous the better. In light of this, is it worth considering that the conventional take on Kim Jong Un does not come close to providing an accurate picture?
What if, despite the well-documented horrors of the Stalinist regime he inherited in 2011, while still in his 20s, Kim has ambitions at home that one might be tempted to describe—within carefully defined limits—as well intentioned? What if, against terrific odds, he hopes to improve the lives of his subjects and alter North Korea’s relationship with the rest of the world?
There is no shortage of evidence to the contrary—evidence, namely, that Kim is little more than a bad, and erratic, approximation of his canny father. Kim has continued his father’s military-first policies: the same saber rattling and shrill denunciations come screaming out of Pyongyang, the same emphasis on building nuclear weapons and ballistic missiles, the same unabashed political oppression. For years, North Korea has engaged in what experts in Washington have called “a provocation cycle”—ramping up provocative behavior, such as launching missiles or conducting nuclear tests, followed by charm offensives and offers to begin a dialogue. Under Kim Jong Un, the provocation cycle continues to spin dangerously. When Sony Pictures suffered a damaging and embarrassing breach of its internal computer network weeks before the scheduled December release of the comedy The Interview, little prompting was needed before fingers started pointing at Pyongyang. In the movie, Seth Rogen and James Franco play Americans who land an interview with Kim and then are enlisted by the C.I.A. to try to assassinate him. Earlier, in June, North Korea had promised to unleash a “merciless countermeasure” should the film be shown.
Whatever his true character, Kim faces a problem peculiar to dictators. His power in North Korea is so great that not only does no one dare criticize him, no one dares advise him. If you are too closely associated with the king, your head might someday share the same chopping block. Safer to adopt a “Yes, Marshal” approach. That way, if the king stumbles, you are simply among the countless legion who were obliged to obey his orders. One way to read the confusing signals from Pyongyang in recent years is that they show Kim, isolated and inexperienced, clumsily pulling at the levers of state.
Kim is, in fact, playing a deadly game, says Andrei Lankov, a Russian expert on Korea who attended Kim Il Sung University, in Pyongyang, in 1984 and 1985, and now teaches at Kookmin University, in Seoul. “He has had a spoiled, privileged childhood, not that different than the children of some Western billionaires, for whom the worst thing that can happen is that you will be arrested while driving under the influence. For Kim, the worst that can actually happen is to be tortured to death by a lynch mob. Easily. But he doesn’t understand. His parents understood it. They knew it was a deadly game. I’m not sure whether Kim fully understands it.”

RUNNING WITH THE BULLS

We’re not even sure how old he is. Kim was born on January 8 in 1982, 1983, or 1984. To tidy up their historical narrative, Pyongyang’s propagandists have placed his birthday in 1982. The original Kim, the current leader’s grandfather and national founder, Kim Il Sung, for whom universal reverence is mandatory, was born in 1912. As the story goes, in 1942 his son and heir, Kim Jong Il, came along; for this second Kim, a slightly lesser wattage of reverence is mandatory. In truth, Kim II was born in 1941, but in North Korea myth trumps fact to an even greater extent than elsewhere, and numeric symmetry hints at destiny, like a divine wink. That is why 1982 was seen to be an auspicious year for the birth of Kim III. For reasons of their own, South Korean intelligence agencies, which have a long history of being wrong about their northern cousins, have placed his birthday in the Orwellian year 1984. Kim himself, who occasionally shows magisterial disdain for the slavish adulation of his underlings, has said that he was born in 1983—this according to the American statesman, rebounder, and cross-dresser Dennis Rodman, who had been drinking heavily when he met Kim, in 2014 (and who shortly afterward went into rehab). Whichever date is correct, the Sun of the 21st Century has walked among us for three decades.

The world isn't going to do shit! Sanctions? Limit his EBT card

Innovation in a Rapidly Changing World


The world is changing at an ever-increasing pace. There may have been times in history when the current rate of change was matched, for example the industrial revolution in the UK in the 19th century, when the balance of society changed from agriculture to industry in the space of a couple of generations. But it could certainly be argued that we are in a phase where so many markets are changing so fast, that the companies competing in them must seriously innovate or die.
Some industries like mobile telephones change almost quarterly. The old “floor area” approach to retail is in rapid decline, as shown by the recent demise in the UK of Blockbusters and HMV, also influenced by the rise of “showrooming” as pointed out by Tom Fishburne. Of course there are some markets that experience relatively little change, even ones we will all use, like the funeral industry. But many other examples give the impression of an increasing pace of change, so it’s no wonder corporate leaders are experiencing Innovation Vertigo, as described by Paul Hobcraft.
Even the relatively slow creep of demographic change and its impact on economies can influence business in many different ways. A single statistic I read recently underlined this well – the number of adult diapers (nappies) in Japan sold in 2012 is likely to be higher than the infant version. So with the occasional rare exception, an individual company cannot change its part of the world; it must adapt in order to survive (thank you, Darwin).
There are three key principles to consider in positioning a business to be ahead of the competition in this context of rapid change. First, companies should become more agile and able to respond to external change. Second, innovation should be developed to accelerate the pace of market change to your advantage. Third, resources should be deployed where they will make the biggest difference to the new agenda.
There is a fine line to be drawn between embarking on a program of determined, sensible and rapid change; and a panic-stricken call to McKinsey. So in the context of innovation, corporate leaders should start by asking themselves some key questions as a prompt for action:
1. When did you last overhaul your development process? Stages and Gates may be fine in principle, but too many of them means too much time preparing for, and seeking approval. This is time wasted in getting to market fast. You should revisit your processes with the key objective to shorten time to market. Even if you work in a regulated area, like pharmaceuticals, where a lot of timing is out of your hands, it’s the same for everybody in the industry. The objective is to be faster. It’s also important to bear in mind that disruptive lines of business may need different evaluation criteria.
2. Do you feel fully in control of everything that happens in your company?If you do, there’s probably a stifling bureaucracy hindering agility. The feeling of control may help you sleep more easily at night, but in the process will waste inordinate amounts of time while people conform to internal needs at the expense of external opportunities. Give your people more credit to do the right thing and to try new options. Bureaucracy and over-control are the enemies of agility.
3. Are you truly passionate about your company’s products and innovation?It’s not good if all the management time is spent analyzing numbers, whether they be financial or performance-related, at the expense of the heart of the company, which is its customer offering. Steve Jobs is a good example to look at, which may seem paradoxical after question #2 above, but you should blend these two points. Don’t over-control your people, but be passionate about what they produce and drive them to achieve more.
4. Is your innovation portfolio well balanced in the context of the 3 Horizons (or similar)?It’s always easy to let the urgent outweigh the important. In order to anticipate change you need to make preparing to exploit it an important item on the C-suite agenda, then to make it urgent. Your strategic perspective should therefore include a view beyond the here and now, to incorporate technology changes and market disruption.
5. Do your senior managers have the appetite to change?Many managers have an inbuilt preference for turf protection and a fierce defence of existing practice, founded on what has worked in the past. When the world is changing rapidly, a sense of crisis is needed, as shown by Stephen Elop at Nokia. Facing up to major change requires brave leadership, and it may also require the replacement of some change-resistant managers.
6. When did you last place a “Little Bet”?Is all your innovation geared to today and focused on what you already know? How many initiatives do you have that are “different” to today’s business, and could form the foundation for entirely new lines of revenue? How many of these “little bets” have you placed? It’s probably time to move some resource to planning a different future.
7. How much do you rely on today’s customer paradigms? Don’t frame all your innovation in the context of what your customers do today. They can only feedback on what they already know; inevitably your research report is a historical document. You need the ambidextrous approach of looking after today’s business at the same time as working on disruptive opportunities for the future. So, how can you anticipate or even induce change in customer habits and needs to your advantage.
- See more at: http://www.innovationexcellence.com/blog/2013/01/31/innovation-in-a-rapidly-changing-world/#sthash.VzqyPeA3.dpuf

Monday, March 7, 2016

Revisited

If you cant find something to live for, find something to die for

Top 9 Ways to Avoid Looking Like a Gringo in Latin America


I realize there are numerous obvious reasons you can think of that would make someone not want to stick out as an obvious tourist in Latin America–safety (criminals are far more likely to target an obvious tourist), social acceptance, not feeling stupid, simply wanting to blend in by dressing in the local fashion, etc.–but the best reason isn’t any of those, and it’s one that requires a bit of explaining and delves right into the culture of Latin America, and it has to do with poverty…
Normal people dress more formally in Latin America than elsewhere, and the reason for this is that a much, much larger proportion of their population is relatively poor than in wealthier developed nations like the U.S., Canada, and Western Europe, and consequently it isn’t, and never has been, considered fashionable to dress down or to dress like you’re poorer than you really are.
No one wants to be mistaken for the lowest lower class (Latin America is also a much more class-centered society), no one wears jeans that are intentionally torn (if your jeans are torn it must be because you’re too poor to afford new ones), no one wears clothes that are baggy and don’t fit (if they don’t fit, it must be because you can’t afford proper clothes that do fit correctly), no one dresses informally because it looks “cool” (because it doesn’t there), etc.
Latin America is an extremely class-conscious society, and the A-number-one way that people communicate to everyone else that they’re respectable, not a criminal, and not a violent delinquent is by dressing as smartly and as nicely as they can possibly afford to.
Even very poor people will still do this, they’ll own just one nice pair of dress pants that they wear every single day and wash and iron every single night if they have to, only the worst of the worst don’t–they’re not being snobs, this isn’t our culture, it’s not the same as if you were to do this here.
When you dress shabbily (shabbily by their standards, normal by ours), you’re associating yourself immediately with some very ‘undesirable’ people that no one else wants to be associated with, people will avoid being seen with you and any friends you might make will not want to be seen out with you but will be too polite to tell you that your dressing habits make you look like a desperate heroin addict.
Please, before you start ranting at me in the comments, understand that I’m not saying you can’t wear what you want, I’m not telling you how to dress, I’m just saying people are going to judge you for it and you really cannot hold that against them since they’re just being normal (you’re in their culture, right?) and you’re the one being weird, I’m just telling you what’s socially acceptable and what’s not and why.
Just as an example of how this can cause problems, having had this same exact experience related to me by several backpackers who have had this happen in several different Latin American countries: you will get turned away at the door at clubs and even bars if you’re wearing sneakers, or shorts, or a t-shirt (without a nice button-up shirt on top of it), and frequently even jeans, and god help you if you’re wearing 3 or 4 of those.

The Top 9 Gringo Giveaways

The following list contains what I’ve found are the most common and obvious things that gringos will tend to do that you would never see a native doing, thereby being the things that are most commonly known by the natives to indicate that someone isn’t from around there (and most of these tend to be associated with the stereotypical white American/Canadian/European tourist).
Follow these tips to avoid looking like a gringo in Latin America:
1. People don’t usually wear just a t-shirt when they go out.
This is something that would be worn around the house after work or perhaps while one was working out or doing some gardening or landscaping at home, though people do wear them underneath a nice button-up shirt, so that’s fine.
2. They don’t wear sneakers unless they’re going running or they’re doing (or on their way to do) some sort of physical or athletic activity that requires them.
And even then, many people would wear their normal clothes on the way over while bringing their running/sports clothes with them that they’ll change into when they get there.
Also, white socks are only worn with sneakers, never with normal dress shoes that people wear day-to-day.
3. They would never wear a tracksuit, exercise shorts, or exercise pants unless they were actually exercising.
Even going to and from the gym they’d wear something nicer and bring their workout clothes back and forth with them and change at the gym (which would almost certainly involve a shower post-workout prior to changing back into their nice clothes).
4. Fanny packs.
No. Never. Not ever. This makes you a walking target as far as muggers are concerned, and with there being plenty of other less obtrusive options such as money belts, backpacks (student-style backpack that is: students are poor, they have no money, don’t bother robbing them, you know?), briefcases/man-purses, etc. there just isn’t a good reason to have one.
5. Generally dressing like a hippy.
You already know if this applies to you: looking like you just rolled out of Woodstock is fine in most places in the U.S., and fine with me personally by the way (I have a bit of a soft spot for hippie chicks, I think they’re cute especially when they have dreadlocks), I have nothing against them, but the problem with it is that Latin Americans will perceive you as dirty, in a heroin-addict-who-might-just-stab-you sort of way.
Sorry, but you’ll get significantly better treatment and service if you take note of the fact that the locals will frequently be dressed in nice trousers/skirts and a starched button-up shirt even in sweltering heat and do what you can to blend in.
6. Very skimpy clothing.
Make no mistake, the women will certainly go to great lengths to show off their “assets” sometimes, especially if they’re going out clubbing or something, and plenty of them are frequently sporting a very respectable amount of cleavage (I’m looking at you, Medellín), but what you won’t ever see is really revealing stuff like shorts that are so short your ass is practically hanging out, a top so small that it’s essentially a bra, itty-bitty mini-skirts (again, with your ass practically hanging out), etc.
This is especially a no-no in a church, and this is one complaint I’ve heard from locals where the reaction goes from “oh that’s slutty”, which is how they would normally see it, to “that’s really f*ing offensive, someone should throw her out”. Be careful what you wear to churches, if you don’t normally bother please just this once make the effort to wear something nice, it’s really a big deal (this isn’t a religion thing–I’m agnostic–it’s a respect thing because it’s their culture you’re in).
7. Cargo pants.
Nope, they don’t do them, they never caught on down there and consequently no one wears them, it’ll immediately peg you as a gringo (whether that’s good or bad or irrelevant is entirely up to you by the way).
8. Flip-flops and sandals.
Sorry girls, outside of the beach or at the swimming pool, they’re never worn and are considered far too casual for everyday wear (kind of like walking around in bedroom slippers here). For guys, this includes sandals, with socks or without, doesn’t matter.
9. I’ve saved the worst offender for last: the men do not wear shorts. Ever.
This is the stereotypical gringo thing to do, it’s the one that everyone jokes about. Exceptions: working out, the beach, walking around the house, swimming pool.
That’s it. I honestly hopes this helps you, and please keep in mind the above list is not some strict “don’t do this unless you’re a jerk” type of thing, it’s just meant to be informative so that you can use it to help youmake a decision about what to wear and when, that’s all.
This is really meant to be only for the people who would actually be concerned about this in the first place, if you’re not really worried about blending in then don’t worry about it, I don’t think there’s anything really wrong with that and even then this should still help you so that you understand part of the culture you’ll be interacting in.
***This does not apply to Black folks.

Featured Posts

Rental Properties for Sale, Santa Marianita, Ecuador

  Beautiful rental with beach access. Utilities and WiFi are included, just bring your food and move in. *Be sure to ask about our long-term...

Popular Posts