Saturday 29 October 2016

History Of Education

                          History Of Education



The term 'educational technology' was used during the post World War II era in the United States for the integration of implements such as film strips, slide projectors, language laboratories, audio tapes, and television. Presently, the computers, tablets, and mobile devices integrated into classroom settings for educational purposes are most often referred to as 'current' educational technologies. It is important to note that educational technologies continually change, and once referred to slate chalkboards used by students in early schoolhouses in the late nineteenth and early twentieth centuries. The phrase 'educational technology', a composite meaning of technology + education, is used to refer to the most advanced technologies that are available for both teaching and learning in a particular era.


In 1994 federal legislation for both the Educate America Act and the Improving America's School's Act (IASA) authorized funds for state and federal educational technology planning. One of the principal goals listed in the Educate America Act is to promote the research, consensus building, and systemic changes needed to ensure equitable educational opportunities and high levels of educational achievement for all students (Public Law 103-227). In 1996 the Telecommunications Act provided a systematic change necessary to ensure equitable educational opportunities of bringing new technology into the education sector. The Telecomm Act requires affordable access and service to advanced telecom services for public schools and libraries. Many of the computers, tablets, and mobile devices currently used in classrooms operate through Internet connectivity; particularly those that are application based such as tablets. Schools in high-cost areas and disadvantaged schools were to receive higher discounts in telecom services such as Internet, cable, satellite television, and the management component.

A chart of "Technology Penetration in U.S. Public Schools" report states 98% percent of schools reported having computers in the 1995-1996 school year, with 64% Internet access, and 38% working via networked systems. The ratio of students to computers in the United States in 1984 stood at 15 students per 1 computer, it now stands at an average all-time low of 10 students to computer. From the 1980s on into the 2000s, the most substantial issue to examine in educational technology was school access to technologies according to the 1997 Policy Information Report for Computers and Classrooms: The Status of Technology in U.S. Schools. These technologies included computers, multimedia computers, the Internet, networks, cable TV, and satellite technology amongst other technology-based resources.

More recently ubiquitous computing devices, such as computers and tablets, are being used as networked collaborative technologies in the classroom. Computers, tablets and mobile devices may be used in educational settings within groups, between people and for collaborative tasks. These devices provide teachers and students access to the World Wide Web in addition to a variety of software applications.

Technology education standards
National Educational Technology Standards (NETS) served as a roadmap since 1998 for improved teaching and learning by educators. As stated above, these standards are used by teachers, students, and administrators to measure competency and set higher goals to be skillful.

The Partnership for 21st Century Skills is a national organization that advocates for 21st century readiness for every student. Their most recent Technology plan was released in 2010, "Transforming American Education: Learning Powered by Technology." This plan outlines a vision "to leverage the learning sciences and modern technology to create engaging, relevant, and personalized learning experiences for all learners that mirror students' daily lives and the reality of their futures. In contrast to traditional classroom instruction, this requires that students be put at the center and encouraged to take control of their own learning by providing flexibility on several dimensions." Although tools have changed dramatically since the beginnings of educational technology, this vision of using technology for empowered, self-directed learning has remained consistent.

Pedagogy
The integration of electronic devices into classrooms has been cited as a possible solution to bridge access for students, to close achievement gaps, that are subject to the digital divide, based on social class, economic inequality, or gender where and a potential user does not have enough cultural capital required to have access to information and communication technologies. Several motivations or arguments have been cited for integrating high-tech hardware and software into school, such as (1) making schools more efficient and productive than they currently are, (2) if this goal is achieved, teaching and learning will be transformed into an engaging and active process connected to real life, and (3) is to prepare the current generation of young people for the future workplace. The computer has access to graphics and other functions students can use to express their creativity. Technology integration does not always have to do with the computer. It can be the use of the overhead projector, student response clickers, etc. Enhancing how the student learns is very important in technology integration. Technology will always help students to learn and explore more.

Paradigms
Most research in technology integration has been criticized for being atheoretical and ad hoc, driven more by the affordances of the technology rather than the demands of pedagogy and subject matter. Armstrong (2012) argued that multimedia transmission turns to limit the learning into simple content, because it is difficult to deliver complicated content through multimedia.

One approach that attempts to address this concern is a framework aimed at describing the nature of teacher knowledge for successful technology integration. The technological pedagogical content knowledge or TPACK framework has recently received some positive attention.

Another model that has been used to analyze tech integration is the SAMR framework, developed by Ruben Puentedura. This model attempts to measure the level of tech integration with 4 the levels that go from Enhancement to Transformation: Substitution, Augmentation, Modification, Redefinition.

Constructivism
Constructivism is a crucial component of technology integration. It is a learning theory that describes the process of students constructing their own knowledge through collaboration and inquiry-based learning. According to this theory, students learn more deeply and retain information longer when they have a say in what and how they will learn. Inquiry-based learning, thus, is researching a question that is personally relevant and purposeful because of its direct correlation to the one investigating the knowledge. As stated by Jean Piaget, constructivist learning is based on four stages of cognitive development. In these stages, children must take an active role in their own learning and produce meaningful works in order to develop a clear understanding. These works are a reflection of the knowledge that has been achieved through active self-guided learning. Students are active leaders in their learning and the learning is student-led rather than teacher–directed.

Many teachers use a constructivist approach in their classrooms assuming one or more of the following roles: facilitator, collaborator, curriculum developer, team member, community builder, educational leader, or information producer.

Counter argument to computers in the classroom
Is technology in the classroom needed, or does it hinder students' social development? We've all seen a table of teenagers on their phones, all texting, not really socializing or talking to each other. How do they develop social and communication skills? Neil Postman (1993) concludes:

"The role of the school is to help students learn how to ignore and discard information so that they can achieve a sense of coherence in their lives; to help students cultivate a sense of social responsibility; to help students think critically, historically, and humanely; to help students understand the ways in which technology shapes their consciousness; to help students learn that their own needs sometimes are subordinate to the needs of the group. I could go on for another three pages in this vein without any reference to how machinery can give students access to information. Instead, let me summarize in two ways what I mean. First, I'll cite a remark made repeatedly by my friend Alan Kay, who is sometimes called "the father of the personal computer." Alan likes to remind us that any problems the schools cannot solve without machines, they cannot solve with them. Second, and with this I shall come to a close: If a nuclear holocaust should occur some place in the world, it will not happen because of insufficient information; if children are starving in Somalia, it's not because of insufficient information; if crime terrorizes our cities, marriages are breaking up, mental disorders are increasing, and children are being abused, none of this happens because of a lack of information. These things happen because we lack something else. It is the "something else" that is now the business of schools.

Technology integration

Technology integration


Technology integrationTechnology integration is the use of technology tools[citation needed] in general content areas in education in order to allow students to apply computer and technology skills to learning and problem-solving. Generally speaking, the curriculum drives the use of technology and not vice versa. Technology integration is defined as the use of technology to enhance and support the educational environment. Technology integration in the classroom can also support classroom instruction by creating opportunities for students to complete assignments on the computer rather than the normal pencil and paper. Technology integration in class would help students to explore more.

The International Society for Technology in Education (ISTE) has established technology standards for students, teachers and administrators in K-12 classrooms. The ISTE, a leader in helping teachers become more effective users of technology, offers this definition of technology integration:

"Curriculum integration with the use of technology involves the infusion of technology as a tool to enhance the learning in a content area or multidisciplinary setting... Effective integration of technology is achieved when students are able to select technology tools to help them obtain information in a timely manner, analyze and synthesize the information, and present it professionally. The technology should become an integral part of how the classroom functions—as accessible as all other classroom tools. The focus in each lesson or unit is the curriculum outcome, not the technology."

Integrating technology with standard curriculum can not only give students a sense of power, but also allows for more advanced learning among broad topics. However, these technologies require infrastructure, continual maintenance and repair – one determining element, among many, in how these technologies can be used for curricula purposes and whether or not they will be successful. Examples of the infrastructure required to operate and support technology integration in schools include at the basic level electricity, Internet service providers, routers, modems, and personnel to maintain the network, beyond the initial cost of the hardware and software.

Technology integration alongside standard education curriculum can provide tools for advanced learning among a broad range of topics. Integration of information and communication technology is often closely monitored and evaluated due to the current climate of accountability, outcome based education, and standardization in assessment.

Technology integration can in some instances be problematic. A high ratio of students to technological device has been shown to impede or slow learning and task completion. In some, instances dyadic peer interaction centered on integrated technology has proven to develop a more cooperative sense of social relations. Success or failure of technology integration is largely dependent on factors beyond the technology. The availability of appropriate software for the technology being integrated is also problematic in terms of software accessibility to students and educators. Another issue identified with technology integration is the lack of long-range planning for these tools within the educative districts they are being used.

Technology contributes to global development and diversity in classrooms and helps develop upon the fundamental building blocks needed for students to achieve more complex ideas. In order for technology to make an impact within the educational system, teachers and students must access to technology in a contextual matter that is culturally relevant, responsive and meaningful to their educational practice and that promotes quality teaching and active student learning. Following the joyous moment when educators realize their students are capable, independent technology users who can create inspiring digital masterpieces, . In the former mindset of teaching with technology, the teacher was the focal point of the classroom, creating (often time-consuming) interactive and multimedia presentations to add shock and awe to his or her lessons and capture the attention of the 21st century child. A new mindset of teaching through technology must emerge, which depends on a vital shift in teacher/student roles. This helps both student and teacher simultaneously. The four C's are at the heart of the International Society for Technology in Education's National Educational Technology Standards (NETS) for Students, providing a substantial framework for defining the focus of technology objectives for K-12 students. For example, in implementing these standards we have found that even our youngest 21st century learners are capable of independently creating digital storybooks, artwork, presentations, and movies.






United National Party

                          United National Party


The United National Party, often abbreviated as UNP (Sinhalese: එක්සත් ජාතික පක්ෂය, pronounced Eksath Jathika Pakshaya, Tamil: ஐக்கிய தேசியக் கட்சி), is a political party in Sri Lanka. It currently is the main ruling party in the government of Sri Lanka and is headed by Ranil Wickremesinghe. The UNP is considered to have right-leaning, pro-capitalist, and liberal conservative policies.

At the last legislative elections in Sri Lanka, held on 2 April 2004, the UNP was the leading member of the coalition United National Front, which won 37.8% of the popular vote and 82 out of 225 seats in Parliament. It came in second to the United People's Freedom Alliance, a left-leaning coalition, which won 45.60% of the vote. The Front previously held a majority in parliament from December 2001 until April 2004, when it had 109 seats, with Ranil Wickremesinghe as prime minister. The UNP had previously been the governing party or in the governing coalition from 1947 to 1956, from 1965 to 1970 and from 1977 to 1994. In total, the UNP governed Sri Lanka (formerly known as Ceylon) for 33 of 57 years of its independent history. The UNP also had control of the executive presidency from the presidency's formation in 1978 to 1994.

The UNP is a conservative party to the right of the Sri Lanka Freedom Party, favouring a more neo-liberal market-oriented economy. The UNP is also member of the International Democrat Union.

Founding
The UNP was founded on 6 September 1946 by amalgamating three right-leaning pro-dominion parties from the majority Sinhalese community and minority Tamil and Muslim communities. It was founded by Don Stephen Senanayake, who was in the forefront in the struggle for independence from the United Kingdom, having resigned from the Ceylon National Congress because he disagreed with its revised aim of 'the achieving of freedom' from the British Empire.The UNP represented the business community and the landed gentry. However, Senanayake also adopted populist policies that made the party accepted in the grassroots level. Due to his agricultural polcies many landless people were relocated to fertile dry zone which was covered in a thick jungle and new agricultural colonies were built which resulted in Sri Lankan agricultural production rising D.S Senanayake is considered as the "father of the nation".


D.S. Senanayake, the founder of the party
After independnece he refused a Knighthood but maintained good relations with Britain and was a Privy Counsel. He launched major irrigation and hydropwer projects such as the Gal oya project, Udawalawa tank, Senanayaka tank and several other multipurpose projects were launched during this period.He also renoavted historic sites in Anuradhapura and Polonnaruwa. He also played a major role in the Colombo plan 

However his government proceeded to disenfranchise the plantation workers of Indian descent, the Indian Tamils, using the Ceylon Citizenship Act of 1948 and the Parliamentary Elections Amendment Act of 1949. These measures were intended primarily to undermine the Left electorally.



Split
In 1952 Prime Minister Senanayake died in a riding accident, and his son Dudley became Prime Minister. This irked long standing UNP stalwart S.W.R.D. Bandaranaike, a Buddhist nationalist leader known for his centre-left views. Bandaranaike quit the party to found the Sri Lanka Freedom Party (SLFP) as a balancing force between the UNP and Marxist parties.

During Dudley Senanayake launched several projects to further develop the agricultural sector and was termed “Bath Dun Piya” (father who offered free rice to the nation). Bathalegoda Paddy research centre, Thalawakele Tea research centre, Lunuwila Coconut research centre was created by him to futer develop the agricultural sector and he also founded the Moratuwa University, Ampara Higher Technology Institution and many Technical colleges.C ommencement of the Bhikku University and the declaration of the Poya day as a government holiday were also done during this period. 

In 1953 the UNP attempted to reduce the rice ration and there was a Hartal, which caused Dudley Senanayake to resign. He was replaced by his cousin, Major John Kotelawala who launched seevral major power generation and infrastructure projects such as the Lakshapana hydro power project, Bambalapitiya housing project which provided houses to the homeless, modernizing of the Ratmalana airport, construction of the Kelaniya bridge others including the development of buddhist religious sites.

There was growing disaffection with the UNP particularly because of its support of minority religious groups, most notably Catholics, to the consternation of the predominantly Buddhist Sinhalese. Bandaranaike was able to take advantage and lead the SLFP to victory in the 1956 elections. Soon afterwards he passed the controversial Sinhala Only Act, which led to communal clashes in 1958.

The UNP again came to power in 1965 in coalition with the Mahajana Eksath Peramuna, the Tamil ethnic Federal Party under Dudley Senanayake, but it lost in a 1970 landslide to the SLFP, which had formed an electoral alliance with Marxist parties known as the United Front.

A bitter leadership battle soon developed between the populist Dudley Senanayake and the more conservative J. R. Jayewardene, a strong supporter of free market policies and a pro-American foreign policy. For the latter, he was called “Yankee Dickey.

Health technology

                      Health technology


Health technology is defined by the World Health Organization as the "application of organized knowledge and skills in the form of devices, medicines, vaccines, procedures and systems developed to solve a health problem and improve quality of lives. This includes the pharmaceuticals, devices, procedures and organizational systems used in health care.


Medical technology
Medical technology, encompasses a wide range of healthcare products and is used to diagnose, monitor or treat diseases or medical conditions affecting humans. Such technologies (applications of medical science) are intended to improve the quality of healthcare delivered through earlier diagnosis, less invasive treatment options and reductions in hospital stays and rehabilitation times. Recent advances in medical technology have also focused on cost reduction. Medical technology may broadly include medical devices, information technology, biotech, and healthcare services.

The impacts of medical technology may involve social and ethical issues. For example, physicians may seek objective information from technology rather than listening to subjective patient reports.

A major driver of the sector's growth is the consumerization of MedTech. Supported by the widespread availability of smartphones and tablets, providers are able to reach a large audience at low cost, a trend that stands to be consolidated as wearable technologies spread throughout the market.

In the past 5 years running up to the end of 2015, venture funding has grown 200%, allowing US$11.7 billion to flow into Health Tech businesses from over 30, 000 investors in the space.


Allied professions
The term medical technology may also refer to the duties performed by clinical laboratory professionals in various settings within the public and private sectors. The work of these professionals encompass clinical applications of chemistry, genetics, hematology, immunohematology (blood banking), immunology, microbiology, serology, urinalysis and miscellaneous body fluid analysis. Depending on location, educational level and certifying body, these professionals may be referred to as Biomedical Scientists, Medical Laboratory Scientists (MLS), Medical Technologists (MT), Medical Laboratory Technologists and Medical Laboratory Technicians.




History of occupational therapy in New Zealand


The early use of occupation to support, treat and rehabilitate people in New Zealand is evident in services for returned soldiers after World War 1 ((Hobcroft 1949)). There are glimpses in mental health services during the 1930s too (Skilton 1981). However the first qualified occupational therapist, Margaret Buchanan, arrived in New Zealand in 1941 (Buchanan 1941). Initially employed in the then Auckland Mental Hospital she was rapidly involved not only in the development of occupational therapy services there, but also the development of the first training programmes and advice to government. Initially those trained had previous health or education backgrounds (Skilton 1981). A formal two-year training programm was established by 1940 (NZNJ 1940), and state registration provided for in the Occupational Therapy Act 1949 with the New Zealand Occupational Therapy Registration Board 1950 but since replaced by the Occupational Therapy Board of NZ through the Health Practitioners Competence Assurance Act 2003. From its early services in mental health and returned serviceman settings occupational therapy expanded into general rehabilitation, work with children with disabilities and services for the elderly (Wilson 2004, p. 

Educational programmes moved from the health sector to the education sector in 1971 (New Zealand Occupational Therapy Registration Board 1970b 17 July). OT career training is now provided by the Schools of Occupational Therapy at the Auckland University of Technology and Otago Polytechnic in Dunedin. An advanced diploma in occupational therapy was first made available in 1989 (Packer 1991) and bachelor programmes have been available since the 1990s. However, it was not until a review of the Education Act that it was possible for master's degree programmes to be made available, as they now are through both schools . The first New Zealand occupational therapist to complete a PhD in the country in a programme related to occupational therapy was Linda Robertson who completed her PhD in 1994 (NZJOT 1996). The development of distance education technology has enabled large numbers of therapists to participate in post-graduate distance education.

An association for practitioners was formed in 1948 (New Zealand Registered Occupational Therapists Association 1949) and since renamed as the New Zealand Association of Occupational Therapists (Inc) or NZAOT. The NZAOT provides a bi-annual conference, representation at government levels, a journal and a monthly newsletter.

Technology during World War II

               Technology during World War II


Technology played a significant role in World War II. Some of the technologies used during the war were developed during the interwar years of the 1920s and 1930s, much was developed in response to needs and lessons learned during the war, while others were beginning to be developed as the war ended. Many wars had major effects on the technologies that we use in our daily lives. However, compared to previous wars, World War II had the greatest effect on the technology and devices that are used today. Technology also played a greater role in the conduct of WWII than in any other war in history, and had a critical role in its final outcome.

World War II was the first war where military operations widely targeted the research efforts of the enemy. This included the exfiltration of Niels Bohr from German-occupied Denmark to Britain in 1943; the sabotage of Norwegian heavy water production; and the bombing of Peenemunde.

Military operations were also conducted to obtain intelligence on the enemy's technology; for example, the Bruneval Raid for German radar and Operation Most III for the German V-2.

Between the wars

In August, 1919 the British Ten Year Rule declared the government should not expect another war within ten years. Consequently, they conducted very little military R & D. In contrast, Germany and the Soviet Union were dissatisfied powers who, for different reasons, cooperated with each other on military R & D. The Soviets offered Weimar Germany facilities deep inside the USSR for building and testing arms and for military training, well away from Treaty inspectors' eyes. In return, they asked for access to German technical developments, and for assistance in creating a Red Army General Staff.

The great artillery manufacturer Krupp was soon active in the south of the USSR, near Rostov-on-Don. In 1925, a flying school was established at Vivupal, near Lipetsk, to train the first pilots for the future Luftwaffe. Since 1926, the Reichswehr had been able to use a tank school at Kazan (codenamed Kama) and a chemical weapons facility in Samara Oblast (codenamed Tomka). In turn, the Red Army gained access to these training facilities, as well as military technology and theory from Weimar Germany.

In the late 1920s, Germany helped Soviet industry begin to modernize, and to assist in the establishment of tank production facilities at the Leningrad Bolshevik Factory and the Kharkov Locomotive Factory. This cooperation would break down when Hitler rose to power in 1933. The failure of the World Disarmament Conference marked the beginnings of the arms race leading to war.

In France the lesson of World War I was translated into the Maginot Line which was supposed to hold a line at the border with Germany. The Maginot Line did achieve its political objective of ensuring that any German invasion had to go through Belgium ensuring that France would have Britain as a military ally. France and Russia had more, and much better, tanks than Germany as of the outbreak of their hostilities in 1940. As in World War I, the French generals expected that armour would mostly serve to help infantry break the static trench lines and storm machine gun nests. They thus spread the armour among their infantry divisions, ignoring the new German doctrine of blitzkrieg based on the fast movement using concentrated armour attacks, against which there was no effective defense but mobile anti-tank guns - infantry Antitank rifles not being effective against medium and heavy tanks.

Air power was a major concern of Germany and Britain between the wars. Trade in aircraft engines continued, with Britain selling hundreds of its best to German firms - which used them in a first generation of aircraft, and then improved on them much for use in German aircraft. These new inventions lead the way to major success for the Germans in World War II. Germany had always been and has continued to be in the forefront of internal combustion engine development. Göttingen was the world center of aerodynamics and fluid dynamics in general, at least up to the time when the highly dogmatic Nazi party came to power. This contributed to the German development of jet aircraft and of submarines with improved under-water performance.

Induced nuclear fission was discovered in Germany in 1939 by Otto Hahn (and expatriate Jews in Sweden), but many of the scientists needed to develop nuclear power had already been lost, due to anti-Jewish and anti-intellectual policies.

Scientists have been at the heart of warfare and their contributions have often been decisive. As Ian Jacob, the wartime military secretary of Winston Churchill, famously remarked on the influx of refugee scientists (including 19 Nobel laureates), "the Allies won the [Second World] War because our German scientists were better than their German scientists.

Allied cooperation

The Allies of World War II cooperated extensively in the development and manufacture of new and existing technologies to support military operations and intelligence gathering during the Second World War. There are various ways in which the allies cooperated, including the American Lend-Lease scheme and hybrid weapons such as the Sherman Firefly as well as the American-led Manhattan Project. Several technologies invented in Britain proved critical to the military and were widely manufactured by the Allies during the Second World War.

The origin of the cooperation stemmed from a 1940 visit by the Aeronautical Research Committee chairman Henry Tizard that arranged to transfer U.K. military technology to the U.S. in case of the successful invasion of the U.K. that Hitler was planning as Operation Sea Lion. Tizard led a British technical mission, known as the Tizard Mission, containing details and examples of British technological developments in fields such as radar, jet propulsion and also the early British research into the atomic bomb. One of the devices brought to the U.S. by the Mission, the resonant cavity magnetron, was later described as "the most valuable cargo ever brought to our shores"

History of science and technology in the Indian subcontinent

 History of science and technology in the Indian subcontinent


The history of science and technology in the Indian Subcontinent begins with prehistoric human activity the Indus Valley Civilization to early states and empires. Following independence science and technology in the Republic of India has included automobile engineering, information technology, communications as well as space, polar, and nuclear sciences.

Prehistory

By 5500 BCE a number of sites similar to Mehrgarh had appeared, forming the basis of later chalcolithic cultures. The inhabitants of these sites maintained trading relations with Near East and Central Asia.

This was developed in the Indus Valley Civilization by around 4500 BCE. The size and prosperity of the Indus civilization grew as a result of this innovation, which eventually led to more planned settlements making use of drainage and sewerage. Sophisticated irrigation and water storage systems were developed by the Indus Valley Civilization, including artificial reservoirs at Girnar dated to 3000 BCE, and an early canal irrigation system from c. 2600 BCE. Cotton was cultivated in the region by the 5th–4th millennia BCE. Sugarcane was originally from tropical South and Southeast Asia. Different species likely originated in different locations with S. barberi originating in India, and S. edule and S. officinarum coming from New Guinea.

The inhabitants of the Indus valley developed a system of standardization, using weights and measures, evident by the excavations made at the Indus valley sites. This technical standardization enabled gauging devices to be effectively used in angular measurement and measurement for construction. Calibration was also found in measuring devices along with multiple subdivisions in case of some devices. One of the earliest known docks is at Lothal (2400 BCE), located away from the main current to avoid deposition of silt. Modern oceanographers have observed that the Harappans must have possessed knowledge relating to tides in order to build such a dock on the ever-shifting course of the Sabarmati, as well as exemplary hydrography and maritime engineering.

Excavations at Balakot (c. 2500–1900 BCE), present day Pakistan, have yielded evidence of an early furnace. The furnace was most likely used for the manufacturing of ceramic objects. Ovens, dating back to the civilization's mature phase (c. 2500–1900 BCE), were also excavated at Balakot. The Kalibangan archeological site further yields evidence of potshaped hearths, which at one site have been found both on ground and underground. Kilns with fire and kiln chambers have also been found at the Kalibangan site.

Based on archaeological and textual evidence, Joseph E. Schwartzberg (2008)—a University of Minnesota professor emeritus of geography—traces the origins of Indian cartography to the Indus Valley Civilization (c. 2500–1900 BCE). The use of large scale constructional plans, cosmological drawings, and cartographic material was known in India with some regularity since the Vedic period (2nd - 1st millennium BCE). Climatic conditions were responsible for the destruction of most of the evidence, however, a number of excavated surveying instruments and measuring rods have yielded convincing evidence of early cartographic activity. Schwartzberg (2008)—on the subject of surviving maps—further holds that: 'Though not numerous, a number of map-like graffiti appear among the thousands of Stone Age Indian cave paintings; and at least one complex Mesolithic diagram is believed to be a representation of the cosmos.

Archeological evidence of an animal-drawn plough dates back to 2500 BCE in the Indus Valley Civilization. The earliest available swords of copper discovered from the Harappan sites date back to 2300 BCE. Swords have been recovered in archaeological findings throughout the Ganges–Jamuna Doab region of India, consisting of bronze but more commonly copper.

Science, engineering and technology

                                 Science, engineering and technology


The distinction between science, engineering and technology is not always clear. Science is the reasoned investigation or study of natural phenomena, aimed at discovering enduring principles among elements of the phenomenal world by employing formal techniques such as the scientific method. Technologies are not usually exclusively products of science, because they have to satisfy requirements such as utility, usability and safety.

Engineering is the goal-oriented process of designing and making tools and systems to exploit natural phenomena for practical human means, often (but not always) using results and techniques from science. The development of technology may draw upon many fields of knowledge, including scientific, engineering, mathematical, linguistic, and historical knowledge, to achieve some practical result.

Technology is often a consequence of science and engineering — although technology as a human activity precedes the two fields. For example, science might study the flow of electrons in electrical conductors, by using already-existing tools and knowledge. This new-found knowledge may then be used by engineers to create new tools and machines, such as semiconductors, computers, and other forms of advanced technology. In this sense, scientists and engineers may both be considered technologists; the three fields are often considered as one for the purposes of research and reference.

The exact relations between science and technology in particular have been debated by scientists, historians, and policymakers in the late 20th century, in part because the debate can inform the funding of basic and applied science. In the immediate wake of World War II, for example, in the United States it was widely considered that technology was simply "applied science" and that to fund basic science was to reap technological results in due time. An articulation of this philosophy could be found explicitly in Vannevar Bush's treatise on postwar science policy, Science—The Endless Frontier: "New products, new industries, and more jobs require continuous additions to knowledge of the laws of nature ... This essential new knowledge can be obtained only through basic scientific research." In the late-1960s, however, this view came under direct attack, leading towards initiatives to fund science for specific tasks (initiatives resisted by the scientific community). The issue remains contentious—though most analysts resist the model that technology simply is a result of scientific research.

History

The use of tools by early humans was partly a process of discovery and of evolution. Early humans evolved from a species of foraging hominids which were already bipedal, with a brain mass approximately one third of modern humans. Tool use remained relatively unchanged for most of early human history. Approximately 50,000 years ago, the use of tools and complex set of behaviors emerged, believed by many archaeologists to be connected to the emergence of fully modern language.

Stone tools

Hominids started using primitive stone tools millions of years ago. The earliest stone tools were little more than a fractured rock, but approximately 40,000 years ago, pressure flaking provided a way to make much finer work.

Fire

Other technological advances made during the Paleolithic era were clothing and shelter; the adoption of both technologies cannot be dated exactly, but they were a key to humanity's progress. As the Paleolithic era progressed, dwellings became more sophisticated and more elaborate; as early as 380,000 BC, humans were constructing temporary wood huts. Clothing, adapted from the fur and hides of hunted animals, helped humanity expand into colder regions; humans began to migrate out of Africa by 200,000 BC and into other continents, such as Eurasia.



Friday 28 October 2016

Technological singularity

                    Technological singularity


The technological singularity also, simply, the singularity is the hypothesis that the invention of artificial superintelligence will abruptly trigger runaway technological growth, resulting in unfathomable changes to human civilization.According to this hypothesis, an upgradable intelligent agent (such as a computer running software-based artificial general intelligence) would enter a 'runaway reaction' of self-improvement cycles, with each new and more intelligent generation appearing more and more rapidly, causing an intelligence explosion and resulting in a powerful superintelligence that would, qualitatively, far surpass all human intelligence. Science fiction author Vernor Vinge said in his essay The Coming Technological Singularity that this would signal the end of the human era, as the new superintelligence would continue to upgrade itself and would advance technologically at an incomprehensible rate.

The first use of the term "singularity" in a technological context was attributed in 1958 to John von Neumann. In the same year, Stanislaw Ulam described "ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue".In the 1990s, Vinge popularized the concept, linking it to I. J. Good's "intelligence explosion", and predicting that a future superintelligence would trigger a singularity.

Ray Kurzweil predicts the singularity to occur around 2045 whereas Vinge predicts some time before 2030. At the 2012 Singularity Summit, Stuart Armstrong did a study of artificial general intelligence (AGI) predictions by experts and found a wide range of predicted dates, with a median value of 2040.

                                          Manifestations

Intelligence explosion
I. J. Good speculated in 1965 that artificial general intelligence might bring about an intelligence explosion. Good's scenario runs as follows: as computers increase in power, it becomes possible for people to build a machine that is more intelligent than humanity; this superhuman intelligence possesses greater problem-solving and inventive skills than current humans are capable of. This superintelligent machine then designs an even more capable machine, or re-writes its own software to become even more intelligent; this (ever more capable) machine then goes on to design a machine of yet greater capability, and so on. These iterations of recursive self-improvement accelerate, allowing enormous qualitative change before any upper limits imposed by the laws of physics or theoretical computation set in.



Emergence of superintelligence

ernor Vinge and Ray Kurzweil define the concept in terms of the technological creation of superintelligence. They argue that it is difficult or impossible for present-day humans to predict what human beings' lives would be like in a post-singularity world.


Plausibility

Many prominent technologists and academics dispute the plausibility of a technological singularity, including Paul Allen, Jeff Hawkins, John Holland, Jaron Lanier, and Gordon Moore, whose Moore's Law is often cited in support of the concept.



Claimed cause: exponential growth


The exponential growth in computing technology suggested by Moore's Law is commonly cited as a reason to expect a singularity in the relatively near future, and a number of authors have proposed generalizations of Moore's Law. Computer scientist and futurist Hans Moravec proposed in a 1998 book[17] that the exponential growth curve could be extended back through earlier computing technologies prior to the integrated circuit.

Kurzweil postulates a law of accelerating returns in which the speed of technological change (and more generally, all evolutionary processes) increases exponentially, generalizing Moore's Law in the same manner as Moravec's proposal, and also including material technology (especially as applied to nanotechnology), medical technology and others.[19] Between 1986 and 2007, machines' application-specific capacity to compute information per capita roughly doubled every 14 months; the per capita capacity of the world's general-purpose computers has doubled every 18 months; the global telecommunication capacity per capita doubled every 34 months; and the world's storage capacity per capita doubled every 40 months.

Kurzweil reserves the term "singularity" for a rapid increase in intelligence (as opposed to other technologies), writing for example that "The Singularity will allow us to transcend these limitations of our biological bodies and brains ... There will be no distinction, post-Singularity, between human and machine".[21] He also defines his predicted date of the singularity (2045) in terms of when he expects computer-based intelligences to significantly exceed the sum total of human brainpower, writing that advances in computing before that date "will not represent the Singularity" because they do "not yet correspond to a profound expansion of our intelligence.

The Information Technology.

                      The Information Technology..


The Information Technology Act, 2000 (also known as ITA-2000, or the IT Act) is an Act of the Indian Parliament (No 21 of 2000) notified on 17 October 2000. It is the primary law in India dealing with cyber crime and electronic commerce. It is based on the United Nations Model Law on Electronic Commerce 1996 (UNCIAL Model) recommended by the General Assembly of United Nations by a resolution dated 30 January 1997.


Resolution for amendment;
In December 2012, P Rajeev, a Rajya Sabha member from Kerala, tried to pass a resolution seeking to amend the Section 66A. He was supported by D. Bandyopadhyay, Gyan Prakash Pilania, Basavaraj Patil Sedam, Narendra Kumar Kashyap, Rama Chandra Khuntia and Baishnab Charan Parida. P Rajeev pointed that cartoons and editorials allowed in traditional media, were being censored in the new media. He also said that law was barely debated before being passed in December 2008.

Rajeev Chandrasekhar suggested the 66A should only apply to person to person communication pointing to a similar section under the Indian Post Office Act, 1898. Shantaram Naik opposed any changes, saying that the misuse of law was sufficient to warrant changes. Then Minister for Communications and Information Technology Kapil Sibal defended the existing law, saying that similar laws existed in US and UK. He also said that a similar provision existed under Indian Post Office Act, 1898. However, P Rajeev said that the UK dealt only with communication from person to person.


Petitions challenging constitutionality
In November 2012, IPS officer Amitabh Thakur and his wife social activist Nutan Thakur, filed a petition in the Lucknow bench of the Allahabad High Court claiming that the Section 66A violated the freedom of speech guaranteed in the Article 19(1)(a) of the Constitution of India. They said that the section was vague and frequently misused.

Also in November 2012, a Delhi-based law student, Shreya Singhal, filed a Public Interest Litigation (PIL) in the Supreme Court of India. She argued that the Section 66A was vaguely phrased, as result it violated Article 14, 19 (1)(a) and Article 21 of the Constitution. The PIL was accepted on 29 November 2012. A similar petition was also filed by the founder of MouthShut.com, Faisal Farooqui, and NGO Common Cause represented by Prashant Bhushan In August 2014, the Supreme Court asked the central government to respond to petitions filed by Mouthshut.com and later petition filed by the Internet and Mobile Association of India (IAMAI) which claimed that the IT Act gave the government power to arbitrarily remove user-generated content.



Future changes
On 2 April 2015, the Chief Minister of Maharashtra, Devendra Fadnavis revealed to the state assembly that a new law was being framed to replace the repealed Section 66A. Fadnavis was replying to a query Shiv Sena leader Neelam Gorhe. Gorhe had said that repeal of the law would encourage online miscreants and asked whether the state government would frame a law to this regard. Fadnavis said that the previous law had resulted in no convictions, so the law would be framed such that it would be strong and result in convictions.

On 13 April 2015, it announced that the Ministry of Home Affairs would form a committee of officials from the Intelligence Bureau, Central Bureau of Investigation, National Investigation Agency, Delhi Police and ministry itself to produce a new legal framework. This step was reportedly taken after complaints from intelligence agencies that, they were no longer able to counter online posts that involved national security matter or incite people to commit an offence, such as online recruitment for ISIS. Former Minister of State with the Ministry of Information Technology, Milind Deora has supported a new "unambiguous section to replace 66A.



Tuesday 11 October 2016

Technology

                                       Technology 

Technology ("science of craft", fromGreek techne, "art, skill, cunning of hand"; and -is the collection of techniques, skills, methods and processes used in the production of goods or services or in the accomplishment of objectives, such as scientific investigation. Technology can be the knowledge of techniques, processes, etc. or it can be embedded in machines, computers, devices and factories, which can be operated by individuals without detailed knowledge of the workings of such things.

The human species' use of technology began with the conversion of natural resources into simple tools. Theprehistoric discovery of how to control fire and the later Neolithic Revolutionincreased the available sources of food and the invention of the wheelhelped humans to travel in and control their environment. Developments in historic times, including the printing press, the telephone, and the Internet, have lessened physical barriers tocommunication and allowed humans to interact freely on a global scale. The steady progress of military technologyhas brought weapons of ever-increasing destructive power, fromclubs to nuclear weapons.

Technology has many effects. It has helped develop more advancedeconomies (including today's global economy) and has allowed the rise of a leisure class. Many technological processes produce unwanted by-products, known as pollution, and deplete natural resources, to the detriment of Earth's environment. Various implementations of technology influence the values of a society and new technology often raises new ethical questions. Examples include the rise of the notion of efficiency in terms of humanproductivity, a term originally applied only to machines, and the challenge of traditional norms.

Philosophical debates have arisen over the use of technology, with disagreements over whether technology improves the human condition or worsens it. Neo-Luddism,anarcho-primitivism, and similar reactionary movements criticise the pervasiveness of technology in the modern world, arguing that it harms the environment and alienates people; proponents of ideologies such astranshumanism and techno-progressivism view continued technological progress as beneficial to society and the human condition.

Until recently, it was believed that the development of technology was restricted only to human beings, but 21st century scientific studies indicate that other primates and certaindolphin communities have developed simple tools and passed their knowledge to other generations.

Definition and usage

The spread of paper and printing to the West, as in this printing press, helped scientists andpoliticians communicate their ideas easily, leading to the Age of Enlightenment; an example of technology as cultural force.

The use of the term "technology" has changed significantly over the last 200 years. Before the 20th century, the term was uncommon in English, and usually referred to the description or study of the useful arts. The term was often connected to technical education, as in the Massachusetts Institute of Technology (chartered in 1861).

The term "technology" rose to prominence in the 20th century in connection with the Second Industrial Revolution. The term's meanings changed in the early 20th century when American social scientists, beginning with Thorstein Veblen, translated ideas from the German concept of Technik into "technology". In German and other European languages, a distinction exists between technik and technologie that is absent in English, which usually translates both terms as "technology". By the 1930s, "technology" referred not only to the study of the industrial artsbut to the industrial arts themselves.

In 1937, the American sociologist Read Bain wrote that "technology includes all tools, machines, utensils, weapons, instruments, housing, clothing, communicating and transporting devices and the skills by which we produce and use them."Bain's definition remains common among scholars today, especially social scientists. But equally prominent is the definition of technology as applied science, especially among scientists and engineers, although most social scientists who study technology reject this definition. More recently, scholars have borrowed from European philosophers of "technique" to extend the meaning of technology to various forms of instrumental reason, as in Foucault's work ontechnologies of the self (techniques de soi).

Dictionaries and scholars have offered a variety of definitions. The Merriam-Webster Dictionary offers a definition of the term: "the practical application of knowledge especially in a particular area" and "a capability given by the practical application of knowledge".Ursula Franklin, in her 1989 "Real World of Technology" lecture, gave another definition of the concept; it is "practice, the way we do things around here".The term is often used to imply a specific field of technology, or to refer to high technology or justconsumer electronics, rather than technology as a whole. Bernard Stiegler, in Technics and Time, 1, defines technology in two ways: as "the pursuit of life by means other than life", and as "organized inorganic matter."

Technology can be most broadly defined as the entities, both material and immaterial, created by the application of mental and physical effort in order to achieve some value. In this usage, technology refers to tools and machines that may be used to solve real-world problems. It is a far-reaching term that may include simple tools, such as a crowbar or woodenspoon, or more complex machines, such as a space station or particle accelerator. Tools and machines need not be material; virtual technology, such as computer software andbusiness methods, fall under this definition of technology. W. Brian Arthur defines technology in a similarly broad way as "a means to fulfill a human purpose".

The word "technology" can also be used to refer to a collection of techniques. In this context, it is the current state of humanity's knowledge of how to combine resources to produce desired products, to solve problems, fulfill needs, or satisfy wants; it includes technical methods, skills, processes, techniques, tools and raw materials. When combined with another term, such as "medical technology" or "space technology", it refers to the state of the respective field's knowledge and tools. "State-of-the-art technology" refers to the high technology available to humanity in any field.

The invention of integrated circuits and themicroprocessor (here, an Intel 4004 chip from 1971) led to the modern computer revolution.

Technology can be viewed as an activity that forms or changes culture. Additionally, technology is the application of math, science, and the arts for the benefit of life as it is known. A modern example is the rise of communication technology, which has lessened barriers to human interaction and, as a result, has helped spawn new subcultures; the rise ofcyberculture has, at its basis, the development of the Internet and thecomputer. Not all technology enhances culture in a creative way; technology can also help facilitatepolitical oppression and war via tools such as guns. As a cultural activity, technology predates both science andengineering, each of which formalize some aspects of technological endeavor.