Posted by cadsmith on May 15, 2010
Information layers seem to spiral as data doubles every year or so. This may find its way to storage structures for accessibility and analysis. In the meantime, semantics do heavy lifting, while users choose centricity, e.g. actor, method, device, network, or space, and match measures to media. Any location may resolve to surface, signal, transform, translation, filter or fractal. Recent links:
Visualization, VTK, Protovis, prefuse; 3D, Autodesk Inventor, camera, Gemvision; Robotics, Anybots telepresence; Socnets, OpenID Connect, OneSocialWeb, Udemy online courses; Translation, Google audio; Security, KHOBE shreds Windows walls, Verizon cloud; Quantum atomic optical computing, location-based cryptography; Energy, EIA Annual Energy Outlook has a national energy modeling system (NEMS), and scenarios to 2035. smart grid, again; Space, Moon Zoo, multiverse; Psychology, Analyze Words twitter sentiment, internet improves well-being; Documents of note: The Fate of the Semantic Web, Pew Research, 2010 (pdf), On the Essence of Truth, Martin Heidegger, 1943.
Re-Designing Learning Contexts, Luckin 2010. This book concentrates on technological literacy for personal and collaborative education. Luckily, it was available for Kindle. Scholars are defining what knowledge is, and it is up to the participant to try to find improvements, learn from failure and synthesize resources. The author, an expert from the UK who will be keynoting the ICICTE conference, highlights interaction and presents the equivalent of a slideshow narrative on the status and needs of the learner. This follows previous work on learning outside of school, holistically within a lifetime. About four hundred publications are organized into nine chapters relating to three parts of background, ecology and future. Context is internalized enough to become independent of elements such as place, culture and technology. For example, Paul Dourish is significant for notion of embodied interaction. Historical approaches include information processing, behaviorism, constructivism, scaffolding in the form of hints or next steps, zone of collaboration, and cognition. Types of learners are supported by more able partners (MAPs). Many software solutions are discussed which emphasize combinations of discussion, tutorials, videos, multiple representations, visualization and simulation, distributed scaffolding, recognizing learner beliefs, collaboration and props such as toys. An ecology of resources model defines filter elements for knowledge and skills, tools and people, and environment. Case studies are distinguished by creation process (brainstorm, focus of attention, categorization, filters, resources, MAPs), relationships, and scaffolds. Interaction models structure conversation, scaffolding design, learner context, and locale and mobile frameworks. A practice example uses lesson plans, whiteboard, tablets and homework activities. Some of the web2.0 terms are used such as tagging and crowdsourcing. Links are given for resources, e.g. rixcentre.org which handles learning disabilities. An attempt to use http://www.autotutor.org appeared inconclusive since it did not seem to understand typed answers on well-known subjects, such as how the internet might be (re)designed, and hand-waving, e.g. to draw gravity wells to show how differently sized objects might respond to eachother, were not allowed. To be fair, the same word problem was completely incomprehensible to a cloud math engine so there is a ways to go for these approaches to be general purpose or compatible with other large scale efforts such as EarthGame.
The Nature of Technology: What It Is and How It Evolves, Arthur 2009. This book presents a theory of technology which directs human life, is beneficial and results in economic arrangements and activities. It is alive in the sense of a coral reef, yet separate from incremental biology which may yet become technology. It involves combinatorial evolution shaped by demand, modularity, recursive structure, and mechanistic bootstrapping. Structures are deepened by subcomponents for performance, monitoring, adaptation, and reliability. The theory is derived from scientific evolution and self-correcting paradigms. It also drives economics which is non-deterministic. The result is physical, yet digitization is the currency. The needs arise from growth of society, support of tech itself, and fixes. Innovation tends to be nation-centric based on deep craft of local cultures. Technology is a means to fulfill a human purpose, the idea of use or programming of a phenomenon for some purpose and resultant cultural practices, components, devices and engineering principles and architecture. Standard engineering aims to solve problems. Invention occurs by mental association. Concepts are realized in physical form. Clusters of common theory form domains which can be categorized hierarchically. Innovation may involve redomaining, e.g. in economics. Solutions become components for further developments.
The reader can make some criticisms. When it comes to theories, testing and falsification of premises and claims are usually required. This one does not show the math nor discuss limits, e.g. due to scale or sustainability. It is predictive only in a causal sense since scientific instruments are a form of technology which discover new phenomena which result in new technologies. Human evolution includes all knowledge and activities, so any particular area, e.g. technology, may already be considered part of this. There are discussions elsewhere that global networks are more for communication than economics and may be prone to politicization. The ROI example of Columbus in the New World may actually be cautionary for indigenous natives. There are risks for controlled ownership by big corporations on one side or open-source and commons on the other. Discussion of robotics, or requirements for education or ethics, for example, would require additional sources.
The Grammar of Technology Development, edited by Tsubaki and others, 2008. The theme is methodologies for quantifying technology development. Idea is based on “The Grammar of Science” 1892 using statistics. A grammar is a description of approaches for selection of the most effective. Three parts provide a total of fourteen papers covering systematic modes, design of experiments and statistical methods. Digital engineering makes use of computer aided engineering, quality control, simulation, verification and validation. The technology development process can be modeled for interactions between virtual and real society. Systems science can also make use of intuition in micro-theories of knowledge creation which are several orders of magnitude better at information processing than verbalization and includes the collective unconscious. Spiral processes in a creative space use dimensions of objectivity, subjective, intersubjective, justification and reflection. Ecodesign is environmentally conscious and has compatible tools. Social networks show useful gaps in three types of communications: interactive, distributed, and soapbox. Simulation matches theoretical and actual conditions and tests the model, its own issues can be compensated for by calibration in the design of experiment. Measures for performance evaluation include hardware simulation, software simulation, and approximate analysis. Several case studies are shown for an example product, digital factory, web-legacy client-server system, and musical search by humming. Many of the articles introduce keywords for the statistical concepts used in that type of grammar and use figure and tables to illustrate the most significant data. This book was the result of a conference of several different authors so it is left to the reader to correlate the ideas. The case seems promising and the combined use will likely lead to additional efforts that can be evaluated for improvements or automation.
Technically Speaking: Why All Americans Need to Know More About Technology, edited by Pearson and Young, 2002. Planning for the future involves significant emphasis on society, economics, and environment. These in turn depend upon development of technology to solve problems, lower costs, and use resources more efficiently. Somewhere along the line of familiarity with a single tool, science experience, and engineering, there is a general technology skill. As values and philosophy are revealed, there seem to be no clearcut definitions of exactly what an essential approach would be or how to recognize the better ones from many attempts. A question arises as to how to remedy this. Education is the usual answer and this then raises another issue of how to teach technology. This book centers around the notion of technology literacy, including capabilities, knowledge and ways of doing. The theme is nationalistic, though international attitudes are described. Some of the problems, e.g. lack of understanding of the electrical grid, have since had new solutions such as smart grid to compensate by automating some of the decision-making. This book is clearly written, formatted like a textbook with boxed summaries and reading lists. This is a subject which can be taken for granted only at peril since the ongoing changes in population, demographics, ideologies and climate seem to be increasing in complexity at faster rates over time. Free complete online preview at http://books.nap.edu/catalog.php?record_id=10250.
This entry was posted on May 15, 2010 at 11:58 pm and is filed under Uncategorized. Tagged: 3D, books, energy, quantum, robotics, security, social-networks, technology, translation, visualization. You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.