Test Information Space

Journal of Tech, Testing and Trends

Archive for June, 2010

Blind’s Eye View

Posted by cadsmith on June 25, 2010

100623f

Society values both creativity and heritage. This may also be true in the digital dimensions. A convergent look at mathematics, computers and biology reveals clues to what may emerge. The capacity of the imagination remains unbounded so far.

Recent links (about eleven): debug: Jinx for multicore, search: LingLink, animation: Xtranormalanalytics: boomerang, space: weather prediction, Icecube telescope, medical: lung-on-a-chip.

Book reviews:

Explanation and Proof in Mathematics: Philosophical and Educational Perspectives, edited by Hanna and others, 2010

How many mathematicians does it take to prove whether a lightbulb is screwed in? These essays are from a workshop in Essen in 2006. The ways that the practice of mathematics has developed in the last three decades due to computerized visualization and experimentation may hint at its future. This book has three parts for seventeen chapters by eighteen contributors on proofs including their nature, aspects of teaching, cognitive development, experiments and diagrams. Proofs of the correctness by writing texts of algorithms were what ancient mathematicians used. Solutions are often based on common intuitions which can be further explored. Proofs are often discovered by mathematical experimentation, rather than deduction, which involves intuitive, inductive, or analogical reasoning through conjecture, verification, global or heuristic refutation, and understanding. Tools may be a proof of a mathematical form, or a way to explore a domain, via the notion of the semiotic potential of an artifact. Three different worlds of mathematics can be distinguished, the conceptual-embodied such as gears, perceptual-symbolic such as conics, and axiomatic-formal such as deductions. Individuals have warrants for truth that compensate for uncertainties in their mathematical proofs and that become more sophisticated over time. Situations that reinforce theoretical proofs over pragmatic are required for this type of research to result. Methods of proof can be used in other mathematical contexts. Though the history of mathematics encourages perseverance, each step of a proof stands without historical context since changes in language use may become a source of fallibility. The philosophy of mathematics shows the evolution of proofs and how they support empirical science and other symbolic endeavors. Much mathematical theorizing also occurs prior to the formulation of the axioms used as contextual definitions. The types of thesis as to why the Greeks invented proof include the socio-political, the internalist and the philosophical influence. Descartes’ arithmetization of geometry and the calculation of magnitudes was refined by Arnauld and Lamy. Can compare Frege and Russell, Peirce and Dewey, or Wittgenstein on how proof as picture shows what was proved and should get the same result, while proof as experiment shows procedure which can remain static and get different results. Lakoff and Núñez considered mathematics to be a cognitive system of conceptual metaphors based upon the sensory motor system.

Creative Environments: Issues of Creativity Support for the Knowledge Civilization, edited by Wierzbicki and Nakamori, 2007

Ba, Japanese for place or environment, is also the notion of computerized creativity support. Heidegger described technology as a quest for truth through creativity. Social science needs to better understand knowledge creation in science and tech. A constructive evolutionary objectivism episteme has ten postulates based on several principles. The evolutionary falsification principle measures fitness by number of tests passed. The emergence principle states that qualitatively different properties emerge from complexity, e.g. as software is different from hardware. The multimedia principle holds that historical records of knowledge will stimulate creativity by including complex visual and preverbal elements in addition to words. New concepts in science will be based on horizontal changes in mathematics. Technology and basic science form a feedback loop. The intellectual environment is a heritage of humanity worth preserving. Creative holism has a systemic approach to organization. Academic knowledge creation involves social, technical and mathematical approaches. Interdisciplinary approaches to mathematical modeling attempt to provide qualitative improvements. The book develops a testable creative environment (CE) to support scientific research. Roadmapping is a kind of knowledge creation process which can use various types of IT principles and tools for academic research. Software and tools for brainstorming and group debate are biased towards commercially goal-oriented organizations and need significant changes for academic use. Knowledge discovery requires interactions between AI and human reviewers, e.g. inclusion of user preferences in data mining. Seven creative spirals are proposed as tools for prescriptive synthesis in the process of learning. Survey results are presented for questions related to knowledge creation support. The book has twenty-one authors. There are four parts for eighteen chapters on models of creative processes, tools, diverse tools, and philosophical issues. It has many figures and tables including the spiral representations of processes, the triple helix model and JAIST Nanatsudaki model. The major text points are emphasized in box outlines. The content also has the hierarchical summaries of introduction and conclusion for each chapter and as a whole. Other topics include machine learning, statistics, virtual labs, gaming, criteria, and distance and e-learning, This is a followup to the editors’ previous publication on Creative Space, 2005. This book may also be of interest to inventors and innovators outside of entirely academic domains since learning advantages are key to most other pursuits. .

Radical Evolution, Joel Garreau, 2005

The scenario planner’s philosophy involves stories, patterns in uncertainties, common solutions, and simulation. This age was formed between 800 and 200BC by ideas which arose simultaneously in the East and West. The rate of change is quickening. According to the author’s Law of unintended consequences, human nature will likely be changed. The book tries to look ahead by defining scenarios which conform to facts and identify the predetermineds, critical uncertainties, wild cards, embedded assumptions and early warnings. These are based on progress in the areas of genetics, robotics, info, and nano (GRIN). Each of them may have its own philosophy, e.g. connecting living and nonliving things, open-source, unlimited creativity or skepticism. There are eight chapters, suggested readings, and notes. Seven major scenarios, some of which have celebrities, include the LUC above, Curve of exponential increase, Singularity/Vinge, Heaven/Kurzweil, Hell/Joy, Prevail/Lanier, and Transcend/Bostrom. This topic was inspired by work at Darpa, e.g. meals which last for days of exertion, or treatments for muscular dystrophy. This title was recommended for its discussion of risks by scifi & philosophy editor Schneider 2009.
The reader may wonder what set of rules are actually developed for each case. Also, there may be situations where science exceeds tech, in which case the artifacts are less evident and, though intelligence may expand wisdom, paradox seems to increase due to censorship.

Write Good or Die, edited by Scott Nicholson, 2010

According to Ray Bradbury, it takes a million words to become proficient at writing. If average is 200-250 words per page in a novel, that becomes 4000 to 5000 pages. The average novel is 100k words or 300-400 pages. Therefore, it would take 10 novels to become good. One of the Kindle authors, in contrast, sold over 29k ebooks at about 2 dollars each in a year and expects Amazon to double royalties soon. This is a tutorial for fiction novelists written by published and award-winning writers. It has three parts on art, craft, and business for thirty-three chapters by eighteen contributors. It might be classified as self-help since it assumes that the reader has finished, or is working on, a novel that needs to be published. The book contains a series of blog-like entries and links on why and how the authors write. In most cases, the rules were personally discovered through trial and error. So, while one debunks the agent career planning myth and says to be an artist, another shows what customers want and includes a pitch letter and instructions for how to get an agent. The Writers Market also has such lists. The anecdotes are absorbing and evoked various emotions, e.g. suspense or laughter. The reader is reminded that fame ups the odds for a bestseller, and recommendations improve sales e.g. through social networking. Other tips include: know what the book is about in a premise sentence or story line; include protagonist, antagonist, setting, conflict, stakes, atmosphere and genre; and write from research once immersed or overwhelmed. It shows the advantages and disadvantages of each POV and how to choose. The anatomy of the three-act story structure is described along with the use of imagery and dialogue. This book is not about the internet, except to say that editors may notice a web page and readers appreciate newsletters. It has a glossary of terms for writing professionals. This title wasn’t included in google books yet. Had originally found it on the Kindle directory, and it can also be downloaded free from its companion website.

Blogs of interest:

Wolfram on Turing

Advertisements

Posted in Uncategorized | Tagged: , , , , | Leave a Comment »

Grain of And

Posted by cadsmith on June 21, 2010

100603

The computational paradigm has several facets including network, social, artificial, biological, and spiritual. How it arises, and where it is leading to, are popular topics. In some futures, the grand elements of identity include better personalization and representation amid the complexity, so the group dynamics that approach or avoid these can be studied.

Recent Links (about twenty links): AI: IBM Watson, semantics: News patterns, science: Sun musicmetrics, art: Sci-fi urban illustration, history: Turing archive, digital libraries: SpringerLink, telecom: Iridium satellite, dark pulses, search: Google commerce, user interface: GoogleCL, entertainment: Rdio, OnLive games.

Book reviews:

Science & technology in China: a roadmap to 2050 : strategic general report of the Chinese Academy of Sciences, edited by Yongxiang Lu and others, 2010

The Chinese Academy of Sciences reports how the country is modernizing science and technology and social changes for a developed world expected to triple in population and economic size over the next five decades. Revolutions in S&T require changing from imitation to innovation, independence, and institutions. Breakthroughs are expected in information science that will outpace technology. Computational thinking combines man-cyber-physical in a ternary universe. The plan is to absorb global innovations and intellectual resources. The format is like a brochure that defines structure, characteristics, steps and research support.The text is supported by data formatted in tables, charts, and highlighted boxes which detail characteristic indicators. There are five chapters by a committee of five writers and forty reviewers representing over three-hundred members. It is written at the level of principles and categories rather than specifics like the design of a new plane, and compares the rate of modernization of twenty-four countries. The major topics are economics, emerging areas, security, basic science, sustainability, and strategic efforts. This will be followed up by actual research, publications, workshops, peer review and priorities. It refers to relevant past plans, the immediately previous of which lasted for four years. It adds integration between demonstration and application, e.g. the topic of social computing and how it goes from electronic to ubiquitous. By 2050, China sees itself as an open society, advanced in culture, ethics, politics, materials, and conservation. Eight basic and strategic systems for economic development include energy, materials and manufacturing, networking, agriculture and biology, health, ecological and environmental, space and ocean, and security. Three emerging cross-disciplinary areas are nanotech, space, and complex systems. Security recognizes open-source intelligence and has two areas, space situational awareness, and social computing and parallel management systems. Four basic science areas are dark matter and energy, controlling structure of matter, synthetic biology, and photosynthesis. Seven sustainability efforts are comprised by 4k meter transparence underground to see ore deposits, renewable energy, deep geothermal, nuclear, marine, stem cells and regenerative medicine, and early diagnosis and intervention of chronic diseases. Six strategic efforts are post-IP networking, green manufacturing, process engineering, ubiquitous sensing, exa supercomputing, and molecular design. The primary milestones are shown for the years 2020, 2030 and 2050.

A Companion to the Philosophy of Technology, edited by Olsen and others, 2009

Technical literacy and individual and social decision-making are among challenges that philosophy attempts to address. It is useful to get a high-level summary. This book includes many technology-related issues in a single volume. There are ninety-eight chapters by about seventy-nine contributors under the seven themes of history, science, philosophy, environment, politics, ethics, and the future. The authors are Western, i.e. European and American, though there are discussions of Eastern references. The chapters are like the intros to books on each separate topic. The notion of convergence appears in the Future section, written by Bainbridge, who is one of the writers having multiple entries. There would probably be value in further integration, perhaps through discussions among various subsets. This might or might not improve prediction market accuracy depending upon how participants actually influenced eachother. This text compares well to previous philosophy books which were more in-depth and are likely included in the reading lists. It presents questions, terminology and some handy visualizations, and would be a good place to begin.

New Computational Paradigms: Changing Conceptions of What is Computable, edited by Cooper and others, 2008

This book is comprised of proofs of neo-Turing theories of logic and mathematics in technically advanced publications from the Computability in Europe (CiE) conference in 2005. It advocates the dynamic turn of interactions between observers and systems, and eachother socially. Conversation is computation. Turing’s boss in 1948, Darwin’s grandson, dismissed his paper on “intelligent machines” as merely “a schoolboy essay” so it was not published for two decades. It turned out to be a manifesto for at least AI, connectionism, and neural computing and was accompanied by another discussing evolutionary computing. Turing machines, the basis for modern computers, were derived as a model of computation. The computable analysis problem was to decide what was computable and how long to expect it to take. Applications include wireless mobile nets, neural nets, analog computers, topological spaces, graphics and hardware. There are machines that do not fall into these classes, e.g. algebraic calculations done by planar mechanisms such as rigid bars joined by rotatable rivets, or viewing an eternity in finite time using relativity equations. Information processing is emphasized, e.g. regulatory genomes. Biological computing has new operations such as splicing, crossover and point mutations and annealing, which demonstrate parallelism, reversibility, nondeterminism, energy efficiency, self-healing and evolution. Membrane computing structures have local reaction rules for evolving objects in multisets, e.g. DNA software. Computational models can be classified by space and time, discrete and continuous in each case. The authors look at how nature, or what the universe, computes. Additional directions are pursued including continuous time computations, derivatives of continuous functions and infinite time computation. There are about thirty international contributors beside the three editors. The format combines twenty papers in four parts. New paradigms were expected to follow.

Apocalyptic AI: Visions of Heaven in Robotics, Artificial Intelligence, and Virtual Reality, Robert M. Geraci, 2010

Apocalypticism consists of dualism, alienation, transcendence, and bodily purification, all of which are present in AI. A2I is a social strategy for research funding, as well as an ideology for online life. It is argued philosophically, legally and theologically. It is about commitment to actions and attitudes. This book approaches the technology and philosophy from the perspective of divinity, and has five chapters, a pair of appendices, notes and references. There are descriptions of the work of many researchers in AI and robotics, e.g. Turing, Minsky, Kurzweil, de Garis, and Warwick. Newell observed that Prometheus denotes tragedy, where technology actually leads to magic. Moravec wrote an essay in 1978 about converting nonlife to immortal mind, and in 1988 predicted that humans would eventually be capable of uploading their mind into a robot “bush” body which is fractal-like. A “mind fire” transforms the cosmos at lightspeed. Nationalism and war are obsolete. We are living in a simulation created by a god. Identity is a pattern and process within the brain and body which is possible in other materials. The Order of Cosmic Engineers believe that they will become the new creators. This may result in a virtualization of identity, available anywhere. It also has intermediate separate personalities of Transhumanists, e.g. Second Life’s Stenvaag. Games and digital worlds are precursors of digital paradise. These are primarily forms of social contact. Bainbridge’s sociology work for NSF is discussed. Actor-Network Throery, e.g. Latour’s trials of strength, observes that understanding scientific advance requires both natural and social actors. Transmutation is also a topic of religious history. There are methodologies common to science and religion, though the two are distinct. Religion affects how robots are integrated into society in the US, Europe and Asia. Japanese karakuri may descend from daVinci’s automata through missionaries. Relationships between humans and robots are worth study since the two may become indistinguishable. The major funding of robotics in the US is from defense, which may also provide the ethics. The robots may be more objective and humane.

Blogs of interest:

MSDN cashto on unit testing

Barnett reviews Natural Computing

Videos of interest:

Philip Low at TEDMED 2009 on cell phone brain cognition display

Posted in Uncategorized | Tagged: , , , | Leave a Comment »

Imprinters

Posted by cadsmith on June 16, 2010

100614c

Nanotech is a computer-aided approach which is transforming many fields and stimulating new industries. Expectations are that it is a next-big-thing equivalent to what personal computers, the internet and web were when they began. The ethics, engineering, effects on medicine and exciting scifi are reviewed below. If all goes well, perhaps this civilization’s legacy will be more than just space junk.

Recent links (about 23): computer: quantum, semantic web: Kingsley Idehen, finance: startups in Boston, minerals in Afghanistan, nanotech: architecture, FDA, futurist: singularityspace: lunar water, NSWP, Kepler exoplanets, presentation: glogster, virtualization: skype, energy: urban, disaster recovery: gulf oil spill, historical chart, photography: photoshop.

Book reviews:

What Is Nanotechnology and Why Does It Matter: From Science to Ethics‎, by Fitz Allhoff and others, 2010

After a basic introduction, e.g. recalling Drexler’s molecular assemblers from 1987, this book delves into the social concerns about nanotechnology. The authors are a scientist and a pair of philosophers. Scale reduces energy consumption. Tools include the electron microscope, scanning transmission (STEM), scanning probe microscopy (SPM), and atomic force microscope (AFM). The engineering challenge is to industrialize scientific development in terms of specification, monitoring and mass production. Among major philosophical topics is the risk in terms of conditions, probability and expected impact. Though the state of the field incrementally improves existing products, present laws do not account for the downsides to humans and animals. Better testing processes are necessary. There is a detailed analysis of the objections to stricter laws. Enhancement integrates tools into anatomy, always on, and is expected to revolutionize engineering. Sleep may become more of a bimonthly rather than nightly requirement. Nanomedicine ethics are discussed, e.g. Bawa and Johnson. The developing world may not be seen as profitable. In this book, in the context of distributed justice, nanotechnology is not unique in unfairness of accessibility to cognitive advances, e.g. similar to university costs. Privacy has been demonstrated as an issue, e.g. related to RFID tags. Potential uses to impose biases for individual control by bureaucracy, e.g. patriotism, may themselves be hard to limit. Defense probably develops war robots and the arms race turns to miniaturization.
This does not cover longevity, space or molecular manufacturing, laws or regulations, or economic impacts

Handbook of Nanoscience, Engineering and Technology, William A. Goddard, 2007

This textbook presents a set of themes describing the current state of nanotech. There are five sections containing twenty-four chapters on potential, concepts, processes, assembly, and functions. About a couple of dozen organizations contributed from US, Russia and Venezuela. Most are academic, e.g. universities of Illinois, North Carolina or Northwestern, and there are some US government space and defense researchers. Feynman introduces the subject. Most chapters have multiple authors, some have a single, and a couple of authors wrote or participated in a pair of chapters, e.g. Karl Hess for U of Illinois or Sergey Lyshevski of RIT. The contents are technical, including equations and graphs, and there is some Matlab source code. Chapters have intros and conclusions, acknowledgements and many references. There is no glossary, though there is an index and digital versions would have search. As an example, the final section has eight chapters on functional structures and mechanics. Nanomechanics links science and engineering, e.g. multiscale multiphysics schemes. Figure 20.1 shows the history of the tech from Mayan age ceramics after 10k BC to synthetic control of macromolecular structure now, and discusses biomimicry through dendrimer assembly. Atomic simulation resolution doubles every 19 months. Strength and fracture properties are outlined. A challenge is to control carbon nanotube growth chirality and diameter for computing-related applications. The optical properties of materials are engineered in photonic crystals. Preparation techniques are being developed for bulk production of nanostructured materials. Modeling and CAD are used in multidisciplinary confluent engineering, e.g. for nanoarchitectronics. In summary, there is a lot of general interest in convergence of nano, bio, info quantum and cognitive tech and this book has supporting examples.

The Handbook of Nanomedicine, Kewal K. Jain, 2008

The title of this book denotes types of tools and approaches rather than a medical specialty. It is derived from biotech and nanotech. The initial applications are expected to be for personalized medicine, e.g. cancer therapies. New tools include 3D nanomaps and the scanning mass spectrometer probe (SMS) used for drug design at cellular level. Nanoparticles can be coated or chemically altered so as to be nontoxic, though they can also be effective for nanoviricide. There are many types of applications, e.g. sunscreens or donor-derived exosomes for organ transplant acceptance. A lab-on-a-chip has chemical experiments for use in battlefield exposure testing. It allows platforms for precise imaging, diagnosis, targeting, drug delivery, destruction, treatment, and therapy. Nanomedicine can also be used in combination with other approaches, e.g. radiotherapy or physical modalities of therapy. There are public misconceptions and fears, so education is warranted, and there will probably be FDA regulation. The detailed table of contents hints at the depth of coverage in the eighteen chapters. There are many new structures and techniques, e.g. devices, machines, chips, robotics, materials, implants, barcodes, needles, tweezers, motors, shells, tubes, fibers, scaffolds, valves, pores, filters, coatings, crystals, emulsions, filaments, lasers, fluidic channels and wire. The nano prefix can be applied to several new fields including biotech, systems biology, bacteria, antibodies, genomics, proteomics, pharma, encapsulation, diagnostics, surgery, therapeutics, dermatology, dentistry, immunology, geriatrics, pulmonology, neurology, and regenerative medicine. The author also lists vendors and academic research centers.

Small Miracles, Edward M. Lerner, 2009

“Speech was so old species” says one of the emergent characters who considers humans to be Neanderthals in this transhuman techno adventure. Where Daniel Suarez had a parasitic AI influencing a group of people, and Robert J Sawyer had one further connected cybernetically through an eye implant, Lerner adds nanobots. This doesn’t go as far as Paolo Bacigalupi in genetically engineering creatures, but it does have a lot of detail about how humans might be medically enhanced. Initially intended to support first aid for government security equipped with new nanosuits, the temptation for a hybrid augmented reality awareness captures human nature. Without further spoilers, it is clear that the author researched the topic. His characters and dialogue are vivid. Italics are used occasionally for thoughts. The backgrounds of at least three of the main characters are fleshed out in separate parts well into the story. Settings are briefly sketched except, for example, to indicate heightened visual acuity in places or where necessary for action such as weather conditions. There are eight sections for about four dozen brief chapters. It is told in the third person omnisciently, including emotions, and an occasional machine perspective. Medical terms and R&D equipment get added detail. The plot may be more convincing since it is near future and there are not a lot of other inventions. The year is 2015 and the pacing opens dramatically with a threat to the main character’s survival. The total duration is about two years. Each chapter is titled by a date, a few have times if a couple are on the same day, and the Reaping has nine times before the epilogue. Success of the authors’ series above may hint at a sequel.

Documents of interest:
Communicating Nanotechnology, European Commission, 2010 (16.4MB ZIP PDF)
Human Enhancement Ethics: The State of the Debate, Bostrom and Sarulescu, 2008 (PDF)

Blogs of interest:

Nick Bostrom home page

Videos of interest:
David Byrne: How architecture helped music evolve
Reducing Existential Risks [UKH+] (1/3)
John Underkoffler points to the future of UI

Posted in Uncategorized | Tagged: , , | Leave a Comment »

Superintent

Posted by cadsmith on June 12, 2010

100527

Seeming contradictions are grist for the engineering mill.  While hints of future issues raise the value of cooperation, vested domains attempt to defend their turf, especially the big budget items. Drama is usually followed by system reorganization, at lesser durations between phases. There are various models for the dynamics, including networking, and these are being shaken out by new entrants. Artifacts tell the story of their environment and culture.

Recent links (about 15): test: usability, cross-browser, debug: lldb, internet of things: Cape Cod, computation: astronomyeducation: computers and data mining, sustainability: contest, population and consumption, art: Sculptris, cognitive: Pinker on mass media.

Book reviews:

How It Ends: From You To the Universe, Chris Impey, 2010

Science doesn’t end in this one, rather it evolves to handle complexity. It is assumed that a general theory of intelligence will be forthcoming. In the meantime, the author seeks to debunk myths, but observes that endings create meaning, and that stories, in addition to facts, are important. Practical limits are respected where known, but measurement of the end depends upon the tool; and adjusting the threshold changes results. The view is systemic. The set of twelve chapters begin from individual perspective and scale up. Each has an introduction that encapsulates the general idea in a scene or person. There are plenty of diagrams and photographs which illustrate the instances, terms, relationships or conclusion. Many of the human fears of impact result in fractional loss of numbers and regression of civilization, but not extinction. This book covers a lot of current thought, e.g. transhumanism, and names or quotes the signature personalities. Much of life is shown as part of a web. Bacteria can survive space and entry to the atmosphere. It is likely that there are other forms of life in the universe or the multiverse. An extensive glossary, notes and reading list are appended.
Incidentally, this does not cover Aubrey de Grey’s theory of regenerative medicine and longevity.

Holistic Engineering Education: Beyond Technology, editors Domenico Grasso and Melody Brown Burkins, 2010

Design is a common topic across nineteen papers by thirty authors in three countries (US, China, Peru) including thirteen states (mostly CA, MA and VA). This is due to the attribute of creativity as part of technology. (It may also overlap educational rivalries between scientific evolution and intelligent design.) The projected shortage of engineers has sharpened interest in improving education, from early through undergrad to faculty. There is an established history that can be improved upon as practice and standards become global. An emphasis on interdisciplinary collaboration has several elements such as changes in segmentation from, for example, electrical/mechanical/industrial, proposed curriculums, ethical values, and study abroad. A set of recommended personal values includes analysis, translation and perception. Skills of interest include asking, labeling, modeling, decomposing, gathering, visualizing, and communicating. A case study is shown for the global positioning system highlighting systems design, and technical and business leadership. Holistic contexts include system, strategic, implementation and stakeholder. Cultural approaches reinforce unity of effort. Engineering is eligible to become a guild, like the learned professions for medical, legal, and accounting. Most chapters have conclusions and suggested readings. Sustainability issues are often reported in the news.
Experienced engineers probably have many stories about what could be changed in education and practice and professional societies attempt to be a conduit for this. Some of the skills are innate and show up in play or the use and innovation of tools and artifacts. Many fields are becoming more sophisticated in the use of instrumentation for measurement, visualization, computation and control. Most of these can be scaled to educational versions that include the newest areas of R&D. If not supplied institutionally, they probably will have some free or affordable public or web versions. Where there are few people to handle the tasks, expert automation would be required.

The Shallows: What the Internet Is Doing to Our Brains, Nicholas Carr, 2010

The question is whether people are losing their minds or society is constructing a new type of one. Tools have sticky cognitive effects on their users and the internet, while figuratively turning on the light for many, may also tend to make it harder to look as deeply as before. In order to write the book, the author attempted to disconnect and find seclusion for a while. He cites how changes of this magnitude have been perceived in the past, e.g. Socrates’ lament that writing destroyed the capacity for individual memory, or how the typewriter changed authors’ styles since they could not dwell on the feeling of writing in longhand. Information is meted out in lots of brief interlinked pieces. Email has become streams. Ads are pervasive. The ten chapters review the mind, book, maps, clocks, tech, computers, and AI, amid the dimension of networking. Rather than point to URLs, the story is told in flashbacks, e.g. how Weizenbaum’s ELIZA could earn the empathic confidence of people even though it was mindless. There are ten chapters and four digressions, the last of which looks at the irony of a book on the disappearance of long-term concentration. Notes and further reading are appended. Much of these are valid issues and worth further study. Whether the realtime flow and exponential increase in data to analyze can be paused often or enough is unknown. More direct types of mind links may not be too far off in the future.
For the attention-challenged, a way to get through this book might be to survey it quickly, then skim a few times to make raw impressions, not word for word, rather similar to becoming familiar with a song or painting, then read it backwards for the verbal reassurance. The reader can increase the pass-throughs to pick up more detail where necessary and as time allows, thereby rendering textual memory as well as consideration and opinion. It may turn out that reading is more of a creative process than previously thought, or that there are better tools for the task, as there are for other kinds of digital composition, e.g. like sculpting 3D art. It may then still be possible to frequently parse titles in dedicated slices while otherwise attending to the network. Eventually a learning process may be discovered, akin to development of Gladwell’s outlier mastery status, And, of course, each of the chapters can become a book or digital museum or web-service in the interim, so none of the 3R’s may remain sacrosanct for much longer. There may be a video about this floating around somewhere.

Reverse Engineering: An Industrial Perspective, Raja and Fernandes, 2008

This textbook details how reverse engineering is used for copy, design abstraction and reengineering. This is based on high-resolution digitization and 3D CAD. Results have included reduced inspection time and improved workflow. Quality assurance benefits were standardization and interchangeable parts and reduced manufacturing cost. There are actual examples from automotive, aerospace, and medical device industries, and tables refer to more. Eleven papers discuss definitions, methodologies, system selection and rapid prototyping. The authors diagram a generic process and show how it is customized in each case. The product development cycles includes test. Taxonomies are given for measuring and positioning systems. Legal concerns arise from fair use and patents, and may be handled by a recommended sui generis system. Organizational considerations are listed including a champion, management support, resource coordination, competition, and user participation, e.g. “tribal knowledge” in an aerospace firm. This does get technical and there are some equations. Terms include computer-aided reverse engineering (CARE), coordinate measuring machine (CMM), nonuniform rational B-splines (NURBS), NC machine, multijet modeling (MJM), and computer-aided inspection (CAI).
Merging data and semantic web approaches is outside the scope of this book.

Posted in Uncategorized | Tagged: , , , , | Leave a Comment »

Netropic

Posted by cadsmith on June 9, 2010

100606

Where there’s a bit, there’s a bot may become an aphorism of digital consciousness, at least until something is picked as the universal standard. There may be some new principles of order, somewhere between anthropic and entropy, introduced by the impending Singularity. Approximations of these turn up in the literature of fiction as well as technology. This may also be an engineering topic for successive approximation by neural nets or intermediate computational agents which can parse relevant external worlds.

Recent links (about 22):semantic-web: RDB2RDF, computer: future of BIOS, flexible OLEDs, internet: trends, documentation: office live docs, location: Flook browser, Bing map SDK, social-networks: Facebook video, networks: IBM Mote Runner SDK, search: ReputationDefender, alertsSproutRobot, business: IBM ecurrency tokens.

 

Book reviews:

Philosophy and Engineering: An Emerging Agenda, by Ibo van de Poel and David Goldberg, 2009

This book details a variety of claims which, together, do not appear consistent, but may provide methods for further study. Engineering applies science and produces technology. The effects on society can be ethically evaluated through cooperation of philosophers. The recognized set of philosophical problems is still being determined including epistemological, methodological, metaphysical and ontological. It may also address other existing philosophical problems. The book has contributions by thirty-two authors in three parts for twenty-eight papers.
Engineering as a discipline is historically distinct from architecture.A well-defined philosophy does not exist, though efforts date from the start of the 21st century having arisen independently in the East and West, and following, yet distinct from, the philosophies of science and technology. A pluralistic approach can be linguistic, phenomenological, post-modern, analytic, pragmatic, and Thomist. It is within the field of philosophy of technology. Science and engineering are often treated as simplified notions based on politics of funding rather than examination of what people actually do in particular. Generalization differs in engineering from natural sciences, including artifact type, function and structure, which combine causes and concepts. The models used to represent reality are idealized, tested, and compared to eachother. Sociotechnical system boundaries include the behaviors and relations of elements impacted by it. Integrity is uniquely complex for engineers, the profession and its education. The engineering priority of technical ingenuity over helping people needs to be rebalanced to avoid becoming lost in the labyrinth of technology. Engineering ethics needs a global foundation based on principles of public safety, human rights, environmental and animal preservation, engineering competence, scientifically founded judgment, openness and honesty. Research in engineering ethics has spread to Asia and Europe from North American origins. The scale extends through individual, group, company, profession and planet. Imagination of the engineering world is a way to deal with conditions of epistemic opacity. Responsibility for artifacts eventually transfer from engineer to user through knowledge of their workings. Ethics concerns the amount of harm from artifacts produced by solutions to engineering problems Ethicists have observed an actual design project where participants were characterized as actors in a network, and intermediate results were presented which affected the outcome of the project. This is helpful in mapping risks, responsibilities and ethical issues. Future comparisons may be made between engineering and medical science. Role-playing games can be used to teach ethics if they are felt and articulated, have a lengthy process, use case studies, and realistically up-to-date. The Norms Evolving in Response to Dilemmas (NERD) platform was used for experimentation in the ethics of technology as a form of stress testing. There is a crisis of a creative era which.is resulting in the philosophical interest similar to what Kuhn showed had occurred in science, and which leads to dialectics, data mining, and reliance upon either brute or social facts or institutional artifacts, it may be short-lived. Wittgenstein had engineering training and his philosophy was based on the realworld of things rather than ideology. Design methodologies include top-down, layered, platform-based or network-based and are related to human organizational structures and national cultural emphases. Computer science builds abstractions from bits, engineering configures solutions, and stigmergic design in nature is bottom up. The settings of engineering are ad hoc realworld or systematic hyperrealworld. Technology is ubiquitous; engineering is either denial or determinacy; Where survival of the human species is the goal, all is heuristic; A quantitative measure of ethics is defined.
Issues concerning posthumanist theories would require other sources.

Science Fiction and Philosophy: From Time to Superintelligence, edited by Susan Schneider, 2009

This book is an advanced treatment of philosophy of mind, cognitive science, and scifi which can be considered narrative of thought experiments about puzzling scenarios. The editor is especially interested in neural enhancements, AI and the problems of disparity between Humans 2.0, whose flaws can only be judged by their own, and those who have not been upgraded. The book prevents a variety of views and methods rather than a concluding thesis. Authors range from classic philosophers, such as Plato and Descartes, through scifi writers such as Asimov and Bradbury. Others of each type are discussed in the contents. There are a couple of entries each from the modern literature of Dennett, Kurzweil and Andy Clark. There are five parts for twenty-seven papers, some of which have additional references. Each part lists related works of scifi, mostly from movies. There are diagrams for some of the mathematical and scientific concepts. Rather than commenting on each entry, there is a lengthy introduction by the editor about the themes and philosophical questions including reality as simulation, free will, mind and ethics and politics, and spacetime. A few recommendations that provide more depth in technology and risks are listed. Superintelligence is expected to arise due to the computational theory of mind, and identity based on information patternism. The philosophies of the reader’s favorite authors may yield to the kinds of approaches here, but there would probably be interest in more of such comparative volumes, also for the newest engineering fields, at least until a cyborg editor can do this in realtime for anyone as hinted by the iRobot-style cover picture.

The Philosophy of Science Fiction Film, edited by Steven M. Sanders, 2008

The editor lists three types of analysis: context, film, and topics. Classic films were selected for philosophical treatment, e.g. the Matrix is likened to Plato’s Cave. Other popular philosophers are Descartes, Heidegger, Hobbes, Hume and Nietzsche. There are three parts having four papers each, by a total of thirteen contributors. Films often quote influential predecessors and seek either general or improved solutions, e.g. Metropolis’ machine woman is like Wizard of Oz’ tin woodsman later echoed in C3PO. Settings are often case studies for logic problems that may introduce new assumptions, e.g. previously hidden forces or actors. Paradoxes are highlighted and heuristics proposed. The look and feel may have unique aesthetic texture, e.g. tech noir. Ethical questions often form themes and may be treated mythically, displaced by alien culture or time travel, for a different perspective that changes the intellectual and political constraints, e.g. involving power, laws, sex or war. Metaphysical questions around death are pursued, e.g. resurrection. The future may be seen as utopian or dystopian, or time may be flexible so that future or past can be changed. Reviewers are sometimes aware of their own cognitive processes so that interpretation is an art.

Minds and Computers: An Introduction to the Philosophy of Artificial Intelligence By Matt Carter 2007

This book is a basic introduction to AI as a philosophical theory of mind. It covers cognitive science topics on the human mind, computation, reasoning, language and philosophical considerations. For example, humans recognize repetitive sensory patterns and dedicate response structures to them; embodied experience is a basis for semantics. Each chapter indicates theory and objections. The style is mildly technical and philosophical. History of the field is broadly sketched and problems are not really delved into, e.g. consciousness, identity and emotions are briefly summarized in a chapter at the end. It does get into some detail about functional neuroanatomy and neural networks. There are twenty chapters, occasional exercises, some of which are labeled “challenge”, further readings, glossary, and index.
Further advanced conclusions are out of the scope of this text, e.g. by Minsky, Kurzweil, Hawkins or Wolfram on computation,.or Noë on consciousness. It does not discuss biological reuse for robotics, e.g. as has been demonstrated using animal brains, or cloning for this purpose. Trends such as functional brain emulation models from scopes and visualization, quantum mechanics and computation, or synthetic life would need additional sources.

 

 

Documents of interest:

An Experimental Philosophy Manifesto, Joshua Knobe & Shaun Nichols, 2008 (PDF)

 

Blogs of interest:

The New Atlantis – A Journal of Technology & Society

Thrilling Tales of the Downright Unusual: Illustrated Interactive Fiction from Retropolis and Beyond

 

Videos of interest:

Authors@Google: Paolo Bacigalupi

Posted in Uncategorized | Tagged: , , , , , , | Leave a Comment »

Deep Destiny

Posted by cadsmith on June 4, 2010

100604

Responses to complexity include modeling and innovative technology. Networks provide computational and social leverage. Tools are adaptive to realtime, combinatorial, fractal, and quantum considerations. There are various opinions about where all this may be leading, with respect to order or limits for example, and what degrees of freedom can be exercised.

Recent Links: (of about 23): 3D: car wrap; semantic web: RDFa checker; robotics: transportation; mobile: local ads; security: civilian net lockdown; surveillance: text stream, electrical network frequency analysis; tracking: eye movement, sleep monitor; quantum: simulation; complexity: matrix decomposition; business: internet of things, travel guide; finance: investing dashboard.

Book Reviews:

Complexity: A Guided Tour, Melanie Mitchell, 2009

The author covers the field in a readable narrative, rather than mathematical, fashion. There is no common measure of complexity since theory and science are still undefined. Research involves interdisciplinary collaboration. It is compared to cybernetics which had more extent than content, though this is more mainstream. There are both adaptive and nonadaptive complex systems. A proposed definition is “a system in which large networks of components with no central control and simple rules of operation give rise to complex collective behavior, sophisticated information processing, and adaptation via learning or evolution”.or briefly “a system that exhibits nontrivial emergent and self-organizing behaviors”. Measures include entropy, algorithmic information content, logical depth, thermodynamic depth, computational capacity, statistical or effective measure complexity, fractal dimension, degree of hierarchy and near-decomposability. Some new areas of research are listed, e.g. self-organized criticality and computational mechanics. These fall into two groups, either more specific applications, or higher level mathematical theories. Historically, emergence arose as a reply to reductionism. Computers mimic evolution and nature mimics computation. Network thinking is more concerned about relationships than entities. The web is a scale-free network. This also occurs in physiology since cells do not scale with body size; space is filled using a fourth dimension of fractal circulatory networks. Ecology extends the food chain to food web. The book has five parts for nineteen chapters, a bibliography of several hundred authors, and extensive notes and index. It is dedicated to Douglas Hofstadter, her doctoral adviser for analogy making programs, and John Holland for genetic algorithms.

Network Science: Theory and Practice, Ted G. Lewis, 2009

The author is a pioneer of network science as a modeling activity that combines complex adaptive systems, chaos, and mean-field theory. This text is dense mathematically and includes Java code. There are thirteen chapters with exercises, a bibliography, and index. The history of significant events is outlined from Euler’s Bridges of Konigsberg in 1736 to Gabbay in 2007. Topics include structure, emergence, dynamism, autonomy, bottom-up evolution, topology, power, and stability. Graph theory describes properties, matrix representation, classes, modeling and simulation. Regular networks are constructed by a generative procedure. Network-centric organizations reduce links and path lengths to lower costs and latency. A new metric, link efficiency, compares network types. Entropy initially increases as nodes are added, flattens, then diminishes to zero as structure predominates. Networks have topological phase transitions as rewiring probability increases. Network emergence describes macroscale properties resulting from microscale rules. Hub emergence is not scale-free. Cluster emergence is not small world. Feedback-loop, adaptive or environmental ermergence connects the next state to input microrules on goal-oriented networks. A network epidemic, characterized by spectral radius, propagates state or condition via links, as do antigen countermeasures which use superspreaders to decrease time and peak incidences. The classic is the Kermack-McKendrick model from 1927. Networks which follow Kirchhoff’s first law are shown where commodity flow in and out is equal. Influence networks are great models of social networks where nodes are actors. Network vulnerability is the probability that an attempted attack will succeed. Strategies such as linear are good defender and exponential for attacker. Risk is reduction of vulnerability or consequence. Resilience is defined for links, where small-world has highest followed by random then scale-free, as well as for stability, and flows where expected flow is availability times actual flow. Percolation adds links, depercolation removes them. Game theory assumes independent success probabilities. The attacker-defender problem is asymmetric. Netgain is a property where nodes compete for value proposition such as preferential attachment. Multiproduct emergence shows how shakeouts and monopolies can occur. Other market emergence types include nascent, creative destructive, or merger and acquisition. Network science can be used to model metabolism. Biology includes protein expression using Boolean networks. Chemistry uses bounded mass kinetic networks. Readers interested in quantum mechanics would seek additional sources.

Complex and Adaptive Dynamical Systems: A Primer, Claudius Gros, 2008

A theme of this book is that scientific common-sense shows that a long-term perspective is essential and that particular quantitative, technical formulations reveal behaviors which apply to many areas. Rather than verbosity, illustrations from network theory are used, such as graph statistics and probability generating functions. There are seven chapters, each having exercises and further readings. Ideas are linked across fields and demonstrated by examples, e.g. oscillators, neural nets or epidemics. The brain is the most complex adaptive system and life is an adaptive network. Complexity theory is a tool for modeling and scenarios, and is used for futurology, e.g. universal prediction tasks. The origin of life involved molecular cooperation in autocatalytic networks. Fitness landscapes are a function of chances of survival for species. Coevolution involves effects across multiple species within time and space scales, resulting in a red queen phenomenon of running in place. Kauffman gene regulation is an example of a random Boolean network. Critical variables dominate the dynamic at point of phase transition, e.g. temperature. This results in punctuated equilibrium and synchronization phenomena exhibited by evolving realworld networks. An order parameter measures degree of asymmetry. There is a small world effect where distance between nodes is a small fraction of the number of nodes. In social networks, diffusion has a role in transport, reported in the 1960s by Milgram. Game theory looks at survival strategies. Adaptive systems alternate between absorbing energy and dissipation.

The Intelligent Universe: AI, ET and the Emerging Mind of the Cosmos, James N. Gardner, 2007

This book’s thesis is that the universe is becoming alive according to a Selfish Biocosm hypothesis which is consilient, falsifiable and retrodictable. Emergence is the controller and intelligence the copier. Some of it is related to the Singularity as outlined in the forward by Kurzweil, or to a “deep DNA” universal genetic code. It looks at many types of artificial life research, and various alternatives to singularity theory such as Virtual Cambrian or Omega point. Humanity becomes the missing link, but its track record of maintaining lesser species is hopefully not repeated up the chain. The future of religion may include cosmotheology, becoming a subject of scientific study, or a “biocosm aborning”. There is a chart of NASA’s mission telescopes to observe black holes, dark matter and the big bang. Rather than “the long hello” of direct communications, SETI would look for computational meaning of seemingly natural noise, prediction markets, artificial exo-society modeling, or artifacts of cosmic macroengineering. An Intelligence Principle results in a post-biological universe.
The style is narrative. There are three parts, nine chapters, an afterword, three appendices, notes, bibliography, and index. It quotes from other scientists’ publications at length. Shaded boxes highlight key concept definitions and explanations, e.g. a notion of quantum evolution.

Blogs:

Barnett on India telecom

Robb on US censorship Commented: A general risk is revealed. May be reminded of notion of ecotechnology in Steward Brand’s “Whole Earth Discipline”. If they are going to treat the planet like a gas station, then a spill side-effect suppression system would be nice, e.g. by a genetic engineering solution capable of changing crude to something friendlier. Of course, this becomes a controlled substance to avoid a Vonnegut Ice 9 scenario where green refinery is used to vanish all the reserves.

Shirky on complex business models

Documents:

Measures of Complexity a non-exhaustive list, Seth Lloyd (PDF)

Videos:
The Secret Life of Chaos (Part 1/6),  Jim Al-Khalili, 2010 on Turing morphogenesis, Belousov nonlinear chemical oscillator, Lorenz chaos theory, Mandelbrot fractal geometry.
Irreducible Complexity 01/04, The Cassiopeia Project, 2010 adds quantum mechanics to organic chemistry and biology, and amino acids from interstellar gas clouds.
Authors@Google: Christos Papadimitriou, 2010 wrote Logicomix and researches algorithms and complexity.
An Evening with Dr. Atul Gawande, 2010 wrote The Checklist Manifesto to handle complexity.
The Most Exhilarating Ode to the Future You’ll See All Day (Batteries Not Included) | Motherboard on Singularity

Posted in Uncategorized | Tagged: , , , | Leave a Comment »