Test Information Space

Journal of Tech, Testing and Trends

Posts Tagged ‘technology’

In Form Fate

Posted by cadsmith on March 28, 2011

Testing news includes uTest Express, user experience, threats versus vulnerabilities (PDF), actors, NASA Mars, and radioactivity. Printing has expanded from 3D prototypes to add mobile antennas, computers, insect bots,  and human kidneys. Technology discusses exoskeletons. Trends include a plastic computer processor, Chinese chips, an artificial leaf, biological computers, and nerve cell chips. Beta sites were WorkFlowy. There were 71 recent links.

Book reviews:

The Information: A History, A Theory, A Flood, Gleick, 2011

The book covers how information has been treated in the precursors of the Information Age up to and including Google, Twitter and DNA databases. There is no standard definition, but there is the theory that the title refers to of Claude Shannon, and the overload of too much (TMI). The only irreversible process, and thus of any cost in terms of physical energy such as heat generation according to Landauer and Bennett, is erasure. The text briefly comments on data mining and machine intelligence, but does not dwell on the directions, not does it consider metadata, ontology, semantic web or augmented reality. Some of the author’s previous interests in chaos theory, quantum, entropy and thermodynamics are summarized. It goes more deeply into the abstractions of concepts such as meaning, language, writing, cryptography, paradoxes, numbers, measurement, logic, communication, transmission, computers, networks and genetics along with the major contributions of a host of researchers. There are fifteen chapters and extensive notes and bibliography. Author video.

Alone Together: Why We Expect More from Technology and Less from Each Other, Turkle, 2011

There are two parts about the author’s concerns about social intimacy and solitude based on practice and research. People decide how to keep devices busy. Questions about privacy and civil society bear on democracy and sacred spaces. People tend to identify with their machines They are vulnerable to overanthropomorphizing the capabilities of social robots. The companions offer safe havens. Some of them require nurture and others can double for caregivers for self or valued others. They may be an alternative to digital immersion. Online self-presentation has become a constant, in some cases also when in physical proximity like cyborgs, while in others for retreat to rehearsal or confession. There are fourteen chapters. Author site.

Storytelling for User Experience, Quesenbery and Brooks, 2010

There are sixteen chapters which appropriately have anecdotal stories. Business narratives are usually told in either reports or presentations. The latter can be oral, written or multimedia. User experience can be structured as prescriptive, hero, familiar or foreign, framed, layered and contextual interludes. Ingredients include perspective, characters, context, imagery, and language. These are intended to engage the audience in some way. They put a human face on research data. They can describe usability tasks for tests and reviews, and design ideas and requirements. Analysis activities identify fragments which are built into stories and personas. These can be found from listening, questioning, instructing, logs and note-taking. Good research ethics are relevant.


Posted in Uncategorized | Tagged: , , , | Leave a Comment »

Info Ops

Posted by cadsmith on January 30, 2011

Data deluge includes new terms. State unites in Sputnik Moment prioritizing tech innovation. All-Seeing Eye test results in contention. Egyptian election revolt disconnects media excepting network surveillance. Cyber police activated in Iran. Facebook handles hacks from Tunisia. Bluetooth is useful to insurgents. Mobile phones are to be tested in space. Android 3.0 SDK announced. Digital docs use steganography. Universal memory seeks to replace flash and DRAM. Alternative energy may take about three more decades. Robot hands become more robust. Cloud bots get smarter. Visual microscopy automated. Autom supports weight reduction. There were twenty recent links.

Book review:

The Next Decade, George Friedman, 2010
This book looks at the near future of US foreign relations in terms of a realignment of the balance of power through actions of the President as Commander-in-Chief. There are two themes, the unintended empire, and whether it can be managed to allow the republic to survive. The US global military supports economic policy. Its President is always engaged in the art of war. This will move beyond recent fear of rising oil prices and Jihadist war and establish surrogates in each region. While democracy, human rights and social progress are still important, strategy becomes more of a concern than ideology. The issues are economic, geopolitical, demographic and technological. There is an aging population, contracting workforce, and lack of water. The state, in the form of the DoD, is more powerful than the market for long-term investment. On the American contents, Cuba is likely to be a target of influence, Latin America will include Brazil, Argentina and Mexico. The latter’s violence and corruption will be resisted at the border and cartels are expected to be in control there, while US hypocrisy will scapegoat members of its government staff during investigations. Canada is stable. In Asia, Korea, Australia and Singapore balance Chinese splintering and Japanese assertiveness. India surges economically, but is not a threat to China, and is balanced by Pakistan to keep its expenditures on Army and Air. In Europe, Brits’ interests are closer to US, and Poland is important to containment since Germany dominates economically, backed by France allied with Russian military which seeks to balance US with radical Islam. Denmark blocks Baltic sea exits. NATO is irrelevant. In the Middle East, US withdraws from Iraq, distances from Israel, and has detente with Iran. Sunni Turkey eventually rises and is important to Russian containment in Balkans and Caucasus. There are fourteen chapters.

Posted in Uncategorized | Tagged: , , , , , , , , , , , , , | Leave a Comment »

Cell Complex

Posted by cadsmith on October 18, 2010

Usability test sites are listed below. Visualization has Circos tool. Machine learning can automatically read the web and derive laws from data. Biotech do-it-yourself has reached the garage stage. Japan has machines that win at chess, economize both heat and smartgrid, sing like humans, and crawl like snakes. Surveillance has video analytics and privacy concerns. Movies have character makeover. Security has memory hardware ID, Bugat trojan, cyber cold war and first strike alert. Space tourism is less than a couple of years away. Water shortages are prevented by intelligence, startups and solar power. Ebooks have self-publishing, search, blog converter, and writing nook.

Book reviews:

Undercover User Experience Design, Cennydd Bowles and James Box, 2010

This is a comprehensive do-it-yourself description about many facets of the topic. The undercover manifesto values bottom-up change, delivery, timeliness, sociability, and action. There are details about critiques, deliverables, design process and problems, research, usability testing, and UX design. UX adoption begins in web design, and proceeds to check-up, integration, ownership, allies, education, persuasion, trust, stories, skills, and ROI. The content of the book has dynamic highlights, notes appear in side-boxes. It has tips for Agile as well as waterfall design. There are recommendations for using the process with various types of customers including developers, visual designers, content specialists, product owners, marketers, SEO specialists and senior managers. It talks about metrics, A/B testing, common design review pitfalls. Types of test include rapid iterative testing and evaluation, and remote. Tools range from sketching wireframes and storyboards, to apps, to dedicated sites. Research methods include feedback, surveys, and third-parties. Author sites http://www.cennydd.co.uk and twitter.com/boxman.
Washington Rules: America’s Path to Permanent War, Andrew J. Bacevich, 2010

The title simultaneously refers to the beltway, namesake, possible result, and the name of the national security consensus since WWII which is no longer as effective. The sacred trinity now holds that the US needs global presence, power projections and interventionism. The inability to distinguish institutional well-being from that of the nation has led to the present conditions. This affected historical figures such as Allen Dulles, Curtis LeMay and Maxwell Taylor. The author takes issue with the way things have turned out for the US. The arguments are nonpartisan. The return to counterinsurgency demonstrates an abandonment of victory as an objective. The US could revert to the tradition of military for defense and Just War. Americans would see soldiers stationed in the country as citizen-protectors. This frees up resources to restore the economy.

Previous links (from about fifty-four):


Turn Visitors into Customers with Performable.
Easy User Experience research – whatusersdo.com
Webnographer Home
Remote Usability testing, online customer experience research, usability testing software. Userzoom
User Experience | Website Usability Testing and Evaluation
Usability Testing
Treejack :: Optimise your site structure using tree testing.
Remote & Online Usability Testing Tool | Loop11
IntuitionHQ, make website usability testing part of every website project
CommandShift3 – It’s like Hot or Not for web design
Chalkmark :: First impression testing.

astronomy Planet hunters no longer blinded by the light | International Space Fellowship

automotive CarWoo!

climate Old Weather – Our Weather’s Past, the Climate’s Future

community Get Satisfaction | Customer Community Software – Love your Customers.

economics Coming Soon: World Government and Global Currency – Beyond Money

education Next Gen Learning Challenges

invention Dean of Invention : Planet Green – On TV

maps The Web 2.0 Summit Points of Control Map

social Multitude

social-networks MYCUBE

technology Garnter’s Newest Hype Cycle: Discuss

transportation Everyone’s Private Driver / UberCab

windows Windows Live Mesh 2011

Posted in Uncategorized | Tagged: , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a Comment »

Last Resort

Posted by cadsmith on August 10, 2010


Society seems to be in the midst of another big shuffle. Part of it is seasonal, but there are major ideological shifts as previous institutions shed the characters that embodied them. Despite oft-repeated scripts, new contacts made online or in person are superficial and find it difficult to pierce the compartmentalization of previous history or biography quickly enough. Discontinuities in authority may occur. Risk mitigation strategies then need review. A tag or stereotype invites examination.

Recent links (about sixty): grouped using Bookmarks2LiveWriter which plugs into the windows live writer app and downloads delicious bookmarks for a date range, which can then be sorted by leading tag. Manually edited to skip newline for single entries.

agile: Tasktop Agile Planner

ajax: Dr Dobbs – Open Source Community Paves Way for Developers to Improve Internet Access for the Aging, Disabled

animation: the macula


Fluther: Tap the Collective

architecture: International Architecture Database


Who’s in the Blogosphere? / Flowtown (@flowtown)


Defending the Undefendable, Walter Block, 2008
Writing Tools: 50 Essential Strategies for Every Writer, Roy Peter Clark, 2006

business: rule.fm | let your work flow…

drawing: News: deviantART Muro: It’s Time to Draw!

ebooks: calibre – E-book management

education: Bill Gates: Forget university, the web is the future for education – Tech Products & Geek News | Geek.com

hardware: UCLA Professor Warns of Hardware Hackers – International Business Times


IDS Readies Data Centers on Ships « Data Center Knowledge
A sneak preview of enterprise IT in 2020 – Computerworld Blogs

media: Renewed Interest in Financing Original Web Shows – NYTimes.com

mobile: Yes I am Precious


Collaborative Analysis of Competing Hypotheses, available soon under GPL


Young Engineer Uses Webcam, Laser to Build Budget 3-D Scanner | Gadget Lab | Wired.com
Technology Review: Computing at the Speed of Light
Medical Daily: Purple light means go, ultraviolet light means stop

phone: http://apps.facebook.com/vonage-talkfree/

photography: Computer Scientists Build Pedestrian Remover

publishing: B.V. Larson Official Author’s site


Scientists provide a new angle on quantum cryptography
Physicists develop model that pushes limits of quantum theory

reading: The Monkey Cage: Social Highlighting


IEEE Spectrum: Engineers Turn Robot Arm into Formula 1 Simulator
The University of Utah: Mechanical Engineering:
Remote controlled, multi-tasking climbing machine
Technology Review: Blogs: TR Editors’ blog: A Strange New Take on Telepresence


The Fuller Memorandum, Charles Stross, 2010
Mech, B. V. Larson, 2010
Containment, Cantrell, 2009
Living Digitally: Fiction by Christian Cantrell

security: Feds admit storing checkpoint body scan images | Privacy Inc. – CNET News

singularity: Singularity Institute for Artificial Intelligence | AI

space: Sharpest Image Yet of Massive Galaxy Collision | Wired Science | Wired.com

storytelling: Storytelling Part 1: Change of Storytelling on Vimeo

technology: More Than Human: Embracing the Promise of Biological Enhancement, Ramez Naam, 2005


6 Ways Eye Tracking Is Changing the Web
Injecting Errors for Fun and Profit – ACM Queue
Dr Dobbs – Time and Testing: The Biggest Developer Headaches

translation: Translation and undo smartquotes in documents – Official Google Docs Blog

ui: Dolphin uses iPad as way to communicate with humans – Boing Boing

use-case: Advanced Use Case Modeling: Software Systems, Armour and Miller, 2000


LoiLo inc
Light Reading – Service Provider IT – Verizon Tailors Video to Criminal Justice – Telecom News Analysis

wave: Official Google Blog: Update on Google Wave

web: Domain Names & Web Hosting : 1&1 Internet Inc.


Poynter Online – Writing Tools
The Soulmen | We Got It! Ulysses 2.0
Welcome – Ommwriter
The Organized Writer
17 Fantastic Apps Made Especially for Writers | tripwire magazine

Posted in Uncategorized | Tagged: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , | Leave a Comment »

Grain of And

Posted by cadsmith on June 21, 2010


The computational paradigm has several facets including network, social, artificial, biological, and spiritual. How it arises, and where it is leading to, are popular topics. In some futures, the grand elements of identity include better personalization and representation amid the complexity, so the group dynamics that approach or avoid these can be studied.

Recent Links (about twenty links): AI: IBM Watson, semantics: News patterns, science: Sun musicmetrics, art: Sci-fi urban illustration, history: Turing archive, digital libraries: SpringerLink, telecom: Iridium satellite, dark pulses, search: Google commerce, user interface: GoogleCL, entertainment: Rdio, OnLive games.

Book reviews:

Science & technology in China: a roadmap to 2050 : strategic general report of the Chinese Academy of Sciences, edited by Yongxiang Lu and others, 2010

The Chinese Academy of Sciences reports how the country is modernizing science and technology and social changes for a developed world expected to triple in population and economic size over the next five decades. Revolutions in S&T require changing from imitation to innovation, independence, and institutions. Breakthroughs are expected in information science that will outpace technology. Computational thinking combines man-cyber-physical in a ternary universe. The plan is to absorb global innovations and intellectual resources. The format is like a brochure that defines structure, characteristics, steps and research support.The text is supported by data formatted in tables, charts, and highlighted boxes which detail characteristic indicators. There are five chapters by a committee of five writers and forty reviewers representing over three-hundred members. It is written at the level of principles and categories rather than specifics like the design of a new plane, and compares the rate of modernization of twenty-four countries. The major topics are economics, emerging areas, security, basic science, sustainability, and strategic efforts. This will be followed up by actual research, publications, workshops, peer review and priorities. It refers to relevant past plans, the immediately previous of which lasted for four years. It adds integration between demonstration and application, e.g. the topic of social computing and how it goes from electronic to ubiquitous. By 2050, China sees itself as an open society, advanced in culture, ethics, politics, materials, and conservation. Eight basic and strategic systems for economic development include energy, materials and manufacturing, networking, agriculture and biology, health, ecological and environmental, space and ocean, and security. Three emerging cross-disciplinary areas are nanotech, space, and complex systems. Security recognizes open-source intelligence and has two areas, space situational awareness, and social computing and parallel management systems. Four basic science areas are dark matter and energy, controlling structure of matter, synthetic biology, and photosynthesis. Seven sustainability efforts are comprised by 4k meter transparence underground to see ore deposits, renewable energy, deep geothermal, nuclear, marine, stem cells and regenerative medicine, and early diagnosis and intervention of chronic diseases. Six strategic efforts are post-IP networking, green manufacturing, process engineering, ubiquitous sensing, exa supercomputing, and molecular design. The primary milestones are shown for the years 2020, 2030 and 2050.

A Companion to the Philosophy of Technology, edited by Olsen and others, 2009

Technical literacy and individual and social decision-making are among challenges that philosophy attempts to address. It is useful to get a high-level summary. This book includes many technology-related issues in a single volume. There are ninety-eight chapters by about seventy-nine contributors under the seven themes of history, science, philosophy, environment, politics, ethics, and the future. The authors are Western, i.e. European and American, though there are discussions of Eastern references. The chapters are like the intros to books on each separate topic. The notion of convergence appears in the Future section, written by Bainbridge, who is one of the writers having multiple entries. There would probably be value in further integration, perhaps through discussions among various subsets. This might or might not improve prediction market accuracy depending upon how participants actually influenced eachother. This text compares well to previous philosophy books which were more in-depth and are likely included in the reading lists. It presents questions, terminology and some handy visualizations, and would be a good place to begin.

New Computational Paradigms: Changing Conceptions of What is Computable, edited by Cooper and others, 2008

This book is comprised of proofs of neo-Turing theories of logic and mathematics in technically advanced publications from the Computability in Europe (CiE) conference in 2005. It advocates the dynamic turn of interactions between observers and systems, and eachother socially. Conversation is computation. Turing’s boss in 1948, Darwin’s grandson, dismissed his paper on “intelligent machines” as merely “a schoolboy essay” so it was not published for two decades. It turned out to be a manifesto for at least AI, connectionism, and neural computing and was accompanied by another discussing evolutionary computing. Turing machines, the basis for modern computers, were derived as a model of computation. The computable analysis problem was to decide what was computable and how long to expect it to take. Applications include wireless mobile nets, neural nets, analog computers, topological spaces, graphics and hardware. There are machines that do not fall into these classes, e.g. algebraic calculations done by planar mechanisms such as rigid bars joined by rotatable rivets, or viewing an eternity in finite time using relativity equations. Information processing is emphasized, e.g. regulatory genomes. Biological computing has new operations such as splicing, crossover and point mutations and annealing, which demonstrate parallelism, reversibility, nondeterminism, energy efficiency, self-healing and evolution. Membrane computing structures have local reaction rules for evolving objects in multisets, e.g. DNA software. Computational models can be classified by space and time, discrete and continuous in each case. The authors look at how nature, or what the universe, computes. Additional directions are pursued including continuous time computations, derivatives of continuous functions and infinite time computation. There are about thirty international contributors beside the three editors. The format combines twenty papers in four parts. New paradigms were expected to follow.

Apocalyptic AI: Visions of Heaven in Robotics, Artificial Intelligence, and Virtual Reality, Robert M. Geraci, 2010

Apocalypticism consists of dualism, alienation, transcendence, and bodily purification, all of which are present in AI. A2I is a social strategy for research funding, as well as an ideology for online life. It is argued philosophically, legally and theologically. It is about commitment to actions and attitudes. This book approaches the technology and philosophy from the perspective of divinity, and has five chapters, a pair of appendices, notes and references. There are descriptions of the work of many researchers in AI and robotics, e.g. Turing, Minsky, Kurzweil, de Garis, and Warwick. Newell observed that Prometheus denotes tragedy, where technology actually leads to magic. Moravec wrote an essay in 1978 about converting nonlife to immortal mind, and in 1988 predicted that humans would eventually be capable of uploading their mind into a robot “bush” body which is fractal-like. A “mind fire” transforms the cosmos at lightspeed. Nationalism and war are obsolete. We are living in a simulation created by a god. Identity is a pattern and process within the brain and body which is possible in other materials. The Order of Cosmic Engineers believe that they will become the new creators. This may result in a virtualization of identity, available anywhere. It also has intermediate separate personalities of Transhumanists, e.g. Second Life’s Stenvaag. Games and digital worlds are precursors of digital paradise. These are primarily forms of social contact. Bainbridge’s sociology work for NSF is discussed. Actor-Network Throery, e.g. Latour’s trials of strength, observes that understanding scientific advance requires both natural and social actors. Transmutation is also a topic of religious history. There are methodologies common to science and religion, though the two are distinct. Religion affects how robots are integrated into society in the US, Europe and Asia. Japanese karakuri may descend from daVinci’s automata through missionaries. Relationships between humans and robots are worth study since the two may become indistinguishable. The major funding of robotics in the US is from defense, which may also provide the ethics. The robots may be more objective and humane.

Blogs of interest:

MSDN cashto on unit testing

Barnett reviews Natural Computing

Videos of interest:

Philip Low at TEDMED 2009 on cell phone brain cognition display

Posted in Uncategorized | Tagged: , , , | Leave a Comment »


Posted by cadsmith on June 16, 2010


Nanotech is a computer-aided approach which is transforming many fields and stimulating new industries. Expectations are that it is a next-big-thing equivalent to what personal computers, the internet and web were when they began. The ethics, engineering, effects on medicine and exciting scifi are reviewed below. If all goes well, perhaps this civilization’s legacy will be more than just space junk.

Recent links (about 23): computer: quantum, semantic web: Kingsley Idehen, finance: startups in Boston, minerals in Afghanistan, nanotech: architecture, FDA, futurist: singularityspace: lunar water, NSWP, Kepler exoplanets, presentation: glogster, virtualization: skype, energy: urban, disaster recovery: gulf oil spill, historical chart, photography: photoshop.

Book reviews:

What Is Nanotechnology and Why Does It Matter: From Science to Ethics‎, by Fitz Allhoff and others, 2010

After a basic introduction, e.g. recalling Drexler’s molecular assemblers from 1987, this book delves into the social concerns about nanotechnology. The authors are a scientist and a pair of philosophers. Scale reduces energy consumption. Tools include the electron microscope, scanning transmission (STEM), scanning probe microscopy (SPM), and atomic force microscope (AFM). The engineering challenge is to industrialize scientific development in terms of specification, monitoring and mass production. Among major philosophical topics is the risk in terms of conditions, probability and expected impact. Though the state of the field incrementally improves existing products, present laws do not account for the downsides to humans and animals. Better testing processes are necessary. There is a detailed analysis of the objections to stricter laws. Enhancement integrates tools into anatomy, always on, and is expected to revolutionize engineering. Sleep may become more of a bimonthly rather than nightly requirement. Nanomedicine ethics are discussed, e.g. Bawa and Johnson. The developing world may not be seen as profitable. In this book, in the context of distributed justice, nanotechnology is not unique in unfairness of accessibility to cognitive advances, e.g. similar to university costs. Privacy has been demonstrated as an issue, e.g. related to RFID tags. Potential uses to impose biases for individual control by bureaucracy, e.g. patriotism, may themselves be hard to limit. Defense probably develops war robots and the arms race turns to miniaturization.
This does not cover longevity, space or molecular manufacturing, laws or regulations, or economic impacts

Handbook of Nanoscience, Engineering and Technology, William A. Goddard, 2007

This textbook presents a set of themes describing the current state of nanotech. There are five sections containing twenty-four chapters on potential, concepts, processes, assembly, and functions. About a couple of dozen organizations contributed from US, Russia and Venezuela. Most are academic, e.g. universities of Illinois, North Carolina or Northwestern, and there are some US government space and defense researchers. Feynman introduces the subject. Most chapters have multiple authors, some have a single, and a couple of authors wrote or participated in a pair of chapters, e.g. Karl Hess for U of Illinois or Sergey Lyshevski of RIT. The contents are technical, including equations and graphs, and there is some Matlab source code. Chapters have intros and conclusions, acknowledgements and many references. There is no glossary, though there is an index and digital versions would have search. As an example, the final section has eight chapters on functional structures and mechanics. Nanomechanics links science and engineering, e.g. multiscale multiphysics schemes. Figure 20.1 shows the history of the tech from Mayan age ceramics after 10k BC to synthetic control of macromolecular structure now, and discusses biomimicry through dendrimer assembly. Atomic simulation resolution doubles every 19 months. Strength and fracture properties are outlined. A challenge is to control carbon nanotube growth chirality and diameter for computing-related applications. The optical properties of materials are engineered in photonic crystals. Preparation techniques are being developed for bulk production of nanostructured materials. Modeling and CAD are used in multidisciplinary confluent engineering, e.g. for nanoarchitectronics. In summary, there is a lot of general interest in convergence of nano, bio, info quantum and cognitive tech and this book has supporting examples.

The Handbook of Nanomedicine, Kewal K. Jain, 2008

The title of this book denotes types of tools and approaches rather than a medical specialty. It is derived from biotech and nanotech. The initial applications are expected to be for personalized medicine, e.g. cancer therapies. New tools include 3D nanomaps and the scanning mass spectrometer probe (SMS) used for drug design at cellular level. Nanoparticles can be coated or chemically altered so as to be nontoxic, though they can also be effective for nanoviricide. There are many types of applications, e.g. sunscreens or donor-derived exosomes for organ transplant acceptance. A lab-on-a-chip has chemical experiments for use in battlefield exposure testing. It allows platforms for precise imaging, diagnosis, targeting, drug delivery, destruction, treatment, and therapy. Nanomedicine can also be used in combination with other approaches, e.g. radiotherapy or physical modalities of therapy. There are public misconceptions and fears, so education is warranted, and there will probably be FDA regulation. The detailed table of contents hints at the depth of coverage in the eighteen chapters. There are many new structures and techniques, e.g. devices, machines, chips, robotics, materials, implants, barcodes, needles, tweezers, motors, shells, tubes, fibers, scaffolds, valves, pores, filters, coatings, crystals, emulsions, filaments, lasers, fluidic channels and wire. The nano prefix can be applied to several new fields including biotech, systems biology, bacteria, antibodies, genomics, proteomics, pharma, encapsulation, diagnostics, surgery, therapeutics, dermatology, dentistry, immunology, geriatrics, pulmonology, neurology, and regenerative medicine. The author also lists vendors and academic research centers.

Small Miracles, Edward M. Lerner, 2009

“Speech was so old species” says one of the emergent characters who considers humans to be Neanderthals in this transhuman techno adventure. Where Daniel Suarez had a parasitic AI influencing a group of people, and Robert J Sawyer had one further connected cybernetically through an eye implant, Lerner adds nanobots. This doesn’t go as far as Paolo Bacigalupi in genetically engineering creatures, but it does have a lot of detail about how humans might be medically enhanced. Initially intended to support first aid for government security equipped with new nanosuits, the temptation for a hybrid augmented reality awareness captures human nature. Without further spoilers, it is clear that the author researched the topic. His characters and dialogue are vivid. Italics are used occasionally for thoughts. The backgrounds of at least three of the main characters are fleshed out in separate parts well into the story. Settings are briefly sketched except, for example, to indicate heightened visual acuity in places or where necessary for action such as weather conditions. There are eight sections for about four dozen brief chapters. It is told in the third person omnisciently, including emotions, and an occasional machine perspective. Medical terms and R&D equipment get added detail. The plot may be more convincing since it is near future and there are not a lot of other inventions. The year is 2015 and the pacing opens dramatically with a threat to the main character’s survival. The total duration is about two years. Each chapter is titled by a date, a few have times if a couple are on the same day, and the Reaping has nine times before the epilogue. Success of the authors’ series above may hint at a sequel.

Documents of interest:
Communicating Nanotechnology, European Commission, 2010 (16.4MB ZIP PDF)
Human Enhancement Ethics: The State of the Debate, Bostrom and Sarulescu, 2008 (PDF)

Blogs of interest:

Nick Bostrom home page

Videos of interest:
David Byrne: How architecture helped music evolve
Reducing Existential Risks [UKH+] (1/3)
John Underkoffler points to the future of UI

Posted in Uncategorized | Tagged: , , | Leave a Comment »


Posted by cadsmith on June 12, 2010


Seeming contradictions are grist for the engineering mill.  While hints of future issues raise the value of cooperation, vested domains attempt to defend their turf, especially the big budget items. Drama is usually followed by system reorganization, at lesser durations between phases. There are various models for the dynamics, including networking, and these are being shaken out by new entrants. Artifacts tell the story of their environment and culture.

Recent links (about 15): test: usability, cross-browser, debug: lldb, internet of things: Cape Cod, computation: astronomyeducation: computers and data mining, sustainability: contest, population and consumption, art: Sculptris, cognitive: Pinker on mass media.

Book reviews:

How It Ends: From You To the Universe, Chris Impey, 2010

Science doesn’t end in this one, rather it evolves to handle complexity. It is assumed that a general theory of intelligence will be forthcoming. In the meantime, the author seeks to debunk myths, but observes that endings create meaning, and that stories, in addition to facts, are important. Practical limits are respected where known, but measurement of the end depends upon the tool; and adjusting the threshold changes results. The view is systemic. The set of twelve chapters begin from individual perspective and scale up. Each has an introduction that encapsulates the general idea in a scene or person. There are plenty of diagrams and photographs which illustrate the instances, terms, relationships or conclusion. Many of the human fears of impact result in fractional loss of numbers and regression of civilization, but not extinction. This book covers a lot of current thought, e.g. transhumanism, and names or quotes the signature personalities. Much of life is shown as part of a web. Bacteria can survive space and entry to the atmosphere. It is likely that there are other forms of life in the universe or the multiverse. An extensive glossary, notes and reading list are appended.
Incidentally, this does not cover Aubrey de Grey’s theory of regenerative medicine and longevity.

Holistic Engineering Education: Beyond Technology, editors Domenico Grasso and Melody Brown Burkins, 2010

Design is a common topic across nineteen papers by thirty authors in three countries (US, China, Peru) including thirteen states (mostly CA, MA and VA). This is due to the attribute of creativity as part of technology. (It may also overlap educational rivalries between scientific evolution and intelligent design.) The projected shortage of engineers has sharpened interest in improving education, from early through undergrad to faculty. There is an established history that can be improved upon as practice and standards become global. An emphasis on interdisciplinary collaboration has several elements such as changes in segmentation from, for example, electrical/mechanical/industrial, proposed curriculums, ethical values, and study abroad. A set of recommended personal values includes analysis, translation and perception. Skills of interest include asking, labeling, modeling, decomposing, gathering, visualizing, and communicating. A case study is shown for the global positioning system highlighting systems design, and technical and business leadership. Holistic contexts include system, strategic, implementation and stakeholder. Cultural approaches reinforce unity of effort. Engineering is eligible to become a guild, like the learned professions for medical, legal, and accounting. Most chapters have conclusions and suggested readings. Sustainability issues are often reported in the news.
Experienced engineers probably have many stories about what could be changed in education and practice and professional societies attempt to be a conduit for this. Some of the skills are innate and show up in play or the use and innovation of tools and artifacts. Many fields are becoming more sophisticated in the use of instrumentation for measurement, visualization, computation and control. Most of these can be scaled to educational versions that include the newest areas of R&D. If not supplied institutionally, they probably will have some free or affordable public or web versions. Where there are few people to handle the tasks, expert automation would be required.

The Shallows: What the Internet Is Doing to Our Brains, Nicholas Carr, 2010

The question is whether people are losing their minds or society is constructing a new type of one. Tools have sticky cognitive effects on their users and the internet, while figuratively turning on the light for many, may also tend to make it harder to look as deeply as before. In order to write the book, the author attempted to disconnect and find seclusion for a while. He cites how changes of this magnitude have been perceived in the past, e.g. Socrates’ lament that writing destroyed the capacity for individual memory, or how the typewriter changed authors’ styles since they could not dwell on the feeling of writing in longhand. Information is meted out in lots of brief interlinked pieces. Email has become streams. Ads are pervasive. The ten chapters review the mind, book, maps, clocks, tech, computers, and AI, amid the dimension of networking. Rather than point to URLs, the story is told in flashbacks, e.g. how Weizenbaum’s ELIZA could earn the empathic confidence of people even though it was mindless. There are ten chapters and four digressions, the last of which looks at the irony of a book on the disappearance of long-term concentration. Notes and further reading are appended. Much of these are valid issues and worth further study. Whether the realtime flow and exponential increase in data to analyze can be paused often or enough is unknown. More direct types of mind links may not be too far off in the future.
For the attention-challenged, a way to get through this book might be to survey it quickly, then skim a few times to make raw impressions, not word for word, rather similar to becoming familiar with a song or painting, then read it backwards for the verbal reassurance. The reader can increase the pass-throughs to pick up more detail where necessary and as time allows, thereby rendering textual memory as well as consideration and opinion. It may turn out that reading is more of a creative process than previously thought, or that there are better tools for the task, as there are for other kinds of digital composition, e.g. like sculpting 3D art. It may then still be possible to frequently parse titles in dedicated slices while otherwise attending to the network. Eventually a learning process may be discovered, akin to development of Gladwell’s outlier mastery status, And, of course, each of the chapters can become a book or digital museum or web-service in the interim, so none of the 3R’s may remain sacrosanct for much longer. There may be a video about this floating around somewhere.

Reverse Engineering: An Industrial Perspective, Raja and Fernandes, 2008

This textbook details how reverse engineering is used for copy, design abstraction and reengineering. This is based on high-resolution digitization and 3D CAD. Results have included reduced inspection time and improved workflow. Quality assurance benefits were standardization and interchangeable parts and reduced manufacturing cost. There are actual examples from automotive, aerospace, and medical device industries, and tables refer to more. Eleven papers discuss definitions, methodologies, system selection and rapid prototyping. The authors diagram a generic process and show how it is customized in each case. The product development cycles includes test. Taxonomies are given for measuring and positioning systems. Legal concerns arise from fair use and patents, and may be handled by a recommended sui generis system. Organizational considerations are listed including a champion, management support, resource coordination, competition, and user participation, e.g. “tribal knowledge” in an aerospace firm. This does get technical and there are some equations. Terms include computer-aided reverse engineering (CARE), coordinate measuring machine (CMM), nonuniform rational B-splines (NURBS), NC machine, multijet modeling (MJM), and computer-aided inspection (CAI).
Merging data and semantic web approaches is outside the scope of this book.

Posted in Uncategorized | Tagged: , , , , | Leave a Comment »


Posted by cadsmith on June 9, 2010


Where there’s a bit, there’s a bot may become an aphorism of digital consciousness, at least until something is picked as the universal standard. There may be some new principles of order, somewhere between anthropic and entropy, introduced by the impending Singularity. Approximations of these turn up in the literature of fiction as well as technology. This may also be an engineering topic for successive approximation by neural nets or intermediate computational agents which can parse relevant external worlds.

Recent links (about 22):semantic-web: RDB2RDF, computer: future of BIOS, flexible OLEDs, internet: trends, documentation: office live docs, location: Flook browser, Bing map SDK, social-networks: Facebook video, networks: IBM Mote Runner SDK, search: ReputationDefender, alertsSproutRobot, business: IBM ecurrency tokens.


Book reviews:

Philosophy and Engineering: An Emerging Agenda, by Ibo van de Poel and David Goldberg, 2009

This book details a variety of claims which, together, do not appear consistent, but may provide methods for further study. Engineering applies science and produces technology. The effects on society can be ethically evaluated through cooperation of philosophers. The recognized set of philosophical problems is still being determined including epistemological, methodological, metaphysical and ontological. It may also address other existing philosophical problems. The book has contributions by thirty-two authors in three parts for twenty-eight papers.
Engineering as a discipline is historically distinct from architecture.A well-defined philosophy does not exist, though efforts date from the start of the 21st century having arisen independently in the East and West, and following, yet distinct from, the philosophies of science and technology. A pluralistic approach can be linguistic, phenomenological, post-modern, analytic, pragmatic, and Thomist. It is within the field of philosophy of technology. Science and engineering are often treated as simplified notions based on politics of funding rather than examination of what people actually do in particular. Generalization differs in engineering from natural sciences, including artifact type, function and structure, which combine causes and concepts. The models used to represent reality are idealized, tested, and compared to eachother. Sociotechnical system boundaries include the behaviors and relations of elements impacted by it. Integrity is uniquely complex for engineers, the profession and its education. The engineering priority of technical ingenuity over helping people needs to be rebalanced to avoid becoming lost in the labyrinth of technology. Engineering ethics needs a global foundation based on principles of public safety, human rights, environmental and animal preservation, engineering competence, scientifically founded judgment, openness and honesty. Research in engineering ethics has spread to Asia and Europe from North American origins. The scale extends through individual, group, company, profession and planet. Imagination of the engineering world is a way to deal with conditions of epistemic opacity. Responsibility for artifacts eventually transfer from engineer to user through knowledge of their workings. Ethics concerns the amount of harm from artifacts produced by solutions to engineering problems Ethicists have observed an actual design project where participants were characterized as actors in a network, and intermediate results were presented which affected the outcome of the project. This is helpful in mapping risks, responsibilities and ethical issues. Future comparisons may be made between engineering and medical science. Role-playing games can be used to teach ethics if they are felt and articulated, have a lengthy process, use case studies, and realistically up-to-date. The Norms Evolving in Response to Dilemmas (NERD) platform was used for experimentation in the ethics of technology as a form of stress testing. There is a crisis of a creative era which.is resulting in the philosophical interest similar to what Kuhn showed had occurred in science, and which leads to dialectics, data mining, and reliance upon either brute or social facts or institutional artifacts, it may be short-lived. Wittgenstein had engineering training and his philosophy was based on the realworld of things rather than ideology. Design methodologies include top-down, layered, platform-based or network-based and are related to human organizational structures and national cultural emphases. Computer science builds abstractions from bits, engineering configures solutions, and stigmergic design in nature is bottom up. The settings of engineering are ad hoc realworld or systematic hyperrealworld. Technology is ubiquitous; engineering is either denial or determinacy; Where survival of the human species is the goal, all is heuristic; A quantitative measure of ethics is defined.
Issues concerning posthumanist theories would require other sources.

Science Fiction and Philosophy: From Time to Superintelligence, edited by Susan Schneider, 2009

This book is an advanced treatment of philosophy of mind, cognitive science, and scifi which can be considered narrative of thought experiments about puzzling scenarios. The editor is especially interested in neural enhancements, AI and the problems of disparity between Humans 2.0, whose flaws can only be judged by their own, and those who have not been upgraded. The book prevents a variety of views and methods rather than a concluding thesis. Authors range from classic philosophers, such as Plato and Descartes, through scifi writers such as Asimov and Bradbury. Others of each type are discussed in the contents. There are a couple of entries each from the modern literature of Dennett, Kurzweil and Andy Clark. There are five parts for twenty-seven papers, some of which have additional references. Each part lists related works of scifi, mostly from movies. There are diagrams for some of the mathematical and scientific concepts. Rather than commenting on each entry, there is a lengthy introduction by the editor about the themes and philosophical questions including reality as simulation, free will, mind and ethics and politics, and spacetime. A few recommendations that provide more depth in technology and risks are listed. Superintelligence is expected to arise due to the computational theory of mind, and identity based on information patternism. The philosophies of the reader’s favorite authors may yield to the kinds of approaches here, but there would probably be interest in more of such comparative volumes, also for the newest engineering fields, at least until a cyborg editor can do this in realtime for anyone as hinted by the iRobot-style cover picture.

The Philosophy of Science Fiction Film, edited by Steven M. Sanders, 2008

The editor lists three types of analysis: context, film, and topics. Classic films were selected for philosophical treatment, e.g. the Matrix is likened to Plato’s Cave. Other popular philosophers are Descartes, Heidegger, Hobbes, Hume and Nietzsche. There are three parts having four papers each, by a total of thirteen contributors. Films often quote influential predecessors and seek either general or improved solutions, e.g. Metropolis’ machine woman is like Wizard of Oz’ tin woodsman later echoed in C3PO. Settings are often case studies for logic problems that may introduce new assumptions, e.g. previously hidden forces or actors. Paradoxes are highlighted and heuristics proposed. The look and feel may have unique aesthetic texture, e.g. tech noir. Ethical questions often form themes and may be treated mythically, displaced by alien culture or time travel, for a different perspective that changes the intellectual and political constraints, e.g. involving power, laws, sex or war. Metaphysical questions around death are pursued, e.g. resurrection. The future may be seen as utopian or dystopian, or time may be flexible so that future or past can be changed. Reviewers are sometimes aware of their own cognitive processes so that interpretation is an art.

Minds and Computers: An Introduction to the Philosophy of Artificial Intelligence By Matt Carter 2007

This book is a basic introduction to AI as a philosophical theory of mind. It covers cognitive science topics on the human mind, computation, reasoning, language and philosophical considerations. For example, humans recognize repetitive sensory patterns and dedicate response structures to them; embodied experience is a basis for semantics. Each chapter indicates theory and objections. The style is mildly technical and philosophical. History of the field is broadly sketched and problems are not really delved into, e.g. consciousness, identity and emotions are briefly summarized in a chapter at the end. It does get into some detail about functional neuroanatomy and neural networks. There are twenty chapters, occasional exercises, some of which are labeled “challenge”, further readings, glossary, and index.
Further advanced conclusions are out of the scope of this text, e.g. by Minsky, Kurzweil, Hawkins or Wolfram on computation,.or Noë on consciousness. It does not discuss biological reuse for robotics, e.g. as has been demonstrated using animal brains, or cloning for this purpose. Trends such as functional brain emulation models from scopes and visualization, quantum mechanics and computation, or synthetic life would need additional sources.



Documents of interest:

An Experimental Philosophy Manifesto, Joshua Knobe & Shaun Nichols, 2008 (PDF)


Blogs of interest:

The New Atlantis – A Journal of Technology & Society

Thrilling Tales of the Downright Unusual: Illustrated Interactive Fiction from Retropolis and Beyond


Videos of interest:

Authors@Google: Paolo Bacigalupi

Posted in Uncategorized | Tagged: , , , , , , | Leave a Comment »

Deep Destiny

Posted by cadsmith on June 4, 2010


Responses to complexity include modeling and innovative technology. Networks provide computational and social leverage. Tools are adaptive to realtime, combinatorial, fractal, and quantum considerations. There are various opinions about where all this may be leading, with respect to order or limits for example, and what degrees of freedom can be exercised.

Recent Links: (of about 23): 3D: car wrap; semantic web: RDFa checker; robotics: transportation; mobile: local ads; security: civilian net lockdown; surveillance: text stream, electrical network frequency analysis; tracking: eye movement, sleep monitor; quantum: simulation; complexity: matrix decomposition; business: internet of things, travel guide; finance: investing dashboard.

Book Reviews:

Complexity: A Guided Tour, Melanie Mitchell, 2009

The author covers the field in a readable narrative, rather than mathematical, fashion. There is no common measure of complexity since theory and science are still undefined. Research involves interdisciplinary collaboration. It is compared to cybernetics which had more extent than content, though this is more mainstream. There are both adaptive and nonadaptive complex systems. A proposed definition is “a system in which large networks of components with no central control and simple rules of operation give rise to complex collective behavior, sophisticated information processing, and adaptation via learning or evolution”.or briefly “a system that exhibits nontrivial emergent and self-organizing behaviors”. Measures include entropy, algorithmic information content, logical depth, thermodynamic depth, computational capacity, statistical or effective measure complexity, fractal dimension, degree of hierarchy and near-decomposability. Some new areas of research are listed, e.g. self-organized criticality and computational mechanics. These fall into two groups, either more specific applications, or higher level mathematical theories. Historically, emergence arose as a reply to reductionism. Computers mimic evolution and nature mimics computation. Network thinking is more concerned about relationships than entities. The web is a scale-free network. This also occurs in physiology since cells do not scale with body size; space is filled using a fourth dimension of fractal circulatory networks. Ecology extends the food chain to food web. The book has five parts for nineteen chapters, a bibliography of several hundred authors, and extensive notes and index. It is dedicated to Douglas Hofstadter, her doctoral adviser for analogy making programs, and John Holland for genetic algorithms.

Network Science: Theory and Practice, Ted G. Lewis, 2009

The author is a pioneer of network science as a modeling activity that combines complex adaptive systems, chaos, and mean-field theory. This text is dense mathematically and includes Java code. There are thirteen chapters with exercises, a bibliography, and index. The history of significant events is outlined from Euler’s Bridges of Konigsberg in 1736 to Gabbay in 2007. Topics include structure, emergence, dynamism, autonomy, bottom-up evolution, topology, power, and stability. Graph theory describes properties, matrix representation, classes, modeling and simulation. Regular networks are constructed by a generative procedure. Network-centric organizations reduce links and path lengths to lower costs and latency. A new metric, link efficiency, compares network types. Entropy initially increases as nodes are added, flattens, then diminishes to zero as structure predominates. Networks have topological phase transitions as rewiring probability increases. Network emergence describes macroscale properties resulting from microscale rules. Hub emergence is not scale-free. Cluster emergence is not small world. Feedback-loop, adaptive or environmental ermergence connects the next state to input microrules on goal-oriented networks. A network epidemic, characterized by spectral radius, propagates state or condition via links, as do antigen countermeasures which use superspreaders to decrease time and peak incidences. The classic is the Kermack-McKendrick model from 1927. Networks which follow Kirchhoff’s first law are shown where commodity flow in and out is equal. Influence networks are great models of social networks where nodes are actors. Network vulnerability is the probability that an attempted attack will succeed. Strategies such as linear are good defender and exponential for attacker. Risk is reduction of vulnerability or consequence. Resilience is defined for links, where small-world has highest followed by random then scale-free, as well as for stability, and flows where expected flow is availability times actual flow. Percolation adds links, depercolation removes them. Game theory assumes independent success probabilities. The attacker-defender problem is asymmetric. Netgain is a property where nodes compete for value proposition such as preferential attachment. Multiproduct emergence shows how shakeouts and monopolies can occur. Other market emergence types include nascent, creative destructive, or merger and acquisition. Network science can be used to model metabolism. Biology includes protein expression using Boolean networks. Chemistry uses bounded mass kinetic networks. Readers interested in quantum mechanics would seek additional sources.

Complex and Adaptive Dynamical Systems: A Primer, Claudius Gros, 2008

A theme of this book is that scientific common-sense shows that a long-term perspective is essential and that particular quantitative, technical formulations reveal behaviors which apply to many areas. Rather than verbosity, illustrations from network theory are used, such as graph statistics and probability generating functions. There are seven chapters, each having exercises and further readings. Ideas are linked across fields and demonstrated by examples, e.g. oscillators, neural nets or epidemics. The brain is the most complex adaptive system and life is an adaptive network. Complexity theory is a tool for modeling and scenarios, and is used for futurology, e.g. universal prediction tasks. The origin of life involved molecular cooperation in autocatalytic networks. Fitness landscapes are a function of chances of survival for species. Coevolution involves effects across multiple species within time and space scales, resulting in a red queen phenomenon of running in place. Kauffman gene regulation is an example of a random Boolean network. Critical variables dominate the dynamic at point of phase transition, e.g. temperature. This results in punctuated equilibrium and synchronization phenomena exhibited by evolving realworld networks. An order parameter measures degree of asymmetry. There is a small world effect where distance between nodes is a small fraction of the number of nodes. In social networks, diffusion has a role in transport, reported in the 1960s by Milgram. Game theory looks at survival strategies. Adaptive systems alternate between absorbing energy and dissipation.

The Intelligent Universe: AI, ET and the Emerging Mind of the Cosmos, James N. Gardner, 2007

This book’s thesis is that the universe is becoming alive according to a Selfish Biocosm hypothesis which is consilient, falsifiable and retrodictable. Emergence is the controller and intelligence the copier. Some of it is related to the Singularity as outlined in the forward by Kurzweil, or to a “deep DNA” universal genetic code. It looks at many types of artificial life research, and various alternatives to singularity theory such as Virtual Cambrian or Omega point. Humanity becomes the missing link, but its track record of maintaining lesser species is hopefully not repeated up the chain. The future of religion may include cosmotheology, becoming a subject of scientific study, or a “biocosm aborning”. There is a chart of NASA’s mission telescopes to observe black holes, dark matter and the big bang. Rather than “the long hello” of direct communications, SETI would look for computational meaning of seemingly natural noise, prediction markets, artificial exo-society modeling, or artifacts of cosmic macroengineering. An Intelligence Principle results in a post-biological universe.
The style is narrative. There are three parts, nine chapters, an afterword, three appendices, notes, bibliography, and index. It quotes from other scientists’ publications at length. Shaded boxes highlight key concept definitions and explanations, e.g. a notion of quantum evolution.


Barnett on India telecom

Robb on US censorship Commented: A general risk is revealed. May be reminded of notion of ecotechnology in Steward Brand’s “Whole Earth Discipline”. If they are going to treat the planet like a gas station, then a spill side-effect suppression system would be nice, e.g. by a genetic engineering solution capable of changing crude to something friendlier. Of course, this becomes a controlled substance to avoid a Vonnegut Ice 9 scenario where green refinery is used to vanish all the reserves.

Shirky on complex business models


Measures of Complexity a non-exhaustive list, Seth Lloyd (PDF)

The Secret Life of Chaos (Part 1/6),  Jim Al-Khalili, 2010 on Turing morphogenesis, Belousov nonlinear chemical oscillator, Lorenz chaos theory, Mandelbrot fractal geometry.
Irreducible Complexity 01/04, The Cassiopeia Project, 2010 adds quantum mechanics to organic chemistry and biology, and amino acids from interstellar gas clouds.
Authors@Google: Christos Papadimitriou, 2010 wrote Logicomix and researches algorithms and complexity.
An Evening with Dr. Atul Gawande, 2010 wrote The Checklist Manifesto to handle complexity.
The Most Exhilarating Ode to the Future You’ll See All Day (Batteries Not Included) | Motherboard on Singularity

Posted in Uncategorized | Tagged: , , , | Leave a Comment »

Who Goes There

Posted by cadsmith on May 29, 2010


When comparing models, measurement and ranking may involve complexity. There are various types, e.g. software, computational and process. Conceptually, user may consider the number of models necessary to cover phenomenon, length of time required to explain the system, or whether the latter can be captured in an intuitive diagram. During design or build, factors might be dimensionality, manufacturing difficulty, amount of software, fixing difficulty, price, sensitivity to environmental conditions or time to configure. Usage has amount of calculation, time for calculation or halting, difficulty of operation, and stability or failure rates. Scaling adds number of components and connections and types thereof. Automation may have tradeoffs between factors. There can be a composite derived from multiple methods. See computational complexity for discussion of metrics.

Recent Links (of about 27): visualization: latency heatmaps, semantic web: Saplo, crowdsource: Fluidinfo, local events, robotics: mind over machine, security: CERT fuzzing, marketplace: Facebook AppBistro, art.sy, disaster recovery: oil reporter, finance: bonds, technology: NanoProfessor, STS OCW, future cities, synthetic cell, space: duality of gravity, WISE star formation, tilted orbit, photography: Stock Photos, music: UJAM.

Book Reviews:

The Power of the Semantic Web to Transform Your Business, David Siegel, 2009

The author has addressed a book to an audience of business customers to acquaint them with who’s doing what in semantic web applications. The transition is expected to complete in a decade. Trends in digitization, availability, managing metadata, synchronization, syndication, and scalability lead to online data lockers which replace silos. Users specify a want which filters semantic queries. The data, products and services are then available anywhere, e.g. via cellphone screen login and browser. This lead to a conversion from current advertising. Realtime pricing is based on metadata. The fair tax legislation is recommended to eliminate income tax and simplify sales tax structures within transactions. Autonomy is expected to automate processes and increase efficiency. Collaborative design spaces replace data repositories, e.g. building information management (BIM) for 3D models, or science commons for protein designs.
The style is optimistic, visionary and direct with short paragraphs and lots of bold emphasis. There are three parts for seventeen chapters. A book ontology concept map shows how the pieces fit together. It has plenty of url’s to demonstrate existing businesses. Evri has acquired Twine since. It may be so wide-ranging that it could use a filter to find stuff that’s fits reader requirements.
Other issues would require additional sources, e.g. dominant vendors or monopolies compared to opensource, closed data (Newscorp blocking Google), closed ecosystem (Apple), sustainability, security, curated data (Wolframalpha), or privacy (Facebook).

The Philosophy of Science and Technology Studies, Steve Fuller, 2006

Science’s place in society is not as well understood by the public as other subjects. The author is a social constructivist; humanity is a goal. STS applies theories of humanities and social sciences to science and technology. This book covers the European and American history. It is descended from positivism and has field methods of research. Actor-network theory is the main one, e.g. Isabelle Stengers or Bruno Latour. This is a cross-disciplinary academic field sometimes likened to a scaffold for specialties. Technology studies is a subfield added in the mid-80s. The public is taught citizen science and ways to consider technical issues of consequence. Technoscience is an undifferentiated combination of science and technology. Scientific authorities are respected for extreme rationality and have their own interpretations of STS and its research. The Science Wars of the 90s pitted realists and critics of scientific theory against eachother. Science has social organization, politics, and control over public relations. Quality is affected by who it is that researchers are accountable to. Institutional science followed a strategy of colonization. Thought collectives were recognized in the 1930s. Kuhn described paradigms in 70s; philosophy of rational or irrational became normal or revolutionary phases of science. There is a Heraclitean dualism between constructivism and relativism. Ideological framework presuppositions can be disintegrated and reconsidered in newer context.
As far as style, there are thirty-three chapters in six parts; Technology follows science models here; utilitarianism is not distinguished. The term artifact is used as an effect rather than in a design sense. The latter is used for theoretical policy and institutions, rather than engineering. Internet and web are sources and not subjects, though there is a recommendation for an online futures market for science-based proposals. The computation trend is not discussed. Where the frameworks are tabular in two-dimensions, Eastern formulations might add spirals or helices, e.g. to show processes, scale of abstraction or detail, and relations, e.g. counterclockwise between actor-network theory in constructivism on left and theory in relativism on right. Exception cases for each category could be expanded upon. Scifi is not considered a factor, currently subsumed in techno-adventure. There is an extensive bibliography.

The Best Science Fiction and Fantasy of the Year, volume four, ed. Jonathan Strahan, 2010

Scifi anthologies are still published because editors have favorites. Editors still exist because publishers do marketing. In this book, an intro describes the status of the industry. There are twenty-nine stories. Each has its own intro. A section for recommended readings closes the volume. Kij Johnson has two entries. Some of the authors are in other anthologies, such as Dozois 2009, including Stephen Baxter, Michael Swanwick, Sarah Monette and Elizabeth Bear. In terms of perspectives, the ratio is about 2:1 third person and first. Cover art is by John Berkey.
Specialized e-readers have changed things a bit, though they may be peaking since tablets are becoming popular. There are other references. Wikipedia has author info. Author’s often have own websites. There is no last.fm for scifi stories yet in addition to soundtracks. Some sites have rankings over all time and there are lots of separate blog reviews. There might be a demand for a personalized recommendation system as an improvement over Amazon. Readers can stream actual science reporting and real summaries. Writers can complement metadata with imaginary data which may also find its way into art or other media, e.g. on youtube or spacecollective. DIY Symphonic soundtracks may be around the corner, e.g. ujam. Studies or creation of space and matte painting are ways to develop sensitivity for these types of images, e.g. deviantart. It is hard to do something outlandish enough, yet true to the collective unconscious, that a real discovery won’t soon be reported in the news.

The year’s Best Science Fiction, 26th Annual, ed. Gardner Dozois, 2009

Scifi is being used more to get ideas across, science or futurist, where nonfiction may not reach the audience. Short stories are less of a stretch for twitter-types than novels. These readers are accustomed to extreme conditions or moods. Education and energy are an invention away. The editor of this book was originally from Massachusetts and the genre was a way to escape the local confines. This text includes a general summation of publishing for the previous year. There are thirty short stories, each having an intro. These are followed by honorable mentions. Paul McAuley and Ian MacDonald each have a couple of entries. Perspective is about even, 1st having thirteen and 3rd seventeen. Cover art is by Chad Beatty.
Scifi is the fantasy side of technology where scenarios can be explored that can sometimes turn into real movies or games. Of course, haven’t yet seen a blade writer, but animations are very realistic and it may incrementally develop from remixing and adapting existing content to originality from the likes of the machinima sphere, simplified for human enjoyment. Faulty industrial designs that almost made it to production may be the equivalent of cliff hangers or detective yarns in worlds of ruthless environments, corporations and governments. Globalized sources may reflect the local engineering mythos.

Blogs of interest:

Haque on institutional innovation


Marcus de Sautoy on mathematics of symmetry

Lee Rainie on internet usage changes

Documents of interest:

Measuring Test Execution Complexity (PDF)
Measuring model complexity with the prior predictive, Vanpaemel (PDF)

Seerability scifi by yours truly

Posted in Uncategorized | Tagged: , , , , , , , , | Leave a Comment »