Pierre Schwob
  • Home
  • About
  • Solutions
  • Blog
  • Science & Culture Advocacy
    • Cosmic Reflection
  • Entrepreneurship
    • Entrepreneurship
    • Classical Archives
    • Program Research & Software
  • Patents
  • Contact

Reducing Two Stressors

5/2/2019

Comments

 
Picture
Picture
Can we mitigate two stressors (depressed economic circumstances and climate change) by leveraging solutions in one towards the other?

  • Many U.S. communities are economically distressed, generating terrible despair, addiction and suicide epidemics
  • The planet is stressed by climate change and fast action to transition to 100% green energy is crucial.
 
Let's consider the following assumptions:
  1. Climate change is a clear and present danger, as well as to future generations
  2. Climate change, if not mitigated with great urgency, may become irreversible: the window for action is closing
  3. Energy generation conversion from fossil fuels to renewables is essential to combat climate change
  4. The real costs of oil, coal, and natural gas have not changed much in more than a century. Renewable energy sources such as solar photovoltaics (PV) and wind have experienced rapid, persistent cost declines. PV module costs have dropped on average 10%/year since 1990, while deployment has increased by 26%/year. [Science 12 April 2019, page 133]. Advances in PV (perovskites, tandem) continue to increase energy collection efficiency
  5. The federal government and many states offer tax incentives to deploy green technologies
  6. More than 50 million Americans live in economically distressed communities (1/5 of Zip codes reviewed in the EIG study - see below.) In these, 25% of adults lack a high school diploma, 55% of them are not working. (Click the images to visit citylab.com and eig.com)
Picture
Here are  some of the ​Energy Watch Group key findings:

  1. Transition to 100% renewable energy requires comprehensive electrification in all energy sectors. 69% solar; 18% wind; 6% bio; 3% hydro; 2% geothermal
  2. By 2050, 96% decentralized local and regional generation will be from solar and wind
  3. Transition will reduce greenhouse gas emissions from 30 GtCO2-eq in 2015 to zero by 2050
 
The estimated global spending needed by 2050 on renewable sources and electrification is $110 trillion (2% of global GDP during that period) of which $95 trillion is already committed. 100% renewable will employ 35 million worldwide (from 9.8 million today) [Source: Irena]
 
 
The questions we need to address:

  1. Can we select economically-effective solar energy collection/storage/distribution technologies?
  2. Can we identify opportunities and develop profitable models to employ some people in a depressed community to build/run/maintain a pilot PV installation(s)?
  3. Can we model the dissemination of that effort by members of the initial pilot, dispatched to teach/evangelize and form new teams in other, like communities?
  4. Can we identify initial funding sources?
  5. Can we neutralize resistance and develop/leverage political mobilization and support?
  6. Can we scale this to influence the crossing of a critical threshold in a post-carbon transition—viewed as a non-linear complex adaptive system? Can we create a new basin of attraction?
 
A latent majority (61% in the U.S. [ABC News/Stanford 2018]) supports climate change mitigation, but the size of the committed minority (e.g. Greta Thunberg school strikes, Green New Deal proponents) needs to cross a critical threshold to have an marked effect on policy makers. Are socioeconomically depressed areas the right targets to reach a tipping point? Do we have access to economically feasible technologies? How do we amplify success and develop a band-wagon effect with positive feedback?
 
Answers to these questions require an integrated, multidisciplinary approach: engineering for technology selections and complex systems analysis, from design experts for packaging, logos, and instruction manuals, business/finance for financial modeling, social/political sciences for community approach programming, and education specialists for education modeling.

A fascinating and worthwhile project.
Comments

Neural Networks Meet Light

9/4/2017

Comments

 
PictureYashar Hezaveh/Laurence Perreault Levasseur/Phil Marshall/Stanford/SLAC National Accelerator Laboratory; NASA/ESA
Two recent articles from Symmetry Magazine (published jointly by SLAC and Fermilab) “Neural Networks Meet Space” and Physics Today (American Institute of Physics) “A Deep Neural Network Of Light,” read one after the other, provide a perspective on current machine learning developments that point to major advances on how we will be able mine data from very large data sets several orders of magnitude faster than traditionally and an additional two order of magnitude faster than with conventional electronics.
 
Neural Networks Meet Space relates the extraordinary research done by Yashar Hezaveh, Laurence Perreault Levasseur and Phil Marshall at the Kavli Institute for Particle Astrophysics and Cosmology (KIPAC) at Stanford/SLAC, recently published in Nature, in which strong gravitational lenses are analyzed using a convolutional neural network. Gravitational lenses are complex distortions of spacetime—predicted by Einstein—produced by the gravity of foreground massive galaxies or galaxy clusters that affect the path of the light reaching us from background galaxies. These distortions allow astrophysicists to quantify, and develop a history of, the dark matter that makes up 85% of matter in the universe and the dark energy driving the acceleration of its expansion.

Traditionally, such analyses were done by comparing computer-intensive mathematical simulation lensing models with actual images, which could take weeks to months to perform. Using a neural network, however, allows the same analysis to be done in seconds, once the network has been “trained” for one day – by presenting about half a million telescope images of gravitational lenses to the system.

Remarkably, in addition to being able to automatically identify a strong gravitational lens, the neural network was able to elucidate the property of each lens (mass distribution and magnification of the background object.)

As the article explains “Neural networks are inspired by the architecture of the human brain, in which a dense network of neurons quickly processes and analyzes information. In the artificial version, the ‘neurons’ are single computational units that are associated with the pixels of the image being analyzed. The neurons are organized into layers, up to hundreds of layers deep. Each layer searches for features in the image. Once the first layer has found a certain feature, it transmits the information to the next layer, which then searches for another feature within that feature, and so on.”

And now, it seems that another advance could make this type of work even more efficient!

As related in AIP’s Physics Today, Marin Soljačić, Dirk Englund (both at MIT), and colleagues developed a proof-of-concept photonics circuit to perform the operations underlying neural networks  that may offer two orders of magnitude faster operations, as compared to using their traditional electronics counterparts.

As Naisbitt said in Megatrend “We are drowning in information but starved for knowledge.” Given the remarkable advances outlined above, we may yet be able to develop solutions towards the ingestion and useful metabolism of the coming deluge.

Picture
Ashley G. Smart; Physics Today 2017, 70, 24-24. DOI: 10.1063/PT.3.3654
Comments

Protect the Goose, Protect Science Funding

7/10/2017

Comments

 
Picture© ScienceCartoonsPlus.com by permission
​A distressing dissonance exists between some influential elements of our government and the scientific establishment. Whether evident in the recently proposed presidential budget or in the pronouncements from some members of Congress, one cannot but observe that science is under siege.
 
One may debate the motivations behind this (e.g. national debt concerns, presumptions–well founded or not–in future economic growth, skepticism–whether in good or bad faith–towards well-established climate science, etc.), but it is unreasonable not to understand, appreciate and internalize the benefits of scientific enquiry and of curiosity-driven research. One can argue that our quest for, and promotion of, rational thinking have rarely been more salient.
 
Federally funded organizations that have delivered so many discoveries are now facing dramatic cuts that would hobble our competitiveness, and I dare say, perhaps even our civilization.
 
The FY18 budget sent to Congress would cut federal spending on basic research by 13-17%.
 
The following funding level reductions have been proposed (in addition to a reportedly very low ceiling rate of 10% for permitted indirect costs):
 
National Institute of Health                  -22%
DOE Office of Science                          -17% (incl. -43% for bio/environment research)
NIST                                                  -23% (incl. -13% for research)
NOAA                                                 -16% (incl. -32% in weather & climate research)
U.S. Geological Survey                        -15% (incl. -24% in land resources mission area)
National Science Foundation                -11% (incl. -14% in education)

The budget is partially based on an overly optimistic expectation that economic growth will generate enough revenues to eliminate the US deficit in 10 years. However the research done in our labs is the principal engine of that growth.

Too many ignore or misunderstand the benefits of the discoveries basic research has produced. One recalls Michael Faraday’s rejoinder to Sir William Gladstone (British Chancellor of the Exchequer) who, when questioning the value of Faraday’s experiments on electricity, was told: “Why, Sir, there is every possibility that you will soon be able to tax it!”
 
Similarly, most iPhone owners have no concept of the fact that this device would not exist without the fundamental and applied research that produced transistors, integrated circuits, cellular communications, GPS, LEDs, and a host of other technologies we now take for granted.
 
Our economic growth depends on a science and technology pipeline that starts with curiosity-driven research with no immediately discernible applications, followed by development and industry-ready maturation, and ending in entirely new products and services that can rarely be forecast at the outset. Cutting basic research funding will inexorably dry this pipeline up and severely damage future growth.
 
Most of that blue-sky research must be supported by our tax dollars and just as we must press our government to defend this essential endeavor, we must strive to explain these pursuits to all stake-holders–including the voters and the press. 

Comments

Benefits of Technology Transfer

1/2/2017

Comments

 
Picture

U.S. patent law is addressed in the U.S. Constitution: Article I, Section 8. [link]. In addition, one remembers that the Bayh-Dole Act (1980) gives the freedom to inventors, funded by government research contracts or grants, to exploit their inventions. [link]
 
More recently the federal government has clearly indicated its desire to see lab-to-market commercialization of technologies.

See, in particular:
  • The American Innovation and Competitiveness Act (AICA) signed into law 1/6/2017. [link]
  • The DOE Research and Innovation Act (RIA) passed by the House and received in the Senate 1/30/2017. [link]
  • The DOE Technology Transfer Execution Plan 2016-2018 (TTEP). [link]
 
Given the current assault on science research funding, one imagines that these institutions would welcome the ability to create a supplementary income stream to their general fund, as well as rewarding their departments and inventors. Being a close admirer of the SLAC Linear Accelerator Center, a DOE federal lab, I looked in particular at the DOE’s Technology Transfer Execution Plan listed above. I focus on their two outlined objectives:

  1. “Increase the commercial impact of DOE investments through the transition of national  laboratory-developed technologies into the private sector.”
  2. “Increase the commercial impact of DOE investments through private sector utilization of national laboratory facilities and expertise.”
 
It is important, in my view that one recognizes that, in addition to the potential financial returns to a general fund, a department lab and the inventor(s), one should also consider the following benefits, particularly given some labs’ ethos to freely share their intellectual capital:

  • Increases US competitiveness
  • Creates jobs
  • Makes industry and funding agencies aware of what we do
  • Helps attract talents
  • Helps research funding
  • Brings the technology to industry
  • Showcases the science and technology

Continued in the next article at www.pierreschwob.com/blog/archives/12-2016

Comments

Technology Transfer – Methodology

12/10/2016

Comments

 
Picture
(Cont'd from benefits-of-technology-transfer.html​)

​Here is an outline of the methodology for the development and functioning of an Office of Technology Transfer at a research university or a lab.

Note that the choice of the name “Office of Technology Transfer” (OTT) is deliberate.  One could propose to use “Office of Technology Licensing” used in some universities. The fact is that many institutions have an ethos in which researchers believe their work should be freely available to all. As I write in a previous blog, technology transfer has many important benefits other than financial returns.

One of the central thrust of the OTT is that it needs to be entrepreneurial, collaborative, respectful of the lab or university’s ethos, and marketing-oriented. This means:
  • Enterprising marketing people and analytical scientists and engineers should run the effort. Lawyers are there to support.
  • Premium on collaboration between OTT, the inventors, and potential licensees or other beneficiaries.
  • Vanity patents should not be allowed: only marketable inventions justify filing costs.
  • Licensing is not the only one way to leverage intellectual capital. Other forms of TT should always be contemplated.
 
Buy-in from the labs and their staff is critical
  • Lab/university statesmen need to be involved to support the OTT mission. They can help with conflict of interest issues between licensees and the obligations of the inventor to the institution.
  • A strong internal PR campaign and communications are essential.
 
Methodology for assessing and developing an invention towards a marketable patent:
  • OTT confidentiality agreements are offered to potential inventors so OTT can discuss emerging/potential inventions, estimate merits and filing costs, and encourage marketable ideas.
  • A simple one page disclosure form (with attachments for details) to provide the essence of the invention.
  • As soon as received, OTT must go back to the lab to discuss the disclosure with the inventor (and possibly colleagues) to
    • Get their own take on marketability/usefulness to the lab’s mission
    • Ask them who is doing research in similar areas and get good leads on potential licensees (the people working on an invention are often the best experts to know who is doing what and whom this might interest)
    • An assessment is done within OTT (and possibly with an internal board of experts) to determine marketability and have a first bead on potential revenue levels
    • Cost analysis is performed to determine filing, marketing and potential maturation costs: Most lab inventions may not be market-ready. Either royalty levels may not be as high since licensee need to invest to further develop product/method or funds must be invested for maturation
    • Evaluation is critical: Out of 100 inventions, 25 are filed, 12 licensed, and 2-3 create revenues! (Probably less for fundamental research/basic science organizations.)
  • An initial measure of returns is made for
    • Up-front payments
    • Royalty rates
    • Minimums
    • Potential for sublicensing shares
    • Exclusive vs non-exclusive (sometimes mixed, for international applications)
    • Non-financial benefits to the lab (see above)
  • Once approved, the provisional application is written and filed (given the "first to file" doctrine, time is of the essence)
  • Writing the claims is an art: narrow claims invite going-around copy-cats; too broad, they may bump against prior art
  • OTT gets on the phone and contacts licensors/licensees in comparable fields and revises, as needed, the initial revenue level estimates. Ask what is a reasonable royalty level for something similar
  • Application is finalized within one year of the provisional (careful to stay within disclosure)
  • Begin marketing, starting with the potential targets identified by the inventor and colleagues
  • Once an agreement is reached, allow the inventor to enter into a separate, bilateral, consulting agreement with the licensee ("show-how" consulting). Should contain a clause that if a conflict arises, obligations of the inventor to the institution shall govern. These consulting agreements are, however, not done by OTT which  should not act as the inventor's representative in that respect. That said, OTT must maintain a collaborative stance with the licensee: good for business
 
A possible breakdown of the proceeds of license agreements:
  • 15% of gross is taken off the top to pay OTT
  • Out-of-pocket expenses are reimbursed to OTT
  • Of the net remainder: 1/3 to the inventor, 1/3 to the department/lab, 1/3 to general fund
 
OTT should consider allowing equity participation instead of cash when dealing with start-ups (where cash is precious).

In addition to the internal outreach mentioned above, a concerted external PR effort should prepare the field. It is critical to the success of the operation that the lab’s unique assets be leveraged for strong OTT returns and maximize the other benefits of successful technology transfer activities.

Comments

Chance Encounters and Cross-Silos Fertilization

9/13/2016

Comments

 
Picture
A chance encounter billions of years ago led to the explosion of life on Earth. An amoeba-like organism absorbed a bacterium that had harnessed sunlight to separate oxygen from water molecules. The descendants of that ancestor of all plants and trees, transformed our atmosphere, allowing all animals and us to evolve on Earth.
 
Chance encounters between moving objects (remember the end of the dinosaurs?) or between interacting sentient beings can also have huge consequences. Just as we must avoid disastrous results (check for Earth-crossing asteroids!) we should foster positive outcomes and provide stages where the latter can occur.
 
Many notable advances have been the result of chance encounters, sometimes between experts from different disciplines. Ed Catmull, the computer scientist who heads Pixar and Walt Disney Animation Studios writes that the best ideas emerge when talented people from different disciplines work together.
 
Two Bell Labs radio astronomers, Robert Wilson and Arno Penzias, were racking their brains in 1964 to explain a persistent noise they observed with their radio antenna. A chance meeting with an MIT physicist who mentioned a pre-print authored by three Princeton physicists, led them to understand that they had discovered the Cosmic Microwave Background, a predicted radiation left over from the early universe, only 380,000 years after the Big Bang. This earned them the Nobel Prize in 1978.
 
It was the article of faith that cross-pollination was essential to the furtherance of their objectives that Jonathan Dorfan, as founding director of the Okinawa Institute of Science and Technology, institutionalized this concept through the development of work areas with no boundaries between the various disciplines at the OIST. Indeed all meeting and resting areas were drawn to force experts to mingle.
 
I used to host Friday lunches at the Stanford Faculty Club and made it a habit to invite each week folks from different departments. It was always a joy to hear “Oh, you are working on this? Did you know that…?” Some of these conversations led to active and fruitful cooperation.
 
Since most universities and labs cannot transform their existing physical layouts, they should make all efforts to promote exchanges across their silos in other ways. This can be as simple as running a weekly random drawing and invite those selected to share a meal. In my experience, 6 to 8 participants is an ideal number. This allows all to participate in one conversation while still permitting one-to-one exchanges.
 
If you know of an important advance resulting from a serendipitous meeting, please share a comment.
 
“Did you ever observe to whom the accidents happen? Chance favors only the prepared mind.” -- Louis Pasteur
 
 

Comments

The Open Ontology Project

3/24/2016

Comments

 
Our civilization progresses through intellectual and technological revolutions (often called paradigm shifts.) We began with the invention of language, tools, agriculture, and writing. We drove through the age of exploration, the invention of the printing press, the Renaissance, Enlightenment, and the Industrial Revolution. We are now fully engaged in the Information Age with broad access to computers, lightning-fast communications, intelligent software, and soon AI.
However the promises of the Digital Age call for another paradigm shift—a phase transition (to borrow from thermodynamics)—to solve the issue famously evoked by Naisbitt in Megatrends:
“We are drowning in information but starved for knowledge.”

We believe that context is essential in transforming information into knowledge, and we are laying out the groundwork for an ambitious project to benefit everyone: a platform upon which a contextual reference tool will be built to usher the Information Age into the Knowledge Age.
This platform is the Open Ontology Project. The Ontology is a scalable, peer-reviewed undertaking to develop a massive hierarchical organization (ontology) of human knowledge.

The Ontology is an essential tool in its own right. Such an organization of knowledge helps frame a subject matter within the domain or domains it belongs to. This helps students understand the “belong to” relationships between various concepts they are exposed to in the classroom. The Ontology creates a framework, or mental matrix, to help metabolize information into knowledge.
 
The Ontology has the ability to provide lists or collections that are otherwise difficult to find in other places. Whether you are interested in historical events, wines, dogs, galaxies, plumbing or opera, the Ontology provides an easily navigable landscape to provide a coherent index of all that we know. It is to be mined by anyone for unlimited applications. In the immediate, the Ontology can provide an AI-machine learning-based Recommendation Engine for scholars and researchers to provide links to the peer-reviewed literature. Conversely, it can be used by editors as a specialist and expert-reviewer recommendation engine.  More generally, it will offer a new methodical system to explore the web, to learn, to shop, to work, to play, and myriad other applications others may think of and develop on top of the Ontology.
 
The Ontology is different from Wikipedia in that the Ontology is fundamentally based on relations. With its top-down organization, it allows an intuitive navigation between related concepts. It is also text-sparse: Ontology descriptions are limited to short, easily digestible 150-word introductions, augmented by links to external references. As such the Ontology can be viewed as a reasoned index to references such as Wikipedia and online education assets such as Khan Academy, Coursera, edX, etc.

Picture
Comments

    Archives

    May 2019
    September 2017
    July 2017
    January 2017
    December 2016
    September 2016
    March 2016

    Categories

    All
    Open Ontology
    Potential Game Changers
    Science Funding
    Stakeholder Engagement
    Technology Transfer

    RSS Feed

Copyright © 2015 - 2018 Pierre Schwob all rights reserved
All logos and trademarks are property of their owners
  • Home
  • About
  • Solutions
  • Blog
  • Science & Culture Advocacy
    • Cosmic Reflection
  • Entrepreneurship
    • Entrepreneurship
    • Classical Archives
    • Program Research & Software
  • Patents
  • Contact