id
stringlengths 30
34
| text
stringlengths 0
71.3k
| industry_type
stringclasses 1
value |
---|---|---|
2017-09/1580/en_head.json.gz/8615 | Capitalists put profit ahead of attempts to save the planet
Jeff Rudin
Despite the effects of climate change, such as the melting of glaciers in Greenland, companies in India and China still abuse the carbon credits system put in place by the United Nations. (Michael Kappeler, AFP)
Others are now hailing the green economy as capitalism's opportunity to save the world while still maximising profit.
Powerful support for using a profit-based green economy as the bulwark against the catastrophe of climate change comes from an impeccable source: James Hansen, the very person credited with having first forced the United States government to take global warming seriously. Hansen, the head of the Nasa Goddard Institute for Space Studies since 1981 and adjunct professor in the department of earth and environmental sciences at Columbia University, evokes the virtues of the market at the same time as warning us that climate change is worse than even he expected.
Writing in the Washington Post on August 6, he confessed to having made a serious mistake when, in 1988, he first alerted the United States Senate to the fact and implications of human-caused global warming.
His warning of 22 years ago has turned out to be far too optimistic. The speed of global warming, leading to what he describes as "a stunning increase in the frequency of extremely hot summers", has surprised him.
These extreme weather events are the key finding of a new analysis of the past six decades of average global temperatures. This peer-reviewed study, which the US Academy of Science published, shows that it is no longer enough to "repeat the caveat that no individual weather event can be directly linked to climate change". The study instead shows that "there is virtually no explanation other than climate change" for such recent events as the European heat wave of 2003 (which killed 50 000 people), the Russian heat wave of 2010, the 2011 drought in Texas and Oklahoma (which had a damage bill of more than $5-billion in Texas alone), or the extremely hot summer the US has just experienced, which is alarming the world in terms of food prices.
"The odds that natural variability created these extremes are minuscule, vanishingly small," according to Hansen, for whom, as a scientist, caveats are a stock in trade. "To count on those odds," he warns, would be "like quitting your job and playing the lottery every morning to pay the bills".
When someone like Hansen speaks with such definiteness, it behoves all of us to listen – and to act with the urgency that his dire warnings demand: "The future is now. And it is hot."
Hansen offers what he sees as a "simple, honest and effective solution": a carbon tax on all fossil fuel companies, with the money collected being distributed on a per-capita basis to all legal residents. This, he assures us, will "stimulate innovations to create a clean-energy economy".
Alas, for all of us, he is wrong on this score. A New York Times article on August 9 blows away the "simple solution" Hansen offered only a few days earlier. Marx seems to be right after all: the pursuit of immediate profit maximisation takes precedence over life itself, including the lives of those making the profit. Capitalism has little, if any, place for Hansen's good intentions.
One year before Hansen began his often lonely campaign to warn the world about climate change, the Montreal Protocol on substances that deplete the ozone layer was adopted (in September 1987). The European Union plus 197 other countries have ratified the protocol, making it the most widely supported treaty in the history of the United Nations. Unlike the Kyoto Protocol, which seeks to limit the greenhouse gases that cause climate change and which the US has rejected, the ozone-specific Montreal Protocol has the full support of the US, which is probably why, in the words of Kofi Annan, then UN secretary general, it is "perhaps the single most successful international agreement".
By the mid-1980s, the science of the "ozone hole" had been sufficiently established for the world's leaders to take the problem seriously. Chlorofluorocarbons (CFCs) – the main coolant used in refrigerators and air conditioners as well as in the manufacture of solvents and blowing agents for plastic-foam manufacture, fire extinguishers and aerosol cans – were causing the depletion of the ozone layer. The resulting hole was leading to an in increase in ultraviolet-B radiation - an increase in skin cancer and damage to crops and marine phytoplankton being the outcome. The Montreal Protocol binds all parties to phasing out the use of CFCs despite manufacturers and users' strenuous objections.
Hydrochlorofluorocarbons (HCFC) are acceptable transitional CFC replacements and the basis of the current problem.
The Montreal Protocol relies on market mechanisms to repair the ozone hole. Embracing the principle of all countries having a common but differentiated responsibility to protect and manage the global commons - a cardinal principle in all climate-change negotiations – the protocol restricts the use of HCFC in developed countries, but not in developing ones. Moreover, it allows for companies in developing countries to be compensated for not using HCFC.
The UN, under whose auspices this takes place, assigned the value of one to carbon dioxide (CO2), the most common of the greenhouse gases. Other industrial gases are assigned values relative to that, based on their warming effect and how long they linger in the atmosphere. Methane is valued at 21, nitrous oxide at 310 and HCFC-22, the waste gas produced while making HCFC, is near the top of the list at 11 700.
And there is the rub. These values are used to calibrate exchange rates for the carbon "credits" the UN began issuing in 2005. Companies awarded these credits under the UN's clean development mechanism are free to sell them on global trading markets. Companies in the developed world that exceed their legal carbon emission limits under various international agreements buy these credits. Profit maximisation, the engine of capitalism, has produced a perverse – though highly profitable – outcome. Rather than eliminating one tonne of carbon dioxide and earning one credit, entrepreneurs quickly realised they could destroy the HCFC-22 produced as a by-product of HCFC (the transitional coolant pending the total outlawing of all CFC) and earn 11 700 credits.
Nineteen plants, mainly in India and China, now legally produce as much HCFC as they can in order to maximise the carbon credits they get for destroying its waste product. The result has been that 46% of all carbon credits awarded worldwide have gone to these 19 companies. Worse still, their production of HCFC has been so enormous that it has brought the cost of the gas down so far that companies in the developed world no longer have any economic incentive to stop using it. Long live profit. To hell with the ozone hole and climate change.
Damn Marx for being right. It would be so much easier if market realities had not turned Hansen's "simple, honest and effective solution" on its head.
Dr Jeff Rudin is on the board of the Alternative Information and Development Centre
climate changecarbon emissionscapitalism
Economists sound warning over carbon tax
Carbon or resource rent tax?
Offsets make light of COP17 carbon load
Gathering urges rethink on carbon emitters | 科技 |
2017-09/1580/en_head.json.gz/8629 | Big Data, Copyright Infringement and Internet Radio News, November 3, 2013
November 3, 2013 / by Julia Rogers in Music News 2013. / 0
Technology and music industry experts analyzed how “big data” continues to change the way artists’ royalties are collected and distributed. Also, record labels in the UK were allowed to block 21 file-sharing websites courtesy of a court order. And hip-hop artist Talib Kweli launched his own signature station on the user-generated radio station Radionomy. “Big Data” Continuing to Change the Music Landscape
The existence of “Big Data” and indie and major label artists’ willingness to speak up about their challenges in the music industry have significantly changed the way labels and PROs worldwide deal with collecting and distributing royalties, according to Australian professor and music technologist Axel Bruns, who wrote an article last week for The Conversation. Throughout their careers, artists like Robert Fripp, Courtney Love and Prince have all documented and vocalized their on-going struggles with collecting societies and labels as they try to make a living making music amidst a rapidly-changing landscape.
The recent growth of digital music and specifically streaming music services has allowed artists, labels and others in the music industry to minutely track exactly how much an artist’s music is being played. And because artists can also add music to streaming services on their own, labels and PROs worldwide are starting to make music and royalty deals directly with streaming services instead of negotiating and renegotiating contracts with bands and artists.
PROs, labels and artists may finally be accepting the power of Spotify and other significant streaming services, whose global market share is giving them major bargaining power with right holders. Bruns added, “In principle, Spotify or iTunes could bypass the conventional royalty collection and distribution frameworks altogether. Labels, and even individual artists, could upload their releases to these services directly, and receive an agreed fee per play or download in return.”
These direct deals are gaining popularity because every stream and download can be monitored, tracked and recorded, allowing for a “negotiated value” to be assigned to each stream and download. Radio broadcasts and physical album sales were previously very difficult to quantify.
However, as Bruns pointed out, the accessibility of Big Data “does little to improve the lot of artists or consumers. It really only facilitates the flow of money between providers and labels.”
Similarly, proposed increases in download costs worldwide (in Australia and other countries and regions) to increase royalties for artists would not necessarily bring individual artists more royalties, as many major label artists continue to depend upon royalty reports provided to them by their labels. Big data on streams, downloads, plays and sales, however, could change the situation, since these data are starting to be gathered in greater detail about sales, radio broadcasts, plays in public venues and other types of offline music listening. And music fingerprinting technology can now track individual radio plays.
Bruns asserted that he believes the problem is not tied to the availability of data, rather transparency: “Many such data sources are fiercely protected by their owners, and little effort has been made to combine them or to determine the distribution of royalties, not just to labels but also onwards to the artists themselves … There is a significant opportunity here for collecting agencies … or a new statutory body … to take leadership of this process … This could throw open the curtains and shine a new light on how recorded music performance is accounted for, valued, and rewarded in the Australian and international industry.”
Court Order Served to Block 21 Additional File-Sharing Sites in the UK
The British music industry trade organization the BPI won a case against 21 copyright-infringing file-sharing sites on Wednesday, which will force the UK’s six largest ISPs to block nine BitTorent sites and 12 music “aggregator” sites, including the popular site mp3skull. The Guardian reported that this brings the total number of websites being blocked in Great Britain to 25, including Pirate Bay.
The BPI stated it had to pursue a high court order after the websites “declined to co-operate in any meaningful way.”
Chief executive of the BPI Geoff Taylor outlined, “We asked the sites to stop infringing copyright, but unfortunately they did not and we were left with little choice but to apply to the court … The judge considered the evidence and declared that ISPs should not serve access to them.”
Taylor also revealed that the previous court orders, which blocked four BitTorrent websites that were allegedly committing copyright infringement online created a “significant reduction” in visits by UK users.
He added, “Music companies are working hard to build a thriving digital music sector in the UK, offering fans great convenience, choice and value, but these efforts are undermined by illegal sites which rip off artists and contribute nothing to Britain’s vibrant music scene.”
Talib Kweli Launching New Signature Station on Radionomy
Renowned hip-hop artist Talib Kweli announced he would be launching his own free radio station on Radionomy, one of the world’s largest user-generated radio station platforms designed for producers, listeners and broadcasters throughout the world. The station will offer up music from Kweli and artists from his label, Javotti Media and will also feature interviews, news and commentary on politics, culture and entertainment related to Hip-Hop. The announcement was made via XXL and an official press release.
Kweli’s station is evidence of a growing trend in entertainment, which allows fans to connect personally to artists. Radionomy said that while a signature radio station might seem like the best way for artists to connect with fans and expand their audience, the technology and business challenges attached to running a station have often been prohibitive to even very well-known artists.
U.S. Manager of Radionomy Thierry Azcarez said, “ We offer the ability to create and run a fully functioning radio station from any web-enabled device … This gives artists like Kweli a perfect platform to communicate with their audiences without having to deal with the business or technology hassles of radio.”
And platforms like Radionomy are allowing artists to take away some of the barriers typically attached to running a radio station while on the road touring. Its web-based platform makes artists able to run their stations from mobile devices from any location, which also fosters collaboration between producers, artists and others.
Donna Dragotta, A&R Admin and Label Manager of Javotti Media pointed out, “We don’t need a studio or any equipment other than a web-enabled device and a microphone to run this station … It gives us flexibility — that’s crucial for our team, which is constantly on the move. We can broadcast live from virtually anywhere.”
Javotti Media is an addition to Radionomy’s Premium Content program, allowing artists, individuals and organizations to promote their brand with content, live event streaming, embeddable players and worldwide broadcasting on hundreds of mobile apps and devices. The service is free, with Radionomy covering streaming and licensing costs through four minutes of advertisements per hour. The Radionomy Network averages 40 million listening hours each month across 7,000 global stations.
Related Posts:Music Business News, January 31, 2017Grammy Creators Alliance, New Music Industry and Digital…Found Them First App, Streaming Playlist Brands and Song…LOST app at SXSW, Targeting on Social Media and Streaming… TAGS: big data, copyright infringement, digital music business, internet radio, ISP blocks, online file sharing, Radionomy, streaming music revenue, Talib Kweli
RELATED STORIES Streaming Music Services, Music Piracy and Music Business 2014 News, December 29, 2013 Spotify Revenue Model, 30 Seconds to Mars and Crowdfunding News, December 8, 2013 Illegal File Sharing, Digital Marketing Strategies and Music in Technology News, November 17, 2013 Digital Downloads, Twitter #Music and Music Stock Market News October 27, 2013 Online File Sharing, Thom Yorke and Musician Revenue Streams News, October 13, 2013 Music Licensing, Music Apps and iTunes Radio News, October 6, 2013 ← How to Build Your Music Network
→ How to Promote Your YouTube Channel | 科技 |
2017-09/1580/en_head.json.gz/8724 | Anmelden Nanotechnology and Global Sustainability
Nanotechnology and Global Sustainability
Donald Maclurcan [0] Lese ich gerade
Verbergen Wo zu kaufen
The rise of collaborative consumption, peer-to-peer systems, and not-for-profit social enterprise heralds the emergence of a new era of human collectivity. Increasingly, this consolidation stems from an understanding that big-banner issues-such as climate change-are not the root causes of our present global predicament. There is a growing and collective view that issues such as this are actually symptoms of a much more vicious, seemingly insurmountable condition: our addiction to economic, consumption, and population growth in a world of finite resources.Nanotechnology and Global Sustainability uses nanotechnology-the product of applied scientific knowledge to control and utilize matter at atomic and molecular scales-as a lens through which to explore the interrelationship between innovation, politics, economy, and sustainability. This groundbreaking book addresses how stakeholders can actively reshape agendas to create positive and sustainable futures through this latest controversial, cross-sectoral technology. It moves beyond issues of efficiency, productivity, and utility, exploring the insights of 22 contributors from around the world, whose work spans the disciplines of science and the humanities. Their combined knowledge, reinforced with various case studies, introduces an exciting prospect-how we can innovate without economic growth.This new volume in the Perspectives in Nanotechnology series is edited by Dr. Donald Maclurcan and Dr. Natalia Radywyl. Dr. Maclurcan is a social innovator and Honorary Research Fellow with the Institute for Nanoscale Technology at the University of Technology Sydney, Australia. Dr. Radywyl is a social researcher and Honorary Research Fellow in the School of Culture and Communication at the University of Melbourne, Australia. She is also an Adjunct Research Fellow in the Faculty of Life and Social Sciences at Swinburne University of Technology, Melbourne. This book is written for a wide audience and will be of particular interest to activists, scholars, policy makers, scientists, business professionals, and others who seek an understanding of how we might justly transition to sustainable societies. Rezensionen ( 0 ) | 科技 |
2017-09/1580/en_head.json.gz/8899 | Home/News/New cosmological clue
New cosmological clueScientists have detected deuterium's radio signature, which will help them understand early particle formation in the universe.
By Liz Kruesi | Published: Friday, September 30, 2005The array at Haystack Observatory consists of 24 stations each with 24 crossed antennae.Bruce Whittier, Haystack ObservatorySeptember 30, 2005Astrophysicists at Massachusetts Institute of Technology's (MIT) Haystack Observatory in Westford, Massachusetts, have detected deuterium's radio signature for the first time. They measured the deuterium-to-hydrogen ratio as 23 parts per million, which is close to the Wilkinson Microwave Anisotropy Probe prediction of 25 parts per million.The amount of deuterium detected puts a constraint on the ratio of radiation-to-ordinary matter. This ratio is related to the amount of matter in the early universe, therefore giving cosmologists clues about the nature of dark matter.Deuterium is also known as "heavy hydrogen." While hydrogen is composed of one proton and one electron, deuterium has one proton, one electron, and one neutron.Deuterium's signal is difficult to detect because there is only about one deuterium atom for every 100,000 hydrogen atoms, and deuterium's spectral lines lie close to hydrogen's in the optical spectrum. However, the elements' emission lines can be far enough apart in the radio spectrum to distinguish between the two. The astrophysicists, led by Alan E. E. Rogers of MIT, detected the signal at a frequency of 327 megahertz, which corresponds to a wavelength of 92 centimeters. Deuterium's 92-centimeter signature emission line is similar to hydrogen's 21-centimeter line, whose detection has helped astronomers map the structure of the Milky Way. Because deuterium's radio signal is so weak, the scientists had to block out radio interference — and, in a few cases, ask Haystack Observatory neighbors to turn off their radio signal sources to accommodate the experiment.The array at Haystack Observatory has 1,152 elements: 24 stations with 24 crossed dipoles at each station, and was completed in June 2004. The scientists observed in the direction opposite the galactic center for 10 months — from June 29, 2004, to April 29, 2005. Their findings were published in The Astrophysical Journal Letters (September 1, 2005).0JOIN THE DISCUSSION
RELATED ARTICLESFirst exoplanet imagedDo brown dwarfs pulsate?The brightest, most distant pulsar has a complex and powerful magnetic fieldThis chart puts the moons of the solar system into perspectiveThis group wants Pluto to be a planet again — and bring hundreds of objects along with itScientists are months away from peering into black holes for the first timeJuno will remain in its current orbit around JupiterHow researchers use solar pressure to study our own star — and maybe reach interstellar spaceNASA is enlisting the public to find Planet Nine
YOU MIGHT ALSO LIKE100 Most Spectacular Sky WondersAstronomy Jigsaw PuzzleAstronomy Magazine 5-Year Collection on DVD-ROMPluto Globe (from Astronomy magazine)The New Cosmos (By David Eicher)Your Guide to the 2017 Total Solar EclipseCosmology's Greatest DiscoveriesCosmic Origins | 科技 |
2017-09/1580/en_head.json.gz/8905 | Business & GA, CNS, Commercial, Connectivity Honeywell and Inmarsat Pair to Move Aviation Internet into the Next Century
By Juliet Van Wagenen | March 26, 2015 The Inmarsat satellite network that will deliver aviation broadband. Photo: Inmarsat
[Avionics Today 03-25-2015] Honeywell and Inmarsat’s JetWave MCS 8200 completed over-the-air testing of the onboard datacom system earlier this month, possibly bringing them one step closer to delivering the global broadband to aviation that companies are striving toward and passengers are hungry for. The new system will deliver satellite-based In-Flight Connectivity (IFC) via the satellite operator’s Global Xpress network, and claims it can enable speeds at 40,000 feet similar to those we’re used to on the ground.
“The system will provide up to 33 Mpbs to the business and general aviation aircraft and up to 49 Mpbs for the air transport and regional aircraft,” Joseph Palovick, director of broadband cabin solutions and product management at Honeywell Aerospace, told Avionics Magazine. This should, Palovolick claims, allow business and commercial passengers to use the Internet in ways customers have come to expect but haven’t yet seen. “Passengers will be able to do everything from real-time social media updates and emails, to live-streaming TV while in flight, and from virtually anywhere in the world with an experience similar to being at home or in the office,” he added.
The new system is the result of a 2012 agreement between the two companies and aircraft manufacturers and airlines are already showing confidence in the system. Bombardier Business Aircraft announced its decision in September to be the launch Original Equipment Manufacturer (OEM) with plans to equip aboard the Global 5000 through 8000 aircraft platforms alongside a retrofit option for aircraft already in service.
“Our customers want to be online everywhere they go,” said Eric Martel, president of Bombardier Business Aircraft at the time of the announcement, noting that passengers expect to “experience the same level of connectivity in the air that they have come to expect on the ground without a drop in connectivity performance once they leave their homes or offices.”
The companies’ hope to deliver this kind of connectivity in the Ka-band network that Inmarsat’s satellite constellation is currently providing over the Indian Ocean, Americas and Atlantic Ocean. With the launch of the next satellite in the second quarter of this year, the band will come available over the Atlantic Ocean and deliver complete coverage. After the last satellite is launched, Palovick believes the Ka coverage it provides will offer a much higher speed than the Ku-band offerings that are on the market today. “It will offer coverage on international flights where Ku coverage is not available and offer significant coverage advantages over other Ka-based solutions,” said Palovick. “It is the answer to limited bandwidth on flights, by providing the same high-speed connection in mid-air that you would get on the ground.”
While Honeywell and Inmarsat’s Ka-based JetWave hardware will offer coverage advantages, the industry still has shown a healthy demand for Ku-based solutions as well. For example, Global Eagle Entertainment (GEE) recently announced a Supplemental Type Certificate (STC) for its Ku-band satellite connectivity system on the Boeing 777. GEE has also made its Ku-band system available on new Boeing 737s, giving airlines ordering 737s the option of including pre-installed satellite connectivity.
Palovick admits that the system is still susceptible to a certain amount of atmospheric or rain fade, however, but that there is some margin built into the link budget to account for it.
The system uses Honeywell’s onboard equipment to enable connectivity in two configurations: one for large air transport-grade aircraft in which it uses a steerable directional antenna system mounted on the fuselage, and the other for corporate business aircraft in which the antenna is mounted on the tail. The antenna is then combined with a number of components within the aircraft skin that function together to provide the connectivity system.
The hardware is currently in full certification testing and entering into DO-160 environmental and electromagnetic interference certifications, after which the companies will now look to prove the system on flight tests in the second quarter of 2015. If all goes as planned, the system should be available to connect to Inmarsat’s Global Xpress constellation network later this year for the cabin and eventually, for the cockpit should regulation allow.
Teledyne Selected for Russian Aircraft
Satcom Direct CTO Talks True North Acquisition
Honeywell's Aerospace Sales Decline in Third Quarter
DRS Wins Radar Contract | 科技 |
2017-09/1580/en_head.json.gz/8923 | Intel's Ivy Bridge chips launch using '3D transistors'
By Leo Kelion
Intel's new Ivy Bridge processors use a new tri-gate transistor technology to boost processing power while reducing the amount of energy needed
Intel is launching its Ivy Bridge family of processors - the first to feature what it describes as a "3D transistor".The American firm says the innovation allows it to offer more computational power while using less energy.The initial release includes 13 quad-core processors, most of which will be targeted at desktop computers.Further dual core processors, suitable for ultrabooks - thin laptops - will be announced "later this spring".Intel and PC manufacturers expect the release to drive a wave of new sales.
"The momentum around the system design is pretty astonishing," Intel's PC business chief, Kirk Skaugen, who is spearheading the launch, told the BBC."There are more than 300 mobile products in development and more than 270 different desktops, many of which are all-in-one designs."This is the world's first 22 nanometre product and we'll be delivering about 20% more processor performance using 20% less average power."
The firm has already built three factories to fabricate the new chips and a fourth will come online later this year. "This is Intel's fastest ramp ever," Mr Skaugen added."There will be 50% more supply than we had early in the product cycle of our last generation, Sandy Bridge, a year ago. And we're still constrained based on the amount of demand we're seeing in the marketplace."Low powerThe fact that Intel's new transistor technology - the on/off switches at the heart of its chips - are more power-efficient could be crucial to its future success.To date it has been largely shut out of the smartphone and tablet markets, where devices are most commonly powered by chips based on designs by Britain's Arm Holdings.Arm now threatens to encroach on Intel's core market with the release of Windows 8 later this year. Microsoft has decided to let one variant of its operating system work on Arm's architecture, paving the way for manufacturers to build laptops targeted at users who prioritise battery life over processing speeds.Tri-gate transistorsIntel hopes a new transistor technology, in development for 11 years, will help it challenge Arm's reputation for energy efficiency.Bell Labs created the first transistor in 1947, and it was about a quarter of the size of an American penny. Since then, engineers have radically shrunk them in size - so there are now more than one billion fitted inside a single processor.Moore's law - named after Intel's co-founder Gordon Moore - stated that the number of transistors that could be placed on an integrated circuit should double roughly every two years without a big leap in cost. However, transistors had become so small that there were fears they would become unreliable if they were shrunk much further.
Intel's 3D tri-gate transistors
Traditionally transistors have used "flat" planar gates designed to switch on and off as quickly as possible, letting the maximum amount of current flow when they are switched on, and minimum when they are switched off.
The transistors' gates in Ivy Bridge chips are just 22nm long (1nm = 1 billionth of a metre), meaning you could fit more than 4,000 of them across the width of a human hair.
Intel plans to incorporate 14nm transistors by 2013 and 10nm by 2015.
The problem is that the smaller that planar gates become, the more energy leakage occurs unless their switching speed is compromised.
Intel's solution has been to make the transistors "3D"
- replacing the "2D" gates with super-thin fins that rise up from the silicon base. Three gates are wrapped around each fin - two on each side and the other across the top.
There are several advantages beyond the fact that more transistors can be packed into the same space.
Current leakage is reduced to near zero while the gates can still switch on and off more than 100 billion times per second.
Less power is needed to carry out the same action.
The innovation only adds 2-3% to the cost of making a chip.
Traditional planar chip design (left) and Intel's new Tri-Gate technology (right).
"A lot of people had thought that Moore's law was coming to an end," said Mr Skaugen. "What Intel has been able to do is instead of just shrinking the transistor in two dimensions, we have been able to create a three-dimensional transistor for the first time."For the user, that means the benefits of better performance and energy use will continue for as far as Intel sees on the road map."Graphics gainsMr Skaugen said that those who use the integrated GPU (graphics processing unit) on the chips, rather than a separate graphics card, would see some of the biggest gains.He said the processing speed had been significantly boosted since Sandy Bridge, meaning devices would be capable of handling high-definition video conferences and the 4K resolution offered by top-end video cameras. The GPU's transcoding rate also benefits from the upgrade, allowing users to recode video more quickly if they want to send clips via email or put them on a smartphone.The chips also offer new hardware-based security facilities as well as built-in USB 3.0 support. This should make it cheaper for manufacturers to offer the standard which allows quicker data transfers to hard disks, cameras and other peripherals.Chip challengeIt all poses quite a challenge to Intel's main competitor in the PC processor market - Advanced Micro Devices. AMD plans to reduce the amount of power its upcoming Piledriver chips consume by using "resonant clock mesh technology" - a new process which recycles the energy used by the processor. However, full details about how it will work and a release date are yet to be announced.One industry analyst told the BBC that Intel was expected to retain its lead."AMD did briefly nudge ahead of Intel in the consumer space in the early 2000s at the time of Windows XP, but since then Intel has been putting in double shifts to break away from its rival," said Chris Green, principal technology analyst at the consultants Davies Murphy Group Europe."Intel is making leaps ahead using proven technology, while AMD is trying to use drawing board stuff. So there's less certainty AMD will succeed, and PC manufacturers may not want to adopt its technology in any volume, at least initially."As advanced as Ivy Bridge sounds, the one thing it is not is future-proof. Intel has already begun to discuss its successor, dubbed Haswell."We are targeting 20 times better battery life on standby - always on, always connected," Mr Skaugen said about the update, due for release in 2013."So you can get all your files and emails downloaded onto your PC while it's in your bag, and still get more than 10 days of standby and all-day battery life."
More on this story First Intel-powered smartphone to be launched in India 19 April 2012 Intel beats analysts' first quarter forecasts 17 April 2012 3D chips to 'transform' devices 4 May 2011 Related Internet links Intel Arm Holdings AMD Davies Murphy Group The BBC is not responsible for the content of external Internet sites | 科技 |
2017-09/1580/en_head.json.gz/8987 | More UBC researcher helping turn rehab into a gamePAMELA FAYERMAN, Vancouver Sun 05.18.2012Rod Cebuliak of Edmonton was one of the first patients to use ReJoyce.
/ Vancouver SunJen Gabrysh participated in one of the trials on the ReJoyce work station. Her progress was supervised by webcams over the Internet.
/ Vancouver SunRod Cebuliak was one of the first patients to use ReJoyce when he took part in a study about four years ago using an early prototype.
/ Vancouver SunShareAdjustCommentPrint
Steep climb still no problem for Rick Hansen
Rick Hansen app project opens new worlds
Internationally acclaimed guitarist credits success to lifelong friend Rick Hansen
SEE PHOTOS OF THE RICK HANSEN RELAY.=========In a Vancouver experiment over the past year and a half, seven paralyzed patients played computer games at home while electrodes in a wrist cuff sent electrical currents to paralyzed muscles so they could contract, allowing users to grasp and move a joystick.The wrist stimulator is controlled by users when they click their teeth to trigger hand opening or closing. Every tooth click generates a vibration in the jaw and temporal bones that is detected by a sensor in an earpiece, similar to a Bluetooth device.While electrical stimulation is now broadly used in rehab of such patients, previous work has shown that electrical stimulation triggered by voluntary (controlled) movements produces better results than when non-triggered stimulation is used.The technologically advanced exercise therapy trial was funded by a $360,000 grant from the Rick Hansen Institute.The study utilized what is called a ReJoyce workstation, a system invented by University of Alberta biomedical engineers Jan Kowalczewski and Arthur Prochazka. It helps build function and strength in the hands of those who have lost both because of stroke or spinal cord injuries.In Montreal and Toronto, the same experiments using the ReJoyce (Rehabilitation Joystick for Computer Exercise) were repeated in another 10 study participants, all of whom still had intact brain cognition but a spinal cord injury resulting in limb paralysis.The purpose of the study was to evaluate improvements in hand function and to be able to predict how often the therapy would have to be used to attain benefits, so training once a week was compared with training five times a week.Study results are now being analyzed and will eventually be published in a medical journal. But based on several previous studies and analysis of the current study so far, researchers expect the benefits of ReJoyce therapy will be confirmed as a rehabilitation model for upper limb strength and dexterity; that would give those with paralysis more independence.Worthwhile effortTania Lam, a University of B.C. researcher who led the Vancouver arm of the study, said it involved tremendous effort on the part of study coordinators and participants, but it was worth it.“This was a great study to be involved in. The participants who volunteered their time to participate in this study were very dedicated and committed to the research, allowing us into their lives and homes to install the ReJoyce workstation and patiently working with us through the demands of the study — multiple testing sessions, weekly training sessions for two months and followup testing over 12 months after the training ended,” she said.“We [saw] from our own [preliminary] data in Vancouver that participants really benefited from this type of training, achieving abilities with their hand function that they had not been able to do [since their injury],” added Lam, an associate professor of kinesiology who is affiliated with the International Collaboration on Repair Discoveries (ICORD).“In consideration of the demands on people’s everyday lives, if this type of therapy is going to be widely used, it’s important to understand whether it ends up needing to be an intensive, daily commitment or whether once per week could be adequate,” she said.Since users may not have any hand function when they start to use the technology, they may require a caregiver to put the wristlet on for them.“Tetraplegic people can do more with their hands, arms and teeth than you might imagine,” Prochazka said.“The earpiece has a small radio transmitter that sends a packet of coded information to a receiver in the stimulator, which is the size of an iPod mini and is located within the wristlet. In some cases, a caregiver puts it on for them. In other cases, they manage themselves,” said Prochazka, who coordinated the study and has a vested interest in ReJoyce since he is involved in a private company called Hometelemed that is already offering in-home rehab therapy using the ReJoyce system.ReJoyce exercises are meant to help patients perform basic functions of everyday life. If the rehabilitation activity is proven to help improve hand function, then patients could again do things such as open doors, turn handles, pick up items and move them.The ReJoyce workstation consists of a laptop computer loaded with at least eight games requiring numerous hand actions on the joystick, such as grasping, gripping, squeezing, pinching and lifting.Although Nintendo’s Wii games are sometimes used for physical or occupational therapy, Prochazka said they aren’t intended to be used as clinical devices. They may be beneficial for whole limb range of motion, but not so much for fine motor control, strength and dexterity.Tele-rehabilitationThroughout the one-hour sessions over eight weeks, trial participants were being supervised and watched by webcams over the Internet. Study researchers in remote locations could observe and interact with the study subjects in what’s been called the world’s first multi-centre trial of in-home tele-rehabilitation.The tele-supervisor’s role was to watch a webcam image of the patient and remotely control the games and sensors. At the same time, they downloaded performance data generated by the hand function test software.Prochazka said the ReJoyce hand exercise workstations have gone through several iterations for fine tuning and are now being manufactured for sale. Up to now, it has largely been a research tool in North America, but a rehab centre in Edmonton — Glenrose Rehabilitation Hospital — was the first to start using it on patients. It has also been commercialized for sale, at $8,000, in some places around the world.Hometelemed, the private company in which Prochazka and Kowalczewski both have a vested interest, has been set up to provide supervised ReJoyce rehab over the Internet to users in their homes.But unless patients are covered for such treatment by extended health insurance plans, the daily therapy may cost about $2,300 over six weeks.Rod Cebuliak was one of the first patients to use ReJoyce when he took part in a study about four years ago using an early prototype. The Edmonton resident became a quadriplegic in 2006 when he broke his neck in an extremely rare event — and cruel twist of fate — while doing nothing more than bending over.“My hands and fingers are very badly atrophied so that’s the primary reason I was motivated to be a guinea pig with this kind of research, to help myself and others,” Cebuliak said.“The ReJoyce system certainly helped me try to do more things and improve my fine motor function and strength. It’s the kind of system anyone can use and it’s not boring, redundant or repetitive because of the game format,” he said.Sun Health Issues Reporterpfayerman@vancouversun.comLONG-WEEKEND RELAY EVENTS:Saturday, Day 270: Burnaby to White Rock5 p.m.: End-of-day celebration at White Rock Community Centre Square, 15154 Russell Ave. While the event begins at 5 p.m., the relay won’t arrive until 7 p.m.Sunday, Day 271: White Rock to Richmond11:45 a.m.: World figure skating champion Patrick Chan joins the relay at McDonald’s Restaurant, 400-11668 Steveston Hwy., Richmond.4 p.m.: Wheelchair Basketball Championship final game, Richmond Olympic Oval, 6111 River Rd., Richmond. Hansen will speak at the closing ceremony and present the gold medals.4:30 p.m.: End of day celebration at the Richmond Olympic Oval Legacy Plaza. While the event begins at 4:30 p.m., the relay won’t arrive until 5 p.m.Monday, Day 272: Richmond to Vancouver10:29 a.m.: Radio host Vicki Gabereau carries the medal along Oak Street from 33rd Avenue to the Van Dusen Gardens Visitor Centre.11:30 a.m.: The relay visits the Musqueam First Nation, Musqueam gymnasium, 6735 Salish Dr., Vancouver.2:41 p.m.: Lululemon CEO Christine Day joins the relay on West Broadway from Angus Bakery to 3466 W. Broadway.Source: Rick Hansen Foundation
xShareUBC researcher helping turn rehab into a game | 科技 |
2017-09/1580/en_head.json.gz/9013 | Informa Channel Partners Online Fiscal Cliff Would Impose 'Major Trauma' on Tech Industry
November 14, 2012 - News
If Congress and the White House don't resolve the looming "fiscal cliff" crisis, the middle class is not the only sector that stands to suffer. According to a new CompTIA white paper, the tech industry, too, will experience "major trauma."
The association noted this week that mandatory spending cuts would harm the United States' small and medium IT businesses, which employ more than 2 million people and contribute $110 billion to the nation's payrolls. There are a number of tax policies that have lapsed or that will expire by Dec. 31, CompTIA said, and that threaten to hit technology businesses. Those areas are:
Section 179 expensing: Small business expensing goes from $139,000 in 2012 to $25,000 in 2013. Upgrading to new technology equipment would be greatly affected, CompTIA said. R&D: The research and experimentation tax credit, which encourages the development or improvement of new products and technologies, faces its end. Government purchasing: The U.S. government buys a significant amount of IT services, through the Department of Defense and other agencies. CompTIA said reduced government spending on research will have a trickle-down effect on the entire technology sector. Access to capital: If the fiscal cliff is not averted, spending cuts will apply to many programs, including those that increase small business lending and access to funding. Access to education: CompTIA said the Department of Education, the Trade Adjustment Assistance Community College and Career Training Grant, and Department of Labor’s Veterans Employment and Training are among the agencies and programs that would be affected. The IT sector makes use of each of these programs, according to CompTIA. CompTIA surveyed IT and business executives and found that about two-thirds (65 percent) supported a degree of balance in addressing the fiscal cliff. Twenty-six percent said Congress should implement spending cuts and revenue increases equally, while 23 percent favored slightly greater spending cuts. And 16 percent supported slightly greater revenue increases.
“As President Obama works to strengthen the economy, it is critical that he supports the efforts of small businesses to grow and innovate. It is equally important that his administration act to help displaced and unemployed workers who, with training and assistance, can find meaningful careers in IT, and contribute to U.S. innovation and global leadership," Todd Thibodeaux, president and CEO of CompTIA, said in a prepared statement.
CompTIA: Highest Tech Industry Growth Rate in More Than a Decade
Hospitality Industry: Tech Challenges and Opportunities
New Westcon, Tech Data Distribution Deals Among New Industry Partnerships
AT&T, Cisco, Verizon ID'd as Some of M2M's Major Players
According to a new research, the market size of M2M connections in the U.S. is expected to reach
News Desk Special Edition: Inside Lightower's Fibertech Acquisition | 科技 |
2017-09/1580/en_head.json.gz/9097 | USPS Wants to be More Digital-Friendly
Tiffany Kaiser - January 16, 2013 9:52 AM
But obstacles like financial cuts and legislation are not easy to overcome
The United States Postal Service (USPS) could use a digital boost to current times, and it has ideas, but executing these plans is another story.
USPS is, for the most part, thought of as a physical mail-only type of service. About 160 billion pieces of mail go through USPS, and none of it is digital. It doesn't offer much in the way of technology -- with a few exceptions like tracking packages online -- but USPS wants more to offer customers.
Paul Vogel is the man behind USPS' tech efforts. His official title is the president of digital solutions, and right now, he doesn't have much to work with. He's got an office like "a San Jose startup," with only 15 Android/Apple developers, consultants coming and going, one computer and his BlackBerry smartphone.
But Vogel's lack of access to resources isn't the only reason for USPS' restricted technology. A major obstacle is legislation needed to get permission for new digital products. Also, USPS has a huge instruction manual for just handful of current products. Adding digital products and security certifications would turn that 1,500 page book into something unimaginable. Furthermore, USPS had losses that amounted to $16 billion last year, and there are legislative proposals to keep making cuts. Hence, more digital tech may not be in the cards right now.
But despite these roadblocks, Vogel and his team are in the midst of making a digital push for USPS. They're currently working on a digital platform called MyPost, which will allow customers to log in and view all packages that they'll be receiving as well as those they've already received instead of searching several different sites that the packages may be coming from.
This is a start, but there are many other pie-in-the-sky ideas USPS would like to take on, such as smartphone apps that would allow customers to scan QR codes on all mail for offers in catalogues or 3D printing at nearby facilities to cut on delivery costs.
Vogel said one of the most important steps forward would be to get more people in his office who could launch these ideas into reality.
Source: The Guardian "It looks like the iPhone 4 might be their Vista, and I'm okay with that." -- Microsoft COO Kevin Turner
USPS to Make First-Class Delivery Cuts in Spring; Could Affect Netflix, Gamefly
USPS Pilot Program Offers Free Mail-in Electronics Recycling | 科技 |
2017-09/1580/en_head.json.gz/9136 | 'What' to Precede 'How' in Design'What' to Precede 'How' in Design
Richard Goering12/28/2000 07:25 PM EST Post a comment
Chip and system design environments will undergo a radical restructuring during the next several years, if Richard Newton, dean of engineering at the University of California at Berkeley, is correct. Newton's attitude toward EDA these days is let's figure out what we're designing first and the "how" of the tools will follow.
Newton believes that new silicon methodologies, such as platform-based design, will cause a complete overhaul of the RTL-based ASIC design flow that dominates the EDA industry today.
As former head of Berkeley's Electrical Engineering and Computer Science Department, Newton has helped shape the EDA industry during the past two decades. During that time, a well-worn design flow has developed-one that encompasses RTL synthesis, simulation and physical layout.
Now, Newton believes, we have reached the end of a "methodology" cycle and are no longer sure just what we'll be designing several years from now. We can't predict what tools will look like until we figure out the methodology, Newton believes.
Still, he said, some general conclusions can be drawn. Newton predicts that hardware and software design will move much closer together, silicon sign-off will occur at a very high level and designers will work in the "language of the problem," not Verilog or VHDL.
"At a certain point, Moore's Law creates enough change that the fundamental problem is methodology," Newton said. "We're just now sorting out what the methodology will be. Change will be driven by methodology and the tools will follow."
One promising new methodology, Newton believes, is platform-based system-on-chip design. Here, hardware architectures are largely predefined and designers create differentiation through programmable logic and embedded software. "I think large markets will be addressed by programmable substrates," Newton said. "They'll be domain-specific platforms."
Another candidate is what Newton calls the 24-hour chip, an idea now being developed by Bob Broderson, professor of engineering at Berkeley and chair of the Berkeley Wireless Research Center. The goal, said Newton, is to start with a Matlab description, go through automatic synthesis and get a power- and performance-optimized layout in 24 hours.
Either concept could change the world, Newton said. "Both will probably happen, for different market segments and different problems."
At Berkeley, Richard Newton, dean of engineering, is taking a new look at what it means to be an engineer: 'Everyone is working across boundaries.' Whatever methodologies prevail, Newton believes there are several key challenges facing designers in the 21st century. One such challenge is managing concurrency, which is essential in both hardware and software to overcome latency in high-performance systems.
"Humans can't visualize concurrent systems well and come up with reliable solutions," said Newton. "That is a key research problem we have to solve and it will apply to both hardware and software."
Another big challenge, Newton said, is "building hardware with a deeper understanding of the software context and building software with a deeper understanding of the hardware context."
Looking toward software
Newton believes, in fact, that the next big market for the EDA industry is embedded software development. Here, he sees a need for a new generation of verification tools. "The challenge is how to verify the correctness of systems implemented as software with a lot of concurrency," he said.
One might conclude that Newton is an advocate of hardware/software codesign, but he's never liked that term. "It casts the problem as hardware vs. software, as opposed to a functional problem that gets decomposed into hardware and software as a process of implementation and evaluation," he said.
Newton also said that "system-level design" is a term that means too many different things to too many people. The real issue, Newton said, is the ability of designers to accurately predict power, performance, throughput and application efficiency for a very complex SoC, at a much higher level of abstraction than is possible today.
"Speaking the language of the problem, you need to reliably predict the performance of the silicon," he said. "That's a major research problem we have to solve."
The days of RTL coding are numbered, Newton believes. Instead, he said, designers will write "in a language more specific to the domain at hand." This language could be a set of C language packages and routines, or something else. But it needs to support concurrency and manage memory automatically, he said.
As for silicon signoff, Newton believes it must move well beyond the gate level. Reiterating the views of the Gigascale Silicon Research Consortium (GSRC), a multiuniversity research project that Newton heads, Newton said that signoff should move to "the boundary between architecture and microarchitecture."
In GSRC's lexicon, an architecture is an abstraction that's independent of implementation, while a microarchitecture has an implementation and an instruction set. Or, as Newton puts it, a microarchitecture is the point at which "you have a block diagram that looks like the chip."
Supporting this level of signoff requires incredibly accurate prediction and estimation tools. "We're talking about being able to make final design decisions at the architecture-microarchitecture boundary and never stopping to think twice," Newton said.
Interdisciplinary skills
What about the skill set that will be needed for next-generation electronic design? Newton is thinking about that a lot in his new position as dean of engineering. He is, in fact, working on what he describes as the first major overhaul of how universities teach engineering since the Second World War.
Newton is overseeing top-rated departments in electrical, mechanical and civil engineering. One big problem faced by the School of Engineering, he said, is that two-thirds of the most qualified students want to go into the Department of Electrical Engineering and Computer Science.
Mechanical and civil engineering departments aren't getting the students they need, even though some very interesting work is going on in these areas. For example, Newton noted, much of Berkeley's microelectromechanical systems research takes place in mechanical engineering, and civil engineers are trying to wire every bridge in California to help predict earthquakes.
"We're starting to take a new look at what it means to be an engineer," Newton said. "It's not about civil or mechanical. Everyone is working across boundaries."
At Berkeley, Newton hopes to launch an ambitious program that would erase some of those boundaries and put engineering students into multidisciplinary research groups. This might entail a common engineering program for the first year or two, with specialization afterwards. "We've lost the binding between the names of the departments and the types of problems people solve," Newton said. "We need to morph the structure into one where kids will identify more with what they're doing. The 'how' will follow."
BACK TO THE FUTURE Email ThisPrintComment | 科技 |
2017-09/1580/en_head.json.gz/9160 | Encyclopedia > Science and Technology > Physics > Physics relativity
Evolution from Classical Theory The modern theory is an extension of the simpler Galilean or Newtonian concept of relativity, which holds that the laws of mechanics are the same in one system as in another system in uniform motion relative to it. Thus, it is impossible to detect the motion of a system by measurements made within the system, and such motion can be observed only in relation to other systems in uniform motion. The older concept of relativity assumes that space and time are correctly measured separately and regards them as absolute and independent realities. The system of relativity and mechanics of Galileo and Newton is perfectly self-consistent, but the addition of Maxwell's theory of electricity and magnetism to the system leads to fundamental theoretical difficulties related to the problem of absolute motion. It seemed for a time that the ether, an elastic medium thought to be present throughout space, would provide a method for the measurement of absolute motion, but certain experiments in the late 19th cent. gave results unexplained by or contradicting Newtonian physics. Notable among these were the attempts of A. A. Michelson and E. W. Morley (1887) to measure the velocity of the earth through the supposed ether as one might measure the speed of a ship through the sea. The null result of this measurement caused great confusion among physicists, who made various unsuccessful attempts to explain the result within the context of classical theory. Sections in this article:IntroductionEvolution from Classical TheoryThe Special Theory of RelativityThe General Theory of RelativityBibliography The Columbia Electronic Encyclopedia, 6th ed. Copyright © 2012, Columbia University Press. All rights reserved.See more Encyclopedia articles on: Physics | 科技 |
2017-09/1580/en_head.json.gz/9194 | 2013>Fujitsu and DERI Revolutionize Access to Open Data by Jointly Developing Technology for Linked Open Data
Over Fujitsu Nederland
Werken bij Fujitsu
Fujitsu and DERI Revolutionize Access to Open Data by Jointly Developing Technology for Linked Open Data
Promoting open data usage with world’s first freely available storage and query platform that utilizes Linked Open Data
Fujitsu Laboratories of Europe Ltd.
Kawasaki, Japan, Galway, Ireland, and London, England, april 03, 2013
Fujitsu Laboratories Limited, the Digital Enterprise Research Institute (DERI) of the National University of Ireland Galway, and Fujitsu Laboratories of Europe Limited have jointly announced a revolutionary new data storage technology that stores and queries interconnected Linked Open Data (LOD *1), available globally. The technology will be made available free of charge on a cloud-based platform that utilizes LOD, thereby promoting open data usage and enabling the easy utilization of enormous volumes of LOD.
The joint development program is focused on overcoming the challenges presented by the huge quantity of LOD available via the Internet, as well as the difficulties in using and processing LOD effectively. Fujitsu Laboratories, DERI and Fujitsu Laboratories of Europe have developed an LOD-utilizing platform that, using a standard Application Programming Interface (API), is capable of batch searches of billions of pieces of stored LOD data via high-speed search algorithms that are five to ten times faster than before.
With much of the available information in LOD format coming from academic and government institutions, together with individual data sets only being available from the respective organizations’ websites, it has been difficult in the past to determine the data type and location. The new technology overcomes this, incorporating a function to enable visual searches of data required by applications, using a search interface that visualizes data together with its linked information. As a result, users can instantly access the data they need, without requiring application developers to search through individual websites and process the underlying data.
Fujitsu Laboratories Ltd. plans to promote the use of open data, and will be the first in the world to make the newly developed technology freely available to the public, on a cloud-based platform that utilizes LOD. (Limited availability planned from 2013)
The technology will be introduced at the international conference XBRL26, from April 16-18 in Dublin, Ireland in the form of a case study in open data usage. Full details of the technology, along with an enterprise analysis application that employs the technology, will be presented at the conference.
In the US, the public data website Data.gov was launched as part of an open government initiative, and as of March 2013, the national governments of 30 countries have released government data through websites intended specifically for that purpose. In Japan, the Cabinet’s IT Strategic Headquarters promulgated an e-gov open data strategy in July 2012 and has been gradually rolling out a legal framework for open data and the disclosure of public data.
To ensure that public data can be easily processed by machines without relying on proprietary applications?a Resource Description Framework (RDF)?the World Wide Web Consortium (W3C), the main international standards organization for the World Wide Web, recommends Linked Data, which is a format involving the use of data links that are used to connect with other pieces of data, thereby making it easier for a piece of data to be discovered and used. The UK’s Data.gov.uk public data site also employs the Linked Data format, and as of March 2013, a total of more than 40 billion pieces of data worldwide have been published as Linked Data. The network of data formed by these links is known as Linked Open Data.
Technological Issues
LOD is currently published through websites operated by individual data providers. These data are available through a variety of different access methods, and many of the sites provide their own search functions to find data on their sites. From the perspective of an application developer, however, these individually developed search functions present a number of challenges: 1) It is impossible to easily determine on which site the desired data is located; 2) complex application-side processing is required to combine and process multiple kinds of data; and 3) if a site does not provide its own search function, data cannot be searched. Until now, these problems have been left up to application developers to address, which has become a roadblock to making effective use of LOD.
Newly Developed Technology
Fujitsu Laboratories Ltd. has developed a data store technology that collects and stores LOD published throughout the world and can perform batch searches on multiple kinds of data. This technology has a search interface for users, and features standard API (SPARQL *2).
Figure 1: Overview of LOD data store technology
Key features of the technology are as follows.
1. Distributed search technology adapted for an LOD link structure
When collecting data in a single place, the massive data structure created by the links between pieces of data requires special handling. Not only is it difficult simply to handle the larger data volumes, but developing a technology for rapidly searching complex data link structures has also proved to be a challenge. In particular, when searching for common elements that are linked together within data, it is necessary to perform comprehensive matching (cross-referencing) of massive data sets, which can lead to performance deterioration.
To facilitate search processing that requires this kind of cross-referencing, Fujitsu Laboratories Ltd. has developed a caching structure that is specifically adapted to LOD. This is employed in combination with distributed processing, thereby enabling a five-tenfold speed improvement. More specifically, each distributed server can perform cross-referencing processing by adjusting search conditions to reduce the workload in the master server and hence shorten overall processing time. Based on the fact that links in LOD link structures are typically concentrated in only a portion of the nodes, and by taking advantage of past usage frequency, Fujitsu Laboratories Ltd. has also developed an algorithm that efficiently caches only the data that is heavily accessed in cross-referencing. The new algorithm has made it possible to reduce disk accesses and thereby accelerate searching.
Figure 2: Overview of search algorithm
2. Search interface enables bird’s-eye view of LOD
The Search interface is designed to give application developers a better grasp of what kinds of data are located where. In addition to enabling searches across all data, the interface also allows searches based on statistical data that expresses the usage frequency and pervasiveness of data, as well as searches using the licenses that are assigned to data. This, in turn, makes it far easier to home in on desired data. Search results are displayed in visual form, together with the links that connect data (Figure 3), enabling application developers to visually ascertain the information they need.
Figure 3: Sample search interface
By employing the newly developed technology, it is possible for application developers to obtain the information they need in a single location, without having to search across multiple public data websites. By using a standard API, developers can easily develop applications that freely combine a variety of data published through LOD.
As a case study, Fujitsu Laboratories Ltd. has developed an enterprise analysis application that employs the new technology. The application combines several data published through LOD, such as basic company information (industry category, number of employees, etc.), public financial information (revenues, profit, etc.), and stock prices, thereby making it possible instantly to analyze a company’s performance from multiple angles.
From 2013, Fujitsu plans to incorporate the newly developed technology, free of charge, on a cloud-based platform that utilizes LOD, as part of its efforts to promote open data usage. In addition, Fujitsu will move to apply this technology to its business divisions involved in data utilization, leveraging it across an array of fields.
[1] Linked Open Data
A dataset published in the Linked Data format. As of March 2013, some 340 public-data sites used it to publish a total of 40 billion pieces of data. A typical example is DBpedia, which converts information from Wikipedia into the Linked Data format.
[2] SPARQL
A query language for RDF established by W3C. It is also used for searches of Linked Data.
About Fujitsu Fujitsu is the leading Japanese information and communication technology (ICT) company offering a full range of technology products, solutions and services. Over 170,000 Fujitsu people support customers in more than 100 countries. We use our experience and the power of ICT to shape the future of society with our customers. Fujitsu Limited (TSE:6702) reported consolidated revenues of 4.5 trillion yen (US$54 billion) for the fiscal year ended March 31, 2012.
For more information, please see: http://www.fujitsu.com
About Fujitsu Laboratories Founded in 1968 as a wholly owned subsidiary of Fujitsu Limited, Fujitsu Laboratories Limited is one of the premier research centers in the world. With a global network of laboratories in Japan, China, the United States and Europe, the organization conducts a wide range of basic and applied research in the areas of Next-generation Services, Computer Servers, Networks, Electronic Devices and Advanced Materials.
For more information, please see: http://jp.fujitsu.com/labs/en
About DERI With over 140 researchers, DERI is one of the world's leading international web science research institutes, and has a specific focus on the Semantic Web and Networked Knowledge. DERI is a Centre for Science, Engineering and Technology (CSET) established in 2003 with funding from Science Foundation Ireland (SFI). As a CSET, DERI brings together academic and industrial partners to boost innovation in science and technology, with its research focused on the SemanticWeb. DERI has leveraged its SFI CSET funding to add significant additional research funding from the European Union, Enterprise Ireland, and industry sources.
For more information, please see: http://www.deri.ie
About Fujitsu Laboratories of Europe Limited Fujitsu Laboratories Limited has had an active presence in Europe since 1990, forming Fujitsu Laboratories of Europe Limited in 2001. The company's groundbreaking work is closely aligned to the future needs of the business community, focused on making future technologies a reality for today's businesses. Fujitsu Laboratories of Europe aims to shorten the R&D cycle to put cutting edge technologies into customers' hands as quickly as possible, enabling businesses to gain a tangible competitive advantage. Close collaboration with leading academics and experts Europe-wide forms a central element of Fujitsu Laboratories of Europe's approach, ensuring the effective pooling of expertise with other pioneers in any given field of research. Fujitsu Laboratories of Europe also participates in a number of EU research initiatives, bringing together the joint expertise of industry and academia to accelerate the development and use of new technologies on a pan-European basis.
For more information, please see: http://www.fujitsu.com/emea/about/fle/
Public and Investor Relations Division Website:https://www-s.fujitsu.com/global/news/contacts/inquiries/index.html Company:Fujitsu Limited
Software Systems Laboratories, Intelligent Technology Lab.
E-mail: lod@ml.labs.fujitsu.com
Company:Fujitsu Laboratories Ltd.
All company or product names mentioned herein are trademarks or registered trademarks of their respective owners. Information provided in this press release is accurate at time of publication and is subject to change without advance notice.
Date: 03 april, 2013
City: Kawasaki, Japan, Galway, Ireland, and London, England
Fujitsu Laboratories Ltd., Digital Enterprise Research Institute, Fujitsu Laboratories of Europe ltd.
Fujitsu Nieuws
Volg het laatste ICT nieuws op computer, tablet of smartphone. | 科技 |
2017-09/1580/en_head.json.gz/9233 | This Cambridge Chapel's Light Display Is a Glorious Ode to Knowledge By Jennifer Ouellette on at The interior of a Cambridge chapel comes alive with light in an eye-popping immersive project art installation. It’s the creation of artist Miguel Chevalier, and the installation was commissioned by the University of Cambridge for a fundraiser last month — the first time the university has invited an artist to do such a project in its famed King’s College Chapel. It was a one-time event, with speeches by such famous Cambridge alumni as Sir Ian McKellen and Sir David Attenborough, among other luminaries. But fortunately filmmaker Claude Mossessian was there to record it for posterity. In the short film below, we are introduced to the cathedral’s glorious late Gothic architecture, both inside and out. And then we see the cathedral’s interior transformed by multi-coloured light playing off all those period details. The imagery evokes the many fields of human knowledge pursued at Cambridge (and around the world): history, literature, botany, biology, neuroscience, physics, astronomy, and finally, cosmology. As Chevalier’s ode to the cosmos flickers on the ceiling and walls that surround the audience, Cambridge’s most famous scholar, Stephen Hawking, gives a short homage to all that the university has given to him — and to the world: “I was lucky to be given the space to think and I chose to think about space. I was lucky to find a place in the world where I could make my mark. But neither Cambridge nor I have finished yet. To transform all of our futures, we can and we must do more, because we have many more world-changing ideas to give. But Dear World — we cannot do this without you.” [Via Lost at E Minor] Want more updates from Gizmodo UK? Make sure to check out our @GizmodoUK Twitter feed, and our Facebook page. Tags:
scienceartstephen hawking | 科技 |
2017-09/1580/en_head.json.gz/9313 | Selfa sense of selfCloseOur sense of smell is directly connected to the brain’s limbic system which stores our memories and emotions. It is this direct link, via the olfactory bulb connecting nose to brain, which enables fragrances to trigger our memories and stir our emotions in a way that nothing else can.
Science has proved the so-called Proustian Phenomenon: memory triggered by familiar smells, typically those learned in childhood, activate memories that are emotional, vivid and evocative.
In a smell survey of over 26,000 people, 50 per cent of the respondents in their 20s reported that a vivid memory was cued by one of the odours given to them.
Our memories can fail us in other ways, yet smell-linked memories – and reactions - may be longer-lasting than other aspects of the same experience. Smell therefore contributes to our unique sense of self.
Fragrances can communicate cleanliness, freshness and softness; they can affect mood; they can trigger allure and attraction, enhancing our unique, natural body odours; they can enhance our sense of taste; and they can communicate clear messages in a way that nothing else can.
Fragrances are powerful and without fragrance, our connection with the world around us would be diminished.understandinga sense of understandingCloseIFRA UK promotes understanding of the importance of the industry’s Standards by making these freely available to all – members and non-members alike. This transparent approach to promote a sense of understanding is something of which the sector is proud.
IFRA UK also supports understanding of the creativity, safety methods and adherence to sustainability targets amongst those who create fragrances, those who apply them to the millions of products in which fragrance is used, and to people who enjoy using those products.
Finally, fragrances often help not only to enhance lives through the enjoyment of perfumed products but can even directly help develop a better understanding of the world for those people with mental health problems or other impairments because scents provide a direct conduit between the brain and the outside world. Children with serious multiple sensory impairments, for example, or elderly people with dementia, have been able to lead fuller lives as a result of using fragrance to improve their connection with the world.Creativitya sense of creativityCloseThe fragrance sector is a creative industry combining both art and science.
The creative innovation of IFRA UK’s members and IFRA members worldwide is the bedrock of a value chain encompassing the growing, gathering and processing of raw ingredients; the creation of proprietary fragrance blends; and the manufacturing, marketing, distribution and sale of consumer and other products containing fragrances. Indeed, for many products, the fragrance they contain is the main factor differentiating this one item from its competitors.
The industry’s success in satisfying our needs, wants and desires depends on the individual inspiration of just 900 of its 32,000 employees: the expert, qualified perfumers.
Creativity may imply immediate competency but this is far from being the case. It takes many years of study, training and practice to become a perfumer requiring the knowledge and discipline gained from the investment of over 7 years of education and training that each has required become a ‘nose’.
Each of the 60,000 to 80,000 unique proprietary fragrance blends that are created and sold world-wide each year is typically made up of between 50 and 250 ingredients drawn from a palette of 3,000 essential oils, natural aromatics and the complex molecules that are the products of the industry’s extensive research and development programme.
The significant investment in human artistic and scientific creativity is supported by expert software and knowledge codification.
At the heart of these investments is expenditure on perfumers: a select group of experts skilled in the creation of unique smell experiences. Of the 900 or so fully qualified perfumers, in the world around two-thirds work in Europe.
Responsibilitya sense of responsibilityCloseThe fragrance industry takes its responsibilities very seriously, concerning environmental, social and financial responsibilities. This sense of responsibility encompasses human health and well-being and sustainability issues.
All the ingredients and compounds used by the fragrance sector are rigorously assessed for safety; always at the forefront of our work are safety controls and research to avoid toxicity and allergen issues. IFRA works closely with regulators, customers and others to issue and update comprehensive safety standards which are made freely available to all.
IFRA’s Code of Conduct prescribes best practice and compliance and these demanding standards must be adhered to by all members.
Sustainability is of prime importance to those working in the fragrance industry, with attention paid throughout to sustainable harvesting of natural ingredients, investment in sound harvesting practices, the use of sound science, sustainable processing and production and in the manufacture and distribution of fragrance blends.
For a summary of some of the sustainability activities of IFRA UK and its members, see the article in SPC magazine April 2013, a link to which is provided here…Reassurancea sense of reassuranceCloseBy using an IFRA UK member, companies whose products employ fragrance as a key ingredient have the reassurance that they are dealing with professionals for whom quality, safety, compliance and excellence are considered essential.
IFRA UK members benefit from being part of a global network which directly sustains some 32,000 jobs and creates $52 billion of gross added value annually across the world.
Fragrance is an important ‘platform technology’ that underpins a wider economic value chain which is estimated to support 778,000 jobs worldwide and create a further $33.6 billion of gross value added value.
The sector produces between up to 80,000 unique proprietary fragrance blends each year, representing a lasting economic and cultural resource in their own right.
The fragrance industry’s contribution to society via its responsible approach to every aspect of fragrance sourcing, creation and distribution is hugely enhanced by the reassurance that is provided by IFRA membership.Prosperitya sense of prosperityCloseFragrance blends for personal care products account for half of the industry’s business while household care products and fine fragrances make up a quarter each. But as well as the consumer products that answer many of our functional and emotional needs, wants and desires, the fragrance industry also creates employment and prosperity. Its continuing investment in knowledge based innovation, understanding and creativity returns a substantial economic contribution to society, directly sustaining some 32,000 jobs and creating $5.2 billion of Gross Value Added annually. And as the platform technology that underpins a uniquely durable value chain its wider economic value is hard to underestimate. In the manufacturing and retail of fragrance products the industry is estimated to support an additional 778,000 jobs world wide and create a further $33.6 billion of Gross Value Added. In the European Union alone, which accounts for 37% of the industry’s sales, its overall contribution to GDP has been estimated at 1.2%.
And beyond manufacturing and retail, of course, the fragrance industry’s value chain also supports the transport and logistics industries as well as marketing services and creative industries such as research, graphic design, advertising and magazine publishing.
While the trade secrets that protect the intellectual property vested in the 60,000 to 80,000 unique proprietary fragrance blends that the industry creates each year also represent a lasting economic and cultural resource in their own right.
Source: The Huggard Consulting Group: ‘The Socio-Economic Impact Of Fragrance Technologies’ : July 2012.
more >Standards
The International Fragrance Association UK promotes the safe creation, development and enjoyment of fragrance on behalf of its members.
IFRA UK is part of the global federation, IFRA, which ensures consistent standards worldwide.
Fragrance contributes to our sense of well-being. Our sense of smell and the fragrances designed by our members can affect our moods and emotions in a way that nothing else does.
Our sense of smell is a unique door to the way we interpret and enjoy the world around us. IFRA UK represents those who play an important role by providing the key to that door.
Please enter an answer in digits:thirteen − seven = A password will be e-mailed to you.
Protected with SiteGuarding.com Antivirus
Ask a Perfumer
IFRA Compliance
Disclaimer/cookies | 科技 |
2017-09/1580/en_head.json.gz/9429 | AT&T lands exclusive access to 64GB HTC One
updated 10:46 pm EDT, Sat March 30, 2013
High-capacity HTC One to be ATT exclusive
HTC One enthusiasts looking for a higher-capacity model of the HTC One can now start planning on their upgrades or path to hop carriers, as AT&T has announced that it will be exclusively carrying the 64GB model of HTC's forthcoming flagship. The announcement came Saturday in a video posting to the carrier's YouTube page. AT&T will also be carrying the 32GB version of the One.AT&T's video gives no details on an exact launch date for the forthcoming HTC flagship. The most specific details available so far (via Engadet) give "April" as a launch window. This conforms to prior reports regarding the US launch for the smartphone.
The One's arrival on American shores will come after a substantial delay brought on by the parts and component sourcing necessary for the device. The delay is said to stem in part from HTC's choice to go with a premium build quality for the One, which is thought to be a considerable bet for the company. iFixit subjected the One to a teardown, finding that it is very difficult to disassemble for repair due to its construction.
Electronista went hands-on with the One at its late-February unveiling. We found the build quality to be superb, with the One feeling like a high-end smartphone should. We were also impressed with the quality of the One's built-in camera, though we would have liked to have seen a pure Android experience instead of HTC's Sense overlay. | 科技 |
2017-09/1580/en_head.json.gz/9447 | Provided by Games Press Tuesday, April 20th 2010 at 6:15PM BST
Having become one of the best-selling games on the Xbox Live Indie Game store, The Impossible Game is now available on iPhone, iPod Touch and iPad! The Impossible Game is a minimalist platformer with only one button, and has been described as "platforming taken back to its roots" by GameTrailers.com. Jumping over a series of spikes, pits and blocks may sound easy, but The Impossible Game, as its name implies, could possibly be the hardest game ever made. Set to a pounding techno beat, the game is absolutely unrelenting in its difficulty and a quick search on Twitter will find hundreds of players putting their gaming skills to the ultimate test. No wonder Eurogamer.net described it as "bafflingly addictive". New features for the iPhone/iPod Touch version include: a new, improved level layout; a statistics page listing (among other things) how many times you've died; and Twitter and Facebook integration, so players can boast to their friends about how far they can get! A free browser-based demo is available now at http://www.flukedude.com/theimpossiblegame and a "lite" version of the game will be released onto the App Store shortly. The Impossible Game is the creation of British indie game developer FlukeDude. His forthcoming projects include "Sci-Fighters", a four-player party game with graphics by Dim of the Super Flash Bros and voice contributions from Egoraptor, author of the "Awesome" flash animations. The Impossible Game is available now on the App Store for the bargain price of 99 cents (£0.59), as well as on the Xbox Live Marketplace for 80 Microsoft points. Games Press is the leading online resource for games journalists. Used daily by magazines, newspapers, TV, radio, online media and retailers worldwide, it offers a vast, constantly updated archive of press releases and assets, and is the simplest and most cost-effective way for PR professionals to reach the widest possible audience. Registration for the site and the Games Press email digest is available, to the trade only, at www.gamespress.com. | 科技 |
2017-09/1580/en_head.json.gz/9494 | May 22 2014, 3:06 pm ET
Global Warming Linked to Frigid U.S. Winter, Scientist Says
by John Roach
A person walks on a deserted pedestrian walkway on a snowy morning in New York, on Dec. 10, 2013. Seth Wenig / AP file
The extreme cold and snow across the eastern half of the United States this past winter makes global warming seem laughable. But, paradoxically, the blasts of polar air were fueled in part by planet-warming gases, according to a new paper. In particular, the gases helped plow heat into the tropical western Pacific Ocean that, in turn, drove the jet stream further north toward the Arctic before it funneled cold, snowy weather over the Midwest and East Coast, explained Tim Palmer, a climate physicist at the University of Oxford in the United Kingdom. The sea surface temperatures in the western Pacific were "probably the warmest ever recorded this past year," he told NBC News. "In fact, consistent with that, we had these fantastically strong tropical typhoons in the western Pacific, not least Haiyan which broke all records of wind strength." In addition to pounding the sea and land surface with wind and rain, typhoons and other tropical storms also release energy into the upper atmosphere. This energy, in turn, excites so-called Rossby waves in jet streams, the rivers of air that snake around the globe and produce weather patterns at the surface. The extra stormy weather in the tropical western Pacific generated extra-large waves in the Northern Hemisphere jet stream, positioning it to send cold weather to the eastern United States, according to Palmer. The theory, outlined today in the journal Science, builds on research Palmer conducted in the 1980s and resonates with more recent work led by Kevin Trenberth, a climate scientist at the National Center for Atmospheric Research in Boulder, Colo. "Everything he has got here is fine," Trenberth told NBC News after reviewing the paper, adding that the connection between the cold winter and climate change can be made even stronger by quantifying the warming waters and storminess in the tropical western Pacific. Warm waters pile up The build-up of heat in the tropical western Pacific has also been tied to another head scratcher in climate science: the apparent pause, or hiatus, in surface temperature warming since the major El Niño winter of 1997-98. During an El Niño, east-to-west trade winds collapse or reverse, allowing warm waters from the tropical western Pacific to dissipate along the equator toward South America, which influences weather patterns around the world. For example, heavy rains often fall on California and drought hits Australia. Since the late 1990s, the east-to-west trade winds have "gotten stronger and more persistent than normal and what this has been doing is piling warm water up in the western Pacific," Palmer said. These winds have essentially blown the excess heat generated under continued greenhouse warming into the deep layers of the ocean, according to recent research on the surface warming hiatus. "My speculation is that that has led to a couple tenths of a degree on top of this already very, very warm water that is being piled up by the trade winds," Palmer said. "So, we've gotten, in my view, a kind of subtle interaction between natural climate variability and man-made climate change." This interaction led to locally devastating tropical cyclones and amplified waves in the jet stream enough to account for this past winter's record cold in the United States and record wet weather in the United Kingdom, he added. Arctic influence? Pinning the cold weather on interactions between global warming and natural variability in the tropical western Pacific carries more clout than a controversial hypothesis that the winter chill was driven by rapid warming in the Arctic, Trenberth noted. According to that theory, advanced by Jennifer Francis at Rutgers University in New Brunswick, New Jersey, the Arctic is warming faster than the rest of the globe, which in turn has caused the northern polar jet stream to become wavier. Some of those bigger bends in the jet stream bring cold Arctic air further south, which could also explain the winter chill in the eastern United States. That theory is controversial, explained Palmer, mostly because the Arctic is considered too small of an area to have a major influence on global weather patterns. In addition, what happens at the surface in the Arctic — which is definitely warming — stays at the surface unlike the western Pacific where heat from tropical storms is released to the upper atmosphere where it influences the jet stream. Francis noted in an email to NBC News that precipitation in the tropical western Pacific has been consistently strong for the past seven years, including the winter of 2011-2012 when the eastern United States had an exceptionally warm winter. "Because the tropical rainfall patterns in these two years were very similar but the temperature extremes in the two winters in the U.S. were opposite, the tropical pattern cannot be the only reason for the persistent cold spell this winter," she said. "I cannot say that rapid Arctic warming played a role in either of these unusual winters in the U.S.," she added, "but the very amplified (wavy) jet stream during both years is consistent with the sorts of patterns we expect to occur more frequently as the Arctic continues to warm." Cold winters on their way out? According to Palmer's research, cold winters will overall become rarer as the concentration of greenhouse gases increases in the atmosphere and the planet continues to warm. "In fact, I'm sort of expecting in the near future they will probably become much less common because all evidence suggests we are heading into an El Niño year when all of this warm water that was previously piled up in the western Pacific will flow back across the Pacific," he said. If that is indeed the case, he added, the hiatus too will likely come to an end and "global mean temperatures will start to rise at the rate that they were doing back in the early 1990s." John Roach
Topics Environment
First Published May 22 2014, 12:25 am ET
Despite Presidents Day, Protesters Give Trump No Respite | 科技 |
2017-09/1580/en_head.json.gz/9516 | Written by / Agency / Source: Infosys Technologies Ltd
Anglian Water Taps Infosys to Support Collaboration Drive
London, United Kingdom, 2012/12/14 - Refreshed 'Hawk' platform helps over 4,500 staff manage every drop of data - Infosys.com. NYSE: INFY
Anglian Water aims to make the most of every drop of water. Now Infosys, its new consulting and technology partner, is helping the water utility make the most of every drop of data as well. Anglian Water is working with Infosys to transform its knowledge management systems as part of a data management overhaul to improve employee and customer communications and collaboration.
Infosys has designed and deployed a new Enterprise Content Management system that will improve workflow management by ensuring staff access to up-to-date information and removing duplication of materials. The system will power Anglian Water’s intranet portal - Hawk (Harnessing Anglian Water Knowledge) - in addition to supporting the creation of reliable reports and submissions for the water industry's regulators. By integrating structured and unstructured data sources into one user-friendly portal, Anglian Water will have a unified view of information and be able to respond faster to business and customer needs.
The system uses a unique combination of Microsoft SharePoint and Open Text Livelink® technologies, the first implementation of its kind in the UK. Bringing together these complementary tools lets Anglian Water create a new corporate information structure (or taxonomy) to modernize the way its employees store and share information.
For example, it is now possible for field engineers to record, upload and download photos and documents from their laptops directly and in real-time so that a complete and up-to-date record of an incident and associated documents are instantly available companywide.
The program includes making the refreshed Hawk portal available to over 4,500 Anglian Water employees on a variety of devices, and the migration of over one million documents. Anglian Water and Infosys will continue to develop the Hawk system by bringing in additional content and data feeds which will also include social media, GIS and SAP integration.
Richard Boucher, Business Change and Strategy Director at Anglian Water said: "The Enterprise Content Management project is an important step toward our goal of being a frontier performer in our industry, and improving the service we provide our customers. Infosys has combined a thorough understanding of our industry with the technical skills and user experience design to build a system that creates a new way of working. It makes our business more efficient by giving our people corporate information they need at their fingertips. We're really excited to have this new platform and we are already looking at ways we can further benefit from it."
Mukul Gupta, Vice President and Head of Energy, Utility and Services in Europe at Infosys said: "By creating a flexible architecture we have helped Anglian Water focus on running an information management system that meets the needs of their business rather than be dictated by the underlying technology architecture. This will become even more important as unstructured data sources from social media become integrated into the system. While Anglian Water aims to make to most of every drop of water, we will help them make the most of every drop of data."
The project has been selected as an IT Initiative of the Year finalist in the Utility Week Achievement Awards, 2012.
About Anglian Water
Anglian Water supplies water and wastewater services to more than six million domestic and business customers in the east of England. We look after 20% more people than we did 20 years ago, but we still supply just the same amount of water today as we did in 1990 – almost 1.2 billion litres every single day. That's because we've invested heavily in improving our network, and because we've helped our customers become more water efficient.
Our huge region stretches from the Humber north of Grimsby to the Thames estuary in the south, and from Buckinghamshire in the west to Lowestoft on the east coast. Our 112,833 km of water and wastewater pipes could take us a quarter of the way to the moon. They supply and transport water across an area of 27,500 square km – geographically the largest area covered by any water company in England and Wales. Because of the size of our region, we operate 1,257 water and wastewater treatment works. This is around a quarter of all those in England and Wales.
Our region is one of the driest regions in the country, with an average of just 600 millimetres of rain falling each year – a third less than the rest of England. Approximately one quarter of the area we serve is below sea level. This means we have to be aware of the risk of flooding, as well as the threats posed by a shortage of water.
About Infosys
Infosys (infosys.com) partners with global enterprises to drive their innovation-led growth. That's why Forbes ranked Infosys 19th among the top 100 most innovative companies. As a leading provider of next-generation consulting technology and outsourcing solutions, Infosys helps clients in more than 30 countries realize their goals. Visit infosys.com and see how Infosys (NYSE: INFY), with its 150,000+ people, is Building Tomorrow's Enterprise® today.
Certain statements in this release concerning our future growth prospects are forward-looking statements, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, and unauthorized use of our intellectual property and general economic conditions affecting our industry. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2012 and on Form 6-K for the quarters ended December 31, 2011, June 30, 2012 and September 30, 2012.These filings are available at sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the company's filings with the Securities and Exchange Commission and our reports to shareholders. The company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the company.
| Publisher Contact: Paul de Lara - Infosys.com +44(0)20 7516 2748 Paul_delara[.]infosys.com Newswire Today - PRZOOM / PRTODAY disclaims any content contained in this article. If you need/wish to contact the company who published the current release, you will need to contact them - NOT us. Issuers of articles are solely responsible for the accuracy of their content. Our complete disclaimer appears here.
IMPORTANT INFORMATION: Issuance, publication or distribution of this press release in certain jurisdictions could be subject to restrictions. The recipient of this press release is responsible for using this press release and the information herein in accordance with the applicable rules and regulations in the particular jurisdiction. This press release does not constitute an offer or an offering to acquire or subscribe for any Infosys Technologies Ltd securities in any jurisdiction including any other companies listed or named in this release.
Read Latest Articles From Infosys Technologies Ltd / Company Profile | 科技 |
2017-09/1580/en_head.json.gz/9517 | Lake-Effect Snow Sometimes Needs Mountains
Released: 18-Feb-2013 11:00 PM EST
Source Newsroom: University of Utah
Monthly Weather Review CHANNELS
Environmental Science, Featured: DailyWire, Featured: SciWire KEYWORDS
Weather, SNOW, Lake Effect, Meteorology, Atmospheric Sciences, SKI, Skiing, powder snow, Great Salt Lake, Great Lakes, Sea of Japan, Black Sea, Mountains, orography, orographic, snowstorm, Snowfall + Show More
Credit: Jim Steenburgh, University of Utah.
Erik Steenburgh of Salt Lake City skis deep powder during a lake-effect snowstorm at Alta Ski Area in Utah's Wasatch Range in April 2011. Steenburgh is the son of Jim Steenburgh, a University of Utah atmospheric scientist. The elder Steenburgh and Trevor Alcott of the National Weather Service have published a new study showing that mountains surrounding the Great Salt Lake play an important role in triggering some lake-effect snowstorms.
This map shows how mountains surrounding Utah's Great Salt Lake interact with the lake to cause some "lake-effect" snowstorms. Air masses from the north-northwest are channeled by mountains north of the lake so they converge above the lake. The air picks up heat and moisture from the lake, so it rises, cools and produces snow as it is funneled into the Salt Lake Valley by surrounding mountains.
Newswise — SALT LAKE CITY, Feb. 19, 2013 – University of Utah researchers ran computer simulations to show that the snow-producing “lake effect” isn’t always enough to cause heavy snowfall, but that mountains or other surrounding topography sometimes are necessary too.The study is relevant not only to forecasting lake-effect storms near the Great Salt Lake, Sea of Japan, Black Sea and other mountainous regions, but also sheds light on how even gentle topography near the Great Lakes helps enhance lake-effect snowstorms, says the study’s senior author, Jim Steenburgh, a professor of atmospheric sciences.“It is going to help us with weather prediction – helping forecasters recognize that in some lake-effect events, the mountains or hills can play an important role in triggering lake-effect snow bands” over large bodies of water, Steenburgh says.The new study was published today in the American Meteorological Society journal Monthly Weather Review. Steenburgh co-authored the study with first author and University of Utah Ph.D. student Trevor Alcott, now at the National Weather Service in Salt Lake City.The research was funded by the National Science Foundation and the National Oceanic and Atmospheric Administration’s National Weather Service.Lake-effect snow’s “negative impacts are in terms of snarling traffic and potentially closing schools,” Steenburgh says, “And then there is the positive benefit for Utah skiers, which is a deep powder day that boosts our winter sports economy.”The lake effect occurs when a cold mass of air moves over a large body of warmer water, picking up moisture and heat to make the air mass rise and cool, ultimately dumping snow downwind. It already was known that lake-effect snowfall increases as the moist air rises over mountains. But the new study shows something new and different: mountains sometimes are essential to triggering the lake-effect over the lakes themselves.“Most people recognize that mountains get more precipitation than lowlands because of moist air being lifted over the mountains,” Steenburgh says. “Everybody recognizes that it plays a role in lake-effect storms. What we’re showing here is a situation where the terrain is complicated – there are multiple mountain barriers, not just one, and they affect the air flow in a way that influences the development of the lake-effect storm over the lake and lowlands, rather than just over the mountains.”He says weather forecast models now fail to adequately include the Wasatch Range, which runs north-to-south directly east of the Great Salt Lake and the Ogden-Salt Lake City-Provo metropolitan area. Forecast models also don’t include northern mountains along the Nevada-Idaho-Utah border, located northwest and north of the Great Salt Lake, Salt Lake metropolitan area and Wasatch Range.“That may be one of the reasons we struggle” in forecasting lake-effect storms in Utah’s major cities, Steenburgh says.Indeed, the new study shows that without the northern mountains, more cold and moist air would flow south from Idaho’s Snake River Plain, pass over Utah’s Great Salt Lake and drop much more snow on the Salt Lake City area and Wasatch Range than real lake-effect storms.How Mountains Can Contribute to the Lake EffectThe new study examined a moderate lake-effect snowstorm that hit metropolitan Salt Lake City and the Wasatch Range on Oct. 26-27, 2010. Some Salt Lake Valley cities had no snow, while others got up to 6 inches. The Alta Ski Area in the Wasatch Range got 13 inches of fresh snow, although it hadn’t yet opened for the season.The researchers found three key mountain-related or “orographic” factors were necessary to produce the October 2010 lake-effect storm, and that in this storm, the lake-effect occurred only because of the interaction between the Great Salt Lake and surrounding mountains:-- Air moving over northern-northwestern Utah’s mountains warms and dries as it flows downslope to the south and southeast toward the Great Salt Lake. Without this warming and drying, the lake-effect from the Great Salt Lake would be even stronger, the study found. This warm, dry downslope flow is similar to what are known as Chinook winds on the east slope of the Rocky Mountains or foehn winds flowing off the Alps.-- The northern mountain ranges deflect the south- and southeast-flowing cold air masses so they converge over the Great Salt Lake. As the air picks up heat and moisture from the lake, it warms, making land breezes blow onto the lake from its western and eastern shores. The convergence of air from the north, east and west makes the air rise and cool, producing lake-effect snow bands over the lake.-- The Wasatch and Oquirrh ranges, which respectively form the east and west boundaries of the Salt Lake Valley, act as a big funnel, forcing air flowing south off the Great Salt Lake to move directly into the valley, further enhancing air convergence and snowfall.Simulating the Lake Effect Since 1998, the Great Salt Lake has helped generate from three to 20 lake-effect snowstorms each winter, most of them relatively small and affecting some areas but not others. The average is about a dozen lake-effect storms each winter, Steenburgh says. Lake-effect storms account for about 5 percent to 8 percent of the precipitation south and east of the Great Salt Lake from mid-September to mid-May.“There is a rich spectrum of lake-effect snowstorms,” Steenburgh says. “Some cover a wide area with light snowfall. Some organize into more intense snow bands that produce heavier snow over a smaller area. Over the Great Salt Lake, 20 percent of our lake-effect events are these narrow intense bands.”The Oct. 26-27, 2010 weather event “wasn’t a huge snowstorm,” but fell between the two extremes, he adds.An earlier study by Steenburgh showed that a Great Salt Lake-effect storm on Dec. 7, 1998 really was the result of the lake effect alone, like many storms from the Great Lakes. But the new study showed that during the lake-effect storm on Oct. 26-27, 2010, something quite different happened: Lake-effect snows fell only because of a synergistic interaction among the Great Salt Lake and mountains downstream.Alcott and Steenburgh used weather data and existing weather research software to construct a “control” computer simulation that closely mimicked the real 2010 storm by incorporating the effects of the Great Salt Lake and surrounding mountains. Then they reran the simulation multiple times, but with various aspects missing.“We can play God with this model,” Steenburgh says. “We can see what happens if the upstream terrain wasn’t there, if the lake wasn’t there, if the Wasatch Range wasn’t there.”The simulations showed that the lake effect would not have happened in the storm were it not for both the lake and downstream mountains:-- A “flat, no-lake” simulation removed the lake and all mountains, as if all the landscape around Salt Lake City was flat and empty. The result was no snowfall.-- The “no-lake” simulation removed the Great Salt Lake, but included all the surrounding mountains. This produced 90 percent less snow than the simulation of the real storm.-- A “flat” simulation included the Great Salt Lake but no mountains. The result was 94 percent less snow than the simulated actual storm.-- The “Wasatch only” simulation included the lake and Wasatch Range, but not the northern Utah mountains or the Oquirrh and Stansbury ranges west of the Salt Lake Valley. It produced 27 percent less snow than the simulation of the real storm.-- A “downstream only” simulation, which included the lake and the Wasatch and Oquirrh ranges downstream from the lake, but not the upstream mountains in northern Utah. The result was 61 percent more snowfall than the simulated real storm because, without the northern mountains, more moist and cold air moves south from Idaho directly to the Great Salt Lake.University of Utah Communications 201 Presidents Circle, Room 308 Salt Lake City, Utah 84112-9017 801-581-6773 fax: 801-585-3350 www.unews.utah.edu Permalink to this article | 科技 |
2017-09/1580/en_head.json.gz/9620 | FDA says fast-growing salmon would not harm nature
The FDA releases its environmental assessment of the AquaAdvantage salmon, which has been subject to a contentious, yearslong debate at the agency. By The Associated Press Share
Comment Read Article WASHINGTON — Federal health regulators say a genetically modified salmon that grows twice as fast as normal is unlikely to harm the environment, clearing the way for the first approval of a scientifically engineered animal for human consumption.
The Food and Drug Administration on Friday released its environmental assessment of the AquaAdvantage salmon, a faster-growing fish which has been subject to a contentious, yearslong debate at the agency. The document concludes that the fish “will not have any significant impacts on the quality of the human environment of the United States.” Regulators also said that the fish is unlikely to harm populations of natural salmon, a key concern for environmental activists.
This AquaBounty Technologies photo shows two same-age salmon, a genetically modified salmon, rear, and a non-modified salmon, foreground. AP Search photos available for purchase: Photo Store →
The FDA will take comments from the public on its report for 60 days before making it final.
The FDA said more than two years ago that the fish appears to be safe to eat, but the agency had taken no public action since then. Executives for the company behind the fish, Maynard, Mass.-based Aquabounty, speculated that the government was delaying action on their application due to push-back from groups who oppose genetically modified food animals.
Experts view the release of the environmental report as the final step before approval.
If FDA regulators clear the salmon, as expected, it would be the first scientifically altered animal approved for food anywhere in the world.
Critics call the modified salmon a “frankenfish.” They worry that it could cause human allergies and the eventual decimation of the natural salmon population if it escapes and breeds in the wild.
AquaBounty has maintained that the fish is safe and that there are several safeguards against environmental problems. The fish would be bred female and sterile, though a very small percentage might still be able to breed. The company said the potential for escape is low. The FDA backed these assertions in documents released in 2010.
Since its founding in 1991, Aquabounty has burned through more than $67 million developing the fast-growing fish. According to its midyear financial report, the company had less than $1.5 million in cash and stock left. It has no other products in development.
Genetically engineered animals contain DNA that has been altered to produce a desirable trait.
The AquaAdvantage salmon has an added growth hormone from the Pacific Chinook salmon that allows the fish to produce growth hormone all year long. The engineers were able to keep the hormone active by using another gene from an eel-like fish called an ocean pout that acts like an “on” switch for the hormone. Typical Atlantic salmon produce the growth hormone for only part of the year. | 科技 |
2017-09/1580/en_head.json.gz/9622 | dansk Deutsch español Français italiano Nederlands norsk português suomeksi svenska Extreme Networks Appoints Gary Newbold as Vice President of Asia-Pacific Sales
Veteran sales leader locates to Singapore to enrich the experience of customers and address growth opportunities in Asia's emerging countries
from Extreme Networks, Inc. SINGAPORE, Aug. 9, 2013 /PRNewswire/ -- Extreme Networks, Inc. (Nasdaq: EXTR), a leader in high performance networking from campus to data centers, today announced the appointment Gary Newbold as Vice President of Sales for Asia-Pacific. Based in Singapore, Newbold will oversee the entire region highlighted by China, Korea, Australia, Japan, and India. Newbold is an experienced sales leader with a track record of success in the network market, having joined Extreme Networks from 3COM in 2009. Newbold assumes his new role as Extreme Networks kicks-off its Asia-Pacific Channel Partner conference, the week of August 12th in Ho Chi Minh City, Vietnam. Hundreds of partners will attend the event to enrich their experience and be addressed by executive leadership, including Newbold, Extreme Networks' CEO Chuck Berger and Nancy Shemwell, EVP of Global Sales.
Commented Shemwell: "Gary's appointment is a big step for Asia-Pacific, as he has demonstrated consistent results and achieved excellent levels of customer satisfaction for the UK, where tremendous achievements came in growing Extreme's presence both with the channel (CRN UK named Extreme Channel "Vendor of the Year" for 2012) and our customers from education, retail, healthcare, and various service providers and Internet Exchange Connection points (IXCs) including LINX of London." About Extreme Networks
Extreme Networks is a leader in high performance Ethernet switching for cloud, data center and mobile networks. Based in San Jose, CA, Extreme Networks has more than 6,000 customers in more than 50 countries. For more information, visit the company's website at http://www.extremenetworks.com
Extreme Networks, the Extreme Networks logo and ExtremeXOS are trademarks or registered trademarks of Extreme Networks, Inc. in the United States and/or other countries. All other names are the property of their respective owners. Except for the historical information contained herein, the matters set forth in this press release, including without limitation statements as to features, performance, benefits, and integration of the products or the combined solution are forward-looking statements within the meaning of the "safe harbor" provisions of the Private Securities Litigation Reform Act of 1995. These forward-looking statements speak only as of the date. Because such statements deal with future events, they are subject to risks and uncertainties, including network design and actual results of use of the product in different environments. We undertake no obligation to update the forward-looking information in this release. Other important factors which could cause actual results to differ materially are contained in the Company's 10-Qs and 10-Ks which are on file with the Securities and Exchange Commission (http://www.sec.gov). SOURCE Extreme Networks, Inc. RELATED LINKS
http://www.extremenetworks.com
Preview: Extreme Networks Open Fabric Network Solutions Selected by Falcon Broadband to Deliver Wide Area Network (WAN) Services to Start School Year
Preview: Extreme Networks Announces Upcoming Financial Conference Schedule
Extreme Networks Introduces "Information Governance Engine"...
Extreme Networks Announces John Shoemaker as Chairman of the...
Extreme Networks and PCM Elevate the Wi-Fi Experience for World... | 科技 |
2017-09/1580/en_head.json.gz/9623 | dansk Deutsch español Français italiano Nederlands norsk português suomeksi svenska TetraLogic Pharmaceuticals Announces New Executives
Company names J. Kevin Buchi President and Chief Executive Officer, Pete A. Meyers Chief Financial Officer, and Lesley Russell, MBChB, MRCP Chief Operating Officer
from TetraLogic Pharmaceuticals MALVERN, Pa., Aug. 19, 2013 /PRNewswire/ -- TetraLogic Pharmaceuticals, a biopharmaceutical company developing novel small molecule SMAC mimetic drugs to treat cancer, today announced the appointment of a new senior executive leadership team with the goal of accelerating the development of the company and its lead product candidate birinapant. The new TetraLogic senior executives are J. Kevin Buchi, President and Chief Executive Officer, Pete A. Meyers, Chief Financial Officer, and Lesley Russell, MBChB, MRCP, Chief Operating Officer. John M. Gill, the departing President, Chief Executive Officer and Co-founder of TetraLogic, will continue to serve on the company's Board of Directors. "I am pleased to hand over the reins to Kevin who is a highly regarded leader within our industry," stated Mr. Gill. "Together with Pete Meyers and Lesley Russell, this team represents the ideal complement to the existing team at TetraLogic, bringing the expertise and experience we need to maximize the value of our lead SMAC mimetic drug candidate birinapant, grow the company, and realize the full economic potential of our commercial opportunities for our shareholders."
"I am honored to be chosen to lead TetraLogic at this very exciting time for the company," stated Mr. Buchi. "With a strong team and clear leadership position in SMAC mimetics, we have a tremendous opportunity to continue the momentum established for birinapant as a potential treatment for both solid tumors and hematological malignancies. Having spent most of my career in the biotech industry, I have a strong sense of the opportunities for success at TetraLogic and look forward to leading the company into what promises to be an exciting future."
Mr. Buchi was most recently Corporate Vice President, Global Branded Products, at Teva Pharmaceutical Industries Ltd. Prior to joining Teva, he was Chief Executive Officer of Cephalon, Inc., which was acquired by Teva for $8 billion in October 2011. Mr. Buchi joined Cephalon in 1991 and held various positions, including Chief Financial Officer and Chief Operating Officer, before becoming Cephalon's Chief Executive Officer in December 2010. He graduated from Cornell University with a Bachelor of Arts degree in chemistry and received a Master of Management degree from the J.L. Kellogg Graduate School of Management at Northwestern University. Mr. Buchi is a Certified Public Accountant.
Mr. Meyers was most recently Managing Director, Co-Head of Global Health Care Investment Banking at Deutsche Bank Securities Inc. He joined Deutsche Bank in 2005 after serving six years with Credit Suisse First Boston LLC where he was a Managing Director in Health Care Investment Banking specializing in the biotechnology and pharmaceutical sectors. Prior to that, he worked at Dillon, Read & Co., specializing in health care mergers and acquisitions. Mr. Meyers earned a Bachelor of Science degree in finance from Boston College and a Master of Business Administration degree from Columbia Business School.
Dr. Russell is a hematologist/oncologist with more than 20 years of international pharmaceutical industry experience and leadership in the therapeutic areas of hematology/oncology, neurology, psychiatry, pain and inflammation, respiratory medicine, cardiovascular medicine and stem cell therapy. Dr. Russell was most recently Senior Vice President and Global Head, Research and Development, Global Branded Products for Teva Pharmaceuticals USA. She was appointed to this role upon Teva's acquisition of Cephalon, Inc., where she served as Executive Vice President and Chief Medical Officer from 2006 to 2011. She joined Cephalon in 2000 as Vice President, Worldwide Clinical Research. Prior to Cephalon, she served as Vice President, Clinical Research at US Bioscience Inc. and held positions of increasing responsibility within the company from 1996 to 1999. From 1995 to 1996, she was a clinical research physician at Eli Lilly U.K. and a medical director at Amgen U.K. from 1992 to 1995. Dr. Russell was trained in hematology/oncology at Royal Infirmary of Edinburgh and at Royal Hospital for Sick Children, Edinburgh. She received an MBChB from the University of Edinburgh, Scotland, is a member of the Royal College of Physicians, United Kingdom and is registered with the General Medical Council, United Kingdom. About BirinapantBirinapant (formerly TL32711) is a small molecule peptidomimetic of SMAC (second mitochondrial-derived activator of caspases), an endogenous regulator of apoptotic cell death, that antagonizes the inhibitor of apoptosis proteins (IAPs). Birinapant is in clinical development for the treatment of solid tumors and hematological malignancies. In clinical studies to date, birinapant has been well-tolerated and exhibited suppression of IAPs and antitumor activity.
About TetraLogic PharmaceuticalsTetraLogic Pharmaceuticals is a privately held biopharmaceutical company that focuses on the discovery and development of second mitochondrial-derived activator of caspases (SMAC) mimetics, small molecule drugs that mimic SMAC for the treatment of cancers. The company's institutional investors include Clarus Ventures, HealthCare Ventures, Quaker Partners, Novitas Capital, Nextech Invest Ltd, Hatteras Venture Partners, Pfizer Ventures, Latterell Venture Partners, the Vertical Group, Amgen Ventures, and Kammerer Associates. For additional information, please refer to the company's Web site at www.tetralogicpharma.com.
SOURCE TetraLogic Pharmaceuticals RELATED LINKS
http://www.tetralogicpharma.com
Preview: TetraLogic Pharmaceuticals Announces Oral Presentation on Smac Mimetic TL32711 at American Society of Hematology (ASH) Annual Meeting | 科技 |
2017-09/1580/en_head.json.gz/9629 | ADVERTISEMENTS: What are the different Types of Precipitation? Article shared by Prakriti Sharma
All precipitation occurs from clouds, and by far the most important cause of clouds is the adiabatic cooling resulting from the upward movement of air.image source: 3.bp.blogspot.com/_94FsDg9xSmo/TOAC3lVfaDI/AAAAAAAAACQ/u4CFzgh2dbM/s1600/precip.jpgTherefore precipitation is classified on the basis of the conditions under which large masses of moist air are actually induced to rise to higher elevations.There are three possible ways in which an air mass may be forced to rise, and each of these produces its own characteristic type of precipitation. Thus, the following three types of precipitation are based on the types of ascent and the precipitation characteristics:-ADVERTISEMENTS: (1) Convectional precipitation(2) Orographic precipitation(3) Cyclonic or Frontal precipitation(1) Convectional precipitation:ADVERTISEMENTS: In this type of precipitation, the actuating force is the thermal convection of warm and moist air masses.Therefore in order to cause precipitation two conditions are necessary: (1) the intense heating of the surface so as to expand and raise the lower layer of the atmosphere, and (2) abundant supply of moisture in the air to provide it with a high relative humidity. Solar radiation is the main source of heat to produce convection currents in the air.Since convectional precipitation is a warm weather phenomenon, it is generally accompanied by thunder, lightning and local winds. Convectional precipitation is entirely in the form of rain. There may be occasional hail associated with this type of precipitation.Under favourable conditions it occurs in the low-latitudes and in the temperate zones. The doldrums invariably gets this type of precipitation.ADVERTISEMENTS: In this belt of calms lying between the north and south trade winds, the mid-day witnesses the formation of clouds, followed in the afternoon or evening by the occurrence of showery rainfall.The clouds dissolve away late in the night, and the morning sky is clear. According to W.M. Davis, the large amount of equatorial precipitation is due not only to the activity of the convectional processes on which it depends, but also and largely to the rapid decrease of the capacity for water vapour when air cools at high temperatures prevailing round the equator.Since convectional precipitation is largely due to the heating of the earth’s surface, the most favourable conditions for its occurrence are always found in the summer months and in the warmer parts of the day.Though intense heating of land surfaces is of great importance, it should not be taken to be the only factor. Vertical air currents and turbulence as well as surface obstructions such as hills, mountains, etc. may provide the initial upward push for the air that already tends to be unstable. This type of precipitation is of a very short duration and consists of heavy showers.Convective precipitation is less effective for crop growth than the steady rain. This is so because much of it is drained off in the form of surface drainage, and little remains for entering the soil. Slope wash and gulling are a menace to the loose soil.However, in the temperate regions, it is most effective in promoting the growth of plants. The main reason is that in the mid-latitudes this type of precipitation occurs in warm seasons when the vegetation is very active.This type of precipitation is peculiar in that it gives the maximum rainfall with the minimum cloudiness. Clouds involved in this type of precipitation are generally cumulo-nimbus or clouds with great vertical development.(2) Orographic precipitation:When mountains or highlands acting as barriers to the flow of air force it to rise, the air cools adiabatically and clouds and precipitation may result. The precipitation thus obtained is referred to as orographic (from Greek: oros = a mountain).According to Foster, orographic precipitation is that which results from the cooling of moisture laden air masses which are lifted by contact with the elevated land masses.This type of precipitation is commonly found on the windward sides of mountain ranges lying across the path of prevailing terrestrial winds where those winds pass from the relatively warmer ocean to the land.After striking the high land, the air is forced to rise and thereby cooled. The moisture, therefore, is condensed and precipitated as rain or snow. However, the process of orographic precipitation is not that simple.Once the air has been initially pushed upward and condensation starts, the stage has been set for the origin of convection currents. Beyond the condensation level, the latent heat of condensation reduces the adiabatic lapse rate and the ascending air becomes unstable and continues its ascent until its temperature equals that of the surrounding air. The mountain barriers produce only the trigger effect.Orographic precipitation occurs far inland also where the elevated land masses rise above the surrounding country in the path of moisture-bearing air masses. It usually takes the form of either rain or snow.Wherever the mountain ranges obstruct the path of moisture-bearing winds and force them to ascend, the maximum precipitation always occurs on the windward slope.On the other side of these physical barriers, the amount of precipitation abruptly decreases. Thus, on the leeward slopes of these mountain ranges, there always exists a relatively dry area, which is known as the rain shadow.There are many extensive regions that are found in rain shadows. The cause of these rain shadows may easily be found out. The moist air ascends on the windward side and its moisture is precipitated, but on crossing the peak of the range, no lifting occurs.Hence there is only a little rainfall, residual of the previous condensation. Another reason for the existence of rain- shadow Pareas is that the descending wind is heated by compression and becomes more unfavourable for precipitation.In India the south-west monsoon gives copious rainfall on the windward slope of the Western Ghats, whereas on the leeward side there are extensive rain shadow areas.Another salient feature of orographic precipitation is the inversion of rainfall. An air stream approaching the mountain ranges is given uplift by the air masses lying close to them. Therefore the amount of precipitation starts increasing some distance away from the mountains.There is a continuous increase in precipitation on the windward slope up to a certain height beyond which it starts diminishing. This is called the ‘inversion of rainfall’. The cause can be easily discovered.A larger fraction of the moisture of the ascending air mass is precipitated up to a certain altitude, so that by the time the air currents reach the peak, the moisture content is completely depleted.In mountainous regions, precipitation is not entirely due to the direct effect of uplift, but there are indirect effects as well. In day time, there are convectional currents set up in the air because of the heating of mountain slopes and valleys.Besides, the belt of heaviest precipitation is determined by .the latitude, season and exposure. In the Himalayan ranges the elevation at which maximum condensation takes place is estimated to be about 1200 meters.Because of their location in the higher latitudes, the maximum condensation in the Alps occurs at about 2000 meters. It may also be noted that the effect of orographic uplift is felt some distance away from the physical barrier, such as a mountain range or a steep escarpment of a plateau.This is so because the mass of stagnant air in front of the barrier has a blocking effect, and the rain-bearing wind has to ascend the wind block.(3) Cyclonic or frontal precipitation:Cyclonic or frontal precipitation occurs when deep and extensive air masses are made to converge and move upward so that their adiabatic cooling results. Whenever there is lifting of the air masses with entirely different physical properties, the atmosphere becomes unstable.When this happens, the stage for large-scale condensation and precipitation has been set. If an additional process is in operation so that the rain drops of the required size are formed, the precipitation results.Home ›› Brief notes on Frontal precipitation in temperate regionsHere is your free sample essay on Latitude PublishYourArticles.net is home of thousands of articles published by users like YOU. Here you can publish your research papers, essays, letters, stories, poetries, biographies and allied information with a single vision to liberate knowledge.Before publishing your Articles on this site, please read the following pages:1. Content Guidelines 2. Privacy Policy3. TOS4. Disclaimer CopyrightPublish Your ArticleAdvertisement Latest | 科技 |
2017-09/1580/en_head.json.gz/9717 | A new technique for creation of entangled photon states developed
Moscow, Russia (SPX) Feb 15, 2017
These are photon beams. Photo was taken by CCD-matrix. Image courtesy Egor Kovlakov.
Members of the Faculty of Physics, the Lomonosov Moscow State University have elaborated a new technique for creation of entangled photon states, exhibiting photon pairs, which get correlated (interrelated) with each other. Scientists have described their research in an article, published in the journal Physical Review Letters.
Physicists from the Lomonosov Moscow State University have studied an entangled photon state, in which the state is determined only for the whole system and not for each separate particle.
Stanislav Straupe, Doctor of Sciences in Physics and Mathematics, a member of the Quantum Electronics Department and Quantum Optical Technologies Laboratory at the Faculty of Physics, the Lomonosov Moscow State University, and one of the article co-authors says the following.
He explains: "Entangled states are typical and general. The only problem is in the point that for the majority of particles interaction with the environment destroys the entanglement. And photons hardly ever interact with other particles, thus they are a very convenient object for experiments in this sphere. The largest part of light sources we face in our life is a classical one - for instance, the Sun, stars, incandescent lamps and so on. Coherent laser radiation also belongs to the classical part. To create nonclassical light isn't an easy thing. You could, for instance, isolate a single atom or an artificial structure like a quantum dot and detect its radiation - this is the way for single photons obtaining."
An effect of spontaneous parametric down-conversion in nonlinear crystal is most commonly used for obtaining of entangled photon states. In this process a laser pumping photon splits into two.
As this takes place photon states get correlated, entangled due to the conservation laws. Egor Kovlakov, a doctoral student from the Quantum Electronics Department at the Radio Physics Division of the Faculty of Physics, the Lomonosov Moscow State University and an article co-author shares: "In our project we've offered and tested a new technique of the spatial entanglement creation. Photon pairs, generated in our experiment, propagate by beams, which get correlated in "spatial profile". Efficiency is the key peculiarity of our technique in comparison with the previously known ones."
Studies of entangled photon states started in 1970-s years and nowadays they are most actively used in quantum cryptography, an area relating to quantum information transfer and quantum communication.
Stanislav Straupe notices: "Quantum cryptography is not the only one of the possible applications, but at the moment it is the most developed one. Unlike classical communication, where it's not important which alphabet is used for message coding and it's enough to use a binary one (0 or 1), everything is more complicated in quantum communication.
"It turns out that enhancement of alphabet dimension not only increases amount of information coded in one photon, but also strengthens communication security. That's why it'd be interesting to develop quantum communication systems, based also on information coding in spatial profile of photons."
The scientists suppose that in the future their solution will be applied for creation of an optical channel with a satellite, where you can't install optical fiber (an optical fiber guide) - a basis for fiber-optic communication.
Lomonosov Moscow State University
Understanding Time and Space
Perimeter Institute researchers apply machine learning to condensed matter physics
Waterloo, Canada (SPX) Feb 14, 2017
A machine learning algorithm designed to teach computers how to recognize photos, speech patterns, and hand-written digits has now been applied to a vastly different set of data: identifying phase transitions between states of matter. This new research, published in Nature Physics by two Perimeter Institute researchers, was built on a simple question: could industry-standard machine learni ... read more | 科技 |
2017-09/1580/en_head.json.gz/9718 | New study explains decade of glacial growth in New Zealand
by Brooks Hays
Victoria, New Zealand (UPI) Feb 15, 2017
disclaimer: image is for illustration purposes only
Globally, glaciers have been on the retreat for several decades. Between 1983 and 2008, however, at least 58 New Zealand glaciers grew in size.
Scientists have struggled to explain their advance, but new analysis suggest a regional climate anomaly, a period of unusually cold temperatures, encouraged their growth.
"Glaciers advancing is very unusual -- especially in this period when the vast majority of glaciers worldwide shrank in size as a result of our warming world," Andrew Mackintosh, a climate scientist at Victoria University of Wellington's Antarctic Research Centre, said in a news release. "This anomaly hadn't been satisfactorily explained, so this physics-based study used computer models for the first time to look into it in detail."
Mackintosh and his colleagues built a climate model -- populated with data from field observations in New Zealand -- to illuminate the drivers of glacial growth. Their findings, detailed in the journal Nature Communications, suggest a prolonged period of low temperatures, not precipitation, explain the advancing glaciers.
Researchers say heightened regional climate variability is one the byproducts of man-made climate change.
"New Zealand sits in a region where there's significant variability in the oceans and the atmosphere -- much more than many parts of the world," Mackintosh said. "The climate variability that we identified was also responsible for changes in the Antarctic ice sheet and sea ice during this period."
The period of cooling and glacial growth appears to now be over. New Zealand's largest glacier, Franz Josef Glacier, has retreated almost a mile since 2008.
"New Zealand's glaciers are very sensitive to temperature change," Mackintosh said. "If we get the two to four degrees of warming expected by the end of the century, our glaciers are going to mostly disappear. Some may experience small-scale advance over that time due to the regional climate variability, but overall they will retreat."
Beyond the Ice Age
NASA, UCI Reveal New Details of Greenland Ice Loss
Pasadena CA (JPL) Feb 10, 2017
Less than a year after the first research flight kicked off NASA's Oceans Melting Greenland campaign last March, data from the new program are providing a dramatic increase in knowledge of how Greenland's ice sheet is melting from below. Two new research papers in the journal Oceanography use OMG observations to document how meltwater and ocean currents are interacting along Greenland's west coa ... read more | 科技 |
2017-09/1580/en_head.json.gz/9754 | Without Delay: Congress to Fast-Track Climate Legislation
Posted January 16th, 2009, 4:01 PM by SundanceTV
WASHINGTON, DC, January 15, 2009 (ENS) – The heads of some of America’s largest corporations together with the leaders of five of the country’s largest environmental groups today presented a joint plan to Congress for climate protection legislation. Congressional Democrats met their call for immediate action with assurances that they agree – there is no time for delay.
Testifying before the House Committee on Energy and Commerce in the first congressional hearing of 2009 on climate change, members of the U.S. Climate Action Partnership called for a reduction in U.S. greenhouse gas emissions by 80 percent of 2005 levels by 2050 through an economy-wide cap-and-trade program.
“In the past, the U.S. has proven that we have the will, the capabilities and the courage to invest in innovation – even in difficult times,” said Jeff Immelt, chairman and chief executive of General Electric, one of the USCAP partners.
“Today, cap-and-trade legislation is a crucial component in fueling the bold clean energy investments necessary to catapult the U.S. again to preeminence in global energy and environmental policy, strengthen the country’s international competitiveness, and create millions of rewarding new American jobs,” Immelt said.
“The health of our economy and the safety of our climate are inextricably linked, except nature doesn’t do bail-outs,” said Jonathan Lash, president of the World Resources Institute.
“USCAP has redefined what is possible,” said Lash. “If the diverse membership of USCAP can find common ground, Congress can agree on effective legislation.”
Committee chair Congressman Henry Waxman of California said his goal is to pass comprehensive climate and energy legislation in the committee before the Memorial Day recess.
“That is an ambitious schedule, but it is an achievable one,” said Waxman who is new to the committee chairmanship. “We cannot afford another year of delay. As today’s hearing will show, a consensus is developing that our nation needs climate legislation. Our job is to transform this consensus into effective legislation. The legislation must be based on the science and meet the very serious threats we face.”
House Speaker Nancy Pelosi said, “Chairman Waxman has set an aggressive timetable for action to reduce global warming and our dependence on foreign oil. I share his sense of urgency and his belief that we cannot afford another year of delay.”
The Houston Ship Channel hosts 25 percent of the United States’ oil refining capacity. Refineries would have to control their greenhouse gas emissions under a national carbon cap-and-trade program. (Photo by Roy Luck)
Developed through two years of intensive analysis and consensus-building among USCAP’s 26 corporations and five environmental groups, the “Blueprint for Legislative Action” aired before the committee today sets forth steps for creating a mandatory, economy-wide cap-and-trade program for the main greenhouse gas carbon dioxide.
Under a cap-and-trade system, a government authority first sets a cap, deciding how much pollution in total will be allowed. Next, companies are issued credits, essentially licenses to pollute. If a company comes in below its cap, it has extra credits which it can trade with other companies.
USCAP’s plan couples the cap-and-trade program with cost containment measures and complementary policies addressing a federal technology research development and deployment program, coal technology, and transportation, as well as building and energy efficiency.
Jim Mulva, chairman and chief executive of the oil company ConocoPhillips, the nation’s second largest refiner, told the committee, “We believe we must act now in a united effort to slow, stop and reverse the growth of greenhouse gas emissions.”
Mulva agreed that quick action is imperative to curb climate change. “Each year the United States delays enacting a federal framework to control its emissions, the greater the future risk.”
“From an oil and gas perspective,” he said, “we understand that this means fundamental changes in the way we operate and in the fuels we produce.”
“ConocoPhilips is ready to meet the challenge,” Mulva said, “but we and others need an effective, efficient and equitable federal program in place to establish the rules and to encourage the technology development and investments necessary for change.”
Frances Beinecke, president of the Natural Resources Defense Council, told the committee, “The time for action on global warming has already been delayed too long. Every day we learn more about the ways in which global warming is already affecting our planet.”
“A growing body of scientific opinion has formed that we face extreme dangers if global average temperatures are allowed to increase by more than 2 degrees Fahrenheit from today’s levels,” Beinecke said.
She said that the NRDC believes we may be able to stay below this temperature increase if atmospheric concentrations of carbon dioxide and other global warming gases are kept from exceeding 450 parts per million of CO2-equivalent and then rapidly reduced.
“This will require us to halt U.S. emissions growth within the next few years and then achieve significant cuts in emissions in the next decade,” she said, “progressing to an approximately 80 percent cut by 2050.”
The targets and timetables in the USCAP legislative proposal are consistent with the schedule proposed by President-elect Barack Obama.
In his November 18 address to a bipartisan conference of governors, Obama said, “Now is the time to confront this challenge once and for all. Delay is no longer an option.”
In December, in a video address to the United Nations climate conference in Poland, Obama said he would open a “new chapter” on climate change, starting with a national cap-and-trade system.
Tags: climate action / committee chairmanship / congressman henry waxman / Ecommunity News / energy investments / greenhouse gas emissions / henry waxman / house committee on energy and commerce / international competitiveness / jeff immelt / world resources institute Recommended by Most Popular Production on Fourth and Final Season of RECTIFY Begins With New Cast Members | 科技 |
2017-09/1580/en_head.json.gz/9756 | Shale Gas – Shall China?
We kept the April 12, 2013 New York Times article as a draft because we basically found it very one-sided and know very little about “Berkeley Earth” or Elizabeth Muller (*), but with information about the Koch Brothers professed skepticism of progressive ideas.
Now we decided to post this because of Fareed Zakaria, someone we hold in high esteem, saying this Sunday on CNN/GPS, that in order to start putting a limit to the emission of CO2 globally, the best step for the US would be to share, what he called safe technologies of Shale Fracking and gas production, this in order to replace the reliance on burning coal as it is done now in China. We know this to be the wrong advice:
(1) there is no technology of “fracking the shale” that is safe to the ground water reservoirs.
(2) fracking and shale-gas will slow down the commercialization of truly positive renewable energy technologies,
and (3) the worse of all – it starts looking like “The Rhinoceros” of World War II Eugene Ionesco – the slow developing of a takeover by an aggressive wrong and obnoxious ideology – and the Koch Brothers are versed in technologies in this respect. So – let us say: Fareed Zakaria expressed the idea that Shale Gas is a step in the right direction, but we do not think so – and thousands of scientists agree with us but have suspicions about the proponents of the fracking myth. Op-Ed Contributor
China Must Exploit Its Shale Gas.
By ELIZABETH MULLER
Published, The New York Times on-line: April 12, 2013
IF the Senate confirms the nomination of the M.I.T. scientist Ernest J. Moniz as the next energy secretary, as expected, he must use his new position to consider the energy situation not only in the United States, but in China as well.
Mr. Moniz, a professor of physics and engineering systems and the director of M.I.T.’s Energy Initiative, sailed through a confirmation hearing Tuesday before the Senate Energy and Natural Resources Committee.
But some environmentalists are skeptical of Mr. Moniz. He is known for advocating natural gas and nuclear power as cleaner sources of energy than coal and for his support of hydraulic fracturing to extract natural gas from shale deposits. The environmental group Food and Water Watch has warned that as energy secretary, he “could set renewable energy development back years.”
The criticism is misplaced. Instead of fighting hydraulic fracturing, environmental activists should recognize that the technique is vital to the broader effort to contain climate change and should be pushing for stronger standards and controls over the process.
Nowhere is this challenge and opportunity more pressing than in China. Exploiting its vast resources of shale gas is the only short-term way for China, the world’s second-largest economy, to avoid huge increases in greenhouse gas emissions from burning coal.
China’s greenhouse gas emissions are twice those of the United States and growing at 8 percent to 10 percent per year. Last year, China increased its coal-fired generating capacity by 50 gigawatts, enough to power a city that uses seven times the energy of New York City. By 2020, an analysis by Berkeley Earth shows, China will emit greenhouse gases at four times the rate of the United States, and even if American emissions were to suddenly disappear tomorrow, world emissions would be back at the same level within four years as a result of China’s growth alone.
The only way to offset such an enormous increase in energy use is to help China switch from coal to natural gas. A modern natural gas plant emits between one-third and one-half of the carbon dioxide released by coal for the same amount of electric energy produced. China has the potential to unearth large amounts of shale gas through hydraulic fracturing. In 2011, the United States Energy Information Administration estimated that China had “technically recoverable” reserves of 1.3 quadrillion cubic feet, nearly 50 percent more than the United States.
The risk is that what is now a nascent Chinese shale gas industry may take off in a way that leads to ecological disaster. Many of the purchasers of drilling rights in recent Chinese auctions are inexperienced.
Opponents of this drilling method point to cases in which gas wells have polluted groundwater or released “fugitive” methane gas emissions. The groundwater issue is worrisome, of course, and weight for weight, methane has a global warming potential 25 to 70 times higher than carbon dioxide, the principal greenhouse gas that results from the burning of coal.
Moving away from fossil fuels entirely may make sense in the United States, where we can potentially afford to pay for more expensive renewable sources of energy. But developing countries have other priorities, like improving the education and health of their people. Given the dangers that hydraulic fracturing poses for groundwater pollution and gas leaks, we must help China develop an approach that is environmentally sound.
Mr. Moniz has warned of the need to curb environmental damage from the process. But he has also stressed the value of natural gas as a “bridging” source of energy as we strive to move from largely dirty energy to clean energy. Extracting shale gas in an environmentally responsible way is technically achievable, according to engineering experts. Accomplishing that goal is primarily a matter of engineering and regulation.
That is where we need the engagement of environmental activists. At home, they can push the United States to set verifiable standards for clean hydraulic fracturing and enforce those standards through careful monitoring. Internationally, American industry can lead by showing that clean production can be profitable.
We need a solution for energy production that can displace the rapid growth of coal use today. Switching from coal to natural gas could reduce the growth of China’s emissions by more than 50 percent and give the world more time to bring down the cost of solar and wind energy to levels that are affordable for poorer countries.
* Elizabeth Muller is the co-founder and executive director of Berkeley Earth, a nonprofit research organization focused on climate change.
Elizabeth Muller Elizabeth is the co-founder and Executive Director of Berkeley Earth Surface Temperature, and CEO of Muller & Associates LLC. Previously, she was Director at Gov3 (now CS Transform) and Executive Director of the Gov3 Foundation. From 2000 to 2005 she was a policy advisor at the Organization for Economic Cooperation and Development (OECD).
Elizabeth has advised governments in over 30 countries, in both the developed and developing world. She has extensive experience with stakeholder engagement and communications, especially with regard to technical issues. She developed numerous techniques for bringing government and private actors together to build consensus and implement action plans, and has a proven ability to deliver sustainable change. She has also designed and implemented projects for public sector clients, helping them to build new policies and strategies for government reform and modernization, collaboration across government ministries and agencies, and strategies for the information society.
Elizabeth holds a Bachelors Degree from the University of California with a double major in Mathematics and Literature, and a Masters Degree in International Management from the École Supérieure de Commerce de Paris.
Email: liz berkeleyearth.org
The Berkeley Earth Surface Temperature team includes statisticians, physicists, climate experts and others with experience analyzing large and complex data sets. They say:
Our main scientific effort continues to be the study and exploration of our huge database and the results of our temperature analysis. Because the oceans exert a moderating effect, their inclusion is important for estimating the long-term impact of human-caused climate change.
We have begun a study of the variability of temperature, and the rate of occurrence of extreme events. Extreme events include heat waves, which are expected to become more frequent due both to global warming and to the urban heat island effects. Such an event occurred in Chicago in 1995 and led to an excess of about 750 heat-wave related deaths. Equally important may be the effects that global warming will have on cold waves. City planners need to understand what to expect at both extremes.
Although warming is expected to lead to more heat waves, it is not clear whether the variability – difference between high temperatures and low temperatures – will change. Although some prior studies have suggested that it does, our preliminary work shows that the range of temperature extremes (difference between hottest and coldest days) is remaining remarkably constant, even as the temperature rose over the past 50 years. Memos describing these preliminary results were posted on our website in early 2013. Additional analysis will test these initial conclusions and we expect to be able reduce the error uncertainties and reach stronger conclusions.
We will continue with exploratory data analysis (a statistical method developed by John Tukey), and we will share our results with the public in the forms of memos posted online and of papers submitted to peer reviewed journals.
The Berkeley Earth Surface Temperature (BEST) Study has created a preliminary merged data set by combining 1.6 billion temperature reports from 16 preexisting data archives.
IT ALSO SAYS SOMETHING THAT WORRIES US TREMENDOUSLY – THIS BECAUSE THE KOCH FAMILY IS BEING MENTIONED:
The Berkeley Earth Surface Temperature (BEST) project is an effort to resolve criticism of the current records of the Earth’s surface temperatures by preparing an open database and analysis of these temperatures and temperature trends, to be available online, with all calculations, methods and results also to be freely available online. BEST is a project conceived of and funded by the Novim group at University of California at Santa Barbara.[1] BEST’s stated aim is a “transparent approach, based on data analysis.”[1] “Our results will include not only our best estimate for the global temperature change, but estimates of the uncertainties in the record.”[2]
BEST founder Richard A. Muller told The Guardian “…we are bringing the spirit of science back to a subject that has become too argumentative and too contentious, ….we are an independent, non-political, non-partisan group. We will gather the data, do the analysis, present the results and make all of it available. There will be no spin, whatever we find. We are doing this because it is the most important project in the world today. Nothing else comes close.”[3]
The BEST project is funded by unrestricted educational grants totalling (as of March 2011) about $635,000. Large donors include Lawrence Berkeley National Laboratory, the Charles G. Koch Foundation, the Fund for Innovative Climate and Energy Research (FICER),[4] and the William K. Bowes, Jr. Foundation.[5] The donors have no control over how BEST conducts the research or what they publish.[6]
The team’s preliminary findings, data sets and programs were made available to the public in October 2011, and their first scientific paper was published in December 2012.[7] The study addressed scientific concerns raised by skeptics including urban heat island effect, poor station quality, and the risk of data selection bias. The Berkeley Earth group concluded that the warming trend is real, that over the past 50 years (between the decades of the 1950s and 2000s) the land surface warmed by 0.91±0.05°C, and their results mirrors those obtained from earlier studies carried out by the U.S. National Oceanic and Atmospheric Administration (NOAA), the Hadley Centre, NASA’s Goddard Institute for Space Studies (GISS) Surface Temperature Analysis, and the Climatic Research Unit (CRU) at the University of East Anglia. The study also found that the urban heat island effect and poor station quality did not bias the results obtained from these earlier studies.[8][9][10][11]
Berkeley Earth team members include:[12]
Richard A. Muller, founder and Scientific Director. Professor of Physics, UCB and Senior Scientist, Lawrence Berkeley National Laboratory (LBNL). Muller is a member of the JASON Defense Advisory Group who has been critical of other climate temperature studies before this project.[13][14]
Robert Rohde, lead scientist. Ph.D. in physics, University of California, Berkeley (UCB). Rohde’s scientific interests include earth sciences, climatology, and scientific graphics. Rohde is the founder of Global Warming Art.
David Brillinger, statistical scientist. Professor of Statistics at UCB. A contributor to the theory of time series analysis.
Judith Curry, climatologist and Chair of the School of Earth and Atmospheric Sciences at the Georgia Institute of Technology.
Robert Jacobsen, Professor of Physics at UCB and an expert in analyses of large data sets.
Saul Perlmutter, Nobel Prize-winning astrophysicist at Lawrence Berkeley National Laboratory and Professor of Physics at UCB.
Arthur H. Rosenfeld, Professor of Physics at UCB and former California Energy Commissioner. Research he directed at Lawrence Berkeley National Laboratory led to the development of compact fluorescent lamps.
Charlotte Wickham, Statistical Scientist.
Jonathan Wurtele, Professor of Physics at UCB and Senior Scientist, LBNL
Elizabeth Muller, founder and Executive Director
the Charles G. Koch Foundation, the Fund for Innovative Climate and Energy Research (FICER).
Charles de Ganahl Koch (pron.: /?ko?k/; born November 1, 1935) is an American businessman and philanthropist. He is co-owner, chairman of the board, and chief executive officer of Koch Industries. His brother David H. Koch also owns 42% of Koch Industries and serves as Executive Vice President. The brothers inherited the business from their father, Fred C. Koch, and have since expanded the business to 2,600 times its inherited size.[citation needed] Originally involved exclusively in oil refining and chemicals, Koch Industries has expanded to include process and pollution control equipment and technologies; polymers and fibers; minerals; fertilizers; commodity trading and services; forest and consumer products; and ranching. The businesses produce a wide variety of well-known brands, such as Stainmaster carpet, Lycra fiber, Quilted Northern tissue and Dixie Cup. In 2007, Koch’s book The Science of Success was published. The book describes his management philosophy, referred to as “Market-Based Management”.[5]
Koch provides financial support for a number of public policy and charitable organizations, including the Institute for Humane Studies and the Mercatus Center at George Mason University. He co-founded the Washington, DC-based Cato Institute. Through the Koch Cultural Trust, founded by Charles Koch’s wife, Elizabeth, the Koch family has also funded artistic projects and creative artists.[6]
Koch Industries is the second-largest privately held company by revenue in the United States according to a 2010 Forbes survey[7] and as of October 2012 Charles was ranked the 6th richest person in the world with an estimated net worth of $34 billion – according to the Bloomberg Billionares Index –[8] and was ranked 18th on Forbes World’s Billionaires list of 2011 (and 4th on the Forbes 400), with an estimated net worth of $25 billion, deriving from his 42% stake in Koch Industries.[3].
=================================================================== Be Sociable, Share! Tweet Permalink | | Email This Article
Posted in Archives, Austria, Canada, China, Copenhagen COP15, Futurism, Global Warming issues, Policy Lessons from Mad Cow Disease, Real World's News | 科技 |
2017-09/1580/en_head.json.gz/9758 | Dark Energy Survey Collaboration
World’s most powerful digital camera records first images
The Dark Energy Camera, a 570-megapixel camera mounted on a telescope in Chile, achieved first light on Sept. 12.
Eight billion years ago, rays of light from distant galaxies began their long journey to Earth. That ancient starlight has now found its way to a mountaintop in Chile, where the newly constructed Dark Energy Camera, the most powerful sky-mapping machine ever created, has captured and recorded it for the first time.
That light may hold within it the answer to one of the biggest mysteries in physics—why the expansion of the universe is speeding up.
Scientists in the international Dark Energy Survey collaboration announced this week that the Dark Energy Camera, the product of eight years of planning and construction by scientists, engineers, and technicians on three continents, has achieved first light. The first pictures of the southern sky were taken by the 570-megapixel camera on Sept. 12.
“The achievement of first light through the Dark Energy Camera begins a significant new era in our exploration of the cosmic frontier,” said James Siegrist, associate director of science for high energy physics with the U.S. Department of Energy. “The results of this survey will bring us closer to understanding the mystery of dark energy, and what it means for the universe.”
The Dark Energy Camera was constructed at the U.S. Department of Energy’s (DOE) Fermi National Accelerator Laboratory in Batavia, Illinois, and mounted on the Victor M. Blanco telescope at the National Science Foundation’s Cerro Tololo Inter-American Observatory (CTIO) in Chile, which is the southern branch of the U.S. National Optical Astronomy Observatory (NOAO). With this device, roughly the size of a phone booth, astronomers and physicists will probe the mystery of dark energy, the force they believe is causing the universe to expand faster and faster.
“The Dark Energy Survey will help us understand why the expansion of the universe is accelerating, rather than slowing due to gravity,” said Brenna Flaugher, project manager and scientist at Fermilab. “It is extremely satisfying to see the efforts of all the people involved in this project finally come together.”
The Dark Energy Camera is the most powerful survey instrument of its kind, able to see light from over 100,000 galaxies up to 8 billion light years away in each snapshot. The camera’s array of 62 charged-coupled devices has an unprecedented sensitivity to very red light, and along with the Blanco telescope’s large light-gathering mirror (which spans 13 feet across), will allow scientists from around the world to pursue investigations ranging from studies of asteroids in our own Solar System to the understanding of the origins and the fate of the universe.
“We’re very excited to bring the Dark Energy Camera online and make it available for the astronomical community through NOAO's open access telescope allocation,” said Chris Smith, director of the Cerro-Tololo Inter-American Observatory. “With it, we provide astronomers from all over the world a powerful new tool to explore the outstanding questions of our time, perhaps the most pressing of which is the nature of dark energy.”
Scientists in the Dark Energy Survey collaboration will use the new camera to carry out the largest galaxy survey ever undertaken, and will use that data to carry out four probes of dark energy, studying galaxy clusters, supernovae, the large-scale clumping of galaxies and weak gravitational lensing. This will be the first time all four of these methods will be possible in a single experiment.
The Dark Energy Survey is expected to begin in December, after the camera is fully tested, and will take advantage of the excellent atmospheric conditions in the Chilean Andes to deliver pictures with the sharpest resolution seen in such a wide-field astronomy survey. In just its first few nights of testing, the camera has already delivered images with excellent and nearly uniform spatial resolution.
Over five years, the survey will create detailed color images of one-eighth of the sky, or 5,000 square degrees, to discover and measure 300 million galaxies, 100,000 galaxy clusters and 4,000 supernovae.
Fermi National Accelerator Laboratory/U.S. National Optical Astronomy Observatory issued this press release on Monday, Sept. 17.
More on Astrophysics, Dark energy, Detectors
The Dark Energy Camera opens its eyes
A long-awaited device that will help unravel one of the universe’s most compelling mysteries gets ready to see first light.
Dark Energy Camera goes to Chile
Doing big science takes big effort and big cooperation. 02/07/17
What ended the dark ages of the universe? New experiments will help astronomers uncover the sources that helped make the universe transparent.
Anything to declare?
Sometimes being a physicist means giving detector parts the window seat.
Viewing our turbulent universe
Construction has begun for the CTA, a discovery machine that will study the highest energy objects and events across the entire sky.
Is there a dark energy particle?
A theoretical particle that adapts to its surroundings could explain the accelerating expansion of our universe.
#AskSymmetry Twitter chat with Leonardo Senatore
See theorist Leonardo Senatore’s answers to readers’ questions about parallel universes.
In search of a parallel universe
What are parallel universes, and why do we think they might exist? | 科技 |
2017-09/1580/en_head.json.gz/9774 | Pakistan unblocks YouTube after Google launches local version
YouTube was blocked in Pakistan from 2012
John Ribeiro (IDG News Service)
Pakistan has lifted a ban on YouTube in the country after Google offered a localized version, which the government claims will allow it to ask for the removal of material considered offensive from the website.YouTube was ordered blocked in Pakistan in 2012 after a controversial video, called the "Innocence of Muslims," created a controversy in many countries for mocking the Prophet Muhammad.Pakistan authorities told a court that they were blocking the whole domain because it was not technically feasible for them to block specific links to the video.Google announced last week localized versions of YouTube in Pakistan, Sri Lanka and Nepal, suggesting that the company had arrived at a deal with Pakistan. Google did not comment on whether a deal had been made. Pakistan's telecom regulator, Pakistan Telecommunications Authority (PTA) also did not comment on the new .pk version of YouTube.On Monday, the country's ministry for communications and information said that Google has provided an online web process "through which requests for blocking access of offending material can be made by the PTA to Google directly and Google/YouTube will accordingly restrict access to the said offending material for users within Pakistan," according to reports.
Internet service provider, Pakistan Telecommunication Company, welcomed YouTube on its Facebook page.Civil rights groups in Pakistan have been concerned about a deal between Google and the Pakistan government, as it could aid censorship. Bolo Bhi has asked the government to provide details on the nature of any agreement. "Users have a right to know what terms have been agreed to and what it means for them," it added.
More about FacebookGoogleProphet | 科技 |
2017-09/1580/en_head.json.gz/9810 | Home » Technology » Resident Evil 7 Release Date & Updates: ‘Biohazard’ To Come With HDR and VR Support To PS4Resident Evil 7 Release Date & Updates: ‘Biohazard’ To Come With HDR and VR Support To PS4 September 18, 2016 By Rishabh Bansal 1 Comment Resident Evil 7 Release Date, Updates, Features: Another upcoming game in the market is Resident Evil 7. RE7 is said to be one of the most awaited survival horrors video game directed by Koshi Nakanishi and produced by Masachika Kawata. (check: LEGO Batman Movie Poster out on Batman Day.)
It is the 11th game in the series of ResidentEvil, developed and published by Capcom. It will be released on Windows, PlayStation 4 and Xbox One. The PS4 has got the VR support. It is a single player game. The full name of the game is Resident Evil7: Biohazard. RE7 will be releasing in Japan on 26th January 2017 and worldwide on 24th January 2017.
Resident Evil 7 Release Date, Updates, Features
The player will be playing as Ethan, a protagonist and can use weapons like pistols, shotguns, flamethrowers, explosives and chainsaw in the fight. ‘Biohazard’ will also contain puzzle solving, resource management, and healing herbs. The term Biohazard in the name describes the plot of the game.
The first demo in the market is quite impressive. It is scary and has slow paced action. The developers have done a wonderful job. These are the common words from every person after playing the demo of the game.
Instead of having a fixed range of the camera, RE7 has a head oriented camera. The camera can be toggled to the right or the left with the increments. About 15 to 30 degree of camera angle can be adjusted.
Capcom words about it are -”It will be compatible with the PlayStation 4 Pro, allowing the immersive gameplay to be experienced in high-resolution, luxuriant 4K with vibrant visuals only HDR can deliver”. He, further added, “Similarly, with HDR support on the Xbox One S.” It means that RE7B is going to have a blast opening in the market and is going to have many fans.
The game will be available to pre-orders at 50 pounds and will have the custom theme for the console. The special day survival is added is said to be priced at 53 pounds. This special edition is called the Deluxe edition which will have extra content from Survival edition on top of a game. A Japanese music band has given a song “Don’t be afraid” for it.
Recently, the demo was released, and it recorded more than 3 million downloads. It will also come with Twilight update. Stay tuned.
Filed Under: Technology Tagged With: Games, Resident Evil, Resident Evil 7« Pokemon Sun and Moon Release Date & Price: Leaks, Features and Everything To Know AboutBatman Day: LEGO Batman Movie Gets A New Poster »Reader InteractionsComments
Mike Chambers says September 18, 2016 at 3:46 pm Will they be releasing a version for the Xbox 360/ PS3?
Leave a Reply Cancel reply Your email address will not be published. Required fields are marked *Comment Name * Email * Website Primary Sidebar Latest News | 科技 |
2017-09/1580/en_head.json.gz/9858 | Climate change: the Zambian story
Media contactsNewsAnnouncementsSpeechesEventsFor the record
Climate change: the Zambian storyJan 5, 2010
In Zambia, any change in climate can spell disaster. With a majority of Zambians depending on agriculture, even a slight change in temperature can affect crops like maize with catastrophic consequences for livelihoods.In the village of Lusitu, in the south of Zambia, the returns from farming have diminished due to severe droughts. According to Eva Chipepo, a local villager, “rainfall is insufficient to give us a good crop yield” and “wild animals have started to wander in the fields”, further destroying crops. Another Lusitu resident confirms that life has become more difficult. “In the past”, he says, “we were able to find solutions to whatever challenges we were faced with. Rivers never ran dry.”With more frequent droughts, but also floods, says Catherine Namugala, Minister of Tourism and Environment, “the government must look for resources to provide relief to the people”. The country was already struggling to achieve development, she says, and climate change is putting additional strain on that process.Domiciano Mulenga, National Coordinator for Zambia’s Disaster Management and Mitigation Unit, confirms that the government is spending increased amounts of money for disaster response. “We are moving money and resources away from development programmes for disaster response”, he says.Climate change has also begun to affect Zambia’s national tourism industry. If extreme weather changes continue, in about 50 years, all that will remain of the Victoria Falls, known as the “7th wonder of the world”, could turn into an empty ravine. Since Zambia’s tourism industry rests on the country’s natural resources, this would have devastating economic effects. Extreme weather is affecting wildlife and flora alike. The lack of rain in Zambia’s South Luangwa region, in the east of the country, means that animals have had to scavenge for roots. Increased numbers of hippos are dying. Vegetation has also been affected, with landscapes translating into scores of petrified trees in dried out areas. Kenneth Kaunda, the first president of Zambia, says help from developed countries will be crucial in ensuring that countries like Zambia can cope with the effects of climate change. Tegegnework Gettu, the Director of UNDP’s Africa region, echoes that view. According to him, “developed countries must take responsibility and lead in helping Africans to develop good adaptations programmes”.The government of Zambia has decided to tackle the problem seriously, launching a National Adaptation Plan of Action (NAPA) and is in the process of preparing a national climate change response strategy.UNDP’s climate change activities in ZambiaUNDP has been supporting the government of Zambia in preparing the response to the challenges that the country is facing as a result of effects of climate change. The organization has also been helping to build capacities and institutions required to effectively combat climate change at the national level. In 2009, the government of Zambia established the climate change facilitation Unit (CCFU) charged with the responsibility of coordinating climate change issues in the country. UNDP has also helped Zambia to build an inventory of greenhouse gas emissions; assess the impacts of climate change on the most vulnerable sectors; analyze potential measures to limit greenhouse gas emissions; and develop capacities for reporting on climate change through the National Communication report to the UNFCCC.UNDP is supporting processes to help enhance Zambia’s chances of entering the international carbon market. One such process is the Clean Development Mechanism. UNDP is also supporting Zambia in readiness for emerging carbon markets in Reduced Emissions from Deforestation and forest Degradation (REDD). In collaboration with GEF’s small grants programme, UNDP has also allocated grants upto USD 50,000 for non-governmental and community-based organizations for climate change mitigation and adaptation; conservation of biodiversity among others.In addition, throughout 2009 UNDP took part in awareness-raising activities, organizing discussion fora among journalists, government representatives and students, traditional leaders and members of parliament. UNDP also supported the production of the documentary film “Climate change: the Zambian story”. For more information on UNDP in Zambia, please visit http://www.undp.org.zm/UNDP and climate change in Africa (.pdf) | 科技 |
2017-09/1580/en_head.json.gz/9942 | > Canadian Solar opposes protectionist tariff, ITC final trade ruling
Canadian Solar opposes protectionist tariff, ITC final trade ruling Friday, Nov 09, 2012 ONTARIO, Nov. 8, 2012 /PRNewswire-FirstCall/ -- Canadian Solar Inc. (the "Company" or "Canadian Solar") (NASDAQ: CSIQ), one of the world's largest solar companies with a global supply-chain and manufacturing facilities in Canada and China, today announced that while it is relieved an end has come to a lengthy and costly legal suit, they oppose the final ruling of the U.S. International Trade Commission (ITC) upholding the Department of Commerce's (DOC) earlier findings that Chinese pricing practices have harmed the American Industry. The ITC ruling will implement the DOC's antidumping duties (AD) ranging from 18.32% to 249.96% and countervailing duties (CVD) of 15.24% to 15.97% (depending on the importer of record) on crystalline silicon photovoltaic cells made in China, whether or not assembled into modules.The ITC also had to determine whether imports covered by the previous DOC critical circumstances determinations are likely to undermine seriously the remedial effect of the AD and CVD orders the DOC will issue. By a vote of 4 to 2, a majority of the ITC found that critical circumstances do not exist. As a result of the ITC's negative determinations regarding critical circumstances, the AD and CVD orders concerning these imports will not apply retroactively to goods that entered the United States prior to the date of publication in the Federal Register of the DOC's affirmative preliminary determinations."As a global company serving customers in 50 countries, Canadian Solar has consistently adhered to fair trading practices around the world," said Dr. Shawn Qu, chairman and chief executive officer of Canadian Solar."U.S. exports to China have increased almost 400 percent in the last eight years, and according to the National Solar Jobs Census 2012, the solar industry employs nearly 120,000 Americans across 50 states, reflecting a 13.2 percent increase from 2011. China and the United States are major trade partners and our interests are interwoven. Earlier protectionist tariffs indicate this action simply displaces sales to alternate foreign exporters and is expected to cost U.S. jobs. The tariff is counterproductive to resolving issues on both sides, and we'd like to see both governments engage in discussions to find workable solutions."Canadian Solar disagrees with the ITC final determination and maintains its position that the Company practices fair trade according to the Trade Act. As a result of this ruling, the Company will leverage its long-term cell supply partners located outside of China to continue to serve the growing demands from its U.S. customers. Canadian Solar is a fiscally strong company dedicated to delivering high quality solar solutions to its customers around the world and will remain actively engaged in the pursuit to reduce the price of solar modules to enable more families and businesses access to affordable solar energy reducing our consumption of fossil fuels and increasing our energy independence."About Canadian SolarCanadian Solar Inc. (NASDAQ: CSIQ) is one of the world's largest solar companies. As a leading vertically integrated provider of ingots, wafers, solar cells, solar modules and other solar applications, Canadian Solar designs, manufactures and delivers solar products and solar system solutions for on-grid and off-grid use to customers worldwide. With operations in North America, Europe, Africa, Australia, Asia and Africa, Canadian Solar provides premium quality, cost-effective and environmentally-friendly solar solutions to support global, sustainable development. For more information, please visit www.canadiansolar.com.Safe Harbor/Forward-Looking StatementsCertain statements in this press release are forward-looking statements that involve a number of risks and uncertainties that could cause actual results to differ materially. These statements are made under the "Safe Harbor" provisions of the U.S. Private Securities Litigation Reform Act of 1995. In some cases, you can identify forward-looking statements by such terms as "believes," "expects," "anticipates," "intends," "estimates," the negative of these terms, or other comparable terminology. Factors that could cause actual results to differ include the risks regarding the previously disclosed SEC investigation as well as general business and economic conditions and the state of the solar industry; governmental support for the deployment of solar power; future available supplies of high-purity silicon; demand for end-use products by consumers and inventory levels of such products in the supply chain; changes in demand from significant customers; changes in demand from major markets such as Germany; changes in customer order patterns; changes in product mix; capacity utilization; level of competition; pricing pressure and declines in average selling prices; delays in new product introduction; continued success in technological innovations and delivery of products with the features customers demand; shortage in supply of materials or capacity requirements; availability of financing; exchange rate fluctuations; litigation and other risks as described in the Company's SEC filings, including its annual report on Form 20-F filed on April 27, 2012. Although the Company believes that the expectations reflected in the forward looking statements are reasonable, it cannot guarantee future results, level of activity, performance, or achievements. You should not place undue reliance on these forward-looking statements. All information provided in this press release is as of today's date, unless otherwise stated, and Canadian Solar undertakes no duty to update such information, except as required under applicable law.SOURCE Canadian Solar Inc. Other Renewable News | 科技 |
2017-09/1580/en_head.json.gz/9962 | New IPCC climate report projects significant threats to Australia
Photo: AP / Michail Michailidis
Fire seasons, particularly in southern Australia, will extend in high-risk areas. Australia's multibillion-dollar mining, farming and tourism industries face significant threats as worsening global warming causes more dangerous and extreme weather, the world's leading climate science body will warn. A final draft of
Are Australia’s bushfire seasons getting longer?
The length of fire seasons has increased globally, which may have implications for firefighting operations in Australia @NickEvershed Increasingly long fire seasons will...
Australia prepares for 'dangerous' bushfire season
As Australia prepares for another horror bushfire season, experts are warning that some areas of the country are becoming uninhabitable because of the increased risk of...
Arson seen in Australia bushfires, bad season ahead
' By Paul Tait SYDNEY (Reuters) - Some of the bushfires which destroyed seven homes near Sydney were likely acts of arson, authorities said on Monday amid warnings that...
'Catastrophic' bushfire season expected
THE bushfire season in south and eastern Australia will become longer and more extreme because of climate change, a new report says. It says a new rating system may also be...
Bushfires to become more intense: study
Australia\'s bush fire preparedness under threat as climate change kicks in
Australia risks being under-prepared for longer, drier and more severe bushfire seasons, a report from the Climate Council says. The national report found that record-breaking temperatures and hot winds will place unprecedented strain on firefighting resources, estimating that the number of professional firefighters across Australia will need to double by 2030. Australia's...
Scientists warn bushfire season getting longer
A bushfire season that starts in spring and stretches well into autumn will be the new norm for Australia's south-east, according to scientists. Melbourne University research fellow in climate science Sophie Lewis said catastrophic events such as the fires in NSW should come as no surprise, due to a dry winter and the ongoing effects of climate change. The past 12 months have...
Arson seen in Australia bushfires
By Paul Tait SYDNEY (Reuters) - Some of the bushfires which destroyed seven homes near Sydney were likely acts of arson, authorities said on Monday amid warnings that drought-hit Australia could be in for one of its worst fire seasons. About 50 separate fires, fanned by winds of up to 110 kph (66 mph), burnt around Sydney on a blistering hot day on Sunday, signalling an early...
Greg Hunt
Australia ( /əˈstreɪljə/), officially the Commonwealth of Australia, is a country in the Southern Hemisphere comprising the mainland of the Australian continent as well as the island of Tasmania and numerous smaller islands in the Indian and Pacific Oceans. It is the world's sixth-largest country by total area. Neighbouring countries include Indonesia, East Timor and Papua New Guinea to the north; the Solomon Islands, Vanuatu and New Caledonia to the north-east; and New Zealand to the south-east.
For at least 40,000 years before European settlement in the late 18th century, Australia was inhabited by indigenous Australians, who belonged to one or more of roughly 250 language groups. After discovery by Dutch explorers in 1606, Australia's eastern half was claimed by Great Britain in 1770 and settled through penal transportation to the colony of New South Wales from 26 January 1788. The population grew steadily in subsequent decades; the continent was explored and an additional five self-governing Crown Colonies were established.
http://en.wikipedia.org/wiki/Australia | 科技 |
2017-09/1580/en_head.json.gz/10047 | News tagged with baikonur
international space station Russia emergency teams look for debris of crashed spacecraft
Russian emergencies workers are combing the mountains near the border with Mongolia for the debris of a cargo spaceship that crashed minutes after its launch.
Image: Expedition 50 crew launches to the International Space Station
In this one second exposure photograph, the Soyuz MS-03 spacecraft is seen launching from the Baikonur Cosmodrome with Expedition 50 crewmembers NASA astronaut Peggy Whitson, Russian cosmonaut Oleg Novitskiy of Roscosmos, ...
Foie gras, saxophone blast into space with astronauts (Update)
A trio of astronauts soared into orbit Friday bound for the International Space Station, with their Soyuz spacecraft delivering some fancy French food, a saxophone and the future female commander.
Launch of three astronauts to ISS postponed
The launch next month of a Soyuz spacecraft carrying three astronauts to the International Space Station has been postponed by 48 hours, Russia's space agency said Friday, reportedly to ensure better docking conditions.
Image: Expedition 49 launch to the International Space Station
The Soyuz MS-02 rocket is launched with Expedition 49 Soyuz commander Sergey Ryzhikov of Roscosmos, flight engineer Shane Kimbrough of NASA, and flight engineer Andrey Borisenko of Roscosmos, Wednesday, Oct. 19, 2016, at ...
Two Russians, one American blast off to ISS
Two Russian cosmonauts and a NASA astronaut soared into orbit in a Soyuz spacecraft Wednesday at the start of a two-day journey to the International Space Station.
Russia cancels manned space launch over 'technical' issues
Russia on Saturday cancelled a planned manned space launch expected in one week due to "technical reasons," giving no explanation or a new launch date.
Russia launches ISS-bound cargo ship
A Russian rocket carrying an unmanned cargo ship blasted off for the International Space Station early Sunday from the Baikonur cosmodrome in Kazakhstan, the Russian space agency said.
Soyuz capsule docks with International Space Station
A Soyuz space capsule carrying astronauts from Russia, Japan and the United States has docked with the International Space Station after a two-day voyage.
The Soyuz MS-01 spacecraft launches from the Baikonur Cosmodrome with Expedition 48-49 crewmembers Kate Rubins of NASA, Anatoly Ivanishin of Roscosmos and Takuya Onishi of the Japan Aerospace Exploration Agency (JAXA) onboard, ...
Next » Baikonur
Baikonur (Kazakh: Байқоңыр, Bayqoñır ; Russian: Байконур, Baykonur), formerly known as Leninsk, is a city in Kyzylorda Province of Kazakhstan, rented and administered by the Russian Federation. It was constructed to service the Baikonur Cosmodrome and was officially renamed Baikonur by Russian president Boris Yeltsin on December 20, 1995.The shape of the area rented is an ellipse, measuring 90 kilometres east to west, by 85 kilometres north to south, with the cosmodrome at the centre.The original Baikonur (Kazakh for "wealthy brown", i.e. "fertile land with many herbs") is a mining town a few hundred kilometres northeast, near Dzhezkazgan in Kazakhstan's Karagandy Province. Starting with Vostok 1 in April 1961, the launch site was given this name to cause confusion and keep the location secret. (The original Baikonur's residents took advantage of the confusion by ordering and receiving much scarce materials before government officials discovered the deception.):284 The new Baikonur's railroad station predates the base and retains the old name of Tyuratam.The fortunes of the city have varied according to those of the Soviet/Russian space program and its Baikonur Cosmodrome.The Soviet government established the Nauchno-Issledovatel'skii Ispytatel'nyi Poligon N.5 (NIIIP-5), or Scientific-Research Test Range N.5 by its decree of 12 February 1955. The U-2 high-altitude reconnaissance plane found and photographed for the first time the Tyuratam missile test range (cosmodrome Baikonur) on 5 August 1957. See a composite satellite image of the early Tyuratam launch complex, the cosmodrome (30 May 1962). | 科技 |
2017-09/1580/en_head.json.gz/10058 | Research Computing to play vital role in $9 million grant from the National Institute of Mental Health November 17, 2011 From the Harvard Gazette
By Peter Reuell, Harvard Staff Writer
Have you ever wondered why infants can learn foreign languages easily, while older children and parents struggle? Or why your third-grader can fix your computer, but you can barely check your email? The answer, scientists have long known, lies in the way brains develop. All children experience what researchers call “critical periods” —windows early in life when the wiring in their brains is more flexible than that in adults, meaning they are more able to learn new skills. Led by Takao Hensch, professor of neurology and molecular and cellular biology, and funded by a $9 million grant from the National Institute of Mental Health (NIMH), a group of Harvard researchers from the Harvard Center for Brain Science (CBS) and the departments of Molecular and Cellular Biology and Chemistry and Chemical Biology is working to map how that brain wiring takes place in an effort to pinpoint the causes of — and potential treatments for —schizophrenia, autism, and a host of other disorders.
The work — aimed at giving scientists a first-of-its-kind look at how the brain’s wiring is created, how it may go awry, and how it might be corrected — will take advantage of Harvard’s wide range of specialists.
“I would not have pursued this work if it weren’t for the unique environment,” Hensch said. “I came to Harvard because I wanted to work with the exceptional investigators that are here. This research represents a dovetailing of our interests — an unusual opportunity where these cutting-edge methods are assembled in one place at one time — something that could only be found here.”
Although he’s been researching the role critical periods play in brain wiring for nearly two decades, Hensch said technology now exists that will allow scientists to answer a question that has thus far eluded them: Exactly what is happening in the brain during this re-wiring process?
To answer that question, Hensch will focus on a single type of cell, the parvalbumin (PV)-positive GABA neuron, in the brains of mice. Though relatively sparse, the neurons are believed to play a key role in triggering the start of critical periods early in life. Impairment of the normal development and function of these cells can contribute to a variety of mental disorders, including autism and schizophrenia.
To get the up-close-and-personal view of the neurons needed for deeper insight, Hensch will collaborate with a number of colleagues, including Catherine Dulac, chair of the Department of Molecular and Cellular Biology and Higgins Professor of Molecular and Cellular Biology.
In earlier research, Dulac identified as many as 1,300 genes affected by “genomic imprinting” — the phenomenon of a gene having different levels of expression depending on whether it was inherited from the father or mother — many of which appear to be linked to autism and other mental disorders.
Understanding the precise role those imprinted genes play in brain development, however, is tricky. The brain contains dozens of cell types, including many different types of neurons. To study the entire brain, or even a single region, would require sifting through a vast amount of data in search of the few truly important results. What was needed, Dulac said, was a focus on a single, critical cell type.
“As it turned out, Takao was working on this very interesting population of neurons, and he has demonstrated that the maturation of this cell type is particularly important, because abnormal development can lead to mental disorders,” Dulac said.
Using lab mice, Dulac will alter those imprinted genes and study how the changes affect brain development and the incidence of mental disorders.
To understand how those genetic changes alter the brain’s wiring, Hensch turned to Jeff Lichtman, Jeremy R. Knowles Professor of Molecular and Cellular Biology, and his “Brainbow” imaging technique. By tagging cells with fluorescent colors, the technique allows researchers to map the thousands of connections to and from each neuron. Lichtman is also developing a new electron microscopy method that will greatly speed the ability to draw detailed neuronal maps.
“What we are trying to do with this work is to better understand the way the central nervous system is organized,” Lichtman said. “We want to learn about the basic organization of these cells from a physiological, anatomical, and genetic perspective, and then compare normal and disordered brains to see if there is a difference.”
Also involved in the imaging is Xiaowei Zhuang, professor of physics and chemistry and chemical biology, who created STORM, a super high-resolution system of optical microscopy. Using the two imaging techniques, researchers will be able to study images of individual neurons, and see for the first time how the brain’s wiring changes in response to genetic alterations.
“We look forward to collaborating with Takao and applying new imaging capabilities to the study of parvalbumin-positive neurons and mental illnesses related to them,” Zhuang said. “This work provides an excellent opportunity for a number of exciting collaborations.”
Rounding out the team involved in the research is James Cuff, director of Research Computing for the Faculty of Arts and Sciences, who will help oversee efforts to process the vast amounts of data that will emerge from the project.
The result of work to study the genetics of the cells, visualizing their connections, and ultimately capturing the functionality of individual neurons, Hensch said, should be an understanding of how brains become incorrectly wired, and the development of therapies to reverse the damage.
“It’s a bit down the road, but we are optimistic that focusing on these cells will have an immediate payoff,” he said. “If we know how a cell’s connectivity starts out, and how it is supposed to end up, we can look at the disease model and trace it back to see where things went awry and how we might fix it.”
The grant also requires that researchers make a commitment to training the next generation of mental health researchers by involving students at many levels — even high school students – in the project. Building awareness of mental illness throughout the community is another aspect of the grant, one Hensch fulfills through his work with Harvard’s Center on the Developing Child.
“I’ve been fortunate to be part of the center, which brings together people from all the Schools to look at early childhood as both a window of opportunity as well as vulnerability,” Hensch said.
“I am convinced, as Takao is, that the development of an animal is a critical time for its brain,” Lichtman said. “In humans, it’s even more critical, because more than any animal our behavior is related to what we’ve learned. It’s a pretty safe bet that the thing that is changed by experience is the wiring diagram.
“If these were simple problems, we would already have solutions to most of the abnormalities of the brain,” he continued. “The fact that we have solutions to virtually none of them suggests how difficult this is.”
MGHPCC Open for Business → | 科技 |
2017-09/1580/en_head.json.gz/10067 | The Staley Lab
University of Chicago The Staley Lab
Research ProjectsCatalysis
RNA Dynamics
PeopleCurrent MembersPrincipal Investigator
Research Professionals
Research Technicians
Lab AlumniPostdoctoral Fellows
Catalysis of pre-mRNA Splicing
Over three decades ago, the small nuclear RNA (snRNA) components of the spliceosome were proposed to mediate catalysis within the spliceosome. This hypothesis was motivated by the discovery of the first catalytic RNAs and bolstered by the identification of a particular class of self-splicing, intronic RNAs (group II introns) that, like nuclear pre-mRNA introns, spliced through a characteristic lariat intermediate. Similarities between the most conserved domains of the snRNAs and group II introns strengthened the hypothesis. Nevertheless, definitive proof that the snRNAs mediate catalysis has been lacking. Further, a conserved protein component of the spliceosome has recently been suggested to contribute to catalysis. While it has been unclear what component of the spliceosome mediates catalysis, work from the Piccirilli lab, also at the University of Chicago, has established that the spliceosome is a metalloenzyme, such that divalent metals stabilize the leaving groups during both of the two chemical steps in splicing.
To identify the spliceosomal ligands for these metals, we teamed up with Piccirilli, combining his expertise in chemical biology and our expertise in spliceosome function, especially our understanding of proofreading mechanisms that operate at the catalytic stage. Together, we showed that the catalytic metals are positioned by U6 snRNA and thereby provided definitive evidence that the spliceosome utilizes RNA to catalyze splicing. Further, our work revealed that the spliceosome and self-splicing group II RNAs share common catalytic mechanisms and likely evolutionary origins. These data support a hypothesis that key RNA-based machines in modern life, including the spliceosome and the ribosome, derived ultimately from a primordial “RNA world” (Fica et al., 2013, Nature). This work has raised new and fundamental questions regarding the mechanism of catalysis.
©2017 The University of Chicago
Instagram The Staley Lab
920 E. 58th Street Suite 817 Chicago, IL, USA 60637 The University of Chicago Biological Sciences Division Molecular Biosciences Department of Molecular Genetics and Cell Biology | 科技 |
2017-09/1580/en_head.json.gz/10078 | NewsConferenceTNW NYCIndexTQDealsAnswersCyberspace Culture China’s 360Buy picks up $400m Series D funding led by Twitter investor Alwaleed bin Talal
— in Asia
360Buy, one of China’s top e-commerce sites, has received a $400 million investment from a consortium led by Saudi prince Alwaleed bin Talal.
The prince’s Kingdom Holding Company (KHC) contributed roughly $125 million to purchase an undisclosed stake in the company. KHC had previously invested $300 million in Twitter.
“This deal confirms KHC’s expertise to seek rewarding global investment opportunities, and to capitalize on them in light of their potential IRR in order to extract value for KHC shareholders,” KHC Executive Director Eng. Ahmed Halawani said in a statement.
Alwaleed bin Talal highlighted the deal as strengthening the “strategic relationship” between Saudi Arabia and China.
The funding, which wraps up the company’s Series D round, follows an investment from the Ontario Teachers’ Pension Plan last November. The total funding for the round passed $700 million, as Ontario Teachers’ put in $300 million. At the time, 360Buy was reportedly valued at $7.25 billion.
Technode reports that 360Buy will use the money for new business development and improvement of its logistics infrastructure. Last November, 360Buy announced that it was opening up its nationwide last-mile delivery system to third-parties. The company has already invested heavily in its logistics operations, offering same-day delivery service in 30 cities throughout China.
KHC noted that its investment is inline with its practice of aligning with fast-growing companies that plant to list on an international capital market within three years. As such, an IPO from 360Buy could be on the cards for the company in the near future. Rumors have repeatedly suggested that 360Buy was looking to go public, but economic uncertainty and increased scrutiny of Chinese companies looking to list in the US have made things difficult.
All told, 360Buy has raised over $2 billion. In 2011, the company picked up $1.5 billion in its Series C round. Investors included Digital Sky Technologies and Tiger Fund.
360Buy boasted $9.7 billion in transactions last year and is aiming to top $16 billion in 2013. The company is not yet profitable, but it believes it will bring in its first profits by the fourth quarter of this year. Last December, it inked a crucial deal with Nokia to become an official reseller, committing to purchase up to $320 million worth of the handset maker’s devices.
Image credit: Fayez Nureldine / AFP / Getty Images
China's daily deal market consolidates as top 10 sites claim 90% revenue share: Report Twitter Share on Facebook (18)
@r0han
People create products. Products create people. | 科技 |
2017-09/1580/en_head.json.gz/10258 | Russia plans test 'mission' to Mars
From Red Square to the Red Planet
The Russians are looking for six volunteers to be locked away for 500 days to learn more about how people would deal with the isolation of the journey to Mars.
In a set up that sounds (horribly) like the ultimate reality TV show, the volunteers will live in a mock up of a Mars space ship in Moscow, according to a BBC report, entirely isolated from the outside world. The ship will be something like a three bedroom flat, built on a sound stage.
Monitored on CCTV and microphone (see the reality set up?) the faux-cosmonauts will breathe recycled air and subsist on sterilised food packages. The resources they have when they are locked in will be all they will have to last them for 18 months: there will be no resupply missions.
Despite the constant monitoring, contact with the outside world will be limited to email, with long delays before replies are sent, as would be the case on a real journey to Mars.
The experiment is scheduled to start in late 2007, but the Russian Space Agency is seeking volunteers now to undergo a lengthy screening process.
Applicants must be relatively young, fit, speak fluent English, and be qualified to University diploma level. They will then be subject to stringent medical and psychological tests to make sure they have what it takes to live in a space ship for 18 months, and properly document the experience.
The BBC doesn't say whether the Russians are planning to screen the adventure, but we wouldn't be surprised if Big Brother's Endemol production company isn't already negotiating for the broadcasting rights. ® | 科技 |
2017-09/1580/en_head.json.gz/10308 | Posted 10:37PM on Tuesday 10th July 2012 ( 4 years ago )
Global warming tied to risk of weather extremes
NEW YORK - Last year brought a record heat wave to Texas, massive floods in Bangkok and an unusually warm November in England. How much has global warming boosted the chances of events like that?
Quite a lot in Texas and England, but apparently not at all in Bangkok, say new analyses released Tuesday.
Scientists can't blame any single weather event on global warming, but they can assess how climate change has altered the odds of such events happening, Tom Peterson of the National Oceanic and Atmospheric Administration told reporters in a briefing. He's an editor of a report that includes the analyses published by the Bulletin of the American Meteorological Society.
In the Texas analysis, researchers at Oregon State University and in England noted that the state suffered through record heat last year. It happened during a La Nina weather pattern, the flip side of El Nino. Caused by the cooling of the central Pacific Ocean, La Nina generally cools global temperatures but would be expected to make the southern United States warmer and drier than usual. But beyond that, the scientists wondered, would global warming affect the chances of such an event happening?
To find out, they ran a lot of computer simulations of Texas climate during La Nina years. They compared the outcome of three such years in the 1960s with that of 2008, which was used as a stand-in for 2011 because they were unable to simulate last year. The idea, they said, was to check the likelihood of such a heat wave both before and after there was a lot of man-made climate change, which is primarily from burning fossil fuels like coal and oil.
Their conclusion: Global warming has made such a Texas heat wave about 20 times more likely to happen during a La Nina year.
Using a similar approach, scientists from Oxford University and the British government looked at temperatures in central England. Last November was the second warmest in that region in more than 300 years. And December 2010 was the second coldest in that time.
Their analysis concluded that global warming has made such a warm November about 62 times more likely, and such a cold December just half as likely.
Kevin Trenberth, of the National Center for Atmospheric Research's climate analysis section, said that he found the Britain study to be reasonable, given what he called a flawed climate model. As for the Texas result, he said that given how the study was done, the calculated increase in likelihood "could well be an underestimate."
A third analysis considered unusually severe river flooding last year in central and southern Thailand, including neighborhoods in Bangkok. It found no sign that climate change played a role in that event, noting that the amount of rainfall was not very unusual. The scale of the flooding was influenced more by factors like reservoir operation policies, researchers wrote.
Also at the briefing, NOAA released its report on the climate for 2011, which included several statistics similar to what it had announced earlier.
Last year was the coolest since 2008 in terms of global average temperature, which was about 57.9 degrees Fahrenheit (14.4 degrees Celsius.). But it still remained among the 15 warmest years since records began in the late 1800s, the agency said. It was also above average for the period 1980-2010.
Associated Categories: U.S. News, Local/State News | 科技 |
2017-09/1580/en_head.json.gz/10329 | China to launch moon rock-collecting probe in 2017
BEIJING — China said Monday it was on track to launch a fifth lunar probe with the aim of bringing back lunar soil and rock samples following the successful moon landing of a space probe.
The new mission planned for 2017 would mark the third and final phase of China’s robotic lunar exploration program and pave the way for possibly landing an astronaut on the moon after 2020.
On Saturday, Chang’e 3 set down on the moon, marking the first soft landing of a space probe on the lunar surface in 37 years. The landing vehicle will conduct scientific research for a year and its accompanying rover will survey the moon’s structure and probe for natural resources.
A challenge for both is to withstand temperatures ranging from 120 degrees Celsius (248 Fahrenheit) to minus 180 C (minus 184 F), said Wu Zhijian, spokesman for the State Administration of Science, Technology and Industry for National Defense.
Chang’e 4 is intended to be an improved version of the Chang’e 3 that will pave the way for the fifth probe.
First explored by the former Soviet Union and the United States in the 1960s and 70s, the moon has become the subject of renewed interest, even as the focus shifts to Mars exploration.
The U.S. Lunar Reconnaissance Orbiter is currently circling the moon to detail its features and resources as a prelude to building a lunar outpost. In 2009, India’s lunar orbiter, the Chandrayaan-1, detected water on the moon. Two years earlier, Japan sent a spacecraft to orbit it.
“People have started saying we’re looking for water and we’re looking for minerals and it’s becoming a much more interesting place to go to,” said Peter Bond, consultant editor for Jane’s Space Systems and Industry. “Especially for new countries like China and India who are testing out new technologies — it’s an ideal place to practice these before they go to Mars and beyond.”
China says its moon exploration program is about gaining a scientific understanding of the moon and developing space engineering and other technologies to prepare it for deep space exploration in the future. It is also a source of national pride.
President Xi Jinping and Premier Li Keqiang were at the Beijing Aerospace Control Center to hear lunar program chief commander Ma Xingrui declare the Chang’e 3 mission a success, the official Xinhua News agency reported.
“China’s moon probe is a way to exhibit to the world that we have acquired advanced space technology, which is more sophisticated than nuclear technology, and it is also a way to win international recognition as a big power,” said He Qisong, a space expert at the Shanghai University of Political Science and Law.
He said that finding and developing mineral resources on the moon could help solve future problems on Earth.
China sent its first astronaut into space in 2003, becoming the third nation after Russia and the United States to achieve manned space travel independently. It launched the Tiangong 1 prototype space station in 2011 and plans to replace it with a larger permanent station seven years from now.
The space program’s close military links have raised questions about its ultimate intentions and dissuaded other countries from cooperating too closely with it. In 2007, the military shot down a defunct weather satellite in an apparent show of force that created a large amount of debris imperiling other spacecraft. ► Join the DiscussionView/Add Comments »
Join the conversation! To comment on azcentral.com, you must be logged into an active personal account on Facebook. You are responsible for your comments and abuse of this privilege will not be tolerated. We reserve the right, without warning or notification, to remove comments and block users judged to violate our Terms of Service and Rules of Engagement. Facebook comments FAQ | 科技 |
2017-09/1580/en_head.json.gz/10450 | Spotted Sandpiper Chick
The spotted sandpiper is the most common sandpiper found in North America. During the summer breeding season they can be found along most any type of wetlands in the northern half of the continent. During the winter these birds migrate down to the gulf coast, Mexico, Central or South America. They share the genus Actitis with the common sandpiper which is found in Europe and Asia.One of the most interesting thing about the spotted sandpiper is that the female is the one that arrives on the breeding range first. She then will stake out and defend a territory. With other species that is typically the males role. Following through on the whole role reversal it is the male who stays to raise the young while the female will often go off to have a family with another male. So most likely this chick was raised by its father. Since the female can store sperm for about a month it is possible that it was raised by a foster father.
Lake Superior by Samuel Griswold Goodrich
Father of Lakes! thy waters bend,Beyond the eagle's utmost view,When, throned in heaven, he sees thee sendBack to the sky its world of blue.
Boundless and deep the forests weaveTheir twilight shade thy borders o'er,And threatening cliffs, like giants, heaveTheir rugged forms along thy shore.
Nor can the light canoes, that glideAcross thy breast like things of air,Chase from thy lone and level tide,The spell of stillness deepening there.
Yet round this waste of wood and wave,Unheard, unseen, a spirit lives,That, breathing o'er each rock and cave,To all, a wild, strange aspect gives.
The thunder-riven oak, that flingsIts grisly arms athwart the sky,A sudden, startling image bringsTo the lone traveller's kindled eye.
The gnarled and braided boughs that showTheir dim forms in the forest shade,Like wrestling serpents seem, and throwFantastic horrors through the glade.
The very echoes round this shore,Have caught a strange and gibbering tone,For they have told the war-whoop o'er,Till the wild chorus is their own.
Wave of the wilderness, adieu--Adieu, ye rocks, ye wilds, ye woods!Roll on, thou Element of blue,And fill these awful solitudes!
Thou hast no tale to tell of man.God is thy theme. Ye sounding caves,Whisper of Him, whose mighty plan,Deems as a bubble all your waves! -
monarch,
Voyaguers National Park is the only National Park in Minnesota. Voyageurs was established in 1975 on Minnesota's northern border with Canada. The park was named after the French Canadian fur traders that travelled the area in birch bark canoes. Many visitors still come to the park to canoe, boat, kayak or fish. Over 344 square miles of the 220,000 acre park are covered in water. The major lakes in the park include Rainy Lake, Kabetogamma Lake, Namakan Lake and Sand Point Lake. There are also a number of small lakes, rivers, and other wetlands in the park. Boaters can access the lakes at boat launches located at the 3 visitors centers. This photo is of the boat launch at the Kabetogamma Lake Visitors Center.
Karner Blue
There are about 1200 species, not counting the plant species, in North America that are on the US Fish and Wildlife Service's Endangered Species List. Many people are familiar with some of the larger mammals and birds, such as bison, cougars, whooping cranes, and California condors, but did you know that there are 10 different butterfly species that are also on the list.Each year I make a trip or two out to Necedah NWR in Wisconsin. One of the big reasons for the trip is to photograph the endangered Karner blue butterfly. The Karner blue is a subspecies of the Melissa blue. It was named after Karner NY where it was first discovered. The Karner was classified as a separate sub species because it has a separate range, Karners are found around the Great Lakes while the Melissa is found through out western North America, and because it has a different larval host plant. The Melissa blue larva (caterpillar) feed on plants in the pea family, such as Astragalus, Lotus, Lupinus, Glycyrrhiza and Mediccago species, while the Karner larva feeds only on wild lupine. Since it is limited to one source of food the Karner blue population is limited by the amount of wild lupine. Unfortunately the amount of lupine growing in the range has been decreasing due to land development and a lack of natural disturbance, such as fire or large mammal grazing. Natural disturbances help to eliminate many of the other plants that tend to choke out the lupine. Posted by
The rose-breasted grosbeak is a medium sized passerine. They summer in the deciduous woodlands of the north eastern half of the United States and up into central Canada. They winter in the tropical forests of Central America and northwestern South America. They are omnivores, eating a combination of seeds, tree buds, fruit and insects. They will sometimes cross breed with the black-headed grosbeak where there ranges cross in the middle of the United States.
Darner Invasion
Most of the dragonflies up here in Minnesota are gone. Our cold nights have for the most part finished off the season. That is not true everywhere in the U.S. however. Yesterday while a football game between Miami and Kansas State was postponed due to a storm, Sun Life Stadium, where they were playing, was invaded by a cloud of dragonflies. If you would like to read more here is a link.It was hard to tell exactly what type of dragon that they were from the pictures that were provided but they appeared to be a member of the darner family. The Canada darner, pictured above, is one of the most common darners that we see around here. In the second photo the darners are mating. Shortly after the picture was taken she flew off back to the water to lay her eggs.
Canada darner,
Wild Lupine
Lupine is one of the earliest blooming wildflowers that we see around here. They usually begin to bloom in May and most of the flowers are gone before the end of June. They are also usually one of the first wildflowers to grow into a new area, such as a field that has been cleared or after a forest fire. Unfortunately it does not take them long to be choked out by other plants and grasses. So as our fire protection and suppression techniques have gotten better the amount of lupine out in the wild has decreased. Which has hurt species like the endangered Karner blue butterfly.
Today is the first day of Autumn. I love the fall with the warm sunny days and cool crisp nights. Fall also means that different birds are beginning to move into the area. One species of bird that has begum to show up is the rough-legged hawk. One has already been reported in the Twin Cities where it wintered last year. These photos where taken at Crex Meadows last fall.Rough-legged hawks spend their summer in Canada north to the Arctic Circle. They get their name from the feathers that cover their legs down to their feet. The feet themselves are smaller the normal for a Buteo hawk because most places that they can perch in their summer habitat are very small. Rough-legged hawks come in two different color morphs. The dark morph, pictured in the first photo, is not as common as the light morph, second photo. There is also an amount of cross breading between the two morphs leading to different intermediate individuals.
Sandhill Cranes at Dusk
Here in the north the our short Summer has come to an end. Even though Fall does not officially begin until tomorrow I have been seeing the signs of Autumn for the past couple of weeks. Leaves on the trees have just begun to change color, woolly bear caterpillars are on the move and the sandhills cranes have begun to descend upon Crex Meadows.Each Fall the sandhill cranes migrate south for the winter. Before they begin the big migration they gather in staging areas where they get ready for the long flight. Crex Meadows in Wisconsin is one of the places that the cranes use as a staging area. There are quite a few fields around Crex where the cranes can find food but the main reason that they come to Crex are the shallow wetlands that the cranes roost in over night. With their long legs the cranes will often spend the night in pools that are a couple of feet deep. This helps to protect them from land based predators who would have to cross the water to get to the cranes, alerting them to the danger. Many people come to Crex, in the Fall, to watch the cranes fly out from the wetlands to the fields near dawn or return from the fields at dusk. The cranes will usually take their leave at some time in October.
sandhill cranes,
A Flower for My Wife on our Anniversary
Bee Macro
bumble bee,
Red-eared sliders are a turtle native to the south eastern United States, this photo was taken in south Texas. They are mostly aquatic coming out usually only to sun bathe or lay eggs. Red-eared sliders are the most popular pet turtles in the United States. Released pets have established populations outside of their normal range and have even become invasive in some areas, such as parts of California where they are able to out compete the native western pond turtle. Like most turtles red-eared sliders are omnivores. Their diet consists of fish, frogs, crustaceans, aquatic insects and aquatic plants.
red-eared slider turtle,
Jays are medium sized passerines that are part of the Corvidae family. They come in a variety of colors and are found through out the world. In North America we have 10 different types of jays in 5 different genus. Probably the most common of the North American jays is the blue jay. Blue jays are found in the eastern half of North America. They share the Cyanocitta genus with the Stellar's jay which is found in parts of western North America. The genus Aphelocoma consists of three types of scrub jay as well as the Mexican jay. The western scrub-jay, pictured above is the most common scrub jay. It is found in the south western United States and parts of Mexico. The Florida scrub-jay is found only in Florida and the island scrub-jay is found on Santa Cruz Island off the coast of California. Gray Jays are the only member of the Perisoreus genus found in North America. Their range includes Canada, Alaska, the northern United States and the Rocky Mountains. Living in the north these birds will often cache food to help them survive the winter months. They are very intelligent birds, as are most members of the Corvidae family, and very inquisitive. They are often referred to by the nickname camp robber. The green jay is a member of the genus Cyanocorax. Their range includes parts of south eastern Texas and eastern Mexico. They can also be found in Central America and northwestern South America. In North America the only other jay found with in the same range as the green jay is the brown jay. Both are in the genus Cyanocorax but it is very easy to distinguish between them since the green jay is very colorful and the brown jay is mostly brown.
blue jay,
gray jay,
green jay,
Meadowhawks are members of the genus Sypetrum. They are a part of the skimmer family of dragonflies, which includes most of the colorful dragonflies. Meadowhawks are medium sized dragonflies that are quite common from midsummer into early fall. Most mature male meadowhawks are red in color while the female and young males are usually gold. Several of the meadowhawk species are so similar that they can only be identified under a microscope. Fortunately the white-faced meadow hawk is the only meadowhawk with a white face. When they first emerge the white face is more of an off white color and can make it difficult to identify them but by the time that they mature, like the ones above, the white is very striking making identification fairly easy.
Ruby Tuesday,
I believe that this may be a northern crescent. The northern crescent is a butterfly that can be found in a variety of habitats in Canada and the northern half of the United States. They look very similar to the pearl crescent, which over laps most of their range. When I help survey butterflies in the early Summer with the St Paul Audubon Society our guide and butterfly expert usually will not try and distinguish between the two species because they are so similar. The northern crescent larvae, caterpillar, feeds on plants of the aster family. Eggs are laid in the summer with the butterfly over wintering in the third stage of its larval form. When spring comes the caterpillar wakes from its dormant state and begins to eat. They will go through their final metamorphosis around middle of June to the middle of July.
Macro Monday,
Flower Purgatory Creek
Verbena is a genus of plants in the Verbenaceae faemily. The genus contains approximately 250 different species. Some Verbena species are annuals, only lasting a single season, while others are perennial, come back year after year. Through out history many different cultures, such as the Egyptians and Romans, have viewed Verbana as a type of holy plant. Even ancient Christians used to refer to common vervain as Holy Herb.Although common vervain is not native to North America it has been introduced there and it is now invasive. Other species of Verbana are native to North America. The two species above are Verbana that are native to the area that I live. The top photo is an example of blue vervain and the bottom is an example of hoary vervain. Although both were photographed at Purgatory Creek they prefer different types of habitat. Blue vervain prefers to live in areas with wet soil while hoary vervain usually is found where the soil is dry.
Endangered Whooping Cranes
For the second time this year we have had some very special visitors in the area in which I live. A pair of endangered whooping cranes has been spotted down near Northfield, MN for the past week. I had made a couple of trips earlier this week after getting off of work but I was not able to locate the birds. So I took a half day off today and spent some time looking around and the third time was the charm. I found the pair of whoopers foraging for food in a field just a few blocks south west of where they had been seen earlier. They were accompanied by a pair of sandhill cranes. It was interesting to see the two species side by side. These were greater sandhill cranes so they are just under four feet tall but since the whooper is the tallest bird in North America they were about a half a foot taller.Whooping cranes are on the endangered species list. There is one naturally migrating flock of whoopers left in the world. They nest up in Canada and winter down on the Gulf Coast of Texas. In 1940 the population of this flock fell to a low of only 22 birds. Through conservation of the birds and vital habitat their numbers have increased about 4% per year until they numbered about 281 in 2010. Conservationist are worried however that if a disease or a disaster of some sort could were to hit the flock it could destroy the entire population. So in 1975 several whooper eggs from the Canada flock where transferred into the nests of sandhill cranes in Idaho. Unfortunately when the birds matured they mated with sandhill cranes instead of other whoopers. In 1999 a new group called the Whooping Crane Eastern Partnership formed with the purpose of building a secondary whooper population that would breed in Wisconsin and migrate down to Florida for the winter. You can tell that these birds are a part of that population by the bands on their legs, see the first picture. They also have a tracking device attached to their legs.
Clouds by Chris Lee
Water Pied-Billed Grebe
Icteridae
Are Northern Cardinals More Red in Texas?
Dot-tailed Whiteface
Flower from Another World?
Blue Sky by Mike Michaels
13 Lined Ground Squirrel
Helianthus Occidentalis
A Popular Old Post
Flaunt your Flowers
Roaring Mountain | 科技 |
2017-09/1580/en_head.json.gz/10452 | Artur Loureiro – CIO of Sonae
Artur Loureiro is Chief Information Officer at Sonae S.A. and Executive Board Member of Modelo Continente; Sonae SR, and President of Finco (Goup’s Forum for IT & Telecommunications). He is Non-Executive Board Member of WEDO, Saphety, Mainroad, Bizdirect and Tlantic.
He was the CIO of Optimus (the telecommunications company of Sonae Group), Executive Director of Novis, Delegate Director of Sonae Rede de Dados and Head of Information Systems – Infrastructures of Modelo Continente SGPS. He started his career at Dorbyl in South Africa and has over 30 years’ experience in IT. He has been in the Sonae Group for 20 years and he has participated and enabled many process changes in the Telecommunications and Retail Division.
With a degree in Information Systems Enginering from WITS University (South Africa), he has attended Information Systems executive programs at the Harvard Business School and in Marketing Management and Services Management at the Insead Business School, and also holds a Pocket MBA from Boston University.
Artur Loureiro has launched major Technology projects such as new food eCommerce site (launch in Mar. ’13), new electronics eCommerce site based on the Cloud (launched in Jan. ’13), Security, Virtualization (storage & servers), Disaster recovery, Information management, Self-checkouts, Self-service checkout, new partnerships in Loyalty Card Program (more than 3 million customers and over 90% penetration). | 科技 |
2017-09/1580/en_head.json.gz/10460 | Advertisment+25°C Kigali Toggle main menu visibility MENU Home Politics Health Sports Entertainment Technology Culture Tourism Economy People Environment Religion News+ MoreDiasporaOpinion2017 HeadlinesNews - When our skies are closed it is harder to make air transport safer –Kagame News - President Kagame’s address at the Aviation Africa Conference 2017News - Rwanda’s economy resilient in 2016 - BNR News - A Day in the life of Rwanda’s First LadyNews - Hamid Ansari says India is development partner of RwandaNews - Uganda deports two Rwandan fugitivesNews - Passengers frustrated over long dwelling-time News - Rusizi: Toxic cassava kills two News - Rwanda joins African Union Security CouncilNews - Attorney General urges Gicumbi leadership on drug abuse, complacency Science &Technology - Rwanda gets 1000 daily cyber attacks News - President Kagame talks on foreign aid and 2015 referendum News - African youth taught to detect genocide signals, urged to preach unityNews - Itorero suspended over malaria News - District mayor contenders begin registration Human ancestor ’Lucy’ was a tree climber, new evidence suggests
Published by Théophile Niyitegeka On 30 November 2016 saa 08:14
2Since the discovery of the fossil dubbed Lucy 42 years ago this month, paleontologists have debated whether the 3 million-year-old human ancestor spent all of her time walking on the ground or instead combined walking with frequent tree climbing. Now, analysis of special CT scans by scientists from The Johns Hopkins University and the University of Texas at Austin suggests the female hominin spent enough time in the trees that evidence of this behavior is preserved in the internal structure of her bones. A description of the research study appears November 30 in the journal PLOS ONE.Analysis of the partial fossilized skeleton, the investigators say, shows that Lucy’s upper limbs were heavily built, similar to champion tree-climbing chimpanzees, supporting the idea that she spent time climbing and used her arms to pull herself up. In addition, they say, the fact that her foot was better adapted for bipedal locomotion (upright walking) than grasping may mean that climbing placed additional emphasis on Lucy’s ability to pull up with her arms and resulted in more heavily built upper limb bones.Exactly how much time Lucy spent in the trees is difficult to determine, the research team says, but another recent study suggests Lucy died from a fall out of a tall tree. This new study adds to evidence that she may have nested in trees at night to avoid predators, the authors say. An eight-hour slumber would mean she spent one-third of her time up in the trees, and if she also occasionally foraged there, the total percentage of time spent above ground would be even greater.Lucy, housed in the National Museum of Ethiopia, is a 3.18 million-year-old specimen of Australopithecus afarensis — or southern ape of Afar — and is among the oldest, most complete fossil skeletons ever found of any adult, erect-walking human ancestor. She was discovered in the Afar region of Ethiopia in 1974 by Arizona State University anthropologist Donald Johanson and graduate student Tom Gray. The new study analyzed CT scan images of her bones for clues to how she used her body during her lifetime. Previous studies suggest she weighed less than 65 pounds and was under 4 feet tall."We were able to undertake this study thanks to the relative completeness of Lucy’s skeleton," says Christopher Ruff, Ph.D., a professor of functional anatomy and evolution at the Johns Hopkins University School of Medicine. "Our analysis required well-preserved upper and lower limb bones from the same individual, something very rare in the fossil record."The research team first had a look at Lucy’s bone structure during her U.S. museum tour in 2008, when the fossil was detoured briefly to the High-Resolution X-Ray Computed Tomography Facility in the University of Texas at Austin Jackson School of Geosciences. For 11 days, John Kappelman, Ph.D., anthropology and geological sciences professor, and geological sciences professor Richard Ketcham, Ph.D., both of the University of Texas at Austin, carefully scanned all of her bones to create a digital archive of more than 35,000 CT slices. High-resolution CT scans were necessary because Lucy is so heavily mineralized that conventional CT is not powerful enough to image the internal structure of her bones."We all love Lucy," Ketcham says, "but we had to face the fact that she is a rock. The time for standard medical CT scanning was 3.18 million years ago. This project required a scanner more suited to her current state."The new study uses CT slices from those 2008 scans to quantify the internal structure of Lucy’s right and left humeri (upper arm bones) and left femur (thigh bone)."Our study is grounded in mechanical engineering theory about how objects can facilitate or resist bending," says Ruff, "but our results are intuitive because they depend on the sorts of things that we experience about objects — including body parts — in everyday life. If, for example, a tube or drinking straw has a thin wall, it bends easily, whereas a thick wall prevents bending. Bones are built similarly.""It is a well-established fact that the skeleton responds to loads during life, adding bone to resist high forces and subtracting bone when forces are reduced," explains Kappelman. "Tennis players are a nice example: Studies have shown that the cortical bone in the shaft of the racquet arm is more heavily built up than that in the nonracquet arm."A major issue in the debate over Lucy’s tree climbing has been how to interpret skeletal features that might be simply "leftovers" from a more primitive ancestor that had relatively long arms, for example. The advantage of the new study, Ruff says, is that it focused on characteristics that reflect actual behavior during life.Lucy’s scans were compared with CT scans from a large sample of modern humans, who spend the majority of their time walking on two legs on the ground, and with chimpanzees, a species that spends more of its time in the trees and, when on the ground, usually walks on all four limbs."Our results show that the upper limbs of chimpanzees are relatively more heavily built because they use their arms for climbing, with the reverse seen in humans, who spend more time walking and have more heavily built lower limbs," says Ruff. "The results for Lucy are convincing and intuitive."Other comparisons carried out in the study suggest that even when Lucy walked upright, she may have done so less efficiently than modern humans, limiting her ability to walk long distances on the ground, Ruff says. In addition, all of her limb bones were found to be very strong relative to her body size, indicating that she had exceptionally strong muscles, more like those of modern chimpanzees than modern humans. A reduction in muscle power later in human evolution may be linked to better technology that reduced the need for physical exertion and the increased metabolic demands of a larger brain, the researchers say."It may seem unique from our perspective that early hominins like Lucy combined walking on the ground on two legs with a significant amount of tree climbing," says Kappelman, "but Lucy didn’t know she was "unique" — she moved on the ground and climbed in trees, nesting and foraging there, until her life was likely cut short by a fall — probably out of a tree."Graduate student M. Loring Burgess of the Johns Hopkins University School of Medicine was also an author on the paper.The study was funded by the Paleoanthropology Lab Fund, the University of Texas at Austin College of Liberal Arts and the Houston Museum of Natural Science. The University of Texas High-Resolution X-Ray CT Facility was supported by U.S. National Science Foundation grants EAR-0646848, EAR-0948842 and EAR-1258878. Comparative data were gathered with support from U.S. National Science Foundation grants BCS-0642297 and BCS-1316104.The fossils that make up the Lucy skeleton.Advertisement
Humans are hard-wired to follow the path of least resistance00
Empathy and moral choices: Study limits the role of emotions in moral decisions00
How desert ants find their way in a featureless environment00
How humans bond: The brain chemistry revealed00Related articles
How Dads bond with toddlers: Brain scans link oxytocin to paternal nurturing
Father’s diet impacts on son’s ability to reproduce, study in flies suggests
Married people have lower levels of stress hormone
Population density pushes the ’slow life’
14/02/17 - 11:40YOUR OPINION ABOUT THIS ARTICLE RULES AND REGULATIONS | 科技 |
2017-09/1580/en_head.json.gz/10470 | Home » Defense » Mobility Mobility
By Federal News Radio Staff January 17, 2011 1:25 pm 8 min read Share
http://federalnewsradio.com/wp-content/uploads/2011/01/207084.mp3Download audio January 25th, 2010 at 12:30 PM
Workers, Citizens, and Customers need easy access to their tools and teams to make quicker, higher quality decisions. They require a high level of responsiveness, agility, and data in a format they can use, i.e. Efficient, Mobile Information Transfer
Panelists will examine how we provide instant worker status, state, and preferences before you make that first contact – saving up to 1/3rd of your time. How do you help workers connect the right way the first time to speed decision making and elevate efficiency and productivity ; If you’re the implementer, how do you implement presence-based mobility strategies while balancing today’s communications innovation and productivity? Sponsored content: Agency managers surveyed, many unsure about data storage or how to improve it. Learn more in our next generation data center survey.
Robert J. Carey– Deputy Assistant Secretary of Defense, (Information Management, Integration and Technology) Department of Defense Deputy Chief Information Officer Casey Coleman– Chief Information Officer,Federal Acquisition Service Gerald T. Charles, Jr. – Director, Public Sector, Cisco Internet Business Solutions Group (IBSG) Chad Tompkins – Director of Information Technology Policy and Planning, U.S. Consumer Product Safety Commission Moderator: Tom Temin– Co-host of the Federal Drive Robert J. Carey Deputy Assistant Secretary of Defense (Information Management, Integration and Technology) Department of Defense Deputy Chief Information Officer Mr. Robert J. Carey serves as the Deputy Assistant Secretary of Defense (Information Management, Integration and Technology) / Department of Defense Deputy Chief Information Officer. Selected to this position after a brief tour as Director of Strategy and Policy for the US TENTH FLEET / FLEET CYBER COMMAND his principle roles will be to help lead the consolidation of Defense information technology enterprise as well as align, strengthen and manage the office of the DoD Chief Information Officer to have it better serve the Department’s mission and help lead the IT workforce into the 21st century. From November 2006 to September 2010 he served as served as the fifth Department of the Navy (DON) Chief Information Officer (CIO) where he championed transformation, enterprise services, the use of the internet, and information security. In his new role, he will also help strengthen the enterprise architecture, network and information security. Mr. Carey entered the Senior Executive Service in June 2003 as the DON Deputy Chief Information Officer (Policy and Integration) and was responsible for leading the DON CIO staff in developing strategies for achieving IM/IT enterprise integration across the Department. Mr. Carey’s Federal service began with the U.S. Army at the Aberdeen Proving Ground in October 1982 where he worked as a Test Director evaluating small arms and automatic weapons and their ammunition. He began his service with the Department of the Navy in February 1985 with the Naval Sea Systems Command. He worked in the Anti-Submarine/Undersea Warfare domain where he served in a variety of engineering and program management leadership positions within the Acquisition Community, culminating in his assignment as the Deputy Program Manager for the Undersea Weapons Program Office. Mr. Carey joined the staff of the DON CIO in February 2000, serving as the DON CIO eBusiness Team Leader through June 2003. During this period he also served as the Director of the DON Smart Card Office from February through September 2001. Mr. Carey attended the University of South Carolina where he received a Bachelor of Science degree in Engineering in 1982. He earned a Master of Engineering Management degree from the George Washington University in 1995. He has been a member of the Acquisition Professional Community and has been awarded the Department of the Navy Distinguished Civilian Service Award (twice) as well as the Superior and Meritorious Civilian Service Awards, and numerous other performance awards. He received the prestigious Federal 100 Award in 2006, 2008 and 2009 recognizing his significant contributions to Federal information technology. Mr. Carey was also named Department of Defense Executive of the Year for 2009 by Government Computer News. Mr. Carey is an active member of the United States Navy Reserve and currently holds the rank of CAPTAIN in the Civil Engineer Corps. He was recalled to active duty for Operation Desert Shield/Storm and Operation Iraqi Freedom, where, in 2006-2007, he served in the Al Anbar province with I Marine Expeditionary Force. Casey Coleman Chief Information Officer Federal Acquisition Service
Casey Coleman was appointed the Chief Information Officer for GSA’s Federal Acquisition Service (FAS) in October of 2006. Ms. Coleman is responsible for the delivery of information technology, management services and business applications to support the FAS, the new Service formed by the combination of the GSA Federal Technology Service and Federal Supply Service. As CIO, Ms. Coleman is responsible for aligning technology with GSA and FAS strategic business objectives. Her primary focus is leading and implementing the effective and efficient acquisition and management of information technology solutions across FAS. Ms. Coleman manages the Service’s $180 million IT program, overseeing management, acquisition and integration of the Service’s information resources. Her oversight includes strategic planning, policy, capital planning, systems development, information security, enterprise architecture, and e-government. Prior to her designation as FAS CIO, Ms. Coleman served two years as the GSA Federal Technology Service CIO. She also headed the GSA Office of Citizen Services, Office of Citizen Services and Communications, from June 2002 through July 2004, where she developed and successfully launched the USA Services government-wide citizen customer service program. Ms. Coleman began her career at Lockheed Martin, where she spent several years in software and system engineering roles, developing onboard command and control systems for military systems employed during the Gulf War. She also served for a year as a Congressional Legislative Fellow in 1994. With more than 18 years of experience in the high tech sector and a background in electronic business commerce, Ms. Coleman is known for her ability to implement organizational change, and using technology to achieve business and mission goals. Ms. Coleman is a native Texan and graduated with honors from Texas A&M University with a degree in computer science. She later earned a master’s degree in business administration and finance from the University of Texas at Arlington. Gerald T. Charles, Jr. Director, Public Sector Cisco Internet Business Solutions Group (IBSG)
Gerald T. Charles, Jr. is a director in the Cisco Internet Business Solutions Group (IBSG). He brings more than 25 years of executive management, comprehensive technology, and strategic thinking expertise to roles of Executive Advisor, Chief Technology Officer (CTO), and VP/General Manager. As a thought leader in management and technology strategy, governance, and business models, workforce and workspace sustainability, security, IT-based solutions, and innovation, he is helping transform service delivery and economic value in the public sector. He has served various senior executives and officials in civilian, defense, and commercial organizations. Charles has developed and implemented companies’ mergers and acquisitions as well as strategic, operational, and financial goals. Prior to Cisco, Charles was a vice president for TRW and BDM. He brought in multimillion-dollar contracts, turned closed subsidiaries into profit centers, and was a key technical and program architect for implementation of the U.S. Securities and Exchange Commission’s EDGAR system. After TRW, Charles was vice president for OAO Technology Solutions (OAOT), a commercial outsourcer and managed services provider, where he eliminated major losses in ebusiness divisions and corporate acquisitions, implemented an IT solution provider practice, and started the company’s public sector division. Charles is a published author and wrote The LAN Blueprint: Engineering It Right (McGraw-Hill) as well as various Cisco global industry points of view, including the 21st century workforce and workspace, innovation in the Public Sector, transportation, and security. Charles also earned a congressional medal for outstanding service and achievement. Charles holds an master’s degree in electrical engineering from the University of Maryland, and a bachelor’s degree in electrical engineering from the Illinois Institute of Technology. Chad Tompkins Director of Information Technology Policy and Planning U.S. Consumer Product Safety Commission
Chad Tompkins is the Director of IT Policy and Planning for the U.S. Consumer Product Safety Commission. He leads the Commission’s IT governance program including Capital Planning and Investment Control, Enterprise Architecture, Project Management Office, Security, and Privacy. These services collectively assure that limited IT resources are applied effectively to enable the Commission to protect the public against unreasonable risks of injury from consumer products. He also serves as program manager for SaferProducts.gov, the Commission’s new public website that allows consumers to report hazardous products, manufacturers to comment, and the public to consider these incidents when making purchasing decisions. In 2010, He worked with several recall issuing agencies and the General Services Administration (GSA) to create the Recalls.gov mobile app. The consumer can use the GSA developed app to search for recalls across the federal government. Before joining CPSC, Chad established the Pension Benefit Guaranty Corporation’s CPIC and quality assurance programs. He also served on detail at the Office of Management and Budget where he evaluated more than $14 billion of agency IT portfolios and developed CPIC related policy recommendations. Chad has more than 20 years of IT management success in the Federal, commercial, and not for profit sectors. He has led development teams in industries ranging from telecommunications to HIV/AIDS research to poultry production. Chad graduated from the University of Maryland where he met his wife Cathy. They live in Fairfax, Virginia with their two beautiful daughters, Morgan, 12, and Grace, 9. Topics:
Chad Tompkins
Future of Public Sector Workforce and Workspace Su
Robert Carey
Home » Defense » Mobility Partners | 科技 |
2017-09/1580/en_head.json.gz/10493 | Royal Society accepts global warming skepticism...
It's about time... Britain’s premier scientific institution is being forced to review its statements on climate change after a rebellion by members who question mankind’s contribution to rising temperatures.The Royal Society has appointed a panel to rewrite the 350-year-old institution’s official position on global warming. It will publish a new “guide to the science of climate change” this summer. The society has been accused by 43 of its Fellows of refusing to accept dissenting views on climate change and exaggerating the degree of certainty that man-made emissions are the main cause.The society appears to have conceded that it needs to correct previous statements. It said: “Any public perception that science is somehow fully settled is wholly incorrect — there is always room for new observations, theories, measurements.” This contradicts a comment by the society’s previous president, Lord May, who was once quoted as saying: “The debate on climate change is over.”The admission that the society needs to conduct the review is a blow to attempts by the UN to reach a global deal on cutting emissions. The Royal Society is viewed as one of the leading authorities on the topic and it nominated the panel that investigated and endorsed the climate science of the University of East Anglia.Sir Alan Rudge, a society Fellow and former member of the Government’s Scientific Advisory Committee, is one of the leaders of the rebellion who gathered signatures on a petition sent to Lord Rees, the society president.He told The Times that the society had adopted an “unnecessarily alarmist position” on climate change.Sir Alan, 72, an electrical engineer, is a member of the advisory council of the climate sceptic think-tank, the Global Warming Policy Foundation.He said: “I think the Royal Society should be more neutral and welcome credible contributions from both sceptics and alarmists alike. There is a lot of science to be done before we can be certain about climate change and before we impose upon ourselves the huge economic burden of cutting emissions.” posted by GayandRight @ 1:43 PM 0 Comments:
Afghan insurgents trained in Iran...
David Littman - in Ottawa on June 7th
Cover-up in Iran....
Bat Ye'or on Eurabia...
Syria arming Hezbollah from secret bases...
Gaza aid convoy won't deliver package to Gilad Sch...
The democracy that is Israel....
Crazier than Hamas???
Bat Ye'or is coming to Canada....
Khaled Abu Toameh is a hero..not a traitor... | 科技 |
2017-09/1580/en_head.json.gz/10515 | At the Printer, Living Tissue
By HENRY FOUNTAIN Published: August 18, 2013 SAN DIEGO — Someday, perhaps, printers will revolutionize the world of medicine, churning out hearts, livers and other organs to ease transplantation shortages. For now, though, Darryl D’Lima would settle for a little bit of knee cartilage. Dr. D’Lima, who heads an orthopedic research lab at the Scripps Clinic here, has already made bioartificial cartilage in cow tissue, modifying an old inkjet printer to put down layer after layer of a gel containing living cells. He has also printed cartilage in tissue removed from patients who have undergone knee replacement surgery. There is much work to do to perfect the process, get regulatory approvals and conduct clinical trials, but his eventual goal sounds like something from science fiction: to have a printer in the operating room that could custom-print new cartilage directly in the body to repair or replace tissue that is missing because of injury or arthritis. Just as 3-D printers have gained in popularity among hobbyists and companies who use them to create everyday objects, prototypes and spare parts (and even a crude gun), there has been a rise in interest in using similar technology in medicine. Instead of the plastics or powders used in conventional 3-D printers to build an object layer by layer, so-called bioprinters print cells, usually in a liquid or gel. The goal isn’t to create a widget or a toy, but to assemble living tissue. At labs around the world, researchers have been experimenting with bioprinting, first just to see whether it was possible to push cells through a printhead without killing them (in most cases it is), and then trying to make cartilage, bone, skin, blood vessels, small bits of liver and other tissues. There are other ways to try to “engineer” tissue — one involves creating a scaffold out of plastics or other materials and adding cells to it. In theory, at least, a bioprinter has advantages in that it can control the placement of cells and other components to mimic natural structures. But just as the claims made for 3-D printing technology sometimes exceed the reality, the field of bioprinting has seen its share of hype. News releases, TED talks and news reports often imply that the age of print-on-demand organs is just around the corner. (Accompanying illustrations can be fanciful as well — one shows a complete heart, seemingly filled with blood, as the end product in a printer). The reality is that, although bioprinting researchers have made great strides, there are many formidable obstacles to overcome. “Nobody who has any credibility claims they can print organs, or believes in their heart of hearts that that will happen in the next 20 years,” said Brian Derby, a researcher at the University of Manchester in Britain who reviewed the field last year in an article in the journal Science. For now, researchers have set their sights lower. Organovo, for instance, a San Diego company that has developed a bioprinter, is making strips of liver tissue, about 20 cells thick, that it says could be used to test drugs under development. A lab at the Hannover Medical School in Germany is one of several experimenting with 3-D printing of skin cells; another German lab has printed sheets of heart cells that might some day be used as patches to help repair damage from heart attacks. A researcher at the University of Texas at El Paso, Thomas Boland, has developed a method to print fat tissue that may someday be used to create small implants for women who have had breast lumpectomies. Dr. Boland has also done much of the basic research on bioprinting technologies. “I think it is the future for regenerative medicine,” he said. Dr. D’Lima acknowledges that his dream of a cartilage printer — perhaps a printhead attached to a robotic arm for precise positioning — is years away. But he thinks the project has more chance of becoming reality than some others. “Printing a whole heart or a whole bladder is glamorous and exciting,” he said. “But cartilage might be the low-hanging fruit to get 3-D printing into the clinic.” One reason, he said, is that cartilage is in some ways simpler than other tissues. Cells called chondrocytes sit in a matrix of fibrous collagens and other compounds secreted by the cells. As cells go, chondrocytes are relatively low maintenance — they do not need much nourishment, which simplifies the printing process Keeping printed tissue nourished, and thus alive, is one of the most difficult challenges facing researchers. Most cells need to be within a short distance — usually a couple of cell widths — of a source of nutrients. Nature accomplishes this through a network of microscopic blood vessels, or capillaries. But trying to emulate capillaries in bioprinted tissue is difficult. With his fat tissue, Dr. Boland’s approach is to build channels into the degradable gel containing the fat cells, and line the channels with the kind of cells found in blood vessels. When the printed fat is implanted, the tubes “start to behave as micro blood vessels,” he said. The body naturally produces chemical signals that would cause it to start growing small blood vessels into the implant, Dr. Boland said, but the process is slow. With his approach, he said, “we expect this will be sped up, and hopefully keep the cells alive.” With cartilage, Dr. D’Lima does not need to worry about blood vessels — the chondrocytes get the little nourishment they need through diffusion of nutrients from the joint lining and bone, which is aided by compression of the cartilage as the joints move. Nor does he need to be concerned with nerves, as cartilage lacks them. But there is still plenty to worry about. Although it is less than a quarter of an inch thick, cartilage of the type found in the knee or hip has a complex structure, with several layers in which collagen and other fibrous materials are oriented differently. “The printing demands change with every layer,” Dr. D’Lima said. “Most 3-D printers just change the shape. We are changing the shape, the composition, the type of cells, even the orientation of the cells.” Dr. D’Lima has been involved in orthopedic research for years; one of his earlier projects, a sensor-laden knee-replacement prosthesis called the electronic knee, has provided invaluable data about the forces that act on the joint. So he was aware of other efforts to make and repair cartilage. “But we didn’t want to grow tissue in the lab and then figure how to transplant it into the body,” he said. “We wanted to print it directly in the body itself.” He and his colleagues began thinking about using a thermal inkjet printer, in which tiny channels containing the ink are heated, producing a vapor bubble that forces out a drop. The technology is very reliable and is used in most consumer printers, but the researchers were wary because of the heat produced. “We thought it would kill the cells,” Dr. D’Lima said. But Dr. Boland, then at Clemson University, and others had already done the basic research that showed that the heat pulse was so rapid that most cells survived the process. Dr. D’Lima’s group soon discovered another problem: the newest thermal inkjets were too sophisticated for their work. “They print at such high resolution that the print nozzles are too fine for cells to squeeze through,” he said. They found a 1990s-era Hewlett-Packard printer, a Deskjet 500, with bigger nozzles. But that printer was so old that it was difficult finding ink cartridges; the researchers finally located a supplier in China who had some. Their idea was to replace the ink in the cartridges with their cartilage-making mixture, which consisted of a liquid called PEG-DMA and the chondrocytes. But even that created a problem — the cells would settle out of the liquid and clog the printhead. So the researchers had to devise a way to keep the mixture stirred up. The mixture also has to be liquid to be printed, but once printed it must become a gel — otherwise the end product would just be a watery mess. PEG-DMA becomes a gel under ultraviolet light, so the solution was to keep the print area constantly exposed to UV light to harden each drop as it was printed. “So now you’re printing tissue,” Dr. D’Lima said. But Dr. D’Lima and his group are investigating other materials for their gel. While PEG-DMA is biocompatible (and approved for use by the Food and Drug Administration), it would remain in the body and might eventually cause inflammation. So they are looking for substances that could degrade over time, to be replaced by the matrix produced by the chondrocytes. The printed material could be formulated to degrade at the same rate as the natural matrix is produced. There are plenty of other challenges as well, Dr. D’Lima said, including a basic one — how to get the right kinds of cells, and enough of them, for the printer. It would not make much sense to use a patient’s own limited number of cartilage cells from elsewhere in the body. So his lab is investigating the use of stem cells, precursor cells that can become chondrocytes. “The advantage of stem cells is that it would mean a virtually unlimited supply,” Dr. D’Lima said. Dr. D’Lima’s team is investigating other technologies that might be used in combination with bioprinting, including electrospinning, a method of creating the fibers in the matrix, and nanomagnetism, a way to orient the fibers. His lab takes a multidisciplinary approach — he even attends Siggraph, the large annual computer graphics convention, to get ideas. “They’re like 10 years ahead of medical technology,” he said. Meanwhile, the lab has upgraded its printing technology. The Deskjet is still around, but it has not been used in more than a year. It has been supplanted by a much more sophisticated device from Hewlett-Packard — essentially a programmable printhead that allows the researchers to adjust drop size and other characteristics to optimize the printing process. Dr. D’Lima said the biggest remaining hurdles were probably regulatory ones — including proving to the F.D.A. that printed cartilage can be safe — and that most of the scientific challenges had been met. “I think in terms of getting it to work, we are cautiously optimistic,” he said. Source Interesting Articles | 科技 |
2017-09/1580/en_head.json.gz/10672 | ScienceQuantum simulator brings hundreds of qubits to bear on physics problemsBrian DodsonMay 10th, 2012NIST's quantum simulator illuminated by fluorescing ions (Photo: Britten/NIST)Physicists at the National Institute of Standards and Technology (NIST) have built a quantum simulator that contains hundreds of qubits - quite a jump from the the 2-8 qubits found in state-of-the-art digital quantum computers. The simulator has passed a series of important benchmarking tests and scientists are poised to study problems in material science that are impossible to model using classical computers.Many important problems in physics and materials are poorly understood at present - not because their basic physical interactions are unknown, but rather because the quantum mechanical description is highly complex and difficult to solve. For example, even an apparently simple problem such as finding the ground state of the hydrogen molecule cannot be exactly solved. Even when using approximations which are physically fairly reasonable, modern computers cannot simulate quantum systems having more than a handful of interacting particles.A great deal of scientific and popular interest has surfaced during the last 20 years concerning the use of quantum computers to get around this problem of complexity. Unfortunately, a rough rule of thumb is that you need the same number of qubits as your problem has quantum degrees of freedom to overcome the exponential complexity of quantum systems, and current quantum computers don't yet make the grade.On to quantum simulators. What is meant by that term, how does one work, and why is the use of a quantum simulator a reasonable alternative to the use of a large-scale quantum computer?A quantum simulator consists of a collection of quantum mechanical objects, typically atoms with spin, arranged in a simple geometry and with known interactions between the objects. An accurate description of the energy of any particular configuration of the spins must be known, even though the quantum state of the simulator is too complex to write down.In use, the initial state of the simulator (e.g. which spins are up and which are down) is initially set, and you observe the evolution of the system. Sometimes this will be done as some measure of temperature is varied, and other times as the strength and/or direction of an external field, such as a magnetic field, is changed. Usually the observation is of some average property of the simulator, such as the total magnetic field of the spins in the simulator.Effective use of a quantum simulator involves describing the physics of a difficult problem in terms of the physics controlling the behavior of the simulator.On to NIST's new quantum simulator, which consists of a collection of hundreds of beryllium ions in an ion trap. The ions self-assemble into a flat crystal with a triangular pattern less than a millimeter in diameter. The beryllium ions have a spin of 1/2, and the simulator is placed in a strong (~4.5 Tesla) magnetic field perpendicular to the plane of the crystal.The photo above shows the NIST quantum simulator with the ions located in a single triangular planar crystal. The ions are all fluorescing, telling us that all the ion spins are in the same state. The individual spins are controllable, so that the initial state of the lattice can be set. In addition, the lattice can be placed into a superposition of differing initial states, and pairs of ions can be entangled as well, adding additional depth to the control one has over the behavior of the simulator.In the simulation, laser beams are used to cool the ion crystal to near absolute zero. Carefully timed microwave and laser pulses then are used to cause the qubits (individual spins) to interact. Controlling the interaction of the qubits can allow an analogy to be made between the simple simulator structure and complex physics defining a much more difficult problem. Although the appearance of the two systems may be very different, their behavior is engineered to be mathematically identical, so that a solution to one yields a solution to the other. Problems that can be studied in this way include gases, liquids, solids, alloys, neural networks, flocking behavior, the interaction of heart cells, and even social behavior.Given the remarkable progress reported by the NIST researchers, it won't be long before the quantum simulator will be a part of the toolbox of every complex materials physics researcher.NIST postdoctorial fellow Joe Britton describes the simulator in the video below.Source: NISTQuantum simulator brings hundreds of qubits to bear on physics problems1 / 1
#Simulations
#Quantum Simulators
#NIST
#Quantum Computing
Latest in Science
Action cams give the low down on dolphins
Polymer probe measures mercury in fish flesh
Genetically-engineered hens produce birds of a different feather
Heat flow method can levitate just about anything
New theory explains why the Earth's core doesn't melt
A four-lens eagle-eyed camera
Ancient pottery reveal weakening of Earth's magnetic field isn't cause for alarm
Software lets researchers keep their sharks straight
Brazilian berry extract stops a superbug in its tracks
Harnessing electricity-generating bacteria to clean up drinking water
Archaeologists uncover first Dead Sea Scrolls cave in 60 years | 科技 |
2017-09/1580/en_head.json.gz/10675 | http://news.nationalgeographic.com/news/2013/07/130715-vampire-archaeology-burial-exorcism-anthropology-grave.html
Archaeologists Suspect Vampire Burial; An Undead Primer
Discovery of a suspected vampire burial site is not a first for archaeologists.
Polish archaeologists believe this skeleton with the head between the legs was found in a 'vampire' grave. Photograph by Andrzej Grygiel, European Pressphoto Agency
When archaeologists opened an ancient grave at a highway construction site near Gliwice, Poland, they came across a scene from a horror movie: a suspected vampire burial.
An Anti-Indiana Jones is Solving the Pyramids’ Secrets Ancient Cannabis 'Burial Shroud' Discovered in Desert Oasis What Can We Learn From a Nazi Time Capsule? Interred in the ground were skeletal remains of humans whose severed heads rested upon their legs—an ancient Slavic burial practice for disposing of suspected vampires, in hopes that decapitated individuals wouldn't be able to rise from their tombs.
But the recent Polish discovery isn't the first time that archaeologists have stumbled upon graves of those thought to be undead. Here's what science has to tell us about a few of history's famous revenant suspects.
How to Bury the Undead
To date, researchers have reported suspected vampire burials in both the Old World and the New World.
In the 1990s, University of British Columbia archaeologist Hector Williams and his colleagues discovered an adult male skeleton whose body had been staked to the ground in a 19th-century cemetery on the Greek island of Lesbos. Whoever buried the man had driven several eight-inch-long iron spikes through his neck, pelvis, and ankle.
"He was also in a heavy but nearly completely decayed wooden coffin," says Williams, "while most of the other burials [in the cemetery] were simply in winding sheets in the earth." Clearly, someone did not want the man to escape the grave. But when physical anthropologists studied the skeleton, Williams adds, they "found nothing especially unusual about him."
More recently, an archaeological team led by University of Florence forensic anthropologist Matteo Borrini came across another suspected vampire burial on the Italian island of Lazzaretto Nuovo. In this case, the body proved to be that of an elderly woman, who was apparently interred with a moderate-sized brick in her mouth—a recorded form of exorcism once practiced on suspected vampires in Italy.
Then there's the New World. In the 1990s, archaeologists working in a small 18th- to 19th-century cemetery near Griswold, Connecticut, came across something highly unusual: the grave of a 50-something-year-old man whose head and upper leg bones had been laid out in a "skull and crossbone" pattern.
Upon examination, physical anthropologists determined that the man had died of what was then called "consumption"—and what is now known as tuberculosis. Those who suffer from this infectious disease grow pale, lose weight, and appear to waste away—attributes commonly linked both to vampires and their victims.
"The vampire's desire for 'food' forces it to feed off living relatives, who suffer a similar 'wasting away,'" the researchers noted in a paper in the American Journal of Physical Anthropology. To play it safe, local inhabitants seem to have decapitated the body of the suspected vampire.
The Dead Truth
Most archaeologists now think that a belief in vampires arose from common misunderstandings about diseases such as tuberculosis, and from a lack of knowledge about the process of decomposition.
Although most 19th-century Americans and Europeans were familiar with changes in the human body immediately following death, they rarely observed what happened in the grave during the following weeks and months.
For one thing, rigor mortis eventually disappears, resulting in flexible limbs. For another, the gastrointestinal tract begins to decay, producing a dark fluid that could be easily mistaken for fresh blood during exhumation—creating the appearance of a postprandial vampire.
When and where the next one will appear is anyone's guess. | 科技 |
2017-09/1580/en_head.json.gz/10728 | Source of Stonehenge's rocks pinpointed
(NEWSER) – For nine decades, it's been established that many of Stonehenge's smaller rocks hail from the Preseli Hills in Wales. Now, a newly published study says that we've been wrong about an outcrop that has been accepted as a specific source since 1923. And the new research, published in February's issue of Journal of Archaeological Science, suggests the rocks—known as bluestones, of which there are many types—may not have been transported there by humans at all. As head researcher Dr. Richard Bevins explains, in 1923, geologist H.H. Thomas identified Carn Meini as the source of spotted dolerite bluestones, but a new analysis of the rocks' chemical makeup has fingered Carn Goedog as the true home of at least 55% of those used at Stonehenge, reportsPlanet Earth. | 科技 |
2017-09/1580/en_head.json.gz/10759 | Mark Zuckerberg defends Facebook as social network reaches one billion users, calls milestone 'an amazing honor'
Thursday Oct 4, 2012 3:59 AM
By Jessica Hopper and Tim UehlingerRock CenterFacebook founder and CEO Mark Zuckerberg announced Thursday that his company has reached an unprecedented milestone: one billion users.In an exclusive interview with NBC’s Matt Lauer, Zuckerberg said the 8-year-old company added 200 million new users in the last year and called having a billion users across the world “an amazing honor.”“To be able to come into work every day and build things that help a billion people stay connected with the people they care about every month, that’s just unbelievable,” Zuckerberg said in an interview airing Thursday morning on Today and on NBC’s Rock Center with Brian Williams at 10pm/9c.Despite the excitement of crossing the billion mark, Zuckerberg acknowledged the company’s leadership and its staff have been in a “tough cycle” in the months since the company’s initial public offering in May.Valued at $100 billion when it went public, Facebook is now worth nearly half that. The social network has faced new scrutiny from its users and investors over how it plans to make money from those billion users, especially the ones who access Facebook through mobile devices.
“Things go in cycles. We’re obviously in a tough cycle now and that doesn’t help morale, but at the same time, you know, people here are focused on the things that they’re building,” said Zuckerberg of his staff. “I mean, you get to build things here that touch a billion people, which is just not something that you can say at almost anywhere else, so I think that’s really the thing that motivates people.”The drop in the company’s value has left many questioning if the 28-year-old tech visionary has the business know-how to be CEO. “I take this responsibility that I have really seriously and I really think Facebook needs to be focused on building the best experiences for people around the world, right? And we have this philosophy that building the products and services and building the business go hand in hand,” Zuckerberg said.Zuckerberg said that he and his team are focusing on growing the number of people who use Facebook on mobile devices, such as smartphones, a move that he says will make money and respond to the changing needs of their users.“There’s five billion people in the world who have phones, so we should be able to serve many more people and grow the user base there,” Zuckerberg said.Of his strategy as CEO, Zuckerberg said that he has taken a few lessons from his late friend, Apple founder Steve Jobs.“I mean, he was just so focused, right? I mean for him, the user experience was the main thing that mattered, the only thing that mattered and I think that there’s a lot that every company can learn from that,” he said. “We’ve had a lot of that philosophy too, which is we just want to stay maniacally focused on building the best product for those people and I think that’s the path to building a great business and, you know, I think that’s something that Steve understood more than most.”Zuckerberg is also friends with current Apple CEO Tim Cook who recently gave him a new iPhone 5 as a gift. Zuckerberg, who called the phone a “wonderful device,” wouldn’t say which mobile device he thinks is the best. More people use Facebook on Android than on the iPhone, Zuckerberg said.“It’s a pretty diverse ecosystem and we spend our time building for all these different things,” he said.DIGITAL LIFE: How big is Facebook? If Facebook was a country, it would have the third largest population.Zuckerberg’s army of builders work on the company’s sprawling campus headquarters in Menlo, Park, Calif. The word “hack” is imprinted on one of the walkways; a homage to the spirit with which Zuckerberg founded the social network. In a building with a red sign saying, “The Hacker Company,” casually clad programmers, engineers and designers toil for the CEO they call 'Mark.'Zuckerberg, often wearing a uniform of jeans and a gray t-shirt, said he frequently leaves his glass office in the afternoons to roam the offices or walk outdoors to talk to his staff. Every Friday, Zuckerberg holds a weekly Q&A for team members to air their grievances and propose their ideas.Zuckerberg said he tries to keep his personal life “simple.” When his May wedding to Priscilla Chan was splashed across magazine covers, he said it felt “odd.”“It’s surprising, but you know, it doesn’t take away from these moments, right? I mean the wedding was an awesome thing,” he said.Zuckerberg met Chan when both were students at Harvard. He said they secretly planned their wedding to be around the time Chan finished medical school.“I sent out this email to all our friends, telling them that I was having a surprise party for her from graduating from medical school,” he said. “It was a really small wedding. We had it in our backyard. It was 80 or so people, but it was really nice.”Of his future, Zuckerberg said his company’s mission has morphed into his life’s mission.“Our responsibility as a company is just to do the best that we can and build the best products for people,” Zuckerberg said. “If we build the best products, then I think that we can continue leading in this space for a long time.”Editor's Note: Matt Lauer's exclusive interview with Mark Zuckerberg airs Thursday, Oct. 4 at 10pm/9c on NBC's Rock Center with Brian Williams. | 科技 |
2017-09/1580/en_head.json.gz/10763 | CDFW Using Helicopters to Survey Wildlife in Modoc, Lassen Counties $7 million in sustainable groundwater grants announced by California Golf Trick-Shot Legend Ben Witter Dies at Age 51 Governor Brown Appoints Four to San Diego County Superior Court San Diego County Adds Firefighting Resources, Urges Residents to “Get Fired Up” About Preparedness San Diego News Water samples teeming with information: Emerging techniques for environmental monitoring Print Details Written by Julia Turan Category: Latest News Published: 29 June 2014 Stanford, California - Environmental policy must respond to ever-changing conditions on the ground and in the water, but doing so requires a constant flow of information about the living world.
In a paper published in Science this week, scientists from Stanford's Center for Ocean Solutions, the University of Washington and the University of Copenhagen propose employing emerging environmental DNA (eDNA) sampling techniques that could make assessing the biodiversity of marine ecosystems – from single-cell critters to great white sharks – as easy as taking a water sample.
Controlling invasive species and saving endangered ones are among the many applications of a new set of monitoring tools that use DNA recovered from the environment.
Although traditional sampling methods – including dive surveys and deploying sampling gear in the water – have been widely used in environmental monitoring, they are expensive, invasive and often focus only on a single species. Genetic monitoring via a form of DNA, known as eDNA, that is shed into the environment by animals could overcome some of these issues.
eDNA is like a fingerprint left at a crime scene. This material may come from metabolic waste, damaged tissue or sloughed off skin cells. Once it is collected, scientists can sequence the DNA to create a fast, high-resolution, non-invasive survey of whole biological communities.
"The eDNA work is potentially a game-changer for environmental monitoring," said Larry Crowder, a professor of biology at Stanford's Hopkins Marine Station, senior fellow at the Stanford Woods Institute for the Environment, science director at the Center for Ocean Solutions and a co-author of the study. "A number of laws require monitoring, but actually keeping tabs on large, mobile, cryptic animals is challenging and expensive."
Using DNA to inform policy
The cost of DNA sequencing is decreasing rapidly, a trend that has fueled eDNA studies in recent years.
"We wanted to know how to put these amazing new genetic tools to use," said lead author Ryan Kelly, an assistant professor at the University of Washington and a visiting fellow at the Center for Ocean Solutions. "Harnessing eDNA is a perfect example of how cutting-edge science can plug into many of the environmental laws we have on the books."
Nearly every environmental law imposes environmental monitoring obligations on government or the private sector, said Meg Caldwell, a senior lecturer at the Stanford Woods Institute and Stanford Law School, and executive director of the Center for Ocean Solutions, as well as a contributing author of the study. "Pushing the science of genomics to help society perform monitoring more cheaply and effectively is one of our core goals," she said.
The authors provide several examples of scientific-legal interactions, among them the use of eDNA to inform the enforcement of laws such as the Endangered Species Act and Clean Water Act with detailed, low-cost data.
So far, eDNA has been used to determine the presence or absence of certain target species. This technique is useful for detecting invasive species or changes in the distribution of endangered species. However, scientists are still evaluating how eDNA concentrations relate to specific numbers of organisms in the wild.
A challenging aspect of the approach is determining exactly where the eDNA was generated, especially in dynamic marine systems. eDNA is thought to persist in water for only a few days.
With these limitations, eDNA alone is not yet enough for policy applications, but it is already being used to supplement existing monitoring. This combination approach has recently been used in California to detect human- and animal-based pathogens in waters off state beaches.
"There is much work left to do to develop and validate this approach, but the potential is amazing," Crowder said. "We will continue to work with other scientists at the Center for Ocean Solutions and worldwide to advance and test this approach."
The David and Lucile Packard Foundation provided initial funding for the original concept of the eDNA tool, as part of its core support to the Center for Ocean Solutions, as well as additional funding to begin testing the tool in the field. A recent Environmental Venture Project grant from the Stanford Woods Institute will help researchers refine the eDNA tool. Prev Next You are here: Home
Water samples teeming with information: Emerging techniques for environmental monitoring Main Menu | 科技 |
2017-09/1580/en_head.json.gz/10768 | Characterisation and pathogenicity of Rhizoctonia isolates associated with black root rot of strawberries in the Western Cape Province, South Africa
Botha A. ; Denman S. ; Lamprecht S.C. ; Mazzola M. ; Crous P.W. (2003)
Black root rot is an important disease of strawberry caused by a complex of fungi including species of Rhizoctonia. In this study, the Rhizoctonia species and anastomosis groups isolated from diseased strawberries in the Western Cape Province of South Africa were determined and their pathogenicity and relative virulence assessed. Both binucleate and multinucleate types were recovered from diseased roots and identified as R. fragariae and R. solani, respectively. Anastomosis grouping of the isolates was carried out on a sub-sample using the conventional method of hyphal fusion, and molecular techniques were employed to confirm results of the former. RFLP analysis of the 28S RNA gene was used to further characterise relationships among the isolates of Rhizoctonia. The molecular results correlated with those obtained from the conventional methods. In the sub-sample tested, all isolates of R. solani were members of Anastomosis Group 6, whereas three AG types were identified among isolates of R. fragariae, viz. AG-A, AG-G and AG-I at a relative occurrence of 69%, 25% and 6%, respectively. Pathogenicity trials were conducted on 8-week-old cv. Tiobelle plants. All Rhizoctonia isolates tested were pathogenic to strawberry, but R. solani (AG 6) was the most virulent causing severe stunting of plants. R. fragariae AG-A and AG-G were not as virulent as R. solani but also caused stunting. R. fragariae AG-I was the least virulent, and did not cause stunting of the plants; however, it incited small, pale, spreading lesions on infected roots. This is the first species confirmation and AG type identification of Rhizoctonia taxa causing root rot of strawberries in South Africa. | 科技 |
2017-09/1580/en_head.json.gz/10774 | Dale Partridge is a serial entrepreneur and Founder of Sevenly.org, a 40+ person socially conscious e-commerce company based in Los Angeles, California. Sevenly was named “The Most Social Media Driven Company in America” by Mashable.
Each week Sevenly partners with one qualified nonprofit, and donates $7 from every product sold to support that charity’s cause. Since it’s launch in June 2011, Mashable, Los Angeles Times, and Forbes have named Sevenly one of the fastest growing social good start-ups in the country. In less than two years, Sevenly has given over $2.8 million in $7 donations to charities across the globe.
Dale’s best known for his expertise in consumer trends, branding, marketing, and social media. He is a creative leader who is influencing an industry to rethink the models of how we do business today.
Dale started his first company while still a teenager and has partnered and launched a number of successful multi-million dollar organizations since. His personal mission is to lead a generation toward generosity and empower business leaders through the teaching philosophy that people matter.
He is an avid speaker and author of the upcoming book People over Profit. He has keynoted at Facebook, Adobe, Panasonic, Invisible Children, and many more. Dale has been featured in various business publications including the cover of Entrepreneur Magazine, INC Magazine, Mashable, MSN Money, Forbes and the Los Angeles Times. He has appeared on FOX News, NBC, and other various talk shows.
Dale is a dynamic speaker who leads with authenticity and beautiful design. His speaking philosophy of “speak to people how they need to hear it, not how you want to say it” has earned him trust and credibility with audiences around the world.
bookdale@seeagency.com
www.sevenly.org
Dale delivered just the right amount of humor and intensity. It really set the tone for our event.
Founder of the Q Conference | Dale is a true teacher. The audience could barely keep up with his wisdom.
Founder Identity Conference | Dale's dynamic style and personality will captivate any crowd.
Panasonic | Dale was charismatic, insightful and really connected with our audience. His passion was contagious.
Cal State University Fresno | Dale was amazing, and the audience could not get enough. We could not of asked for a better speaker.
Director of the Social Media Summit | Dale was powerful, personable, and inspirational.
Invisible Children | Copyright © See Agency 2017, All Rights Reserved | 科技 |
2017-09/1580/en_head.json.gz/10806 | Barnes & Noble Puts Google’s Play Store and Apps on the Nook
The walled garden around the venerable bookseller's reading-centric tablets is coming down. By Harry McCracken @harrymccrackenMay 03, 2013 Share
Barnes & Noble Email
The walls around Barnes & Noble‘s Nook walled garden are tumbling down.
The company’s Nook HD and Nook HD+ are credible content-consumption tablets — remarkably credible, actually, considering that they come from a 127-year-old bookseller. But they sold so poorly over the holiday season that it raised questions about whether B&N would end up being forced to de-emphasize its hardware business in favor of selling content on other platforms.
The Nooks use Barnes & Noble’s own custom version of Android and provide its own stores for books, magazines, newspapers and apps. And therein lies an oft-raised argument against buying a Nook: the Barnes & Noble application store has had only 10,000 pieces of software — mostly for-pay ones — vs. the hundreds of thousands of choices in Google’s Google Play.
So with one fell swoop, in the form of a software update being rolled out today, B&N is eliminating that downside. It’s giving both Nooks the Google Play stores for apps, music, movies and books, plus key Google apps which the tablets have lacked until now: Chrome, Gmail and YouTube. (Google’s policies for its apps are an all-or-nothing proposition for device makers — if they want Google Play, they also have to pre-install Google’s apps.) New Nooks sold at Barnes & Noble’s bookstores and elsewhere will also carry the updated software.
The bottom line: if something’s available for Android, it’s now available for Nook, assuming it’s compatible from a technical standpoint. Among other things, that means you’ll be able to install Amazon’s Kindle app on a Nook and read books you’ve purchased from Amazon. For the first time, the notion of someone with a heavy investment in Kindle books buying a Nook doesn’t sound completely impractical.
Apps you bought from Barnes & Noble will be marked with an “n” label (for Nook). And if you’ve bought Google Play apps for another Android device, they’ll be downloaded to your Nook at no extra charge.
The arrival of the standard Android stores and apps doesn’t mean that the Nooks are becoming plain-vanilla, general-purpose Android devices. They still sport Barnes & Noble’s heavily-customized version of Android 4.0 Ice Cream Sandwich, with a reading-centric home screen and B&N’s own services and apps for books, magazines, newspapers and video. (It hasn’t had a music store of its own.) The fact that the tablets don’t run Android 4.2 Jelly Bean, the latest version of the operating system, is largely moot: B&N has modified the software’s look and feel so much that it’s not readily apparent which flavor of Android its devices run.
Stephane Maes, vice president of product for Barnes & Noble, told me that the company will continue to operate its own application store. But with Google Play on its devices, it won’t be under pressure to be comprehensive. Instead, it can focus in on particular categories, such as high-quality kids’ software.
All of this adds up to one gigantic, gutsy gamble for Barnes & Noble. It’s been selling its tablets at dirt-cheap prices — $199 for the 7″ Nook HD, and $269 for the 9″ Nook HD+ — in hopes of turning an overall profit by selling content. Starting today, you could buy a new Google-ized Nook at the same price and buy all your content through Google’s stores rather than from Barnes & Noble.
Maes told me that the company understands the risk. But “we’ve done a lot of analysis,” he says, “and while we may lose some sales to Google, a rising tide carries all ships.” In other words, it hopes to sell a lot more Nooks, thereby giving it a lot more customers for its content. (And B&N’s own stores remain hardwired into the Nook interface in a way which Google Play will not be.)
I wasn’t even sure whether what Barnes & Noble is doing was technically possible on devices with interfaces that depart so sharply from standard Android. But it is, and it’s hard to see any reason why this isn’t good news for Nook owners and potential Nook owners. It also draws a sharper distinction between the Nook and Amazon’s Kindle Fire tablets, which don’t have Google’s stores and offer no straightforward access to Google’s apps.
It’s way too early to know whether B&N’s gambit will help end the Nooks’ sales doldrums, but it’ll be fun to watch — and offhand, I can’t think of any other strategy with better odds of paying off.
Oh, and one odd side note. Microsoft is part owner of the Nook business, having invested $300 million in it. As part of its ongoing, Google-bashing “Scroogled” campaign, the company is currently telling consumers that they should find Google Play’s policy of sharing users’ names, e-mail addresses and zip codes with developers of paid apps to be deeply unsettling. The whole campaign is a counter-productive embarrassment, and I can’t wait until Microsoft figures that out. But I wonder how the company feels about the fact that products it owns a piece of will be, um, Scroogling people?
[UPDATE: Droid Life reports that Google is in the process of rolling out a Google Wallet update which doesn’t share customers’ names and e-mail addresses with developers. Microsoft’s Scroogled campaign is still saying that it does, though — in fact, I just saw one of the ads on MSNBC.] | 科技 |
2017-09/1580/en_head.json.gz/10816 | Featured, 3 Best Environmentally Eco-Friendly Green Android Smartphones
November 15, 2016, 9:15 am According to Ericsson’s 2015 Mobility Report, there will be a total of 9.2 billion mobile subscriptions by 2020. Given that the average smartphone life expectancy is 4.7 years—with most customers replacing their smartphones much sooner than that—we have the face the growing issue of electronic waste. Even year, over 125 million cell phones are thrown out in the United States alone, and the number will only increase unless we do something about it.
The good news is that eco-friendly smartphones make for a pretty good publicity, so phone manufacturers have an incentive to invest money in sustainable technologies and environmentally conscious devices. As Cellular News reports, 400 million green cell phones are expected to be sold by 2017. Furthermore, Sprint announced this year that all smartphones they sell must meet standards set by UL Environment, an organization that strives to advance the recognition of sustainable products and drive purchaser clarity by bringing trusted transparency to the green marketplace.
But how do you choose a good eco-friendly smartphone and how does it differ from a regular one? Most eco-friendly smartphones share several key features: they use less energy and charge more efficiently, are built using sustainable and recycled materials, their manufacturing doesn’t negatively impact the environment, and radiation emission rates are kept at the minimum.
The problem is that the average smartphone review is more concerned with processing power, display resolution, and benchmark scores, often neglecting to pay any attention to less geeky features. Well, this is not your average smartphone review. We have selected top 3 best environmentally-friendly green Android smartphones, focusing predominantly on what makes them safer for the environment.
Fairphone 2
Not only is the Fairphone 2 the current green leader, but it’s also the first smartphone to earn a 10/10 reparability score on iFixit, a global community of people helping each other repair things. “We aim to create positive social and environmental impact from the beginning to the end of a phone’s life cycle,” claims the company on their official website. To achieve this goal, they only source materials from suppliers who share their vision of a green, sustainable future.
But even the greenest smartphone on the planet is far from eco-friendly when you have to ditch it and get a new one only because of a cracked screen. That’s why the creators of the Fairphone 2 designed the smartphone to be modular and easily reparable. They sell spare parts in their online shop and offer repair tutorials for retailers and individual users.
To repair the phone, just lift the back cover to access the inner compartment, which houses six modules: the display module, camera module, battery, core module, top module, and bottom module. Out of these, the display module is the most expensive one, currently costing €85.70. Smaller modules, such as the earpiece or vibration mechanism cost just a few dollars each. We love that you completely repair the Fairphone 2 using just one small screwdriver, instead of an assortment of obscure tools as is often the case with other smartphones.
The smartphone itself is made from recycled plastic and aluminum. Due to the focus on reparability, the design is considerably less premium than what you might expect for its rather high price of $590, but that’s a small sacrifice to make.
Inside is an older but still capable quad-core processor from Qualcomm, 32 GB of internal storage space, 2 GB of memory, and an 8 MP rear CMOS sensor coupled with a 2 MP front-facing sensor. The 5” IPS display has a Full HD resolution and is protected by Gorilla 3 glass.
The company behind the smartphone is committed to helping people improve their privacy, by providing alerts on app permissions. Each app is rated based on how much access it has to your personal information. All of this makes the Fairphone 2 the best green smartphone on the market.
What Users Like
Easy to repair
Swappable parts
Unassuming design
Kazuo Hirai, President and CEO of Sony, says, “At Sony we always endeavor to reduce environmental impact in our wide range of products. These efforts are evident in compact, lightweight, energy-efficient designs that reduce wasted resources and power consumption, and our commitment to using recycled materials.”
The company has come up with their “Road to Zero” environmental plan, which aims for a zero environmental footprint. To their words into action, Sony has released a new range of eco-friendly smartphones that have some of the lowest levels of toxic materials of all smartphones. A part of this range is the Xperia ZR, a compact 4.55” Android smartphone with a brilliant screen and even more stunning rear camera.
Sony is using up to 99% recycled materials, in order to contribute to building a sustainable society. These materials make the smartphone environmentally friendly and remarkably lightweight, weighing only 138 grams. Its smallish display has an HD resolution and is protected by a shatterproof glass. The rest of the phone is water and dust resistant and capable of withstanding 1.5 meters of freshwater for up to 30 minutes.
At around $150, the Sony Xperia ZR is clearly a budget device, which is only confirmed by its outdated Qualcomm APQ8064 Snapdragon S4 Pro chipset and Adreno 320 GPU. Sony was wise enough to include 2 GB of memory, helping the smartphone run smoothly in most situations that don’t involve demanding 3D games or dozens of browser tabs.
If you don’t want to spend hundreds of dollars on an eco-friendly smartphone, the Sony Xperia ZR is a great alternative to the much more expensive Fairphone 2. Granted, you won’t be able to easily repair the smartphone by yourself, but the money you save make that a non-issue.
Bright display
Several colorful version to choose from
Used/Refurbished Samsung Galaxy S6 (Or Any Used Smartphone)
Those 140 million cell phones users get rid of every 14 to 18 months often end up on the second-hand market and sites like eBay, Craigslist, and Amazon. Many cellular companies and retailers also refurbish older devices and offer them for a fraction of their original price. If you really think about it, the greenest smartphone you can buy is a used smartphone. And the best used smartphones money can buy are a generation or two old flagship models from major smartphone manufacturers, such as Samsung or LG. Owners of flagship devices tend to take good care of them, and they usually sell them just after a year or two of use.
Currently, our top pick for a used Android smartphone is the Samsung Galaxy S6, the flagship king of 2015. It immediately attracts the eye with its gorgeous metal body and 5.1” Super AMOLED capacitive touchscreen protected by Corning Gorilla Glass 4. The display has a resolution of 1440 x 2560 pixels, which is fantastic for media consumption and pictures. The rear 16 MP sensor with f/1.9 and 2160p video recording at 30 frames per second is in every way as good as you would expect Samsung’s camera to be. It takes sharp, vibrant pictures that never fail to amaze with their high dynamic range.
The Exynos 7420 Octa chipset is by far the most powerful processor on this list, easily handling heavy usage and demanding multi-tasking scenarios. The non-removable Li-Ion 2550 mAh battery could be the only problem, when buying the smartphone on the second-hand market. Its original capacity is great, but, as all smartphone batteries do, it will last less and less as it gets older.
Lovely metal body
Excellent camera
Sharp screen
best eco friendly phonesfairphone 2Galaxy S6sony xperia zr | 科技 |
2017-09/1580/en_head.json.gz/10839 | EARTH: AN ALIEN ENTERPRISE - TIMOTHY GOOD’S UFO UNIVERSE A UFO Digest Book Review By Steve Erdmann of Timothy Good's most recent book entitled "Earth: An Alien Enterprise" THE MISSING GRUDGE-BLUE BOOK REPORT CHAPTER 13 - CONTENTS REVEALED THE GRUDGE-BLUE BOOK INCIDENTS Contained in the Missing Chapter (13) of the Grudge Report as they were revealed to John Lear by William English A UFO Book from Bangladesh: Obaidur Rahman’s “The Search for Extraterrestrial Life" A Dhaka, Bangladesh based writer and I recently publish a book entitled "The Search for Extra-Terrestrial Life in the Universe" TRUMP & PUTIN READY TO SIGN SPACE-BASE WEAPONS BAN TREATY! ... BUT ... Russia ready to sign space-based weapons ban Treaty on the Prevention of the Placement of Weapons in Outer Space while Obama Signs a Defense Bill... COSMIC SOCIETY OPENS TELEPATHIC CENTERS IN EUROPE & WORLDWIDE TO PREPARE THE WORLD ... Cosmic Society Opens Telepathic Centers in Europe & World-Wide to Prepare the World ... For First Contact THE FOURTH REICH RISING: HOW HITLER WON WORLD WAR II “History now tells us that German industries as well as Hitler devised a devious plan that enabled them to retain valuable assets even as they... THE JFK TORRENT OF TRUTH 2016: 53 YEARS LATER ... THE TRUTH WILL "OUT" THE CRIMINALS Compiled by Victor Martinez U.K. EXPRESS Reports: DONALD TRUMP WAS "TAILED BY UFO" DURING HIS PRESIDENTIAL CAMPAIGN Donald Trump 'was tailed by a UFO' as he flew across Iowa in his helicopter during the Presidential election campaign THE FAKED KOREAN WAR Patrick O'Carroll examines the false pretexts of the Truman Administration that fomented and caused the Korean War TRUMP VICTORY: Russia's Putin Sends Congratulations by Telegram Donald Trump Elected 45th President of The United States SORCHA FAAL Claims "RUSSIA NAMES HILLARY CLINTON AS "MURDER SUSPECT" IN DEATHS OF TWO (2) UFO RESEARCHERS" tWO (2) UFO RESEARCHERS DIE MYSTERIES DEATH & RUSSIAN SUSPICION FALLS ON HILLARY CLINTON AND CGI DEMON CRITICS EXPOSED: Cracking the Demon Critics Exposing the "Demon Critics," i.e. those skeptics who deny that demonic possession is a real phenomenon, not a psychiatric disorder. GHASTLY ENCOUNTER NEAR SHREVEPORT - PART II A multitude of unexplained mysteries converge during the era of the Fouke Monster sightings OUR STRANGE RELATIONSHIP WITH THE SUBJECT OF UFOs Author questions our relationship with the UFO issue EXPOSING THE TOP SECRET TREATY BETWEEN THE U.S. AND ALIENS Global Communications/Inner Light Publications, the ever-prolific publishing house helmed by CEO Timothy Green Beckley, has recently published a new... Home
INFORMATION RECEIVED DURING MY UFO ENCOUNTERS
By John Foster - 3 years 3 months ago
As stated below, this is a summary or overview of my 46 plus years of experiencing UFO encounters from my very early childhood.
The information suggests, or indicates solidly many insightful things that now affect, have affected, and will affect our human reality as we perceive it…including many elements of our sciences, religions and history.
Some of the information is self explanatory, while other information I was given…or information that was obviously suggested…demands further investigation, analysis and speculation. I will attempt to do this in another entry on this website.
Some of the information I received during the encounters:
✽ When I was 2 or 3 years old, 1939 or 1940, the so-called Ets told me that they were here to change humankind and the way they did it was to enlist help from people like me. They said we would have to fool some people because some people didn’t like change…etc.
✽ When I was 11 years old, early Summer of 1949, they told me that if I didn’t want to be like one cow in a herd of cattle that I should not resist what they were doing to me and telling me…that I should cooperate with them. They said the churches were not teaching what they were set up to teach and that modern corporations had too narrow a focus on profit. They said I needed to develop my metaphysical talents and that I would teach others in the future about construction techniques. They explained things about the four directions and told me to always know where I was in relation to North, South, East and West. They explained the purpose of “the familiar strangers” and encouraged me to join them in their tasks.
✽ When I was about 13 years old, around 1950, they encouraged me to join the boys’ auxiliary organization of the Masonic Lodge. They demonstrated a Masonic ceremony and put me through part of an Ancient Egyptian ceremony. I saw several small alligator-like creatures who they said were in charge of my life-long education and development. They indicated I would work with buildings and other people. They placed me on an examination table and told me I would have heart attacks and heart surgeries in my later years…which I did some 38 and 45 years hence. They admonished me for getting into trouble with my young buddies.
✽ The Summer of 1952, just before I passed from Junior High School into Senior High School, they told me I would take Mathematics, Chemistry and Physics in High School and that I would go to college and take them in college…and as a matter of fact that I would be an engineer…essentially whether I like it or not…which I did (become an electro mechanical research and development engineer) but without an engineering degree.
✽ In June of 1957 they told me that they were the Monitors or Keepers of the Earth, a brotherhood who was responsible to a hierarchy that included the Arch Angels Michael and Gabriel…who in turn were responsible to the entity of Christ, who was in charge of this corner of the Universe. They said they were here to save humankind from destroying itself (as they had done in the past) and that they were preparing people for a time when they would return and take them away. Those who were willing and qualified would go and those who weren’t would stay and suffer some kind of consequence that I don’t remember. We (my friend and I) met “the familiar strangers” and orphans who were to be a part of a future project that would involve others for the purpose of helping the orphans survive and educating the orphans and others. When we asked questions about what was happening, we were told we were only to see and hear those things we heard and saw…i.e. we were only to be presented what we experienced during the encounters and nothing more.
✽ Throughout the 1950s a friend or a relative and I infrequently, but periodically would meet the “familiar strangers,” who most often arrived in an elongated windowed flying saucer. We would then fly off into the future in the craft, to specific Earthly locations where we were shown and told what we were to do in the future. I even received technical information in minute detail. This involved the vast Et/human project that would not only help the orphans but many others in an area stretching from Minneapolis to Jackson Hole, Wyoming, including some people in Colorado. Also, throughout the 1950s into the 1960s and 1970s, we experienced future “time warps” in which we received information and saw what we were to do in the future. This occurred from coast to coast and from Canada to Mexico. The time warps were what seemed to be a physical phenomenon, as well as other-dimensional phenomenon that sometimes developed as we watched. Buildings and people (including “the familiar strangers”) developed in what appeared to be a distorted atmosphere and we witnessed and experienced the future happenings. I have found that others also have witnessed this throughout history. It most often was called “a bubble,” etc.
✽ Many other things happened and were said throughout the 46 plus years…too many to include here. However, it is important to note that I was never alone during the encounters, except for a few short episodes. Over 125 different people who I knew experienced one or more encounters with me.
✽ Also throughout the entire 46 plus years, I experienced what I call “conditioning sessions” where I was asked to enter a strange object or craft, then told to turn around, kneel down and face the door. After kneeling down, I would see a strange light approach my forehead and hover about one foot away…and I would began to feel excruciating forces penetrate my body and mind. When I yelled out that I couldn’t stand it anymore, they would tell me to hold on, that it wouldn’t last much longer. When I felt I might pass out, the forces began to fade away. At times, I would then find myself floating a foot or so above the ground. This, they said, was to improve me as a human being. We experienced and were told other things that allegedly would improve us as human beings.
✽ When asked to go public with my story in 1988, I did…but suffered massive heart attacks and heart surgery, losing over half of my heart. And I continued to suffer serious health problems for over 10 years…just as I was told during the experience on the “examination table” above (the experience that occurred in 1950). My state of health was so serious during the 1990s, my doctors call me “a walking time bomb.” In the meantime, I continued to try to understand what had happened to me, giving lectures and talks throughout the country…being interviewed on radio, TV…in newspapers and magazines. I also participated in several academic studies sponsored by prominent universities, etc, etc…and I finally finished writing the first two books, which took more than 14 years.,
✽ Since my 1986 “flood of recollections,” my family and I have experienced a few UFO sightings, seeing a couple that flew over the roof of our house. And I mysteriously, actually met “the familiar strangers” in the Spring of 1987. We compared stories and found that we had been told and experienced some of the exact same things during are respective, so-called, UFO encounters. This involved the extensive “Et/human project.” I witnessed them doing some of the exact same things I had seen them do during the 1950s time travels…and I mysteriously performed some of the same things I had done during the time travels. It was…amazing! Many coincidental things occurred between us and we finally discovered that “the project” would not occur exactly as we were shown and told.
✽ As before…many other things happened and were said.
Vist John Foster's website: http://johnfosterufos.wordpress.com
Categories: Aliens and UFOs
Author articles
INFORMATION RECEIVED DURING MY UFO ENCOUNTERS ET CONTROL...UFOS OR RELIGIOUS BEINGS? UFO CONTACTEE: ET CONNECTIONS TO HUMAN CULTURE? THE EXTRATERRESTRIALS - WHO THEY SAID THEY WERE! MY FIRST ENCOUNTERS WITH "THE VOICE!" A THIRD WITNESS TO WHAT HAPPENED DURING UFO ENCOUNTERS AT DILLION LAKE, COLORADO, 1975 TAKING A LOOK AT REALITY...FROM YET ANOTHER UFO CONTACTEE'S POINT OF VIEW THE STRANGE CRAFTS AND OBJECTS I ENCOUNTERED DURING THE MORE COMPLEX EXPERIENCES UNBELIEVABLE 'ENCOUNTERS OF THE THIRD KIND' PHENOMENON ATMOSPHERIC CHANGES DURING ALIEN ABDUCTIONS Newsletters | 科技 |
2017-09/1580/en_head.json.gz/10880 | Amateur 'Planet Hunters' Find One With Four Suns By editor
Transcript DAVID GREENE, HOST: Now is the moment in the program when I admit that I am a total Star Wars nut. Those of you with me, you might recall that Luke Skywalker's home planet of Tatooine enjoyed the warmth of not one but two suns. That dramatic scene, you remember Luke at dusk gazing at the weird peaceful sunset. Well, anyway, there is a reason that we're talking about this. A new planet, a real one, called PH1 was just discovered and it has not two but four suns. And what's more, it wasn't discovered by a professional scientist. Instead, this new planet was found by amateur astronomers, so-called citizen scientists who were part of a network called Planet Hunters. Joining us to talk about the discovery of PH1 and also the work of these armchair astronomers all over the world is Dr. Arfon Smith. He is one of the founders of Planet Hunters and the director of citizen science at the Adler Planetarium in Chicago. Dr. Smith, welcome to the program. DR. ARFON SMITH: Thank you. Thanks for having me on. GREENE: I really can't wait to hear about this new planet with all these suns. But can you - I want to hear first about citizen science and this group you founded, Planet Hunters. Who are they? SMITH: Sure. So Planet Hunters is one of our citizen science projects. These are projects where members of the public play a fundamental role in the sort of scientific process and discovery that goes on in science. And so this is our website. It's at PlanetHunters.org. And you can go online and basically look at data from a telescope. And we need your brain, your human intuition to interpret and analyze these data that are presented. GREENE: So is it a matter of you're pumping out data and there just aren't enough eyes to look at it in kind of the professional science world, so you were looking for more people who were interested and excited about this? SMITH: It really depends. I mean, there are tasks that need doing that are usually done by grad students but are not insanely difficult and can be achieved quite easily. And so we think that citizen science has a great place to play here, because these data sets are often beautiful to look at - maybe they're pictures of galaxies or something a little bit more abstract like in Planet Hunters. But there are a lot of areas where human intuition and human interpretation is still far better than a machine can achieve. And so these are perfect candidate projects I think for citizen science. GREENE: OK. And I'm just guessing here. PH1 must be named for Planet Hunters One, being the first planet that they found. SMITH: It is. There was much kind of discussion within the group about whether we could name it after one of the authors. But the rules of International Astronomical Union state that you're not allowed to name planets after individual people. GREENE: How is it possible for one planet to have four suns? SMITH: That's a very good question and I wish I had a better answer for you. So what we know about this system is that there are two main stars here. There's something called an M dwarf, about half the mass of the sun, so this is called a red dwarf kind of planet. So quite a lot cooler than the sun. And then there's a star that's a little bit bigger than the sun, about one and a half times the mass of the sun, that's a little bit whiter, a little bit hotter. And so Planet Hunters One is going around this pair of stars. But what's new about this one is of course there's actually another binary pair. It's about the distance between us and Pluto away. And I wish I could tell you exactly how this formed. Unfortunately, the models in which are developed currently to try and predict how planets form really don't support this kind of system. GREENE: Can I get totally cliche for just a second? SMITH: Of course. Is there life on PH1? Probably not. Unfortunately, it's a bit hot. It's about 350 degrees on the surface. GREENE: OK. That's hot. SMITH: So there isn't liquid water on the surface. And currently the way that astronomers describe planetary candidates that might be Earth kind of analog - so places where life could exist - we usually say that we would want liquid water to exist on the surface. And it almost certainly doesn't in this case. GREENE: That's Dr. Arfon Smith, who is director of citizen science at the Adler Planetarium in Chicago. We were talking to him about the discovery of PH1, a planet with four suns. Dr. Smith, thanks for joining us. SMITH: You're welcome. Thank you. GREENE: This is NPR News. Transcript provided by NPR, Copyright National Public Radio.TweetShareGoogle+EmailView the discussion thread. © 2017 WLRN | 科技 |
2017-09/1580/en_head.json.gz/10887 | Related Program: Morning Edition Key To Unlocking Your Phone? Give It The Finger(print) By editor
Sep 10, 2013 Related Program: Morning Edition TweetShareGoogle+Email Phil Schiller, Apple's senior vice president of worldwide marketing, speaks about fingerprint security features of the new iPhone 5s Tuesday in Cupertino, Calif.
Originally published on September 11, 2013 7:54 am The first note I sent out after Apple announced it was including a fingerprint scanner in the new iPhone 5s was to Charlie Miller. Miller, who learned how to hack at the National Security Agency and now works in security for Twitter, has hacked connected cars, wireless connections and NFC devices. But what he's best known for — what he seems to enjoy more than almost anything else — is hacking into Apple. So I was curious. If Apple is rolling out a fingerprint scanner as a way to replace passwords, exactly how long would it be until Miller got to work trying to figure out how to exploit the system? It is undeniable that passwords are only a half-effective form of security. They are a pain. Apple says roughly half of iPhone users don't even bother to set them up. Your password could be guessed, broken with brute force or stolen. No one will mourn the end of the password, which no doubt is why Apple is pinning its hopes for the 5s to a fingerprint scanning system, called Touch ID, that could make passwords obsolete. Apple spent more than $350 million to buy AuthenTec last year. AuthenTec owned a number of security patents, including some covering fingerprint scans. But Apple isn't the first smartphone manufacturer to try this — and fingerprint scanning isn't foolproof. In 2011 Motorola release a phone with a scanner. Joshua Topolsky, then writing for Engadget, had this to say: "As far as truly unique hardware goes, the fingerprint scanner seems fairly novel — but in practice it's a little frustrating. It does work as advertised, but being told to re-swipe your finger if it doesn't take when you're trying to get into the phone quickly can be a little bothersome. Unless you really need the high security, a standard passcode will suffice for most people." A key test for Apple will be whether its version of this technology just works. But now, with a fingerprint scanner built into the iPhone 5s' home button, biometrics are taking a big step into a much bigger ecosystem. And the scan won't just be used to start the phone. Apple says you'll also be able to confirm purchases in the App Store using a print instead of your Apple ID password. But — for now at least — don't expect to pay for anything outside of Apple's ecosystem with your finger. App developers will not have access to the scan. Apple did do its best to assure consumers that the fingerprint data it collects from users will be kept safe and private. The scanned print won't be uploaded to Apple's iCloud. Instead, it will be stored in a secure "enclave" on the iPhone, and Apple says the data will be encrypted. "I don't think the encryption will be a big hurdle for a hacker," Miller said. "Apple is going to have to compare that encrypted data with a new scan before they unlock the phone. So they are going to have to decrypt it at that point. You could re-engineer that process." "Of course, doing any of this is difficult," Miller added. "You have to remember you are starting with a phone that's locked and you can't get past the pass screen." Nonetheless Miller said, in terms in terms of overall security, adding fingerprint scanning is only likely to make iPhones easier to break into. "They are not going to do away with the pass code entirely," he explained. "So, really, by creating another way to unlock the phone they have created another access point for a hacker to try and exploit." If the 5s sells as well as its predecessors it's conceivable that 100 million people could be using fingerprint scanning with the year. And that has already raised some privacy questions. If you are worried about someone, like the police, getting a copy of your prints, there are probably easier ways than hacking your phone. After all, if the authorities have your smartphone they could probably lift a print from the glass screen the old-fashioned way — by dusting for one.Copyright 2013 NPR. To see more, visit http://www.npr.org/. TweetShareGoogle+EmailView the discussion thread. North Carolina Public Radio - WUNC is created in partnership with: | 科技 |
2017-09/1580/en_head.json.gz/10941 | Westar to buy 200 MW of wind
Kansas-based investor-owned utility Westar Energy, Inc., said it has reached agreement with wind project developer Apex Clean Energy to purchase 200 megawatts (MW) of electricity from a wind farm Apex will build near Arkansas City.
In another favorable utility comment on the current cost of wind power, Westar CEO Mark Ruelle commented, “This project is great for our customers. “For the first time, we’ve been able to add renewable energy at a cost comparable to our other energy resources. In addition, the project will bring about $90 million to the Kansas economy during construction and provide millions more in ongoing economic benefits.”
Westar is one of a number of utilities who have been adding wind power to their portfolios as the cost of the clean energy resource continues to fall.
The Apex project is near one of Westar Energy’s recently constructed high-voltage transmission lines, meaning that hooking it up will be less costly and less disruptive, the utility said. It comprises about 18,000 acres six miles south of Arkansas City, just across the state line. Construction is anticipated to begin in 2015, and the wind farm is expected to begin providing electricity in late 2016.
“When construction starts, Arkansas City [will] likely see a boost in our economy from the local hiring, as well as the presence of other construction workers who will eat, sleep and shop in our community,” Nickolaus Hernandez, city manager of Arkansas City, said in support of the project. “The Kay Wind project will bring significant economic benefits to the people of Arkansas City for decades to come.”
Westar currently has about 700 MW of renewable energy in its generation portfolio. It was the first utility in Kansas to add commercial scale wind energy to its generation, with a pilot project in 1999. It remains a leader in renewable energy.
Apex Clean Energy is developing a number of large-scale wind energy projects in the Midwest.
“We are very pleased to be working with Westar, and we look forward to a strong long-term relationship moving forward,” said Mark Goodwin, president of Apex. “The Kay project will bring clean energy, at a great value, to Westar and its customers.”
Photo credit: Smoky Hills Wind Farm, Kansas, by Drenaline via Wikimedia Commons
Related ItemsApex Clean EnergyKansasutility integrationWestarWind Energywind power
Heartland Institute’s Noon bullies American-supported wind power Microsoft latest tech giant to buy wind power | 科技 |
2017-09/1580/en_head.json.gz/11001 | National Park Service and Autodesk Carry Out First Comprehensive Digital Survey to Preserve the USS Arizona and Memorial
Collaboration marks the first time LiDAR, Sonar, underwater laser scanning, and photographs are brought together to create an intelligent 3D model
Loading media player... The National Park Service, Autodesk and other partners are working to create a 3D model of the USS Arizona
MPEG-4 Video
The National Park Service, Autodesk and other partners are working to create a 3D model of the USS Arizona
Don Stratton, a survivor of the bombing at Pearl Harbor, holds a 3D print of a cooking pot that rests on the deck of the USS Arizona (Photo: Business Wire)
A point cloud of the USS Arizona Memorial displayed in Autodesk ReCap software (Photo: Business Wire)
May 26, 2014 08:30 PM Eastern Daylight Time
HONOLULU--(BUSINESS WIRE)--In honor of U.S. military veterans, the National Park Service (NPS) and Autodesk, Inc. (NASDAQ:ADSK) hosted a press conference today to unveil preliminary results from the first comprehensive survey of the USS Arizona and Memorial in 30 years and its resulting 3D models.
Consistent with Autodesk’s vision to help people imagine, design and create a better world, the survey takes full advantage of the latest technology for the purposes of both historic preservation and public education. Scheduled to be completed later this year, the survey will provide the public with a more detailed view and understanding of this historic site while contributing to the ship’s ongoing preservation.
At today’s Memorial Day press conference, a 3D printout of the USS Arizona showcased details never seen before on paper. In addition, highly detailed 3D models of a cooking pot and Coke bottle that have sat on the ship’s galley for the past 72 years were also created and displayed. Each model featured intricate details including color and the barnacles now present on the cooking pot. Autodesk and NPS hope to create a 3D model of the USS Arizona in its entirety by the end of this year.
“This technological approach helps make the USS Arizona’s legacy come alive that just wasn’t possible before,” said National Park Service Superintendent Paul DePrey. “The USS Arizona is one of America’s most revered historical sites. As its steward, the National Park Service has a mission to share the story of December 7 with current and future generations. Creating 3D models allows people to see and touch these highly detailed and accurate replicas, something that will play an important role in our educational outreach program.”
Also present at the press conference was 92-year-old USS Arizona survivor Don Stratton, one of only nine remaining USS Arizona survivors still alive, and one of only a few hundred to make it off the ship. Stratton was only 19 years old when Pearl Harbor was attacked. He along with six other crew members went hand over hand on a heaving line across the burning deck to safety on the USS Vestal that was moored alongside the USS Arizona that fateful morning. Stratton suffered burns over 70 percent of his body.
When presented with the 3D print of the cooking pot for the first time, Stratton said, “That is amazing. I don’t know anybody in the galley that survived that day. At the time of the explosion, it was self-preservation. After that, it was extremely hard to return. Now, when I go back and remember, it’s a little easier. I think it [3D artifacts] will make an impression on a lot of people, I really do.”
Stratton’s son Randy Stratton, who was also present, said, “You can’t duplicate these artifacts. They represent the beginning of the war, the end of the war and the fact that there is still life there [on the USS Arizona].”
Don Stratton concluded, “I hope they remember all the shipmates that are still aboard the Arizona. And I hope they remember all the people that gave their lives for this great country.”
Approximately 900 remains of the 1,777 officers, sailors and Marines killed still remain inside the USS Arizona; therefore any work done on the ship must be done with extreme care and sensitivity. With this in mind, the NPS is leading the effort to create a highly accurate, 3D digital representation, while minimizing any disturbance to the ship. NPS is employing Autodesk’s reality computing technology, underwater photogrammetry, subsea LiDAR, high-resolution SONAR, and above water laser scanning to conduct investigation and analysis without disturbing the ship.
“The USS Arizona Memorial is such an important yet fragile piece of history,” said Brian Mathews, vice president, Autodesk. “Reality Computing is an emerging concept that bridges the physical and digital worlds, and Autodesk sees great potential in supporting the National Park Service and preservationists around the world with reality computing technology to capture, analyze and communicate these stories of our past for future generations.”
Other organizations involved in the survey include: HDR, Sam Hirota, Inc., Oceanic Imaging Consultants, Inc., 3DatDepth, Shark Marine Technologies, Inc., United States Coast Guard, and the US Navy Mobile Diving Salvage Unit One.
About World War II Valor in the Pacific National Monument
One of nearly 400 units in the National Park Service, World War II Valor in the Pacific National Monument preserves and interprets the stories of the Pacific War, including the events at Pearl Harbor, the internment of Japanese Americans, the battles in the Aleutians, and the occupation of Japan. This year, over 1.7 million individuals will visit the Pearl Harbor Visitor Center, making it the most popular visitor destination in Hawaii. Learn more at www.nps.gov/valr.
About Autodesk
Autodesk helps people imagine, design and create a better world. Everyone—from design professionals, engineers and architects to digital artists, students and hobbyists—uses Autodesk software to unlock their creativity and solve important challenges. For more information visit autodesk.com or follow @autodesk.
Autodesk, and the Autodesk logo are registered trademarks of Autodesk, Inc., and/or its subsidiaries and/or affiliates in the USA and/or other countries. All other brand names, product names or trademarks belong to their respective holders. Autodesk reserves the right to alter product and services offerings, and specifications and pricing at any time without notice, and is not responsible for typographical or graphical errors that may appear in this document.
© 2014 Autodesk, Inc. All rights reserved.
Autodesk, Inc.Angela Simoes, 415-547-2388angela.simoes@autodesk.comorNational Park ServiceLaurie LaGrange, 808-375-9335ontai@pixi.com
The National Park Service (NPS) and Autodesk unveil preliminary results from the first comprehensive survey of the USS Arizona and Memorial in 30 years and its resulting 3D models | 科技 |
2017-09/1580/en_head.json.gz/11028 | High De...
High Definition Content Through Analog Sockets
The existing HD-enabled TV sets and current PC monitors will be possibly not part of the upcoming HDTV home concept, due to their incompatibility with the Digital Right Management schemes introduced by the movie and CE industry.
According to representatives of several consumer electronics companies, industry groups is set to rule whether millions of HDTV will have to buy new sets and PC monitors to watch the movies in high definition.
The problem concerning the reproduction of HD video content is located to the display. Movie studios want the High Definition Multimedia Interface (HDMI) connection that is only now becoming a common feature on HDTVs. The digital HDMI interface is favored because it includes copy protection that makes it difficult to break into the video signal when it makes its way from the player to the TV set. But the truth is that millions of HDTV sets already in people's homes don't have HDMI output ports and use traditional analog methods to transfer video. However, analog signals pose a potential threat to studios because movie pirates could use them to copy the content.
The situation could be harder in case of the PC monitors. Taking the AACS protection scheme as a rule for HD video, both HDMI and DVI digital output interfaces would require the monitor to support the HDCP encryption scheme. However, HDCP compatible monitors are not available in the market, at least for now.
"Considering the problem, it is true that HD DVD video will not be able to be reproduced current PC monitors. Therefore, the industry examines the possibility to allow analog interfaces to output HD content to monitors," said Ryoichi Hayatsu, chief manager of NEC's 1st storage products division at Ceatec. He also added that negotiations are under way on whether a grace period could be given that would allow transmission of HD disc content across the DVI interface until perhaps 2010 to give people time to upgrade their monitors.
A decision on whether to allow high-definition over analog connections is expected sometime in October or November and will be made by the group behind the Advanced Access Content System (AACS) content protection system. AACS founders include IBM, Intel, Microsoft, Panasonic, Sony, Toshiba, Disney, and Warner, and the decision will be made with input from content providers. Note that AACS is used in both formats, so the decision is likely to affect both HD-DVD and Blu-ray Disc. Moreover, NEC plans to start production of its first HD DVD player for PC late October. NEC's representatives assume that the first PC equipped with the HD DVD drive will be in the market by the end of 2005.
Australian Court Rules Game Console 'Mod Chips' Legal
All News Updated PlexTools Professional XL With Video Enhancements
JVC Develops Technologies for Next-generation Optical Wireless Access System
Microsoft to Start Music Service?
HDMI 2.1 Brings Higher Video Resolutions And Dynamic HDR
OPPO Releases UDP-203 4K Ultra HD Blu-ray Disc Player
PowerDVD Software Player Certified for Ultra HD Blu-ray Disc Playback
Demand For 4K TVs And Streaming Media Players Is Growing
Home Entertainment Spending Up Driven by Digital HD Sales and 4K Ultra HD Technology
Sony Showcases 4K Ultra HD Blu-ray player, 4K Projector And New Audio-video Receivers At CEDIA | 科技 |
2017-09/1580/en_head.json.gz/11128 | Home > Android Army > Samsung to add new features to the Galaxy S3 with… Samsung to add new features to the Galaxy S3 with its Premium Suite update By
Samsung Galaxy S3 owners can look forward to another software update soon, which will bring with it improvements to the Android 4.1 Jelly Bean operating system, plus the company’s Premium Suite first seen on the Galaxy Note, which itself has been tweaked with features introduced on the Galaxy Note 2.
To give S3 owners a hint at what’s to come, Samsung has released the first of two videos demonstrating the Premium Suite’s features, which you can see below. They’re split into two sections, the first being called “contextual awareness” and the second, “enhanced features.”
Let’s look at contextual awareness first. This will include Page Buddy, where certain actions dictate what the phone does next, for example plugging in a pair of headphones will activate the music player. Contextual Menu let you re-order apps alphabetically, or by usage, while Contextual Tag instantly adds weather, date and location data, plus if it’s a snap of a friend, their name to your pictures.
Moving on to the enhanced features, it’s great to see Samsung include the split screen system from the Galaxy Note 2, where two apps can be run at the same time — a feature only possible on devices with screens as large as the S3’s. You’ll also be given the chance to set your Facebook news feed to show on the lock screen, plus the chance to share photos and videos using NFC and the S-Beam feature too. Finally, a Reader mode has been added to the Web browser, which focuses on the text by zooming in on that section.
That’s all we’ve been shown in part one of the Premium Suite video, so there’s still more to come in the near future. Samsung hasn’t said when Premium Suite will arrive, but it will be bundled in its Android 4.1.2 OS update, which has reportedly already started to seed in parts of Europe. Seeing as U.S. networks have only just got to grips with the Galaxy S3’s first Jelly Bean update, there’s little chance of this one being approved on AT&T and others just yet. | 科技 |
2017-09/1580/en_head.json.gz/11131 | Home > Music > When it comes to audio, vintage tech still rocks When it comes to audio, vintage tech still rocks By
Louie Herr
Obsolescence is an unfortunate part of our digital lives. My first-generation iPad is now too old to benefit from OS updates. Windows 8 looks to be a lot less fun if you’re not using it with a tablet device. Whatever top-notch phone you buy today will be replaced by something better in six months.
The audio realm defies all of these rules. While you would never use a “vintage” cell phone (Zack Morris costumes excepted), vintage audio gear not only stands the test of time, in some cases it’s highly prized. Let’s take a look at some old-school listening devices that have escaped the clutches of time.
PlayStation (SCPH-100X Model)
Why they’re great: Fidelity. And, yes, the original PlayStation console is now a vintage device.
Surprisingly, certain models of the now 18-year-old PlayStation pack CD players of incredible quality. The hardware on these models – those with RCA jacks on the back of the unit – is so good that the playback quality has been compared favorably to CD players that cost several thousand dollars. Audiophile magazines like 6moons and Stereophile have given PlayStations very positive reviews, along with many others.
Nothing vintage is perfect, though, so there are some drawbacks. First, the PlayStation must be left turned on for best results. Apparently, it can take as long as a week to reach peak audio quality. That may put it out of bounds for the environmentally conscious. Leaving your PlayStation on for long lengths of time will also put strain on already ancient hardware. I know there are a lot of original Nintendo consoles that still soldier on, but I doubt many were left on for weeks or months at a time. If you get hooked on the PlayStation’s sound quality, it may be a piece of hardware you’ll need to replace often.
Luckily, these devices are pretty cheap. At least three were available on eBay at the time of this writing, and none cost more than $22.99. That is a phenomenal price for the sound quality they afford — to say nothing of the shock value for your audiophile friends.
Why they’re great: Even better fidelity. No matter how great your PlayStation sounds, it will never achieve sound quality of a good, clean vinyl record played on a quality system.
The problem is that CDs are digital media. Vinyl records are analog. Without getting into the details, analog media preserve sound as a contiguous analog wave. This is more similar to the way that sounds exist in reality than digital media, which instead preserve sound in a way that resembles a series of steps. The original analog wave is sliced into extremely small segments, and each of those segments is represented by a number. During playback, the sequence of these numbers then reconstructs a waveform similar to the original sound. No matter how finely the original audio is sliced, however, the digital representation will never be quite the same as the original analog wave. If you want access to even greater audio fidelity, you will have to look to vinyl.
This could mean buying a record player. You have many options. Vintage turntables can be highly desirable. Both Sound and Vision Mag and Turntable Kitchen have suggested models to keep an eye out for. Don’t shy from contemporary models, though. The New York Times recently suggested several, in addition to providing a host of other tips for new fans of vinyl. Additionally, Stereophile.com offers reviews of many other recent releases.
Tape cassette player
Why they’re great: Access and cachet. You might be surprised how much excellent music is being released on cassette tape these days. In Portland alone, Apes Tapes, Eggy Records, and Gnar Tapes all release new cassette tapes on a regular basis. Many of these tapes are extremely affordable. Take, for example, the And And And / Woolen Men Split Tape from Apes Tapes, which costs just $5 and contains full-length releases from two of the best bands in Portland. These tapes can also be exceedingly rare. Those in the Gnar Tapes catalog are limited to 100 units, making each a collector’s item for a small but dedicated group of tape devotees.
Cassette tapes are also, somehow, cooler than vinyl. This may be due to their exclusivity. Not only are indie tapes produced in very small quantities, but not many people have working cassette decks. This makes the ability to play tapes a mark of distinction. The current hipness of cassette also comes from the people running the tape labels. The operators of Apes Tapes, Eggy Records, and Gnar Tapes are key members of bands like Radiation City, The Woolen Men, and White Fang respectively (among others). While tapes may not have superior fidelity, their exclusivity and the small communities involved with making them can lead listeners to very interesting content.
If you want to pick up a tape deck, the Tapeheads.net Cassette forums look to be a great place to start your research.
Don’t toss that tech
A computer from 1994 makes an interesting curio, but isn’t as useful today as a record player, cassette deck, or even an original PlayStation. Though technology continues to advance in all sectors, vintage audio gear retains its utility in a way very different from other tech.
Maybe I should have held on to my MiniDisc player and Sir Mix-a-Lot tapes. | 科技 |
2017-09/1580/en_head.json.gz/11156 | Technology » ScienceStudy: Billions of Earth-Size Planets in Milky Way by Alicia ChangAssociated PressTuesday Jan 8, 2013 PRINT
Our Milky Way is home to at least 17 billion planets that are similar in size to Earth, a new estimate suggests. That's more than two Earth-size planets for every person on the globe.Just how many are located in the sweet spot where water could exist is "simply too early to call," said Francois Fressin of the Harvard-Smithsonian Center for Astrophysics, who presented his work at an astronomy meeting Monday.It's the first reliable tally of the number of worlds outside the solar system that are the size of Earth, but the hunt for our twin is far from over.Despite the explosion of exoplanet discoveries in recent years, one find remains elusive: A planet that's not only the right size but also in the so-called Goldilocks zone where it's not too hot or too cold for water to be in liquid form on the surface.The sheer number of Earth-size planets gives astronomers a starting point to narrow down which ones are in the habitable zone.Fressin and his team came up with their figure by conducting a fresh analysis of data collected by NASA's Kepler spacecraft, which was launched in 2009 to track down other Earths. They estimated at least one in six stars in the galaxy hosts a planet the size of ours, translating to at least 17 billion Earth-size worlds.Using a different method, a team from the University of California, Berkeley and University of Hawaii separately came up with a similar estimate. They calculated 17 percent of distant stars have planets that are the same size as Earth or slightly larger.The findings were presented at the American Astronomical Society in Long Beach, Calif.Meanwhile, the Kepler spacecraft continues to spot planets as they pass between Earth and the star they orbit. It found 461 new candidate planets, bringing the total to 2,740 potential planets, said mission scientist Christopher Burke at the SETI Institute.Most of the new Kepler finds were driven by discoveries of Earth-size planets and super-Earths. Four of those are thought to reside in the Goldilocks zone, but more observations are needed.Fressin said it's clear that rocky planets abound outside the solar system."If you look up on a starry night, each star you're looking at - almost each one of them - has a planetary system," Fressin said.Copyright Associated Press. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed. | 科技 |
2017-09/1580/en_head.json.gz/11163 | > Borneo Rainforests a Treasure ... Sign Up for Free NewsLetter Email Address
From: Diyan Jari and Reuben Carder, Reuters
Published March 31, 2006 12:00 AM
Borneo Rainforests a Treasure Trove of Rare Species
Red, Catlike Animal May Be a New SpeciesDecember 7, 2005 12:00 AM
Indonesian Orangutans Under Siege, Green Groups SaySeptember 7, 2005 12:00 AM
Borneo Lowland Forests Face ExtinctionJune 8, 2005 12:00 AM
Indonesia pins hopes on forests at Bali meetingNovember 20, 2007 07:12 PM
JAKARTA About three years ago, wildlife researchers photographed a mysterious fox-like mammal on the Indonesian part of Borneo island.
They believed it was the first discovery of a new carnivore species there in over a century.
Since then, more new species of plants and animals have been found and conservationists believe Borneo, the world's third-largest island, is a treasure trove of exotic plants and animals waiting to be discovered.
The new finds were all the more remarkable after decades of deforestation by loggers, slash-and-burn farming, creation of vast oil palm plantations, as well as rampant poaching. Conservationists hope that Borneo will reveal many more secrets, despite the myriad threats to its unique flora and fauna.
"There is vast potential," said Gusti Sutedja, WWF Indonesia's project director for Kayan Mentarang national park, a sprawling reserve on the island where the new mammal, nicknamed the Bornean Red Carnivore, was photographed in a night-time camera trap.
The animal itself is so rare, it's never been captured.
"In 2003, we conducted joint operations with Malaysian scientists and discovered many unknown species of lower plants. Three frogs discovered are being tested by German researchers. We also recorded five new birds in a forest survey in 2003."
Some conservationists believe Borneo could be the next "Lost World" after the recent discovery of a host of butterflies, birds and frogs in another Indonesian jungle on the island of New Guinea.
The tropical island's fate, along with other threatened areas on the planet, are at the center of a U.N. meeting in Curitiba, Brazil, that ends on Friday. Governments are discussing how to protect the world's biodiversity under a U.N. goal set in 2002 to slow the loss of species by 2010.
Progress towards meeting that goal looks bleak.
Borneo -- a territory shared among Indonesia, Malaysia and Brunei -- is home to about 2,000 types of trees, more than 350 species of birds, about 150 types of reptiles and 210 mammal species, including 44 only found on the island.
Many animals such as pygmy elephants, Sumatran rhinos, orangutans as well as the clouded leopard, the sun bear and the Bornean gibbon top the list of Borneo's endangered species.
NATURE'S HOTHOUSE
Environmentalists say the island, described by Charles Darwin as "one great untidy luxuriant hothouse made by nature for herself," is being stripped of vast swathes of forests by loggers. Mining, lax law enforcement and corruption are also threats.
According to some estimates, Borneo loses forests equivalent to an area of about a third of Switzerland every year, or at a rate of 1.3 million ha (3.2 million acres), much of it to feed the voracious appetite for timber in the West and Asia.
"Indones has so far managed to remain intact because of its rugged terrain and distance from the coast.
"There is opposition from most environmental NGOs. Their research says that areas of natural forest could be converted, and the project could affect rivers," Sutedja said.
"Flooding could occur, which would affect the indigenous Dayak people who live downstream," he said, adding that WWF did not oppose the plan, but was concerned it be carried out in accordance with environmental principles.
Environment Minister Rachmat Witoelar said the government plan to open major palm oil plantations had taken into account his ministry's concerns.
"We will start by making use (of) the areas that are already ready for planting. I strongly oppose ... cutting down forest for the replanting of palm oil plantations, which does not make sense," he told Reuters.
Environmentalists say they are particularly worried as island ecosystems are known as much for their fragility as their ability to harbor rare animals and plants.
Of approximately 800 species extinctions worldwide since accurate scientific recording began in 1500, the vast majority have been from island ecosystems, the World Conservation Union says.
Green groups say hundreds of orangutans are killed or captured every year on the Indonesian part of Borneo as part of an illegal trade that is driving the primates towards extinction.
According to a study by WWF International and wildlife trade monitor TRAFFIC, between 200 and 500 Borneo orangutans are traded in various parts of Indonesia each year. The vast majority are infants sold as pets.
WWF International estimates poachers have also killed most endangered rhinos in Borneo and only about 13 might have survived.
"The current situation will continue until the forest is gone," Muhtaman said. Source: Reuters Contact Info: Website : Tweet | 科技 |
2017-09/1580/en_head.json.gz/11224 | Ian_Dean
News Knocking on heaven's door?
Shares Details from the Japanese Home beta test revealed just as many new questions as it did answers. The application, which is designed as a virtual world to customize and meet mates in, will be released for free, hopefully this summer. The 500MB code will enable you to setup and develop an online world inside a thriving gaming community.Already games such as Devil May Cry 4 have been released with hundreds of embedded ‘achievements’ which on Home’s release will be converted into trophies, furniture and clothing for your own specific part of Home and avatar. The beta has revealed the flexibility of the software, enabling the 1,500 players who have tested it behind closed doors at Sony to connect with large groups of friends, tour the Home Square environment, moving from disc-based games to watching trailers to playing Home’s free arcade games (which include takes on Chopper Lift, Q-Bert and Super Sprint) all without any trouble. The idea is that eventually, you’ll be able to include similar content in your Home apartment, and invited friends and clan members will be able to sift your media (photos, videos, replays, scores, etc) together while you wait for the party to gather.When first announced, Sony stated Home wouldn’t be a persistent world -when you leave, your space would close down so friends wouldn’t be able to tour your apartment if you weren’t there. The Japanese beta seems to have thrown this into confusion. Sony is now saying that Home will be a persistent world to such an extent that you will be able to upload media - photos from your holiday, for example, as you take them on your digital camera, in real time.The big question is how ‘free’ Home will be when it’s released. Extra content will come at a price. Rumors suggest that future demos may come with a charge. The upside is that you should be able to sell content in Home in the same way as EA, Ubisoft and Capcom would do. If you’ve created a level in LittleBigPlanet, perhaps you could charge for it?Apr 2, 2008 | 科技 |
2017-09/1580/en_head.json.gz/11254 | PCIO Issues
Looking for a Few Good CIOs
by Tod Newcombe
/ November 18, 2003
The subject of this month's cover story was a pleasure to cover. Karen Evans started her career in federal government 20 years ago at one of the lowest civil service rankings, and through hard work and constant improvement of her IT skills, knowledge and leadership, she has risen to the pinnacle of her profession. The story doesn't stop there. As everyone in public-sector IT knows, Evans is stepping into big shoes. Her predecessor, Mark Forman, did an impressive job, by virtually all accounts, in getting the vast federal bureaucracy on the same page in terms of creating and launching a number of strategic e-government initiatives.
Now it's Evans' turn to take up the reins and keep the program moving forward. Most analysts agree the job won't be easy -- for one thing, Congress is not providing any new funding -- but they also agree if anyone can handle the job, it's Evans. As our interview with her shows, she has broad knowledge of federal IT issues, the drive to get things done, and people skills to work with diverse groups that often have differing agendas within government.
It's important to remember, however, there's another story here as well -- namely CIO turnover. What is happening in Washington, D.C., is happening throughout the country. CIOs are leaving their jobs, creating a demand for skilled replacements. More than half the states have changed CIOs since December 2002, and that doesn't include all agency CIOs and deputy CIOs who left jobs as well. The question is: Where are the new CIOs going to come from, and will they have the skills to lead in the increasingly complex world of public-sector IT?
Our feature on CIO education -- Head of Their Class -- takes a close look at where the next generation of public-sector CIOs is coming from, and where they are getting their education. One thing is clear: the number of courses and degree programs available to would-be CIOs is growing, as is the number of people admitted into executive-level, public-sector IT education programs. For instance, CIO University, which is run by the General Services Administration in conjunction with a number of prestigious universities, has seen its graduation numbers skyrocket from 18 in 2000, to 138 in 2003. In five years, the Information School at the University of Washington has tripled in size.
It's a promising sign, but as everyone knows, more must be done. Today's public-sector CIO needs to be a business person as well as a technology strategist, and it doesn't hurt to have a good understanding of the political process in his or her jurisdiction. They need all this at a time when government is undergoing a transformation as the Information Age matures. Finding qualified people willing to take on that responsibility at public servants' pay isn't going to be easy.
Tod Newcombe
With more than 20 years of experience covering state and local government, Tod previously was the editor of Public CIO, e.Republic’s award-winning publication for information technology executives in the public sector. He is now a senior editor for Government Technology and a columnist at Governing magazine.
MORE FROM PCIO Issues | 科技 |
2017-09/1580/en_head.json.gz/11303 | Director of Penn State Earth System Science Center; Author of 'Dire Predictions' and 'The Hockey Stick and the Climate Wars' r. Mann is Distinguished Professor of Atmospheric Science and Director of the Earth System Science Center at Penn State University. His research focuses on understanding climate variability and human-caused climate change. He was selected by Scientific American as one of the fifty leading visionaries in science and technology in 2002. He was organizing committee chair for the National Academy of Sciences Frontiers of Science in 2003 and contributed to the award of the 2007 Nobel Peace Prize with other IPCC lead authors. He was awarded the Hans Oeschger Medal of the European Geosciences Union in 2012 and the National Conservation Achievement Award of the National Wildlife Federation in 2013. He made Bloomberg News' list of fifty most influential people in 2013. In 2014, he was named Highly Cited Researcher by the Institute for Scientific Information (ISI) and received the Friend of the Planet Award from the National Center for Science Education. He is a Fellow of the American Geophysical Union, the American Meteorological Society, and the American Association for the Advancement of Science, and has authored more than 200 publications, and three books including Dire Predictions: Understanding Climate Change, The Hockey Stick and the Climate Wars, and The Madhouse Effect with Washington Post editorial cartoonist Tom Toles. | 科技 |
2017-09/1580/en_head.json.gz/11310 | Nanotechnology: Breaking Through the Next Big Frontier of Knowledge
Joseph V. Kennedy, Former Senior Economist Joint Economic Committee for Congress Nanotechnology is getting big. It is already a driving force in diverse fields such as physics, chemistry, biology, and information sciences. Developments coming out of research labs this year will lead to breakthrough new products in medicine, communications, computing, and material sciences sometime in the next two decades. Its impact on our lives over the next fifty years could rival the combined effects of electricity, the internal combustion engine, and the computer over the last century. As with any new technology, nanotechnology raises some safety concerns. However, its overall effects will be strongly beneficial to all sectors of society.
This article describes what nanotechnology is and how it builds on previous scientific advances. It then discusses the most likely future development of different technologies in a variety of fields and how government policy is aiding scientific advance.
What Is Nanotechnology?
A nanometer (nm) is one billionth of a meter. For comparison purposes, the width of an average hair is 100,000 nanometers. Human blood cells are 2,000 to 5,000 nm long, a strand of DNA has a diameter of 2.5 nm, and a line of ten hydrogen atoms is 1 nm. The last three statistics are especially enlightening. First, even within a blood cell there is a great deal of room at the nanoscale; therefore, nanotechnology holds out the promise of manipulating individual cell structure and function. Second, the ability to understand and manipulate matter at the level of one nanometer is closely related to the ability to understand and manipulate both matter and life at their most basic levels: the atom and the organic molecules that make up DNA.
It is difficult to overestimate nanotechnology’s likely implications for society. For one thing, advances in just the last five years have proceeded much faster than even the best experts had predicted. Looking forward, science is likely to continue outrunning expectations, at least in the medium-term. Although science may advance rapidly, technology and daily life are likely to change at a much slower pace for several reasons. First, it takes time for scientific discoveries to become embedded into new products, especially when the market for those products is uncertain.
Second, both individuals and institutions can exhibit a great deal of resistance to change. Because new technology often requires significant organizational change and cost in order to have its full effect, this can delay the social impact of new discoveries. For example, computer technology did not have a noticeable effect on economic productivity until it became widely integrated into business processes. It took firms over a decade to go from replacing the typewriters in their office to rearranging their entire supply chains to take advantage of the Internet. Although some firms adopted new technologies rapidly, others lagged far behind.
Nanotechnology is distinguished by its interdisciplinary nature. The most advanced research and product development increasingly requires knowledge of disciplines that, until now, operated largely independently. These areas include:
Physics: The construction of specific molecules is governed by the physical forces between the individual atoms composing them. Nanotechnology will involve the continued design of novel molecules for specific purposes. In addition, researchers need to understand how quantum physics affects the behavior of matter below a certain scale.
Chemistry: The interaction of different molecules is governed by chemical forces. Nanotechnology will involve the controlled interaction of different molecules. Understanding how different materials interact with each other is a crucial part of designing new nanomaterials to achieve a given purpose.
Biology: A major focus of nanotechnology is the creation of small devices capable of processing information and performing tasks on the nanoscale. The process by which information encoded in DNA is used to build proteins, which then go on to perform complex tasks offers one possible template. A better understanding of how biological systems work at the lowest level may allow future scientists to use similar processes to accomplish new purposes. It is also a vital part of all research into medical applications.
Computer Science: Moore’s Law and its corollaries, the phenomena whereby the price performance, speed, and capacity of almost every component of the computer and communications industry has improved exponentially over the last several decades, has been accompanied by steady miniaturization. Continued decreases in transistor size face physical barriers including heat dissipation and electron tunneling that require new technologies to get around. In addition, a major issue for the use of any nanodevices will be the need to exchange information with them.
Electrical Engineering: To operate independently, nanodevices will need a steady supply of power. Moving power into and out of devices at that scale represents a unique challenge. Within the field of information technology, control of electric signals is also vital to transistor switches and memory storage. A great deal of research is also going into developing nanotechnologies that can generate and manage power more efficiently.
Mechanical Engineering: Even at the nanolevel, issues such as load bearing, wear, material fatigue, and lubrication still apply. Detailed knowledge of how to actually build devices that do what we want them to do with an acceptable level of confidence will be a critical component of future research.
With so many sciences having input into nanotechnology research, it is only natural that the results of this research are expected to have a significant impact on four broad applications (nanotechnology, genetics, information technology, and robotics) that interrelate in a number of ways:
Nanotechnology: Nanotechnology often refers to research in a wide number of fields including the three listed below. But in its limited sense, it refers to the ability to observe and manipulate matter at the level of the basic molecules that govern genetics, cell biology, chemical composition, and electronics. Researchers can then apply this ability to advance science in other fields. The broader definition of nanotechnology applies throughout most of this paper, but it is worth remembering that advances in other sciences depend on continued improvements in the ability to observe, understand, and control matter at the nanolevel. This in turn will require more accurate and less expensive instrumentation and better techniques for producing large numbers of nanodevices.
Biotechnology (Genetics): Nanotechnology promises an increased understanding and manipulation of the basic building blocks underlying all living matter. Though the basic theory of genetic inheritance has been known for some time, biologists do not fully understand how life goes from a single fertilized egg to a living animal. Questions exist on exactly how the information encoded in DNA is transcribed, the role of proteins, the internal workings of the cell and many other areas. On a basic level, research is allowing us to tease out the genetic basis for specific diseases and in the future may reliably allow us to correct harmful mutations. But what would a full understanding of the genetic process give us? Could we develop DNA that uses a fifth and sixth molecule? Could the existing process be reprogrammed to code for more than 20 amino acids? To what extent is it possible to create brand new proteins that perform unique functions?
A better understanding of biological processes is obviously needed in order to deliver the health benefits that nanotechnology promises. But it is also important for many reasons outside of biology. Those comfortable with traditional manufacturing techniques may at first have difficulty with the concept of building a product up from the molecular level. Biology offers a template for doing so. A single fertilized egg in the womb eventually becomes a human being: a system of incredible complexity from a simple set of instructions 2.5 nm in diameter. Scientists are hopeful that similar processes can be used to produce a range of other structures.
Information Technology: Progress in information processing has depended on the continued application of Moore’s law, which predicts a regular doubling of the number of transistors that can be placed on a computer chip. This has produced exponential improvements in computing speed and price performance. Current computer technology is based on the Complementary Metal Oxide Semiconductor (CMOS). The present generation of computer chips already depends on features as small as 70 nanometers. Foreseeable advances in nanotechnology are likely to extend CMOS technology out to the year 2015. However, at transistor densities beyond that, several problems start to arise. One is the dramatic escalation in the cost of a new fabrication plant to manufacture the chips. These costs must be amortized in the cost of the transistors, keeping them expensive. Second, it becomes increasingly difficult to dissipate the heat caused by the logic devices. Lastly, at such small distances, electrons increasingly tunnel between materials rather than going through the paths programmed for them. As a result of these constraints, any continuation of Moore’s Law much beyond 2015 is likely to require the development of one or more new technologies.
Future advances will likely bring us closer to a world of free memory, ubiquitous data collection, massive serial processing of data using sophisticated software, and lightening-fast, always-on transmission. What happens when almost all information is theoretically available to everyone all the time?
Cognitive Sciences (Robotics): Continued advances in computer science combined with a much better understanding of how the human brain works should allow researchers to develop software capable of duplicating and even improving on many aspects of human intelligence. Although progress in artificial intelligence has lagged the expectations of many of its strongest proponents, specialized software continues to advance at a steady rate. Expert software now outperforms the best humans in a variety of tasks simply because it has instantaneous access to a vast store of information that it can quickly process. In addition, researchers continue to develop a much better understanding of how individual sections of the brain work to perform specific tasks. As processing power continues to get cheaper, more and more of it will be applied to individual problems.
Government Policy Toward Nanotechnology
Nanotechnology is still in its early stages. Many of the most valuable commercial applications are decades away and require continued advances in basic and applied science. As a result, government funding still constitutes a large proportion of total spending on research and development. Within the United States, this spending is guided by the National Nanotechnology Initiative (NNI). The NNI coordinates the policy of twenty-five government agencies, including thirteen that have budgets for nanotechnology research and development. It has set up an infrastructure of over thirty-five institutions across the country to conduct basic research and facilitate the transfer of technology to the private sector. For fiscal year 2008, President Bush has requested $1.45 billion for research directly related to nanotechnology (see Figure 1).
U.S. Federal Funding for Nanotechnology Research
The NNI’s strategic plan sets out four main goals:
Maintain a world-class research and development program to exploit the full potential of nanotechnology.
Facilitate the transfer of nanotechnology into products for economic growth, jobs, and other public benefits.
Develop educational resources, a skilled workforce, and the supporting infrastructure to advance nanotechnology.
Support responsible development of nanotechnology.
The NNI is clearly geared toward developing the technology on a broad front, correctly seeing it as the source of tremendous benefits to society. Its mission is not to see whether we should go forward with research and development. It is to go forth boldly, while trying to discover and deal with possible risks.
Presently, the United States leads the world in most areas of research. However, other countries, including China, also see research in nanotechnology as being vital to their ability to create value in tomorrow’s economy. It is not necessary, nor would it even be desirable, for the United States to lead in every aspect of this broad field. However, continued leadership on a broad range of applications is critical to our nation’s continued ability to compete in world markets. In addition, in a few areas, such as defense applications, international leadership has important strategic implications.
What Does Nanotechnology Mean for Us?
The simple answer is that, over the next fifty years, consumers will see a growing range of new products that dramatically transform their lives. If properly managed, these products will dramatically improve human health, change the structure of society, and open up new possibilities for human potential.
On a more basic level, managers must begin to study how today’s discoveries could transform their business in the next five to ten years. By now every business, even those far removed from the computer industry, has been significantly affected by the revolution in communications and computing. Nanotechnology’s influence will be equally broad. First, it will create the capacity for new products with much better performance characteristics and less waste. Second, by continuing the communications revolution, it will give companies new ways to organize work and distribution lines. Third, it will transform the environment within which the business competes.
The world we live in will continue to get faster, more complex, and smaller. Also in this Issue…
The Professional and Business Services Sector: Employment Changes Across Indiana Metros Nanotechnology: Breaking Through the Next Big Frontier of Knowledge Return to Table of Contents
Summer 2007 | Volume 82, No. 2
The Professional and Business Services Sector: Employment Changes Across Indiana Metros | 科技 |
2017-09/1580/en_head.json.gz/11336 | more than 246,000 articles currently online -- Science is knowledge -- innovations-report - The latest trendsForum for Science, Industry and BusinessSponsored by: Search our Site: HomeAbout usDeutsch Science ReportsSpecial TopicsB2B AreaJobs & OpportunitiesHomeScience ReportsReports and NewsPower and Electrical Engineering
Fuel cells supply parked trucks with electricity
Converting diesel fuel quietly and with low emissions
Truck cabins provide both the workplace and home for drivers for days and even weeks. The communications technology, lighting and air conditioning in trucks therefore also require onboard power supplies during rest periods.
The current BINE-Projektinfo brochure “Low-emission energy supplies at truck stops” (02/2013) presents a high-temperature fuel cell that has been developed for this purpose. For the first time, a system using diesel fuel has been developed, which was previously problematic because of the sulphur content. The list of requirements for supplying on-board electricity with high-temperature fuel cells is long: the system needs to be light, compact and very robust, and it must be able to withstand vibrations. The components are subject to considerable thermomechanical loads caused by rapid heating up and high operating temperatures. When reforming the gas from the diesel fuel, all components must be designed for a sulphur content of 10 – 15 ppm. The developers have already been able to make some important breakthroughs with the aforementioned requirements and are approaching the pre-production stage. However, further optimisation stages are necessary, which are expected to take about three years, before production readiness is attained. In particular these concern the sulphur tolerance, the redox resistance, the thermomechanics and the heat transfer. The system currently has an efficiency of 22%. The production-ready model is intended to achieve 30%, have a net electrical capacity of 3 kW and thus consume one litre of diesel per operating hour. The BINE-Projektinfo brochure “Low-emission energy supplies at truck stops” (02/2013), which can be obtained free of charge from the BINE Information Service at FIZ Karlsruhe, is available online at www.bine.info or by calling +49 (0)228 92379-0. Press contact Uwe Milles presse(at)bine.info About BINE Information Service Energy research for practical applications The BINE Information Service reports on energy research topics, such as new materials, systems and components, as well as innovative concepts and methods. The knowledge gained is incorporated into the implementation of new technologies in practice, because first-rate information provides a basis for pioneering decisions, whether in the planning of energy-optimised buildings, increasing the efficiency of industrial processes, or integrating renewable energy sources into existing systems. About FIZ Karlsruhe FIZ Karlsruhe – Leibniz Institute for Information Infrastructure is a not-for-profit organization with the public mission to make sci-tech information from all over the world publicly available and to provide related services in order to support the national and international transfer of knowledge and the promotion of innovation. Our business areas: • STN International – the world’s leading online service for research and patent information in science and technology • KnowEsis – innovative eScience solutions to support the process of research in all its stages, and throughout all scientific disciplines • Databases and Information Services – Databases and science portals in mathematics, computer science, crystallography, chemistry, and energy technology
FIZ Karlsruhe is a member of the Leibniz Association (WGL) which consists of 87 German research and infrastructure institutions. Rüdiger Mack | idw
http://www.bine.info/en
Further reports about: > BINE
> BINE-Projektinfo
> FIZ
> Fuel cells
> databases
> diesel fuel
> energy source
> practical application
> renewable energy source
More articles from Power and Electrical Engineering:
22.02.2017 | Toyohashi University of Technology
All articles from Power and Electrical Engineering >>> | 科技 |
2017-09/1580/en_head.json.gz/11337 | Islands top a global list of places to protect
Rare and unique ecological communities will be lost if oceanic islands aren't adequately considered in a global conservation plan, a new study has found.
Although islands tend to harbor fewer species than continental lands of similar size, plants and animals found on islands often live only there, making protection of their isolated habitats our sole chance to preserve them.
Many conservation strategies focus on regions with the greatest biodiversity, measured by counting the number of different plants and animals. "Normally you want to focus on the most diverse places to protect a maximum number of species," said Holger Kreft, a post-doctoral fellow at the University of California, San Diego and one of the two main authors of the study, "but you also want to focus on unique species which occur nowhere else."
To capture that uniqueness, Kreft and colleagues at the University of Bonn, UC San Diego and the University of Applied Sciences Eberswalde used a measure of biodiversity that weights rare species more than widespread ones. They carved the terrestrial realm into 90 biogeographic regions, calculated biodiversity for each, then compared island and continental ecosystems. By this measure, island populations of plants and vertebrate animals are eight to nine times as rich. ... more about:»Applied and Environmental Microbiology »Biodiversity »Science TV »conservation strategies »continental ecosystems »continental lands »ecological communities »global conservation plan »places to protect Their results, plotted on global maps, will be reported the week of May 11 in the Proceedings of the National Academy of Sciences. The southwest Pacific island of New Caledonia stands out as the most unique with animals like the kagu, a bird with no close relatives found only in the forested highlands that is in danger of extinction, and plants like Amborella, a small understory shrub unlike any other flowering plant that is thought to be the lone survivor of an ancient lineage.
Fragments of continents that have broken free to become islands like Madagascar and New Caledonia often serve as a final refuge for evolutionary relicts like these. The source of diversity is different on younger archipelagos formed by volcanoes such as the Canary Islands, the Galápagos and Hawaii which offered pristine environments where early colonizers branched out into multiple related new species to fill empty environmental niches. The new measure doesn't distinguish between the two sources of uniqueness, which may merit different conservation strategies.
Although islands account for less than four percent of the Earth's land area, they harbor nearly a quarter of the world's plants, more than 70,000 species that don't occur on the mainlands. Vertebrate land animals – birds, amphibians, reptiles and mammals – broadly follow this same pattern. "Islands are important and should be part of any global conservation strategy," Kreft said. "Such a strategy wouldn't make any sense if you didn't include the islands."
Threats to biodiversity may also rise faster for islands than for mainlands, the team reports. Scenarios based on a measure of human impact projected to the year 2100 warn that life on islands will be more drastically affected than mainland populations. "That threat is expected to accelerate particularly rapidly on islands where access to remaining undeveloped lands is comparatively easy" said Gerold Kier, project leader at the University of Bonn and lead author of the study. Expanding farmlands, deforestation, and other changes in how people use land are among the alterations expected to cause the greatest damage.
The researchers also considered future challenges posed by climate change and report mixed impacts. Rising sea levels will swamp low-lying areas and smaller islands, but the ocean itself is expected to moderate island climates by buffering temperature changes. "Although disruptions to island ecosystems are expected to be less severe than on the continents, climate change remains one of the main threats to the biodiversity of the Earth," Kier said. "If we cannot slow it down significantly, protected areas will not be much help."
"We now have new and important data in our hands, but still have no simple solutions for nature conservation," Kreft said. "In particular, we need to answer the question how protected areas with their flora and fauna can complement each other in the best way. The part played by ecosystems, for example their ability to take up the green-house gas carbon dioxide, should be increasingly taken into account."
Co-authors included Tien Ming Lee and Walter Jetz of UC San Diego; Pierre Ibisch and Christoph Nowicki of the University of Applied Sciences Eberswalde; and Jens Mutke and Wilhelm Barthlott of the University of Bonn.
The Academy of Sciences and Literature Mainz, the Wilhelm Lauer Foundation, and the German Federal Ministry of Education and Research funded the research. Holger Kreft holds a Feodor-Lynen Fellowship from the Alexander von Humboldt Foundation.
Holger Kreft | EurekAlert!
http://www.ucsd.edu
http://www.pnas.org/cgi/doi/10.1073/pnas.0810306106
Further reports about: > Applied and Environmental Microbiology
> Science TV
> conservation strategies
> continental ecosystems
> continental lands
> ecological communities
> global conservation plan
> places to protect | 科技 |