id
stringlengths 30
34
| text
stringlengths 0
71.3k
| industry_type
stringclasses 1
value |
---|---|---|
2016-40/3982/en_head.json.gz/2487 | Understanding Mechanical Properties Of Silicon Nanowires
Silicon nanowires are attracting significant attention from the electronics industry due to the drive for ever-smaller electronic devices, from cell phones to computers. The operation of these future devices, and a wide array of additional applications, will depend on the mechanical properties of these nanowires. New research from North Carolina State University shows that silicon nanowires are far more resilient than their larger counterparts, a finding that could pave the way for smaller, sturdier nanoelectronics, nanosensors, light-emitting diodes and other applications.
It is no surprise that the mechanical properties of silicon nanowires are different from "bulk" "“ or regular size "“ silicon materials, because as the diameter of the wires decrease, there is an increasing surface-to-volume ratio. Unfortunately, experimental results reported in the literature on the properties of silicon nanowires have reported conflicting results. So the NC State researchers set out to quantify the elastic and fracture properties of the material.
"The mainstream semiconductor industry is built on silicon," says Dr. Yong Zhu, assistant professor of mechanical engineering at NC State and lead researcher on this project. "These wires are the building blocks for future nanoelectronics." For this study, researchers set out to determine how much abuse these silicon nanowires can take. How do they deform "“ meaning how much can you stretch or warp the material before it breaks? And how much force can they withstand before they fracture or crack? The researchers focused on nanowires made using the vapor-liquid-solid synthesis process, which is a common way of producing silicon nanowires.
Zhu and his team measured the nanowire properties using in-situ tensile testing inside scanning electron microscopy. A nanomanipulator was used as the actuator and a micro cantilever used as the load sensor. "Our experimental method is direct but simple," says Qingquan Qin, a Ph.D. student at NC State and co-author of the paper. "This method offers real-time observation of nanowire deformation and fracture, while simultaneously providing quantitative stress and strain data. The method is very efficient, so a large number of specimens can be tested within a reasonable period of time."
As it turns out, silicon nanowires deform in a very different way from bulk silicon. "Bulk silicon is very brittle and has limited deformability, meaning that it cannot be stretched or warped very much without breaking." says Feng Xu, a Ph.D. student at NC state and co-author of the paper, "But the silicon nanowires are more resilient, and can sustain much larger deformation. Other properties of silicon nanowires include increasing fracture strength and decreasing elastic modulus as the nanowire gets smaller and smaller."
The fact that silicon nanowires have more deformability and strength is a big deal. "These properties are essential to the design and reliability of novel silicon nanodevices," Zhu says. "The insights gained from this study not only advance fundamental understanding about size effects on mechanical properties of nanostructures, but also give designers more options in designing nanodevices ranging from nanosensors to nanoelectronics to nanostructured solar cells."
The study, "Mechanical Properties of Vapor-Liquid-Solid Synthesized Silicon Nanowires," was co-authored by Zhu, Xu, Qin, University of Michigan (UM) researcher Wei Lu and UM Ph.D. student Wayne Fung. The study is published in the Nov. 11 issue of Nano Letters, and was funded by grants from the National Science Foundation and NC State. ---
Image Caption: These are silicon nanowires used in the in-situ scanning electron microscopy mechanical testing by Dr. Yong Zhu and his team. Credit: North Carolina State University | 科技 |
2016-40/3982/en_head.json.gz/2491 | Print Email Font ResizeT. rex not a stand-up guy? Test your dino skillsBy MALCOLM RITTERPosted:
02/25/2013 03:00:00 AM ESTMonday February 25, 2013 NEW YORK -- Here’s a test of your dinosaur knowledge: Did Tyrannosaurus rex stand upright, with its tail on the ground? The answer: No. But a lot of young people seem to think so, and the authors of a study are blaming toys like Barney and other pop influences for that misconception.
Scientists used to think T. rex stood tall, but they abandoned that idea decades ago. Now, the ferocious dinosaur is depicted in a bird-like posture, tail in the air and head pitched forward of its two massive legs.
The change led major museums to update their T. rex displays, study authors said, and popular books have largely gotten the posture right since around 1990. So did the "Jurassic Park" movies.
But when the researchers asked college students and children to draw a T. rex, most gave it an upright posture instead. Why? They’d soaked up the wrong idea from toys like Barney, games and other pop culture items, the researchers conclude.
"It doesn’t matter what they see in science books or even in ‘Jurassic Park,"’ says Warren Allmon, a paleontology professor at Cornell University in Ithaca, N.Y., and an author of the study.
It struck him when he saw a box of dinosaur chicken nuggets at a grocery store.
"What they grew up with on their pajamas and their macaroni and wallpaper and everything else is the tail-dragging posture," he said.Advertisement
If the explanation is correct, Allmon said, it’s a sobering reminder of how people can get wrong ideas about science. The study will be published in the Journal of Geoscience Education. The authors examined 316 T. rex drawings made by students at Ithaca College and children who visited an Ithaca museum. Most of the college students weren’t science majors. Seventy-two percent of the college students and 63 percent of the children drew T. rex as being too upright. Because the sample isn’t representative of the general population, the results don’t necessarily apply to young people in general.
When the authors looked at other depictions of T. rex, they found the obsolete standing posture remains in pop culture items like toys, games, cookie cutters, clothing, comics and movies. Mark Norell, a prominent paleontologist at the American Museum of Natural History in New York who didn’t participate in the study, said he doesn’t know if the upright-posture myth is as widespread as the new study indicates. But he said it makes sense that children’s first impressions of T. rex can persist. If they don’t study dinosaurs later, "that’s what they’re stuck with."Print Email Font ResizeReturn to Top TALK TO US: If you'd like to leave a comment (or a tip or a question) about this story with the editors, please email us news@reformer.com. We also welcome letters to the editor for publication; you can do that by filling out our letters form and sending it to the newsroom.(Photo by Richard Shotwell/Invision/AP, File)Taylor Swift and Tom Hiddleston split upTaylor Swift decides Tom Hiddleston isn’t the one after all and hands him his walking papers. Bruce Springsteen has dealt with depression for more than 30 yearsGreta Van Susteren leaves Fox NewsGreen Day's theater tour coming to Berkeley next month | 科技 |
2016-40/3982/en_head.json.gz/2559 | Advertisement Advertisement Resilient Electric Grid Project: Keeping the U.S. Electrical Grid Online Fri, 08/28/2009 - 11:58am Comments by Resilient Electric Grid Project: Keeping the U.S. Electrical Grid Online With the new superconducting cable, Manhattan's electrical workers may be able to eventually clear out the aging, subterranean rats' nest beneath Wall Street that, amazingly, looks much the same today as it did a century ago (1913 image).
Barring the occasional thunderstorm, most Americans take the electric current behind their power buttons for granted, and assume the juice will be there when they're ready to fire up an appliance or favorite tech toy. Little do most know, the strain on our electric grid — which has led to rolling brownouts and the massive 2003 blackout that left 40 million people across the Northeast in the dark — will only intensify in coming years. According to the Department of Energy, the annual cost of power outages is approximately $80 billion. Now add to conventional challenges those risks posed by terrorists intent on crippling our economy. Suddenly, the aim of electrical engineers to develop a technology to keep the country's electrical grid online (and recover faster) really begins to resonate.
The Science and Technology Directorate (S&T) of the U.S. Department of Homeland Security is currently funding a promising solution — a superconductor cable that would link electrical substations and allow the sharing of excess capacity during emergencies. This generally is not done now, and so a flexibility like this strengthens the resiliency of the overall grid, reducing the likelihood of major power failures. This is S&T's Resilient Electric Grid project, and the superconducting cable is called an inherently fault current limiting (IFCL) superconductor cable.
A single superconducting cable (shown in blue) could one day replace a dozen traditional copper cables (shown in red), freeing up much-needed space beneath city streets. Courtesy of US Department of Homeland Security - Science and Technology
Engineers are putting decades of existing electrical research by industry electricity leaders from American Superconductor, Southwire and Consolidated Edison into practice, as they eye the aging rats' nest of power cabling under the crowded streets of New York City. S&T managers and scientists recently participated in a successful test of the new superconducting technology at Oak Ridge National Laboratory.
The benefits are simple but profound: these cables can deliver more power, prevent power failures, and take up less physical space. A single superconductor cable can replace 12 copper cable bundles, freeing up more space underground for other utility needs such as water, natural gas or phone service. The technology is capable of carrying 10 times as much power as copper wires of the same size, while also being able to adapt automatically to power surges and disruptions from lightning strikes, heat waves and traffic accidents, even sabotage.
"The IFCL superconducting cable being tested could well revolutionize power distribution to the country's critical infrastructure," said Dr. Roger McGinnis, Director of the Homeland Security Advanced Research Project Agency at S&T. "Eventually, these technologies will help incorporate localized clean, green electricity generation into the power grid."
As for the science, the cables work by transmitting electricity with near zero resistance at higher temperatures than usual. But "high" is a relative term among superconductors. The cables conduct electricity at a chill -320°F instead of an icy -460°F for traditional superconductor cables.
Holding and conducting energy better than traditional copper means these cables take up a fraction of the space. Manhattan's electrical workers may be able to eventually clear out the subterranean congestion beneath Wall Street that, amazingly, looks much the same today as it did a century ago.
Since the cables themselves better prevent extremely high currents from cascading through the system, they will help eliminate the power surges that can permanently damage electrical equipment, similar to a breaker switch in a home, explained McGinnis. The cable switches off during a surge or failure, but automatically resets when conditions return to normal.
For some context, electrical substations take electricity delivered over transmission and distribution lines and lower the voltage so it can be used by homes and businesses. Even if power is lost to an individual substation, by creating multiple, redundant paths for the electric current, the cables allow quick power restoration to all the surrounding power loads. Ultimately, these cables may allow substations that had been intentionally isolated from one another in the past, for fear of cascading failures, to be interconnected in order to share power and assets.
Cutting-edge high temperature superconducting cables have been successfully tested in laboratories, and can be found in a handful of demonstration projects around the country, but they remain an emerging technology. S&T is interested in advancing the technology so that it can be used nationwide, and is pursuing an opportunity to connect two Con Edison Manhattan substations with the cable.
The Department of Homeland Security hopes to enable the Department of Energy and various utility companies around the country to replace more than 2,000 circuit miles of power cables in U.S. cities with resilient, safe and green IFCL cables. Informatics Related Reads A New Way to Keep the Heart Pumping
Water Resilience That Flows Shire's First Prescription Eye Drop Is Now Available in the U.S.
Argonne-Led Projects Among $39.8M in First-Round Exascale Computing Project Awards | 科技 |
2016-40/3982/en_head.json.gz/2578 | Print Email Font ResizeNew iPhones make a splash with colors, priceBy Dan Nakaso and Troy Wolverton, San Jose Mercury NewsUpdated:
CUPERTINO -- Confirming weeks of rumors, Apple executives on Tuesday unveiled a new, gold-colored iPhone 5S and a cheaper iPhone 5C designed to appeal to overseas markets.
The iPhone 5C borrows a page from Apple's iPods and will come in multiple colors. Prices start at $99 for a 16GB model and $199 for a 32GB model -- both with two-year contracts. The 5C features a case made out of plastic, which Apple's design guru, Jony Ive, described as "beautifully, unapologetically polycarbonate."Advertisement
Apple marketing head Phil Schiller called the 5C's higher-end brother, the iPhone 5S, the "most forward thinking phone ever" that's been designed to run both 32-bit and 64-bit apps and will include an upgraded camera along with a new fingerprint sensor built into the phone's home button that's intended to provide convenient security.
Several analysts embraced Apple's upgraded 5S.
New models of the Apple iPhone 5C on display in the Apple Store in Berlin, Germany, 10 September 2013. The introduction of the new Apple smartphones was held in Cupertino, California, USA, and screened live in the store in Berlin. (via Bay Area News Group)
"You can't under-estimate how important security has become for consumers," said Tim Bajarin, president of Creative Strategies. "The camera clearly delivers a new set of features, larger pixels, a wide space for images and all these filters. It's just absolutely stunning. It'll make the iPhone 5S one of the best smart phone cameras available."
Investors and advertisers also may be impressed by the new phone's 64-bit upgrade, which Bajarin called the "kind of new processing power that will allow software developers to create even more interesting and powerful applications, not just games. It'll provide a more intense experience and increase the speed of video and the quality."
Apple unveils two new iPhones, the less expensive iPhone5C and a high-end upgrade, iPhone 5S. The 5C will come in new colors; the 5S debuts a fingerprint scanner in the home button, a faster CPU and major camera and video upgrades; a comparison. (TOBEY/The Washington Post)
The iPhone 5S will cost $199 for a 16GB model, $299 for a 32GB version and $399 for a 64GB model -- all for two-year contracts. An "unlocked and contract-free" version carried over T-Mobile will be available for $549 for the 16GB version and $649 for a 32GB model. The iPhone 5S will come in silver, gold and "space gray."
Pre-orders for the iPhone 5C and 5S will begin on Friday. They will be available for sale on Sept. 20. Apple also will keep its 8GB iPhone 4S, which will be available for free on a two-year contract.
It's unclear how the public will react to the announcements.
Some analysts said Tuesday's presentation offered no surprises following weeks of leaked media reports and Apple stock fell $11.53, or 2.28 percent Tuesday to close at $494.64. Shares dipped slightly in after-hours trading.
"There were no surprises at all," said Bob O'Donnell, an analyst at technology research firm IDC. "Some people are going to be disappointed."
The iPhone 5C is not a "cheap" version of the iPhone, noted Avi Greengart, an analyst with market research firm Current Analysis.
"It's an iPhone 5, just made out of different material" Greengart said.
Apple's announcements came as the company arguably needs another hit product. As a company, Apple's sales growth has slowed to a crawl and its profits have slumped. Meanwhile, its stock price, despite recovering recently, is still down more than 30 percent from the highs it set last year.
While Apple's iPhone sales have held up better than its tablet and computer sales, they still have been hit by the slowdown in the company's business. And thanks to that slowing growth, Apple's market share in smartphones has slumped. In the second quarter, Apple held about 14 percent of the worldwide smartphone market, compared with about 19 percent a year earlier, according to Gartner.
One of the attention-grabbing aspects of the iPhone 5S is its new level of security aimed at preventing anyone else from accessing the phone. Apple's fingerprint recognition "Touch ID" sensor is designed to scan through the sub-epidermal layer of skin.
Fingerprint information will be encrypted and stored inside the A7 chip and will not be backed up to the iCloud or to Apple's servers, according to an Apple video.
The Touch ID technology also can be used to make purchases at any of Apple's iPhone stores -- to buy books, music, movies and apps -- without entering a password.
Forrester Analyst Frank Gillett called the new fingerprint security system "jaw droppingly easy" and "the first painless biometric I've seen."
Tony Cripps, principal device analyst at Ovum, said, "Apple is certainly offering meaningful innovation here. Moving to a 64-bit architecture means Apple can genuinely claim to have brought something new to the smartphone party. It should certainly help the company further cement its lead as a mobile gaming platform and will give the Android fraternity something to think about in a space whose significance is sometimes downplayed beyond the gaming world."
Apple executives began their presentation by announcing that the iOS7 operating system will be available for download on Sept. 18 for iPhone 4 models and above and for iPad 2 models and above.Print Email Font ResizeReturn to Top Welcome to your discussion forum: Sign in with a Disqus account or your social networking account for your comment to be posted immediately, provided it meets the guidelines. (READ HOW.) | 科技 |
2016-40/3982/en_head.json.gz/2609 | Yes, Online Privacy Really Is Possible
ASU | NEW AMERICA | SLATE
Learn more about Future Tense »
SlateFuture TenseThe Citizen's Guide to the FutureFeb. 14 2014 6:19 PM
By Eva Galperin and Jillian C. York
You can protect yourself online. Photo by PATRIK STOLLARZ/AFP/Getty Images A few short weeks ago, we were conducting a security training for a group of journalists in Palestine. The journalists were deeply aware of the potential threats facing them—and by not one, but three governments—but didn’t have the first idea of how to mitigate against those threats. “It’s too confusing!” claimed one, while another said it was futile.
Unfortunately, these reactions are all too typical. We’ve heard from a variety of populations all over the world. Despite all of the awareness-raising around surveillance that has taken place over the last year, many individuals feel disempowered, helpless to fight back. Efforts such as the February 11 initiative the Day We Fight Back aim to empower individuals to lobby their representatives for better regulation of mass surveillance. But legislation and policy are only part of the solution. In order to successfully protect our privacy, we must take an approach that looks at the whole picture: our behavior, the potential risks we face in disclosing data, and the person or entity posing those risks, whether a government or company. And in order to successfully fight off the feeling of futility, we must understand the threats we face.
Advertisement In a recent piece for Slate, Cyrus Nemati hems and haws over the complexities of creating a private online existence, ultimately choosing to give up on Internet privacy and embrace the convenience of sharing. While working at an organization that advocates for digital rights, Nemati found himself anxious about his personal privacy and took steps that made browsing “a chore”; later, after getting married and wanting access to social tools, he claims he “learned … to love a less private Internet.”
The truth is that most of us simply can’t protect ourselves from every threat 100 percent of the time, and trying to do so is a recipe for existential dread. But once we understand our threat model—what we want to keep private and whom we want to protect it from—we can start to make decisions about how we live our lives online. You’ll find yourself empowered, not depressed.
Threat modeling is an approach undertaken by the security community. It looks at the specific circumstances of the individual and the potential threats facing him or her and makes a diagnosis (and a prescription) on that basis. Threat modeling looks at what a person has to protect (her assets), who she has to protect those assets from (her threat), the likelihood that she will need to protect them, her willingness to do so, and the potential consequences of not taking precautions.
A teacher in suburban California doesn’t have the same set of online privacy concerns than a journalist in Palestine. And the kinds of steps the teacher might take to protect his personal photos from nosey students and their parents are quite different from the precautions the journalist might take to protect her anonymous sources from being identified by the government. Some us don’t want our Internet browsing habits tracked by companies like Google and Facebook. Some of us don’t want the NSA reading our emails. But without enumerating our threats and our assets, it’s easy to choose tools that are inappropriate or unnecessary to the task at hand. The schoolteacher probably doesn’t need to PGP-encrypt his email or run every privacy-enhancing app and plugin, like Nemati did in his privacy hipster phase. The journalist might find that taking the time to use PGP gives her peace of mind.
Nemati’s frustration may not have come from failing to list his threats and assets as much as may have come from misidentifying them. He writes that he “treat[ed] himself like a criminal, obsessed with keeping a very low online profile”—a perfect recipe for frustration, bearing little to no resemblance to how an actual criminal might behave. A successful criminal understands his threat—law enforcement—and recognizes the steps he needs to take to evade them, which may or may not include keeping a low profile online. Nemati might instead face the threat of his parent, spouse, or boss viewing his online activity and work to hide those activities from them. He might also be worried about criminals who want to steal his login credentials and gain access to his bank account. This requires an understanding of security settings, social media, and browser privacy settings, for sure, but not the elaborate privacy kabuki Nemati describes.
Don’t get us wrong: We’re sympathetic to Nemati and the many Internet users like him we meet every day. But we also know that the choice between a crippled Internet experience and an Internet in which privacy is a mere afterthought is a false one. Instead of heading down the rabbit hole of deep paranoia and subsequent nihilism, we recommend that you tackle the task of becoming safer online the way you would any other task: step by step. By starting slow and building on your repertoire of tools, you can protect yourself. For a list of 10 things you can do right now to protect yourself against surveillance, check out this blog post from the EFF, where we work.
Total privacy on the Internet may not be possible, but meaningful privacy is within your reach. And you don't have to go crazy trying to achieve it Future Tense is a partnership of Slate, New America, and Arizona State University.
Eva Galperin is a global policy analyst for the Electronic Frontier Foundation. Her work focuses on providing digital privacy and security for vulnerable populations.
Jillian C. York is the director for international freedom of expression at the Electronic Frontier Foundation. | 科技 |
2016-40/3982/en_head.json.gz/2610 | The Worst Case Scenario Has Come True: California’s Snowpack Is Now Zero Percent of Normal
SlateThe SlatestYour News CompanionMay 29 2015 2:56 PM
California’s Snowpack Is Now Zero Percent of Normal
By Eric Holthaus
A stump sits at the site of a manual snow survey on April 1, 2015 in Phillips, California. The current recorded level is zero, the lowest in recorded history for California. Photo by Max Whittaker/Getty Images California’s current megadrought hit a shocking new low this week: On Thursday, the state’s snowpack officially ran out.
At least some measurable snowpack in the Sierra mountains usually lasts all summer. But this year, its early demise means that runoff from the mountains—which usually makes up the bulk of surface water for farms and cities during the long summer dry season—will be essentially non-existent. To be clear: there’s still a bit of snow left, and some water will be released from reservoirs (which are themselves dangerously low), but this is essentially a worst-case scenario when it comes to California’s fragile water supply.
This week's automated survey found California's statewide snowpack had officially run out. California Department of Water Resources Advertisement The state knew this was coming and has been working to help soften the blow—but they’re fighting a losing battle. Bottom line: 2014 was the state’s hottest year in history, and 2015 is on pace to break that record. It’s been too warm for snow. Back in April, Gov. Jerry Brown enacted the state’s first-ever mandatory water restrictions for urban areas based mostly on the abysmal snowpack. In recent days, the state’s conservation efforts have turned to farmers—who use about 80 percent of California’s water.
With a burgeoning El Niño on the way, there’s reason to believe the rains could return soon—but not before October or November. The state’s now mired in such a deep water deficit that even a Texas-sized flood may not totally eliminate the drought.
Welcome to climate change, everyone.
Eric Holthaus is a meteorologist who writes about weather and climate for Slate’s Future Tense. Follow him on Twitter. | 科技 |
2016-40/3982/en_head.json.gz/2684 | Introducing nanotechnology to industries Ananda KANNANGARA Science and Technology Minister Prof. Tissa Vitarana stressed the importance of introducing Nanotechnology to industries and said fund controllers in the country must extend their fullest co-operation to develop the technology. He was speaking at a seminar on `Tapping the World of Nanotechnology�, organised by the National Science Foundation and Small and Medium Enterprise Developers of the Federation of Chambers of Commerce and Industry in Sri Lanka (FCCISL). Nanotechnology is a branch of engineering that deals with the design and manufacture of small electronic circuits and mechanical devices that are built at the molecular level of matter. The Minister also said that there was a major economic crisis in the world and Sri Lankan could take advantage of this situation by using Nanotechnology. The Minister thanked the government for releasing a 60-acre land at Homagama to set up a Nanoscience Park with facilities for research and development in Nanotechnology. He also said that the Sri Lanka Nanotechnology Institute, at Biyagama is a public-private partnership and it would help to conduct research activities and also apply Nanotechnology for the advancement of technologies. Dr. Rohan Munasinghe of the Moratuwa University, Engineering Faculty said that governments and industries are investing on research and development in Nanotechnology, since it is an interdisciplinary field that encompasses physics, chemistry, biology and engineering. Prof. Veranga Karunaratne of the Sri Lanka Institute of Technology said that Nanotechnology is at the infant stage in our country and scientists could contribute to develop technologies. EMAIL | PRINTABLE VIEW | FEEDBACK
Gamin Gamata - Presidential Community & Welfare Service | 科技 |
2016-40/3982/en_head.json.gz/2896 | Home / Science News Manta rays threatened by fishermen Nov. 24, 2012 at 2:53 PM Follow @upi Comments
RAJA AMPAT, Indonesia, Nov. 24 (UPI) -- Marine scientists say they are working to save a population of manta rays off the coast of Indonesia.
Manta rays, abundant around Raja Ampat, eastern Indonesia, were listed last year as "threatened" under the International Convention on the Conservation of Migratory Species in 2011, NBC News reported.
Scientists say mantas are being caught as bycatch, getting caught in industrial fishing nets targeting different types of tuna and, increasingly, for their gill rakers -- which allow them to filter food from water, and are used in traditional Chinese medicine.
A report called Manta Ray of Hope found an estimated 3,400 manta rays and 94,000 mobulas, which are related to the manta ray, are caught each year, but the numbers reflect only reported catches.
"Unreported and subsistence fisheries will mean true landings are much higher," the report said.
Scientists in China are working to have manta rays protected by the government.
"In the last two years, we have conducted evaluations of the manta ray and submitted a recommendation to the government to list it as a protected species," said Professor Wang Yanmin from the Chinese Shandong University's Marine College.
Feng Yongfeng, founder of Green Beagle, a group that promotes environmental protection, said, "There is no regulation for protecting the manta ray so sales of mantas are not illegal."
"They're such an iconic species, beloved by divers," said Andrea Marshall, director of the Marine Megafauna Foundation. "They're just amazing."
Fishermen turn to rays as sharks decline
Maldives manta rays threatened by tourism
Scientists track manta rays by satellite
Indonesia to protect two of world's largest manta ray species | 科技 |
2016-40/3982/en_head.json.gz/2897 | Home / Science News / Technology Stephen Hawking: Dismissing artificial intelligence would be a mistake
Scientists say not enough research being done on effects of artificial intelligence. By Danielle Haynes | May 3, 2014 at 2:40 PM Follow @upi Comments
| License Photo LONDON, May 3 (UPI) -- Stephen Hawking, in an article inspired by the new Johnny Depp flick Transcendence, said it would be the "worst mistake in history" to dismiss the threat of artificial intelligence.
In a paper he co-wrote with University at California, Berkeley computer-science professor Stuart Russell, and Massachusetts Institute of Technology physics professors Max Tegmark and Frank Wilczek, Hawking said cited several achievements in the field of artificial intelligence, including self-driving cars, Siri and the computer that won Jeopardy!
"Such achievements will probably pale against what the coming decades will bring," the article in Britain's Independent said.
"Success in creating AI would be the biggest event in human history," the article continued. "Unfortunately, it might also be the last, unless we learn how to avoid the risks."
The professors wrote that in the future there may be nothing to prevent machines with superhuman intelligence from self-improving, triggering a so-called "singularity."
"One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand. Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all," the article said.
"Although we are facing potentially the best or worst thing to happen to humanity in history, little serious research is devoted to these issues outside non-profit institutes such as the Cambridge Centre for the Study of Existential Risk, the Future of Humanity Institute, the Machine Intelligence Research Institute, and the Future of Life Institute. All of us should ask ourselves what we can do now to improve the chances of reaping the benefits and avoiding the risks."
Notre Dame alumnus giving university record $75 million donation
Tech companies defy government and will notify users of secret data demands
GenDyn UK contracted to support military radio system
US nuclear arsenal still controlled by floppy disks
Topics: Stephen Hawking, Johnny Depp Latest Headlines | 科技 |
2016-40/3982/en_head.json.gz/2924 | News Scientists Solve Mystery of Brilliant Northern, Southern Lights October 27, 2009 12:12 PM
Scientists have solved the mystery behind the brilliant northern and southern light show known as Aurora Borealis. The phenomenon is caused by electromagnetic energy from the sun, which experts say also wreaks havoc on ground-based power grids and satellites. VOA's Jessica Berman reports.Just like atmospheric conditions can affect weather on the ground, experts say the sun is responsible for weather in outer space. They say the Sun's atmosphere emits high energy solar winds that bathe the Earth continuously with electromagnetic energy. Nicola Fox is with the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland. "You could really think of it as us living in the atmosphere of the Sun," Fox explained. "So, if the Sun changes, the Earth will feel its effects. So, if the Sun sneezes, the Earth will catch a cold."But until now, space scientists have been unable to pinpoint the source of energy releases in the earth's atmosphere that are responsible for the spectacular light show, aurora borealis, in the extreme northern and southern latitudes. The same energy releases are responsible for dangerous sub-storms that disrupt ground-based power grids and communications systems. Scientists at the University of California at Los Angeles discovered the source of the space blasts using five satellites of the U.S. space agency's THEMIS program.The researchers explain that the Sun's and Earth's electromagnetic fields normally glide past one another in many different directions. But when enough energy builds between the two fields, they snap and right themselves in a process scientists call reconnection.David Sibek is THEMIS project scientist with the U.S. space agency NASA. He says reconnection releases a huge amount of electrical current into the magnetosphere that surrounds the planet."When reconnection occurs, that current is broken and it flows down to the Earth so you have like a short-circuit out in the Earth's magnetic field," explained Sibek. "And it's that current that's going to power the aurora and dump into the Earth's ionosphere and cause power line disruption in Canada for example by blowing out transformers."Scientists say it is important to know about sub-storms in order to take measures to protect valuable technical equipment, and possibly the lives of spacewalking astronauts.The discovery of the mechanism behind sub-storms is reported in the journal Science. Related
Scientists Describe Formation of First Stars
Scientists Link Three Genes in Schizophrenia
Total Eclipse Draws Crowds to Russia, China
Watch NASA video on Auroras and THEMIS satellites | 科技 |
2016-40/3982/en_head.json.gz/3073 | NASA Administrator
NASA Administrator Charlie Bolden's Blog
NASA’s New Neighbor on the National Mall: Reflections on the Opening of the Smithsonian’s National Museum of African American History and Culture
Posted on September 23, 2016 at 5:09 pm by Stephen Fox. Tomorrow, the National Museum of African American History and Culture opens its doors to the public. Located on the National Mall, the museum is less than a mile away from NASA’s Washington Headquarters. Recently I spoke to NASA TV about the significance of this occasion, and you can watch a few clips from our conversation below.
On a personal note, I think it’s critically important – and it’s really impressive — that at long last we’re going to have a museum on the Mall that’s dedicated to people of African decent here in the United States. I never in my wildest dreams growing up in Columbia, South Carolina during segregation would have believed that I would be experiencing the opening of a museum dedicated to African American history and culture – let alone during the Administration of America’s first Black president, whom I have the privilege of serving under as NASA’s first African American Administrator.
Because I believe so strongly in its mission, I donated a few personal items to the museum, including my flight suit and mission patch from STS-60, my final space flight and a critical mission to what would become the International Space Station program; a model of the Hubble Space Telescope – what I believe to be the most incredible scientific instrument that humanity has created; rugby shirts from STS-45 and STS-31, among other items.
It’s my hope that young people who visit the museum will be encouraged to reach for new heights in their own lives, and use their dreams as inspiration to work hard, study hard, and refuse to be deterred by failure. The reason that I applied for the Astronaut program many years ago, was that the late, great Dr. Ron McNair – himself a hallmark figure in both African American history and the history of America’s space program — encouraged me to go for it. It is my hope that the objects and displays in this museum will have the same sort of impact on a new generation of future astronauts, artists, engineers, educators, physicists, philosophers, physicians and so forth.
Being the first African American Administrator is all well and good, but I want to make sure I’m not the last. Encouraging more young people from underserved communities to study the STEM subjects of science, technology, engineering and math is one important way to make sure of this. Reminding the next generation that even the sky is not the limit is another.
Best wishes to everyone involved in opening this important new museum. I cannot wait to visit!
NASA Administrator Charles Bolden discusses the flight jacket from mission STS-60 that he donated to the new National Museum of African American History and Culture.
NASA Administrator Charles Bolden talks about the advice he gives to young people,on the occasion of the opening of the National Museum of African American History and Culture.
NASA Administrator Charles Bolden discusses the historic significance of the new National Museum of African American History and Culture.
This entry was posted in Uncategorized on September 23, 2016 by Stephen Fox. NASA Uses Federal Employee Viewpoint Survey to Improve Performance
Posted on September 20, 2016 at 8:11 am by Administrator Charles Bolden. NASA is proud to have been named the “Best Place to Work” in the Federal Government (among large agencies) for the past four consecutive years by the Partnership for Public Service. Using the Federal Employee Viewpoint Survey (FEVS) as a focal point for guidance, over time we have developed a positive work culture with a high level of employee engagement through deliberate, proactive initiatives.
I’ve always told our employees that their voices matter. At NASA, it’s especially critical, as much of our work is difficult and dangerous, and sometimes lives are in the balance. We must have a culture where speaking up and providing feedback is encouraged. I’ve made nurturing that culture a centerpiece of my leadership, and we created a Workforce Culture Strategy to communicate and codify these values.
With some 18,000 employees at NASA, getting feedback can be daunting, and the FEVS helps provide a vehicle where people feel they can be candid and offer constructive comments without putting themselves or their jobs at risk. We use it to help offices within our organization to improve and to share their successes. At NASA, we consider ourselves a family and, like any family, there can be some bumps in the road. The FEVS helps us get past them.
Based on last year’s employee feedback, we focused this year on second-level performance reviews to support and encourage fairness in ratings, and we created a Leader’s Handbook to guide supervisors and employees, and to foster organizational health.
I’m still listening – and feel privileged to be working with such a talented, creative workforce. The best part of serving as NASA Administrator continues to be witnessing how open and honest opinions and ideas have changed NASA for the better. Our entire NASA senior leadership team sincerely cares about our workforce’s opinions and is ready to take action.
I want to thank my colleagues and their teams for using the FEVS to make progress on employee engagement. I know agencies across government are using this important tool to make similar strides. All of us need to work each and every day to make sure the talented people who work for the Federal Government feel valued, included, and engaged in their jobs.
This entry was posted in Uncategorized on September 20, 2016 by Administrator Charles Bolden. Kibo: Our Shared Destiny Will Be Written by Us, Not for Us
Posted on August 4, 2016 at 11:00 am by Administrator Charles Bolden. This week, I embarked on a visit to Japan for discussions with a variety of senior Japanese government officials about our mutual interest in space exploration. I will also visit NASA’s outstanding partners at JAXA, the Japan Aerospace Exploration Agency.
With more than 30 active agreements in place, NASA and JAXA have one of the strongest, most comprehensive and longest lasting space bilateral relationships of any two nations in the world. One of the greatest illustrations of this partnership is the International Space Station (ISS), orbiting 250 miles (400 kilometers) above the Earth at 17,500 miles per hour (about 28,000 kph) with six astronauts on board as I write this!
NASA’s Journey to Mars is taking shape aboard the orbiting laboratory, where astronauts from different countries are working together to advance research and technology that will allow future astronauts to travel deeper into space, at the very same time we create jobs and improve our quality of life here on Earth.
Japan and the United States are working together aboard the Space Station with many other international partners – and we will be for the foreseeable future. Today, Japanese astronaut Takuya Onishi and American astronauts Jeff Williams and Kate Rubins are living and working together with their Russian crewmates at the cutting edge of innovation, science and discovery. Their research ‘Off the Earth, For the Earth’ promises to deepen understanding and expand human progress around such areas as medicine, biology, technology, Earth science, material production and communications – and that’s just the short list!
Because leaders in both the U.S. and Japan have chosen to extend our Space Station participation through at least 2024, the promise and potential progress that comes out of this research will continue for years to come. In the more immediate future, the research benefitting all of humanity will be bolstered by cargo delivered to station aboard Japan’s upcoming HTV-6 mission (which, as was announced recently, is set to launch in October of this year).
As we consider the bright future of our partnership, I’m very much looking forward to joining our friends at JAXA this week for a ceremony to officially open a control room for Kibo, the Japanese Experiment Module on ISS.
Kibo is, appropriately, a Japanese word meaning “hope,” and I believe that “hope” is an excellent description of the research that’s being conducted aboard the International Space Station and the cooperation that goes into it.
President Obama once said that “hope is the belief that destiny will not be written for us, but by us, by the men and women who are not content to settle for the world as it is, who have the courage to remake the world as it should be.”
The International Space Station is the embodiment of this sort of hope and effort. Consider this: more than 220 human beings from 18 countries have visited the International Space Station; tens of thousands of people have been involved in its construction and operation; and people from dozens of countries have had their research and experiments flown aboard it.
As we look forward to an exciting future exploring space, I am also enthusiastic about advances the U.S. is making in airspace travel a little closer to Earth. We are in the midst of an incredible moment in the history of aeronautics. With President Obama proposing an historic investment in green aviation, we have an opportunity to make air travel cleaner, greener, safer and quieter – even as our skies grow more crowded and aircraft fly faster.
One of the more important areas of NASA aeronautics research is air traffic management. Our country’s skies will have to absorb an estimated four billion more passengers over the next several decades and it’s essential that we do this without compromising the safety of our skies.
We in the United States are not the only country with an interest in building a more efficient air traffic management system. International commerce depends on air transportation and it is imperative that we work together with partner countries around the world to maximize human resources and investment for the benefit of all humanity.
With this in mind, after my visit to Japan I plan to travel to China to discuss areas of mutual interest in aviation research between NASA and the Chinese Aeronautical Establishment (CAE). This will be part of ongoing conversations that began in November of 2014 and have continued through a NASA-CAE workshop in Beijing that was held in August 2015.
Taken together, our partnerships around the world continue to instill optimism – and inspire hope – about the future of space exploration, aeronautics and our ability to write our own destiny – together.
This entry was posted in Uncategorized on August 4, 2016 by Administrator Charles Bolden. Bringing Humans to Mars and Humanity Together
Posted on June 3, 2016 at 3:10 pm by Administrator Charles Bolden. NASA’s Journey to Mars is about more than sending American astronauts to the Red Planet in the 2030s; it’s about bringing people together here on Earth. It’s about strengthening the American economy and with it the economic security of families throughout our country. It’s also about strengthening our friendships across sectors and also across national borders. This is why I’m fond of reminding virtually every audience to whom I speak that sending humans to Mars requires all hands on deck – government, industry, academic and international partners and citizen scientists – we need everybody.
Today, I’m embarking on a journey of my own — to meet with our global friends in international space agencies, governments, private companies, universities and other forums; folks who are eager to be part of NASA’s Journey to Mars. I plan to carry with me a message of partnership as I remind them of how much the American people value their friendship, especially when it comes to space – which in many ways is the great global connector.
It should not be lost on any of us that for the last decade and a half, human beings from multiple countries have been living and working together on the International Space Station (ISS) in common pursuit of human progress. It certainly is not lost on me, that a girl or boy age 15 or younger has lived every single second of every day of her or his life while human beings have been living and working together in space. Our grandchildren’s children may very well live every day of their own lives while human beings are living and working together on Mars.
For this reason, I’m a firm believer in the soft power that our country is able to demonstrate when we engage in space diplomacy. From our perspective at NASA, one of the most gratifying developments over the past few years has been the increasing number of nations who have joined the global exploration endeavor. Nations large and small, both with and without formal space agencies, have all come to the conclusion that everyone who has a passion for space can find a role and a place where their expertise is critical. In short, every single nation can play a part in our journey to Mars, in our scientific journey of discovery and in the next phase of humanity’s development as a spacefaring people.
Over the course of this trip, I will have the opportunity to discuss NASA’s Journey to Mars with the Israeli Minister of Science, Technology and Space, the Israel Space Agency (ISA), and Israeli innovators, students and entrepreneurs. I’ll also be meeting with students in both Israel and Jordan who participate in the Global Learning and Observations to Benefit the Environment (GLOBE) science and education initiative, of which NASA is a proud partner. I’ll also be traveling to the United Arab Emirates (UAE) to meet with colleagues at the UAE Space Agency. I’ll wrap up this trip with a meeting with NASA partners in the European Space Agency (ESA) at the ESA Council in Paris.
We recognize that NASA provides inspiration to dreamers and doers of all professions everywhere around the world, so we are looking forward to partnering with the U.S. Embassy in Amman and His Royal Highness Crown Prince Al Hussein bin Abdullah II to host a public dialog about NASA’s Journey to Mars while I am in Jordan.
Everywhere I travel, I meet people who are looking to the United States for leadership when it comes to space exploration. Time and again I hear enthusiasm about our Journey to Mars and an appetite for partnership in this remarkable pursuit of progress and possibility.
Together, we can bring humanity to the face of Mars and reach new heights for the benefit of all humankind … and we will.
This entry was posted in Uncategorized on June 3, 2016 by Administrator Charles Bolden. NASA, NOAA Analyses Reveal Record-Shattering Global Warm Temperatures in 2015
Posted on January 20, 2016 at 4:33 pm by Administrator Charles Bolden. By NASA Administrator Charles Bolden and
NOAA Administrator Kathryn Sullivan
Climate change is one of the most pressing issues of our generation and it affects every person on Earth. Tracking the changes in the global climate is the basis for understanding its magnitude and extent.
Today’s announcement that NASA and NOAA scientists have determined that 2015 was the hottest year in recorded history underscores how critical Earth observation is. The NOAA-NASA collaboration has served the country very well, from the origin of space-based remote sensing for weather forecasting to the Earth system monitoring and science that are so crucial to tackling the issues of our times. This announcement is a key data point that should make policy makers stand up and take notice — now is the time to act on climate.
The modern temperature record dates back to1880, and 2015 was the warmest year by a long shot.
There has been a lot of talk about the strengthening El Niño in the Pacific Ocean and how that might be supercharging temperatures. El Niño did likely play an important role – but more significantly, 2015’s record temperatures are the result of the gradual, yet accelerating, build-up of carbon dioxide and other greenhouse gases in Earth’s atmosphere. Scientists have been warning about it for decades and now we are experiencing it. This is the second year in a row of record temperatures and what is so interesting is that the warmest temperatures often occur the year after an El Nino, like in 1998 compared to 1997.
Fifteen of the 16 warmest years on record have now occurred since 2001. Temperatures will bounce around from year to year, but the direction of the long-term trend is as clear as a rocket headed for space: it is going up.
This record-breaking moment is a good time to take stock of what we know of our changing planet and why it is important for NASA, NOAA and other federal agencies to continue studying Earth’s climate and how it is changing:
Sea levels are rising – nearly three inches in the past two decades alone. The successful launch earlier this week of the NOAA-led Jason-3 mission will continue our 23-year record of measuring sea level change from space with remarkable precision. In the coming years and decades, our work to understand how quickly seas are rising will be vital to coastal cities in the U.S., millions of people around the world who live in low-lying areas, and to NASA’s own facilities at Kennedy Space Center, where we will one day launch astronauts to Mars, and other affected facilities such as the Stennis Space Center, Wallops Flight Facility and Michoud Assembly Facility.
The Arctic ice cap is shrinking. In the 1970s and 80s, NASA scientists pioneered techniques to measure the extent of sea ice at the poles. That new ability quickly gave way to the realization that the Arctic ice cover – which plays a significant role in the planet’s climate and even the weather we experience in the U.S. – is retreating and growing thinner.
NOAA’s global drifting buoy program and other NOAA and international ocean temperature and land surface temperature measurements have provided the means to measure the temperature at the Earth’s surface, so critical to our survival.
Ice sheets and glaciers worldwide are shedding ice. Greenland is losing about 300 billion tons of ice per year, according to measurements from NASA’s GRACE mission. Observations from the agency’s Operation IceBridge have helped confirm rapidly accelerating changes in the West Antarctic Ice Sheet and the dramatic retreat of glaciers in Alaska. Given the pace of these changes and their significance for the climate and sea level rise, we need close and continuous monitoring. In 2017, NASA will launch two missions – GRACE-FO and ICESat-2 – that represent a major refresh of our capabilities to observe how ice sheets and glaciers are changing.
Rising temperature is not an isolated effect of rising greenhouse gas levels, and scientists are still studying the full implications of a warmer world. How might patterns of drought and precipitation change? Will ecosystems and species be able to adapt to human-induced climate change? What might these changes mean for wildfires, agriculture and the economy?
Climate change isn’t a problem for the future. Earth’s climate is changing now. At NASA, we use our unique vantage point from space to study the planet as a whole system. NOAA’s scientists are on the ocean, land and in the sky collecting data that help bring clarity. Our job is to answer these kinds of questions, to make the measurements needed to get to those answers and to provide our knowledge and our data freely so the world can address this fundamental challenge.
This entry was posted in Uncategorized on January 20, 2016 by Administrator Charles Bolden. Building a Robust Commercial Market in Low Earth Orbit
Posted on January 14, 2016 at 2:58 pm by Administrator Charles Bolden. NASA is on a Journey to Mars and a new consensus is emerging around our plan, vision and timetable for sending American astronauts to the Red Planet in the 2030s. Our strategy calls for working with commercial partners to get our astronauts and cargo to the International Space Station while NASA also focuses – simultaneously — on getting our astronauts to deep space.
Few would have imagined back in 2010 when President Barack Obama pledged that NASA would work “with a growing array of private companies competing to make getting to space easier and more affordable,” that less than six years later we’d be able to say commercial carriers have transported 35,000 of pounds of space cargo (and counting!) to the International Space Station (ISS) – or that we’d be so firmly on track to return launches of American astronauts to the ISS from American soil on American commercial carriers.
But that is exactly what is happening.
Since the first SpaceX Dragon commercial resupply mission to deliver cargo to the ISS in October 2012 and Orbital ATK’s first Cygnus mission in January 2014, American companies have delivered cargo to the Space Station that enables our astronauts to work off Earth for Earth on extensive and ongoing scientific research and technology demonstrations aboard the Space Station. This has included investigations that directly benefit life on Earth and expand commercial access to microgravity research through the U.S. National Laboratory (which is operated by the Center for the Advancement of Science in Space or CASIS).
All this matters because NASA research helps us understand our home planet as well as the solar system and beyond, while technology demonstrations and human health research like astronaut Scott Kelly’s one-year mission and the Twins Study aboard the Space Station prepare us for long-duration missions into deep space.
As a result, we are closer than ever before to sending American astronauts to Mars and at the very same time, we’re “insourcing” American jobs and empowering American entrepreneurs and innovators to expand the nascent commercial market in low-Earth orbit.
Today, thanks to the bold plan laid out by the President, Americans are working at more than 1,000 companies in nearly every state in the Union on NASA commercial space initiatives.
Across the board, about 80% of NASA’s activities are carried out by our partners in industry and at America’s academic institutions. We develop more than 1,600 new technologies a year and work with business partners to transfer thousands of products, services and processes into the market for job creation and economic growth. More venture capital was invested in America’s space industry in 2015 than in all the previous 15 years combined.
In other words, at NASA we’re exploring deep space, but we’re anchored right here on Earth, where we’re creating jobs and fueling innovation, technology development and growth, recognizing that it all depends on American ingenuity and innovation.
With the recent passage of the FY2016 federal budget and our selection of Robert Behnken, Sunita Williams, Eric Boe and Douglas Hurley to be the first NASA astronauts to train to fly to space on commercial crew vehicles, we are close to returning human launches to American soil and ending our sole reliance on the Russians to get into space.
In addition, the commercial crew spacecraft will enable us to add a seventh crew member to the normal Space Station crew complement, effectively doubling the amount of crew time available to conduct research off Earth for Earth. The additional research (and crew supplies) will be delivered during cargo resupply missions.
A NEW MILESTONE
Despite critics who may have said this was a pipe dream just five short years ago, we continue to transform the way NASA does business and as a result, today we’re able to mark another significant milestone that will carry President Obama’s vision further into the future.
This afternoon, our ISS team in Houston will announce that NASA is making its new award for commercial space cargo delivery to the ISS.
This is a big deal, because our commercial resupply missions enable NASA and our private industry and other government agency partners to continue the extensive, ongoing scientific research aboard the Space Station.
President Obama extended the life of the International Space Station through at least 2024 (with the support of Congress) and our commercial cargo providers ensure cargo resupply missions continue, enabling us to keep using the station as our springboard to the rest of the solar system and a test bed for human health in space. Today’s selection builds on our initial resupply partnerships. It will ensure that NASA maintains the capability and flexibility to operate the ISS and conduct the vital research of a unique National Lab through resupply services launching from the United States.
As President Obama said, “in fulfilling this task, we will not only extend humanity’s reach in space — we will strengthen America’s leadership here on Earth.” Our investment in commercial space is creating jobs and it’s bringing us closer to sending American astronauts to Mars. Competition, innovation and technology – it’s the American way. It’s helping us to Launch America.
This entry was posted in Uncategorized on January 14, 2016 by Administrator Charles Bolden. NASA’s Work to Understand Climate: A Global Perspective
Posted on December 4, 2015 at 3:40 pm by Administrator Charles Bolden. NASA is uniquely positioned to study our home planet, and Earth observation has been at the core of the agency’s work since our founding. In addition to a fleet of amazing satellites that we and our international partners use to study our planet in a range of wavelengths, and across the spectrum of planetary features from oceans to atmosphere and ground cover, the International Space Station is also rapidly becoming a significant platform to study Earth.
Our work has global implications. This week, a small delegation of NASA leaders have been participating with a larger U.S. delegation at the 21st session of the U.N. Framework Convention on Climate Change (UNFCCC) Conference of Parties, also known as COP-21. COP-21 will bring nearly 200 nations together to reach an agreement on limiting climate change.
Global climate change, driven by rising levels of carbon dioxide and other greenhouse gases in the atmosphere, represents a fundamental challenge to the U.S. and the world. It is the challenge of our generation. While NASA has no formal role in the COP-21 climate policy talks, the agency is hard at work providing the nation and the world the best information possible about how Earth is changing. Regardless of what world leaders decide in Paris, our job is to build an understanding of the whole planet now and what it will look like in the future.
NASA’s comprehensive study of Earth has provided much of the underlying understanding of current trends in the planet’s climate – including definitive measurements of rising sea levels, glacier retreat, ice sheet changes and the decline in the volume of the Arctic sea ice cap. Our satellites have provided global, long-term views of plant life on land and in the ocean. And our supercomputing power is allowing us to better understand how all the parts of the Earth system work together and help us to predict how this could change. We will continue to monitor climate trends and investigate other ways in which the planet is ultimately responding to increasing greenhouse gas levels.
We have discovered more than a thousand planets outside of our solar system, but none yet match Earth’s complexity. That’s one reason we have more satellites orbiting Earth than any other planet. We made a significant expansion of the Earth-observing fleet in 2014 and 2015, launching missions that are making unprecedented measurements of rainfall and snow (Global Precipitation Measurement), carbon dioxide in the atmosphere (Orbiting Carbon Observatory-2), and soil moisture (Soil Moisture Active Passive). Soon, with the help of NOAA, the French Space Agency CNES, the European Organisation for the Exploitation of Meteorological Satellites -EUMETSAT and SpaceX, we will launch the Jason-3 mission to continue building on the vital, two-decade record of how much and where global sea level is changing.
The view from space is incredible – seeing our planet from orbit is one of the highlights of my life — but sometimes we need to get in a little closer. So in the 2015 and throughout 2016, NASA is sending scientists on expeditions to all corners of the planet – by plane, by ship and even by foot – to get an on-the-ground look to help answer some important science questions. How are warming ocean waters melting Greenland glaciers and adding to sea level rise? How are the world’s coral reefs responding to changes in the ocean? What will rapidly warming temperatures in the Arctic mean for the greenhouse gases stored in forests and permafrost? Our scientists are putting together multi-year campaigns that will complement our space-based perspective. Consider it planetary exploration right here at home.
Global meetings like COP-21 are important for discussion and policymaking, and NASA will continue the day to day work of monitoring our Earth observation satellites and making their wealth of data available to people across the globe. There’s no more important planet for us to understand.
This entry was posted in Uncategorized on December 4, 2015 by Administrator Charles Bolden. President Obama Meets With Space Pioneers
Posted on October 21, 2015 at 10:38 am by Administrator Charles Bolden. Monday, the stars were out at the White House — literally — as more than 100 students joined President Obama, twelve astronauts, scientists, engineers, teachers, and space enthusiasts — along with Americans participating virtually from more than 80 national parks, observatories, schools, museums, and astronomy clubs across our country — White House Astronomy Night.
President Barack Obama greets NASA Commercial Crew astronauts: Robert Behnken, Eric Boe, Douglas Hurley and Sunita Williams, NASA Administrator Charles Bolden, and NASA Deputy Administrator Dava Newman, in the Map Room before White House Astronomy Night on the South Lawn of the White House, Oct. 19, 2015. (Official White House Photo by Pete Souza)
Some of the brightest stars of the night weren’t celestial in nature. Rather, they are four space pioneers: astronauts Robert Behnken, Sunita Williams, Eric Boe, and Douglas Hurley.
These distinguished veteran astronauts are blazing a new trail, a trail that will one day land them in the history books. NASA selected these four, who privately met with the President earlier in the evening, to be the first astronauts to train to fly to space on commercial crew carriers.
It’s an important step on our Journey to Mars, and for President Obama’s ambitious plan to once again launch U.S. astronauts into space from U.S. soil and to create good-paying American jobs in the process – 350 American companies across 35 states are working toward this goal.
For as long as I’ve been Administrator, President Obama has made it very clear that returning the launches of American astronauts to American soil is a top priority.
Five years ago, when the President came to the Kennedy Space Center in Cape Canaveral, Florida, to ask NASA to work toward sending American astronauts to Mars in the 2030s, he talked about being inspired as a young boy when his grandfather lifted him on his shoulders so he could cheer on astronauts arriving in Hawaii.
His hope – and, really, all our hope – is that a new generation of young Americans will be inspired by people like Bob, Suni, Eric, and Doug to reach for new heights, both in their own lives and in the life of our nation.
Today’s young people are a part of what I like to call the “space generation.” Those who are younger than 15 have lived every day of their lives in a time when American astronauts are living and working in space aboard the International Space Station.
Our goal is to give them a future where Americans are pushing further into the solar system at the very same time that our Nation strengthens our leadership here at home. President Obama’s commercial crew vision represents a giant leap into this future.
#AskNASA Chat with NASA commercial crew astronauts. Photos from Astronomy Night 2015. Video of the President’s remarks at Astronomy Night.
This entry was posted in Uncategorized and tagged Astronomy Night, Bolden, Commercial Crew, Obama on October 21, 2015 by Administrator Charles Bolden. Mars: A Journey We Will Take Together
Posted on October 13, 2015 at 8:33 am by Lauren Worley. Nearly everywhere I travel, I meet people who are excited to learn more about NASA’s Journey to Mars and NASA’s plan, timetable and vision for getting there. This past week, we released a detailed outline of our plan – a clear, affordable sustainable, roadmap for sending our astronauts to Mars in the 2030s.
It’s called “NASA’s Journey to Mars: Pioneering Next Steps in Space Exploration” and I hope you’ll take a moment to give it a look, here.
A Journey such as this is something that no one person, crew, or Agency can undertake alone. As I like to tell the young people with whom I meet, it will take not only astronauts, scientists and engineers, but also, physicists, physicians, programmers, poets, teachers, designers, human capital professionals, entrepreneurs and parents who talk to their kids and get them excited about space. It will take folks working both in and out of government.
A mission of this magnitude is made stronger with international partnership – the sort of spirit and cooperation that is demonstrated so vividly by the tens of thousands of people across 15 countries who have been involved in the development and operation of the International Space Station.
This is the message I plan to share with our friends and partners next week at the International Astronautical Congress (IAC) in Jerusalem.
Yesterday I joined the leaders of space agencies from around the world to talk about NASA’s Journey to Mars and the partnerships and cooperation that help make humanity’s common dreams a reality.
Tuesday, I’ll join leaders of the Israeli Space Agency to sign a framework agreement to continue ongoing cooperation. It extends our decades-long relationship working together in Earth science, discoveries in space and new technologies.
The late Israeli astronaut Ilan Ramon – who grew up about 50 miles from where we’ll be meeting – commented that “There is no better place to emphasize the unity of people in the world than flying in space. We are all the same people, we are all human beings, and I believe that most of us, almost all of us, are good people.”
Having been blessed with the opportunity to see the Earth from space with my own eyes, I cannot agree more with this sentiment.
NASA’s Journey to Mars is ongoing right now — from our Space Launch System rocket and Orion spacecraft to new propulsion and habitation systems – and our partnerships across sectors, across states and across the world make it stronger.
This entry was posted in Uncategorized on October 13, 2015 by Lauren Worley. Supporting the People of South Carolina
Posted on October 8, 2015 at 10:10 am by Karen Northon. The hearts of the entire NASA family go out to our friends, family, colleagues and countrymen and women in South Carolina. While the people of my home state have seen our share of tough times (including severe weather events), I cannot recall, in all my years growing up in the Palmetto State, rains and flooding as devastating as what has been going on this week.
As a child of Columbia, I can personally attest to the fact that South Carolinians are resilient. As the people of the Palmetto State turn to the tough task of recovery and rebuilding, we hope that they will know that NASA is with them every step of the way – and we have been since the storm began.
From the time the rain began to fall, our assets in space were watching it and our scientists were harnessing these unique capabilities day after day for weather forecasters and the emergency agencies dealing with the flooding and other impacts of the storm.
NASA provided regular updates on the amount of rain falling across the region using data from the Global Precipitation Measurement (GPM) mission. Data from the GPM Core Observatory that we launched with the Japan Aerospace Exploration Agency (JAXA) last year is combined with rainfall estimates from a constellation of international satellites to provide rainfall totals every three hours. These data not only confirmed the record-breaking rainfall totals in the Carolinas, they helped forecast the extent of flooding in the region.
Rainfall totals over the U.S. Southeast measured from space by the NASA/JAXA Global Precipitation Measurement mission aided weather forecasters and emergency agencies responding to extensive flooding in South Carolina.Image credit: SSAI/NASA/JAXA, Hal Pierce
NASA provided the National Weather Service with detailed information about how water-saturated the ground was across the U.S. Southeast from the heavy rains – a key factor in forecasting flood conditions. Data from GPM and another NASA satellite, the Soil Moisture Active Passive (SMAP) mission, were combined in the NASA Land Information System model to produce experimental soil moisture estimates as a new piece of information for short-term flood forecasting.
Maps of the location and severity of local flooding produced by a NASA-funded experimental modeling system at the University of Maryland were provided to Federal Emergency Management Agency (FEMA) to help identify hard hit areas across South Carolina. The system, fine-tuned with over a decade of previous NASA satellite precipitation data, used GPM data to estimate the intensity and location of floods every three hours.
It will be a while before South Carolina recovers from the enormous rainfall and flooding. The loss of life and property is a heartbreaking outcome of this disaster that will take more than time to heal. I want everyone in South Carolina and other parts of the world threatened by natural disasters to know that NASA is dedicated to using our scientific ingenuity and innovative satellite resources to help inform response and recovery efforts on the ground.
There are some who have suggested our country and our agency ought be doing less when it comes to Earth Science. When tragedies like these occur, I believe it’s a reminder that we ought be doing more. As we make advances in studying Earth’s climate, weather, oceans, ice caps, and land cover, that long-term effort of scientific discovery also yields benefits in improving our ability to respond to and recover from natural disasters.
Today, Americans everywhere are thinking about our brothers and sisters in South Carolina. We know that the Palmetto State will recover stronger, just like we always have.
This entry was posted in Uncategorized on October 8, 2015 by Karen Northon. Page 1 of 1212345...10...»Last » | 科技 |
2016-40/3982/en_head.json.gz/3079 | InsightsMethodology
Interactive Stats
AboutTechnology
UK iOS 7 Usage Rates Higher than North America, Australia Slightly Behind UK iOS 7 Usage Rates Higher than North America, Australia Slightly Behind
Chitika Insights previously found that iOS 7 users generate the vast majority of Web traffic from North American iPhones (89.7%) and iPads (84.8%). After examining iOS Web traffic within the UK and Australia, the results demonstrate the many similarities and few differences in terms of iOS version distribution between these regions and North America.
To quantify this study, Chitika Insights analyzed millions of UK and Australian iOS-based online ad impressions generated within the Chitika Ad Network from May 22 through May 28, 2014, the same time period utilized for a previous study focused on North American iOS users. The results were then divided into iOS version distributions for iPhone and iPad users respectively.
As seen above, the UK user base is on par with North America’s in terms of adoption of iOS 7 on iPhones. In both regions, 89.7% of iPhone Web traffic is generated by devices running iOS 7. In Australia, this figure is slightly lower at 86.3%.
Looking at iOS 6 usage rates, a higher amount of Australian Web traffic (11.2%) is generated from iPhones running some version of iOS 6 as compared to what is observed in North America and the UK. The slightly greater shares for older iOS versions in Australia may be partially due to the much-publicized issues with Apple Maps in the country following the service’s debut in 2012. While Apple has addressed many of these problems in the following months and years, it’s possible that a small percentage of Australian users are still wary to upgrade to newer OS versions for this reason.
When it comes to iOS version distribution for iPad users, the UK adoption rate for iOS 7 (87.0%) is higher than what is exhibited from the North American (84.8%) and Australian (83.3%) user bases. Much like the iPhone figures, iOS 6 drives a higher share of iPad Web traffic in Australia as compared to North America and the UK.
Notably, iOS 5 or older iOS versions are better represented amongst U.S. and Canadian iPad Web traffic (7.3%) as compared to the UK or Australia, where the combined usage shares for those operating systems are 5.2% and 5.8%, respectively. While Apple has never broken out iPad sales by country, the original iPad, which is not compatible with iOS 6 or 7, was released in the U.S. a full month before it reached the UK or Australia. This likely means a greater number of those units were sold in North America, and are still in use by a comparatively larger portion of the user base considering the longer lifespan of tablets as compared to smartphones.
Overall, the high iOS 7 usage rates between all studied geographies and device types point to Apple’s iOS update strategy paying dividends from an adoption standpoint across multiple regions. Additionally, app and mobile Web developers can take some solace in the noticeable similarities in iOS version distribution between North America and the UK – particularly in regards to iPhones. Regarding Australia, its differences from the other two studied regions are slight and may change, but future studies should provide a better indication as to whether these higher rates of older iOS version usage are an ongoing characteristic of the Australian iOS user base.
Search Subscribe to the Chitika PulseOur newsletter delivers an overview of recent Chitika Insights reports right to your inbox!
Connect With Chitika
Search Chitika, Inc.257 Turnpike Rd, Suite 320Southborough, Massachusetts 01772 Google+
Copyright © 2005-2016 Chitika, Inc. All Rights Reserved.
We collect information about your activities on certain websites to send you targeted advertisements. To opt out of Chitika's targeted ads, click here. | 科技 |
2016-40/3982/en_head.json.gz/3098 | Share 8 February, 2001
Taxi! to the programming future
By NMA Staff
Entranet's TAXI! is the world's first digital TV programme to combine segments of linear entertainment, video-on-demand and interactive shopping.
Entranet, a London-based online commerce developer, is showcasing a fully interactive TV programme. The show, TAXI!, is touted to be the first digital TV programme in the world to combine segments of linear entertainment, video-on-demand and interactive shopping. E-commerce partners in the interactive show include financial services company Goldfish, Daimler Chrysler Smartcar, and Warner Bros Cinema. "This style of programming is a natural progression with the advent of technologies like TiVo and VoD, where people can take control of their viewing," says Paul Hastings, head of production at Entranet. "Advertising will become marginalized and more targeted in the future. TAXI! is an example of the way consumers can take control - choose how they want to enjoy a programme, and also choose commerce propositions along the way." The format of the show is an interactive city guide. The viewer is taken on taxi ride with a Spitting Image-style cabbie, whose mood the viewer can choose, selecting from happy, grumpy or rude dispositions. "This is an example of the way interactivity is built into the show from the beginning, not retrofitted," says Hastings. The viewer then gets to choose which area of the city - in this example London - that they want to explore, selecting from shopping, nightlife, arts, sights, or playing a competition about the city. For example, selecting the nightlife option takes the viewer into a Holiday-style featurette, showcasing London's theatreland, cinemas, bars and clubs. Incorporated into this are options for further mini-documentaries, about Andrew Lloyd-Webber for instance, and also commerce propositions and special offers, for example a 2-for-1 ticket deal at Warner Bros Cinema. "We have made the commerce less intrusive by offering sales for things when they occur naturally in the show," says Hastings. An iTV kiosk can be called up at the end of the nightlife section, in which viewers can buy tickets for West End musicals and other attractions using their remote control. Product placement is also used - when a BMW appears in shot, viewers can follow the call-to-action, click on their remote and enter a prize draw for the car. TAXI! is not currently being carried by any of the digital TV platforms, although Entranet is in currently in discussion with Sky and ntl. The programme has been transmitted through a Sky digital set-top box for demonstration purposes, but the current problem is the amount of bandwidth the show requires for its different options and programme strands. Through Sky Digital, the show requires 8-10 different channels. TAXI! is currently available on DVD and will soon be offered through a broadband Web site. The show is really being used as a calling card for Entranet's interactive programming admits Hastings. "The model of broadcasting is changing so quickly, we have to keep pace and anticipate the future. TAXI! is the culmination of two years worth of learning," he says. TAXI! is underpinned by OpenTV middleware, and requires around 11Mb per stream for broadcast quality delivery.
Published 8 February, 2001 by NMA Staff 50335 more posts from this author
NMA Archive
The NMA Archive is the new home for new media age archived content. | 科技 |
2016-40/3982/en_head.json.gz/3137 | Payoff from Idling Coal Plants Overestimated, Researchers Say
Four researchers from Carnegie Mellon University’s Green Design Institute discuss their more conservative estimates of greenhouse gas emission reductions in two papers this month. Oct 10, 2012
A quartet of researchers at Carnegie Mellon University's Green Design Institute conclude in two new papers that ignoring uncertainty in any coal-to-natural-gas transition for generating electricity can make a substantial difference in estimating the net environmental effect of the change. Researchers Aranya Venkatesh, W. Michael Griffin, H. Scott Matthews, and Paulina Jaramillo concluded life cycle assessment (LCA, the study of impacts that occur from cradle to grave) can be useful in these analyses.
Their papers appear in the October issue of Environmental Research Letters and in Environmental Science and Technology, according to a news release posted by the university.
While many studies simply examine different emissions from coal and natural gas plants, suggesting roughly a 50 percent reduction in greenhouse gas emissions, these researchers conclude the reduction is likely to be 7-15 percent instead because of changes in grid operation in response to price changes in natural gas.
"As natural gas prices go down, it becomes cheaper to operate natural gas plants, and some of these plants start being operated more often. This results in some coal plants being operated less often. However, given certain technical constraints related to the operation of existing power plants, the displacement of coal-based generation is limited," said Jaramillo, an assistant research professor in CMU's Department of Engineering and Public Policy. To cut emissions by 50 percent using natural gas would require a significant retirement of coal plants and building new natural gas plants.
The second paper examined the uncertainty in emissions that could be expected from retiring coal-fired power plants. While it suggests reductions in greenhouse gas emissions from limited retirement of coal plants will be minimal, emissions of sulfur and nitrogen oxides would be substantial, up to 25 percent, in some areas. (The paper focuses on up to 7 gigawatts of coal capacity being retired without building new power plants to replace them.) "We found that if expected coal plants retire, that alone will not bring us dramatic reductions in climate change inducing greenhouse gas emissions," said Matthews, a professor in CMU's Civil and Environmental Engineering and EPP departments.
"In addition, the benefits achieved from reducing emissions of sulfur and nitrogen oxides, while substantial in aggregate measures, will not be evenly distributed; and while some counties will see reductions in the emissions of these criteria air pollutants, some counties will see increases," Jaramillo said. | 科技 |
2016-40/3982/en_head.json.gz/3164 | Up, up, and away By Sam Scott '96
Slow it down: The sail on NanoSail-D trims the speed and brings down the satellite sooner rather than later and cuts down on space junk. Courtesy NASA/Marshall Space Flight Center
Share: Further nanosatellite adventures in the cosmos—with SCU students at Mission Control.
Launching a 12-pound nanosatellite into orbit is a little bit like becoming the caretaker for a newborn baby. Suddenly you do things on its schedule, not yours.
In the weeks after the O/OREOS satellite was detached from an Air Force rocket last November, students with the SCU School of Engineering Robotics Systems Laboratory had to be ready any time the satellite streaked overhead. Be it at 3 a.m. or 3 p.m., they were at Mission Control on the third floor of Bannan Engineering, furiously sending commands and checking vital statistics before the tiny vessel disappeared over the horizon, out of reach till the next pass. “You never know how things are going to act in space,” says Associate Professor Chris Kitts, director of the robotics lab. SCU is the only university in the country to let students do all mission operations and ground development for NASA satellites.
Waking for satellites means a wearying schedule, doctoral student Michael Neumann ’03 says. But like any guardian, he found it a relief to see things are going well 400 miles above. The satellite, whose name is an acronym for Organism/Organic Exposure to Orbital Stresses, carried astrobiology experiments testing how microorganisms found in soil and salt ponds respond to solar ultraviolet radiation and other ardors of space. Results could help scientists with questions about the origin, evolution, and durability of life.
The Small Spacecraft Division
The flight was a joint effort between NASA/Ames’ Small Spacecraft Division, which built the 12-pound vessel, and Santa Clara, which managed it. Space missions are nothing new for Santa Clara’s Robotics Systems Laboratory, a magnet for undergraduate and graduate students eager for real-world, high-tech challenges in environments as diverse as deep lakes and outer space. For more than 10 years, engineering students involved with the lab have been designing, building, and controlling nanosatellites that are often times as small as a loaf of bread. The lab has been working with NASA since 2004.
A cosmic tail
In addition to the O/OREOS satellite, the Minotaur rocket that launched last November from Kodiak Island, Alaska, contained three more satellites with SCU connections. One of them, NanoSail-D, reported to SCU’s Mission Control, testing a novel way to force satellites into de-orbit—an important goal given the growing amounts of junk orbiting in space endangering other satellites. After reaching space, the NanoSail unfurled a 10-square-meter sheet of fabric no thicker than single-ply tissue to slow its speed.
The rocket also contained two satellites operated by the University of Texas at Austin, using flight computers provided by the Santa Clara team to guide the satellites in formation flying. O/OREOS, though, was the satellite most entwined with SCU. In addition to operating Mission Control for months, students provided the satellite with its own way of de-orbiting.
A satellite of O/OREOS’ size, altitude, and density would normally remain in space for more than 60 years before it burned up in Earth’s atmosphere, which is twice as long as NASA guidelines allow. So graduate student Eric Stackpole M.S. ’11 devised a spring-loaded, box-shaped tail that popped out of the satellite after O/OREOS reached orbit, increasing its surface area by more than 60 percent. The increased drag should gradually slow it down, hastening re-entry time for the satellite to less than 25 years. Stackpole’s device marked the first time NASA has used a propellantless de-orbiting mechanism on a scientific satellite.
The next project will give the lab’s undergraduates a chance to show their power of design. In August 2012, NASA will launch a nanosatellite studying E. coli in space. SCU students are designing a low-power, low-cost mechanical way to point the satellite in a particular direction, necessary for communicating with Mission Control.
“There is no other school that does mission operations for NASA the way we do,” says lab director Kitts, who started in satellite operations as an Air Force officer. “It’s really a student-centered operation.”
SCU is the only university in the country to let students do all mission operations and ground development for NASA satellites, he says. Students developed the Mission Control center itself, and they wrote the software and operating procedures. OREOS satellite launch. Academics & Research | 科技 |
2016-40/3982/en_head.json.gz/3253 | Home / Toys
EA Access Now Available For Xbox One! Pay Just $4.99/ Month For Free Games, Discounts Etc.
Got an Xbox One or planning to get one? Then subscribe to Ea Access for just $4.99 to save on the purchases of games, play games for free and much more!
Subscribe to EA Access @ EA.com here
First of all, this is only valid for Xbox One. Second of all, if you have or are planning to get an Xbox One, then $4.99 is hardly a bank-breaking fee especially since you will end up saving so much more money in long term. An average game usually runs upwards of $70 but with EA Access, you gain access to select games per month with the vault, but there are other benefits as well.
In fact, here are the three main advantages to getting to subscribing to EA Access:
Once subscribed, you will gain access to the vault, which will allow you to play games for free and unlike PS Plus' rather mediocre line-up of games, which are mostly indie ones, EA Access is setting the bar much higher by promising only the best games like Madden NFL 25, FIFA 14, Battlefield 4 and Peggle 2 and many others to be added later on as well.
Play For Less
EA Access members can also save 10% on all digital purchases of games. Sadly, this doesn't apply to physical copies of games but with the way in which everything is transferring to cloud technology, I doubt many of you will mind, especially if you save money. Plus, most games tend to be cheaper digitally, so you might even be looking at save even more than just 10% in comparison to the retail price of physical copies.
Play First
With this benefit, you are given the advantage of downloading and playing games five days before they are released! This is definitely a major perk, especially for full-time gamers, but I'm sure any gamer would appreciate the extra time to learn the ropes of a new game and take a shot at setting high scores. Plus, it would give a major competitive edge over friends who would be playing the game on a PlayStation 4! ;) The only drawback is that you are only provided a limited amount of time in order to try the game, but it's still a neat feature to have nonetheless, especially if you have been eagerly awaiting the release of a specific game (i.e. NHL 15).
Moosers, do you own an Xbox One? Will you be subscribing to EA Access? Let us know in the comments section!
(Expiry: Never) | 科技 |
2016-40/3982/en_head.json.gz/3307 | How DNA finds its match
IMAGE: This graphic shows DNA strung between two beads, which are held in position by laser. view more Credit: Stephen Kowalczykowski, UC Davis
It's been more than 50 years since James Watson and Francis Crick showed that DNA is a double helix of two strands that complement each other. But how does a short piece of DNA find its match, out of the millions of 'letters' in even a small genome? New work by researchers at the University of California, Davis, handling and observing single molecules of DNA, shows how it's done. The results are published online Feb. 8 by the journal Nature.
Defects in DNA repair and copying are strongly linked to cancer, birth defects and other problems.
"This is a real breakthrough," said Stephen Kowalczykowski, professor of microbiology and co-author on the paper with postdoctoral researcher Anthony Forget. "This is an issue that has been outstanding in the field for more than 30 years."
"It's the solution of one of the greatest needle-in-the-haystack problems in biology," said Professor Wolf-Dietrich Heyer, a UC Davis molecular biologist who also studies DNA repair but was not involved in this research.
"How can one double-stranded DNA break find its match in an entire genome, five billion base pairs in humans? Now we know the fundamental mechanism," Heyer said.
Forget and Kowalczykowski used technology developed in Kowalczykowski's lab over the past 20 years to trap lengths of DNA and watch, in real time, as the proteins involved in copying and repairing DNA do their work.
The first step in repairing a damaged piece of normally double-stranded DNA by a process called recombination is to strip it to a single strand. That single-stranded DNA then looks for a complementary sequence within an intact chromosome to use as a template to guide the repair.
How does a short, single-stranded piece of DNA find its exact matching partner out of perhaps millions of possibilities? In the 1970s, scientists discovered a protein, called RecA in bacteria and Rad51 in humans, which binds to the single-stranded DNA, forms an extensive filament and guides it to the right place in the chromosome.
"This is a very important aspect of chromosome maintenance," Kowalczykowski said. "Without it, your genome will start to scramble very quickly."
Defects in some proteins associated with DNA repair are associated with an increased risk of cancer - for example BRCA2, the breast cancer gene. But animals with defects in Rad51 don't even survive as embryos.
But how this search for DNA sequence compatibility works has been unclear. The RecA/DNA complex has to bump into and sample different stretches of DNA until it finds the right one, but the number of sequences to search is huge - it's like finding the proverbial needle in the haystack.
One model would be for RecA and its attached single-stranded DNA to slide along the intact duplex DNA until it gets to the right place. Or, if the DNA is in a coiled up form like a bowl of spaghetti, the RecA/DNA filament might be able to touch several different stretches of DNA simultaneously and thus shorten the time for the search.
Forget set out to test these ideas by stretching single molecules of duplex DNA between two tiny beads to make a dumbbell shape. Both beads were held in place by laser beams, but one of the beads could be steered around using the laser. Then he added the RecA assembled on single-stranded DNA to the DNA-dumbbells and watched to see how well they attached to the target DNA when it was stretched out, or relaxed and allowed to coil up.
"These are very complicated experiments to perform," Kowalczykowski said.
They found that the RecA complex attached most efficiently to the target DNA when it was in a relaxed, coiled form.
"The most efficient homology search is when the local DNA density is higher and the RecA-DNA filament can contact more areas of duplex DNA at the same time," Kowalczykowski said. "RecA doesn't slide along the DNA looking for a partner."
Consider a bowl of spaghetti, Kowalczykowski said. If you were looking for one tiny region on just one piece of spaghetti in the bowl, you could grab several strands at once and quickly examine each. But if the spaghetti were stretched out in one long piece, you could only touch one part of one piece at a time.
Kowalczykowski began working on the system for studying single molecules of DNA in 1991 with the late Ron Baskin, professor of molecular and cellular biology at UC Davis. In 2001, they demonstrated the technique by filming an enzyme called a helicase at work in real time unwinding the double helix of DNA. Since then, they have used the method to get new insights into the complex of proteins that copy and repair DNA.
Kowalczykowski's lab was also one of two UC Davis groups to purify the protein made by the BRCA2 gene, strongly associated with breast cancer. BRCA2, it turns out, loads Rad51 - the human equivalent of RecA in bacteria - onto DNA to search the human DNA for the correct region to use for repair.
###The work was funded by the National Institutes of Health and the American Cancer Society.
Andy Fell
ahfell@ucdavis.edu
@ucdavisnews
http://www.ucdavis.edu More on this News Release
National Institutes of Health, American Cancer Society
DNA Dumbbells (IMAGE) | 科技 |
2016-40/3982/en_head.json.gz/3311 | Study of Andromeda's stellar disk indicates more violent history than Milky Way
Survey data reveal a more disordered stellar population in our galactic neighbor than in our own galaxy, suggesting more recent bombardment of Andromeda by smaller galaxies
IMAGE: This Hubble image of a crowded star field in the disk of the Andromeda galaxy shows that stars of different ages can be distinguished from one another on basis of...
view more Credit: Ben Williams, PHAT collaboration
A detailed study of the motions of different stellar populations in the disk of the Andromeda galaxy has found striking differences from our own Milky Way, suggesting a more violent history of mergers with smaller galaxies in Andromeda's recent past. The structure and internal motions of the stellar disk of a spiral galaxy hold important keys to understanding the galaxy's formation history. The Andromeda galaxy, also called M31, is the closest spiral galaxy to the Milky Way and the largest in the local group of galaxies. "In the Andromeda galaxy we have the unique combination of a global yet detailed view of a galaxy similar to our own. We have lots of detail in our own Milky Way, but not the global, external perspective," said Puragra Guhathakurta, professor of astronomy and astrophysics at the University of California, Santa Cruz. The new study, led by UC Santa Cruz graduate student Claire Dorman and Guhathakurta, combined data from two large surveys of stars in Andromeda, one conducted at the W. M. Keck Observatory in Hawaii and the other using the Hubble Space Telescope. The Spectroscopic and Photometric Landscape of Andromeda's Stellar Halo (SPLASH) survey has used the Keck/DEIMOS multi-object spectrograph to measure radial velocities of more than 10,000 individual bright stars in Andromeda. The recently completed Panchromatic Hubble Andromeda Treasury (PHAT) survey provides high-resolution imaging at six different wavelengths for more than half of these stars.
"The high resolution of the Hubble images allows us to separate stars from one another in the crowded disk of Andromeda, and the wide wavelength coverage allows us to subdivide the stars into sub-groups according to their age," said Dorman, who is presenting her findings on Thursday, January 8, at the winter meeting of the American Astronomical Society in Seattle. The study presents the velocity dispersion of young, intermediate-age, and old stars in the disk of Andromeda, the first such measurement in another galaxy.
Dorman's analysis revealed a clear trend related to stellar age, with the youngest stars showing relatively ordered rotational motion around the center of the Andromeda galaxy, while older stars displayed much more disordered motion. Stars in a "well ordered" population are all moving coherently, with nearly the same velocity, whereas stars in a disordered population have a wider range of velocities, implying a greater spatial dispersion.
"If you could look at the disk edge on, the stars in the well-ordered, coherent population would lie in a very thin plane, whereas the stars in the disordered population would form a much puffier layer," Dorman explained.
The researchers considered different scenarios of galactic disk formation and evolution that could account for their observations. One scenario involves the gradual disturbance of a well-ordered disk of stars as a result of mergers with small satellite galaxies. Previous studies have found evidence of such mergers in tidal streams of stars in the extended halo of Andromeda, which appear to be remnants of cannibalized dwarf galaxies. Stars from those galaxies can also accrete onto the disk, but accretion alone cannot account for the observed increase in velocity dispersion with stellar age, Dorman said.
An alternate scenario involves the formation of the stellar disk from an initially thick, clumpy disk of gas that gradually settled. The oldest stars would then have formed while the gas disk was still in a puffed up and disordered configuration. Over time, the gas disk would have settled into a thinner configuration with more ordered motion, and the youngest stars would then have formed with the disk in a more ordered configuration. According to Dorman, a combination of these mechanisms could account for the team's observations. "Our findings should motivate theorists to carry out more detailed computer simulations of these scenarios," she said.
The comparison to the Milky Way revealed substantial differences suggesting that Andromeda has had a more violent accretion history in the recent past. "Even the most well ordered Andromeda stars are not as well ordered as the stars in the Milky Way's disk," Dorman said.
In the currently favored "Lambda Cold Dark Matter" paradigm of structure formation in the universe, large galaxies such as Andromeda and the Milky Way are thought to have grown by cannibalizing smaller satellite galaxies and accreting their stars and gas. Cosmologists predict that 70 percent of disks the size of Andromeda's and the Milky Way's should have interacted with at least one sizable satellite in the last 8 billion years. The Milky Way's disk is much too orderly for that to have happened, whereas Andromeda's disk fits the prediction much better.
"In this context, the motion of the stars in Andromeda's disk is more normal, and the Milky Way may simply be an outlier with an unusually quiescent accretion history," Guhathakurta said. ###
Other researchers who collaborated with Dorman and Guhathakurta on this study include Anil Seth at the University of Utah; Daniel Weisz, Julianne Dalcanton, Alexia Lewis, and Benjamin Williams at the University of Washington; Karoline Gilbert at the Space Telescope Science Institute; Evan Skillman at the University of Minnesota; Eric Bell at the University of Michigan; and Katherine Hamren and Elisa Toloba at UC Santa Cruz. This research was funded by the National Science Foundation and NASA.
stephens@ucsc.edu
@ucsc
http://www.ucsc.edu More on this News Release
National Science Foundation, NASA
American Astronomical Society 225th Meeting
Andromeda Star Field (IMAGE)
view more Velocity Map of Andromeda Stars (IMAGE) | 科技 |
2016-40/3982/en_head.json.gz/3386 | Accelerating Science Discovery - Join the Discussion OSTIblog HomeTopicsAuthorsArchive Search James Van Allen – Space Pioneer 10 Jun
2016 Published by Kathy Chambers james_van_allen_wcaption.jpg
James Van Allen’s space instrumentation innovations and his advocacy for Earth satellite planetary missions ensured his place among the early leaders of space exploration. After World War II, Van Allen begin his atmospheric research at the Johns Hopkins University Applied Physics Laboratory and Brookhaven National Laboratory. He went on to become the Regent Distinguished Professor and head of the University of Iowa (UI) Department of Physics and Astronomy. Drawing on his many talents, Van Allen made tremendous contributions to the field of planetary science throughout his career.Van Allen used V-2 and Aerobee rockets to conduct high-altitude experiments, but the lift was limited. He devised a ‘rockoon,’ a rocket lifted by hot air balloons into the upper atmosphere where it was separated from the balloons and ignited to conduct cosmic-ray experiments. The rockoon, shown with Van Allen in the image above, achieved a higher altitude at a lower cost than ground-launched rockets. This research helped determine that energetic charged particles from the magnetosphere are a prime driver of auroras. Read more... Thorium – An Element with Promise 09 May
2016 Published by Kathy Chambers winge_caption.png
Thorium (232Th), the chemical element named after the Norse god of thunder, has a history that is as colorful as its namesake. Although discovered in 1828 by the Swedish chemist Jöns Jakob Berzelius, thorium had no known useful applications until 1885, when it was used in gas mantles to light up the streets across Europe and North America. Then in 1898, physicist Marie Curie and chemist Gerhard Schmidt observed thorium to be radioactive, and subsequent applications for thorium declined due to safety and environmental concerns. The scientific community would later find that the element thorium held promise for the planet to have clean, safe, cheap, and plentiful nuclear power as an alternative fuel to plutonium-based nuclear power plants. Read more... Climate Change Research 24/7 11 Apr
2016 Published by Kathy Chambers Image credit: ARM Program
One of the research programs managed by the Department of Energy (DOE) is the Atmospheric Radiation Measurement (ARM) Program, created in 1989 to address scientific uncertainties related to global climate change. ARM's Climate Research Facility, a DOE scientific user facility, provides the world's most comprehensive 24/7 observational capabilities to obtain atmospheric data specifically for climate change research. The ARM facility includes fixed, mobile, and aerial sites that gather continuous measurements used to study the effects and interactions of sunlight, radiant energy, clouds, and aerosols and their impacts on the global climate system. The ARM program serves as a model and a knowledge base for climate change research endeavors across the globe.
Read more... What is Scientific and Technical Information (STI)? 06 Apr
2016 Published by Judy Gilmore Scientific and technical information, or STI: It's in OSTI's name. It's in the language of our most recent statutory authority, section 982 of the Energy Policy Act of 2005: "The Secretary, through the Office of Scientific and Technical Information, shall maintain within the Department publicly available collections of scientific and technical information resulting from research, development, demonstration, and commercial applications supported by the Department." A DOE policy directive, DOE Order 241.1B, entitled "Scientific and Technical Information Management," requires DOE offices, contractors, and grantees "to ensure that STI is appropriately managed as part of the DOE mission to enable the advancement of scientific knowledge and technological innovation." As provided in the directive, OSTI spearheads the DOE Scientific and Technical Information Program (STIP), a collaboration of STI managers and technical information officers from across the DOE complex responsible for identifying, collecting, disseminating, and preserving the results of DOE-funded research and development (R&D). STI is the heart of OSTI and its mission.
The STI that OSTI makes available is produced and published in a variety of media and formats. OSTI disseminates this STI publicly via a suite of web-based searchable databases featuring basic and advanced search capabilities, including semantic search, customized alerts, results displayed in relevance rank, in-document searching, and downloadable search results. SciTech Connect... Read more... OSTI Helping High Energy Physics Collaboration to Register Datasets 01 Apr
2016 Published by Sara Studwell The Department of Energy (DOE) Office of Scientific and Technical Information (OSTI) is working with a researcher in the High Energy Physics (HEP) community to register scientific datasets produced by a domain collaboration, a recent blog post has reported.
OSTI offers a service for registering datasets to help increase access to digital data from DOE-funded scientific research. Through the DOE Data ID Service, OSTI assigns persistent identifiers, known as Digital Object Identifiers (DOIs), to datasets submitted by DOE and its contractor and grantee researchers and registers the DOIs with DataCite to aid in citation, discovery, retrieval, and reuse. OSTI assigns and registers DOIs for datasets for DOE researchers as a free service to enhance the Department of Energy's management of this important resource. | 科技 |
2016-40/3982/en_head.json.gz/3400 | http://www.reason.org/news/show/bipolar-bad-news-for-global-wa
Bipolar Bad News for Global Warming Alarmists
Global warming alarmists got some bad news from both poles recently:One: London’s Telegraph Christopher Booker reported yesterday that a research team jointly dispatched by the BBC and World Wild Life foundation to the North Pole expressly to measure how quickly the Arctic sheet is melting ran into just one problem: It found no evidence of melting. In fact, since last March, it seems that the ice sheet has thickened by at least half a meter.A tip-off that things were not going to turn out as anticipated came when the team, whose express mission was to raise awareness about global climate change ahead of the December confab in Copenhagen, saw wandering around aimlessly one of those polar bears who are supposedly near extinction due to global warming.Separately, Booker reported that a London employment tribunal ruled that a firm had wrongly dismissed from the position of “head of sustainability” someone who, in his fervent commitment to “climate change,” was trying to reduce the company’s “carbon footprint”. The tribunal chairman David Neath found the company guilty of discriminating against the employee under the 2006 Equality (Religion and Belief) Regulations, because his faith in global warming was a “philosophical belief”.“Recalling how ‘eco-psychologists’ at the University of the West of England are pressing for “climate denial” to be classified as a form of ‘mental disorder,’ writes Booker, “one doubts whether the same legal protection would be given to those who fail to share (the fired employees ‘philosophical beliefs.’”Two: A study in the journal Nature calculated that the time span for the Antarctic’s melting is not on the scale of a hundred years as alarmists have been hyperventilating, but thousands. The New York Times’ Andrew Revkin, no global warming skeptic, reports:“Dr. Pollard and Dr. DeConto ran a five-million-year computer simulation of the ice sheet’s comings and goings, using data on past actual climate and ocean conditions gleaned from seabed samples (the subject of the other paper) to validate the resulting patterns.The bottom line? In this simulation, the ice sheet does collapse when waters beneath fringing ice shelves warm 7 to 9 degrees Fahrenheit or so, but the process — at its fastest — takes thousands of years. Over all, the pace of sea-level rise from the resulting ice loss doesn’t go beyond about 1.5 feet per century, Dr. Pollard said in an interview, a far cry from what was thought possible a couple of decades ago.”So will President Obama please take a deep breath and exhale before committing the U.S. to an energy diet through a cap-and-trade scheme?
Shikha Dalmia is Senior Analyst | 科技 |
2016-40/3982/en_head.json.gz/3405 | Does Stephen Hawking believe in God?
In an interview with The Guardian in 2011, Stephen Hawking was dismissive of the view that there is a God. He said he believed that the human brain is like a computer, which dies after its components fail. In Hawking's opinion, there is no afterlife to look forward to.
What did Stephen Hawking discover?
What did Stephen Hawking invent or discover?
In 2010, Stephen Hawking published a book, "The Grand Design," which laid out his thoughts on religion. In it Hawking said, "There is no need for a creator to explain the existence of the universe." Hawking was diagnosed with motor neurone disease more than 50 years ago, but he has since become one of the world's most noted physicists.
Learn more about Judaism
theguardian.com
What are some Hebrew names for God?
There are many different Hebrew names for God, but the most common are "Yahweh," "Elohim" and "Adonai." Each has a slightly different meaning, but all are ...
What are the Hebrew names for God?
Appearing over 6,000 times in the Tanakh, YHVH is the most common name of God in the Hebrew language. The Tetragrammaton is a combination of four Hebrew le...
What has Stephen Hawking said about aliens?
Stephen Hawking believes aliens are real, and humans should tread carefully when first meeting them. Mathematically, he believes there is a challenge in fi...
What is Stephen Hawking's IQ?
Stephen Hawking's IQ is not known for certain. Hawking has never been interested in how high his IQ is, but it has been estimated to be over 160. Hawking i...
What are some promises from God found in the Bible?
What is the Jewish hat called?
What do Buddhists believe about God?
Is Christmas mentioned in the Hebrew calendar?
What is an example of a Yiddish expression?
Who was God's angel Gabriel, according to Christian mythology?
IQ of Stephen Hawking
Stephen Hawking Biography
Biblical Names of God
5 Proofs God Exists
Names of God Meanings
72 Names of God Chart
Is swordfish kosher?
What does an anointing oil recipe contain?
Whom do the Jewish people worship?
What are rituals and practices of Judaism? | 科技 |
2016-40/3982/en_head.json.gz/3506 | Apple’s New iPad Costs at Least $316 to Build, IHS iSuppli Teardown Shows
Apple’s new iPad hit store shelves today. That means that along with the lines at the stores and the requisite applause of store employees cheering people who buy them, there were among the many iPad buyers today people who just couldn’t wait to get the gadget torn apart.
The analysts at the market research firm IHS iSuppli, considered by the investment community to be the most reliable of the organizations that conduct teardowns, were among that set. Today, somewhere in Southern California, an iSuppli analyst stood in line at a store and promptly took an iPad to a lab, where it was torn into, initiating the interesting process of estimating what it all cost to build.
Here’s what iSuppli’s team found: First off, there weren’t many changes from the last iPad, in terms of suppliers. “It’s most of the same characters we saw last time around,” analyst Andrew Rassweiler told me today. Wireless chipmakers Qualcomm and Broadcom both reappeared — Qualcomm supplying a baseband processor chip, Broadcom a Bluetooth and Wi-Fi chip, TriQuint Semiconductor suppling some additional wireless parts. STMicroelectronics once again retained its position supplying the gyroscope. Cirrus Logic supplied an audio codec chip. The 16 gigabyte, Wi-Fi-only iPad that sells for $499 costs about $316 to make, or about 63 percent of the device’s retail price. On the upper end, the 4G-ready 64GB model that sells for $829 costs about $409 to make, or about 49 percent of the retail price.
The new cost figures represent an increase of between 21 percent and 25 percent, depending on the model, from the iPad 2, which iSuppli tore down last year.
So what did they find inside? An expensive Samsung display, for one thing. All those millions of pixels don’t come cheap. ISuppli analyst Andrew Rassweiler estimates that the display, which cost $57 on the iPad 2, has grown in cost to $87 on the latest iPad. Rassweiler says that two other vendors, LG Display and Sharp Electronics, have inked display supply deals with Apple for the latest iPad, but only Samsung is thought to have fully ramped up production. Depending on the vendor, the display may cost as much as $90, he said.
One set of components remained essentially the same as before: Those that drive the touchscreen capabilities. Rassweiler says that three Taiwanese companies, TPK, Wintek and Chi Mei, supply parts related to driving the central interface feature of the new iPad, but he says to expect a major shift in how Apple handles the touch interface on future iPads.
The combined cost of cameras, including the front-facing and back camera, is pegged at $12.35, more than three times the cost of cameras found on the iPad 2, Rassweiler says. But it’s essentially the same setup as that on the iPhone 4, he says. As has been the case with cameras, the identity of the supplier wasn’t easy to determine because they try hard to hide identifying information from the prying eyes of teardown analysts. The candidates, however, include Largan Precision Co., a Taiwanese supplier of camera modules to wireless phone companies, and Omnivision. On the iPhone 4S, a research firm called Chipworks identified the supplier of the CMOS sensor in one of the cameras as having come from Sony.
As with other Apple devices, the main processor chip is an Apple-made A5X processor, one manufactured under contract by Samsung. The estimated cost of that chip is $23, up from $14 on the iPad 2. Another part that’s more expensive than on the last iPad, but also better for a variety of reasons, is the battery. This one is estimated to have cost Apple $32, up from $25 on the iPad 2. But it constitutes a significant upgrade, Rassweiler says, with 70 percent more capacity than before. Apple benefited in part by lower prices in the lithium polymer material used to make the battery, offsetting the cost of adding a vastly improved battery.
ISuppli wasn’t the only outfit conducting teardowns of the iPad today. An enthusiast site called iFixit that encourages consumers to learn how to repair and upgrade their own electronics, flew technicians to Australia to conduct its own teardown analysis. Tagged with: Apple, Broadcom, chips, components, display, IHS ISuppli, iPad, iSuppli, manufacturing, Samsung, semiconductors, teardown | 科技 |
2016-40/3982/en_head.json.gz/3520 | National Aquarium and National Wildlife Federation Join Forces
HomePress RoomPress ReleasesNational Aquarium and National Wildlife Federation Join Forces
The National Aquarium and the National Wildlife Federation have joined forces to protect wildlife and water resources for future generations. Approved by unanimous vote at the most recent Board of Directors meeting, the National Aquarium has been selected as the National Wildlife Federation’s Maryland affiliate. This partnership will link conservation efforts from Appalachia, to the Chesapeake Bay, and the Atlantic Ocean.
“This is a tremendous opportunity to align the efforts of this nation’s aquarium with one of its most effective conservation organizations,” said John Racanelli, CEO of the National Aquarium. “The National Aquarium team has worked tirelessly over the past 30 years to preserve and protect the Chesapeake Bay, in that time restoring 155 acres of bay shorelines with 1.4 million individual native plants, shrubs and trees. This exciting new alliance will allow us to further expand our reach and strengthen our impact.”
“We are delighted to welcome the National Aquarium into the Federation’s family of 48 affiliates,” said Larry Schweiger, president and CEO of National Wildlife Federation. “The National Aquarium is the trusted voice of the aquatic world, filling visitors with a sense of wonder, educating them about the threats to our oceans and water resources, and inspiring them to take individual action.”
Maryland is part of NWF’s Chesapeake Mid-Atlantic region, one of nine such regions throughout the United States. Affiliates in each region work together and with partners to advance conservation and protect the region’s unique natural treasures. The Mid-Atlantic region includes Pennsylvania, Delaware, Maryland, Washington D.C., Virginia, West Virginia and North Carolina.
“The National Aquarium will be a great addition to our region-wide efforts to conserve our resources and to connect families with the natural world,” said Tony Caligiuri, NWF Mid-Atlantic regional executive director. “We’re already collaborating on important efforts to restore the Chesapeake Bay and look forward to working together ensure that aquatic habitats are preserved for future generations.”
“Both of our organizations are dedicated to inspiring people to take an active role in protecting our natural resources,” said Laura Bankey, director of conservation at the National Aquarium. “We are excited about the national impact we will have by joining together to protect and restore our ecosystems.”
National Wildlife Federation, founded 75 years ago, has 4 million members and supporters nationwide. Affiliate representatives elect the NWF Board of Directors and set the organization’s policy objectives in the form of resolutions. NWF has more than 82,000 members and supporters in Maryland.
Conservation Go to Newsroom
Chesapeake Bay The Chesapeake Bay is the largest estuary in the United States and one of the largest in the entire world. Learn About Our Efforts in the Bay | 科技 |
2016-40/3982/en_head.json.gz/3547 | Advertisement Home > Operations & Technology > Social media's impact: Your mistakes are public--and they live forever
Social media's impact: Your mistakes are public--and they live forever
Michele McDonald | ATWOnline EMAIL
Comments 0 Advertisement If you disappoint your passengers, "it will be public, and it will live forever," Forrester Research analyst Henry Harteveldt said at the recent SITA IT Summit in Cannes, France. Harteveldt was talking about social media phenomena such as Facebook, YouTube, Flickr and Twitter, which allow travelers to broadcast their experiences immediately to hundreds, thousands, even millions of people. The reality of social media is that airlines must treat their customers as though the entire world is watching. A good chunk of it may be doing just that. Hartelveldt's presentation, titled "Keeping Mr. 22D Happy: What changing passenger behaviors and attitudes mean for air transport industry IT," included two photos posted by Flickr users. One depicted the airport in Baden, Germany, where, the poster said, "Ryanair gave me a birthday present on Sunday of an extra 5 hours and 40 minutes in Germany. The bad news is that I spent them at Baden airport!" The other memorialized in less-than-appetizing detail a breakfast served on a BMI transatlantic flight. An emphatic demonstration of Harteveldt's warning came a few days after he spoke in Cannes, when Canadian folksinger Dave Carroll posted a video on YouTube that details a battle with United Airlines in music and lyrics. Carroll's song describes his experience in March 2008, when he and his band traveled from Nova Scotia to Nebraska with a connection at Chicago O'Hare. While sitting on a plane in Chicago, he heard a passenger exclaim, "My God, they're throwing guitars out there!" He immediately alerted three employees that baggage handlers were mishandling expensive equipment, but none took any action. Sure enough, Carroll's guitar was severely damaged. For more than a year, Carroll tried to get United either to replace the guitar, pay for the repair or provide travel vouchers as compensation. Nothing happened until he posted his video, "United Breaks Guitars," on the evening of July 6. On July 7, United took action, and it, too, used social media -- in this case, Twitter -- to say, "This has struck a chord w/ us and we've contacted him directly to make it right." In another "tweet" on July 8, United said Carroll's video "is excellent and that is why we would like to use it for training purposes so everyone receives better service from us." But by July 10, the video had been viewed more than 1.38 million times, and more than 14,000 viewers went to the trouble of rating it (it got five stars). A Google search for the terms "Dave Carroll guitar United" returned 65,700 links. The video made the rounds on Facebook, too. Travel writer Peter Greenberg posted a story about it for his 1,652 friends. Carroll was invited to appear on CBS News' morning show, where he said that were it not for the video, he was sure United would not have gotten in touch with him. "They told me I wouldn't ever hear from them again." The moral of the story: Passengers are now armed with the ability to air their complaints instantaneously and globally. "Public relations no longer controls the message," Harteveldt said in his presentation. The message is viral, and it lives in cyberspace for a long, long time. The flip side: Social media can be used as a vehicle to reach out to customers, Harteveldt said. JetBlue and United, for example, alert their Twitter followers to short-term offerings. Even more intriguing is the ability to use Twitter to nip customer service issues in the bud before they become public relations nightmares. When a customer tweeted, "JetBlue, I need a wheelchair," she got an instant response from the carrier's Twitter account. "Twitter is becoming the customer service feedback loop," Harteveldt said. Print
Please Log In or Register to post comments. Advertisement Related ArticlesSocial media's impact: Your mistakes are public -- and they live forever Study urges customer-centric view of irregular operations Social Distribution Google gives tips on airline innovation People Power Blogs & Commentary Sep 30, 2016 | 科技 |
2016-40/3982/en_head.json.gz/3574 | ChemImage Sensor Systems selected to develop portable chemical detector
Wednesday, Jul 23, 2014 @ 1:30pm by BioPrepWatch Reports
ChemImage Sensor Systems (CISS), a subsidiary of ChemImage Corporation, announced on Monday that it was selected by the U.S. Army as one of the organizations to develop a portable chemical agent detection system.The U.S. Army Next Generation Chemical Detector (NGCD) program selected CISS and multiple other organizations to develop a portable system for the detection and location of chemical warfare agents (CWA) on environmental services. The work will be performed under the direction of the Joint Project Manager for Nuclear, Biological and Chemical Contamination Avoidance (JPM-NBC-CA), according to a CISS press release."We welcome the opportunity to show how hyperspectral imaging can perform reliable CWA detection and location with the goal of saving lives," Charles Gardner, the project manager for CISS, said.As part of the project, CISS will configure and evaluate its existing portable hyperspectral imaging technology for the detection of CWA at a breadboard level. The company will then progress the development of the system through brassboard and final prototype stages. The U.S. government will conduct rigorous testing in each phase to ensure the CISS system meets Army requirements."Recent events have shown us that CWA are still a very real threat in our world," Matthew Nelson, the chief scientist and business director at CISS, said. "CISS is excited about working with the U.S. Army to provide handheld instruments that dramatically lessen the warfighter impact of these terrible weapons of mass destruction."ChemImage Corporation, the parent company of CISS, develops hyperspectral and Raman chemical imaging technology. | 科技 |
2016-40/3982/en_head.json.gz/3614 | FiRe Website
Strategic News Service
StratNews.com
The world's most reliable source of advanced information at the intersection of technology and economics.
Recent & Selected Issues
SNS Innovations
Focus Channels
Comp. & Comm.
Econ & Fin.
FiReFilms
INVNT/IP
Orca Relief
Pattern Computer
Project Inkwell
Presidents’ Club
Volume Licenses
SNS Media
FiRe Conference Media
FiRe Conference Galleries
Predictions Dinner Gallery
SNS FiRe Speaker Series
SNS Predictions Dinner
Future in Review Conference
SNS Predictions West
About SNS
Self-driven disruption
By Arunabh Satpathy
Everybody is talking about autonomous cars these days, but no one knows the exact contours of its effects. To answer that question, the FiRe 2016 conference brought together host Robert Anderson, Chairman and CEO, Hybrid Electric Vehicle Technologies and HEVT LLC and Craig Giffi, Vice Chairman, US Automotive Leader, and Deloitte LLP to discuss this world changing technology.
Right off the bat, Giffi identified five “areas” that would be critical to the success of the autonomous vehicle. These five areas were entertainingly titled “my mother the car,” “what only goes up,” “Bill Clinton 1992,” “A Game of Thrones,” and “consumers are fickle.
Anderson started by asking Giffi what he thought the biggest problems would be that were solved by autonomous vehicles. Giffi responded that safety would be the biggest part. He mentioned that over 35,000 annual highway fatalities exist, with over 94 percent attributable to human error.
“The vision is these things never crash,” he said. “For society, the most obvious benefit is reducing the risk of a traffic fatality.”
The session then returned to the five major areas to figure out. “My mother the car” refers to vehicle control and ownership, which ridesharing models and autonomous vehicles are increasingly challenging.
Giffi was especially vehement in mentioning the second area, “what only goes up” i.e. regulation. He pointed to existing areas where Uber and Lyft cannot go, and predicted that certain factors inevitably cause uneven implementation.
The third area, Bill Clinton 1992 played on the phrase “it’s the economy, stupid” by emphasizing “it’s the economics, stupid.” He said that in companies (especially entrenched companies) investing in autonomous vehicles, return on investment isn’t considered enough. He cited $8 — $12 billion invested in powertrains running on electricity, gas, diesel, and hybrids and new materials for fuel efficiency like graphene.
He also said that the auto industry has led the way in diminishing returns. A 1x return is considered big in the auto industry, while Ford and GM getting 0.3x.
“With all of the investment that is being put into new technologies, how in fact do the automakers or the industry get any ROI?”
He put the smart money on disruptors and outsiders. He was also wary of the disruptive capabilities of the fourth area, titled “A Game of Thrones.” Continuing on the ROI point, he said insurance premiums will go down because of safer cars, and dealers will likely go out of business if the automakers or ridesharing companies control the business. He also mentioned massive worker disruption.
“If this happens rapidly, it will be fairly catastrophic,” he said. “The disruption will be significant.”
His final area was titled “consumers are fickle.” In this area, Giffi mentioned skepticism among American consumers in adoption of autonomous vehicles following the safety argument, while acknowledging that later generations like millennials and post-millennials would be far more receptive.
“Consumers are slowly warming up to the notion of safety tech,” he said. He further elaborated that on average, the American consumer is willing to save less than $1000 on safety tech, whereas the investment is much higher, causing a major disjunct between investment and ROI.
He ended the panel by saying that entrenched automakers would have to do the hard job of disrupting themselves, while newcomers would have to look at novel business models to monetize data from the experience.
To discover more or read other articles from the conference, visit StratNews.com or our Medium blog.
Share This Post: The CTO Challenge Team Reports Back
By Melissa Dymock
Team members of the 2016 CTO challenge reported back on their progress to the judges at the closing Friday session of FiRe Conference. The team had been tasked with building a flow computer system that can also measure the energy flow of the Earth.
Nathanael Miller, an aerospace engineer at NASA and spokesperson for the team walked the judges through the work done so far. He said it was a “a treat for all of us to work through the night to get the presentation together.”
Miller said that when analyzing the assignment placed before them, they changed the challenge from only being a flow system to also be an interactive system.
Ben Brown, department head at Molecular Ecosystems Biology, explained the system at its most basic. The steps will include sensing the data, routing, aggregating, identifying a flow/pattern, and then using, interacting, and/or archiving the data.
Franklin Williams, a principal at Live Earth Imaging Inc said that to make the system buildable in a timely manner, they suggested using sensors already in place. They would place their own aggregators on the sensors. “[There are] Millions of sensors out there,” he said. “We just need to pull them into the system.”
Once they have data and flows, they can determine what holes exist in the knowledge. Then they can build their own sensors. Williams said that after the first iteration, they can drive the problem backward.
Miller said they hope the system they design will allow variability in what it can do. It can be used by a kid in his garage or a Fortune 500 company.
The judges were asked if they would vote up or down for this project. They all voted up except Ty Carlson, CTO at Coventry Computer, who voted up with an asterisk.
“The impact that we have here is pretty significant,” Carlson said. “This is a human surveillance system that you have basically provided.” He listed many aspects affected by it, including: political systems that will resist, private systems that will be directly threatened, and there’s also the effect of people’s livelihoods.
“Does everyone understand the significance of the design?” he asked.
Miller said those were all things they were including in their discussions.
“We want to make this data as useful as possible to better life on earth,” said David Zuniga, a commercial innovation manager at the Center for the Advancement of Science in Space.
To discover more or read other articles from the conference, visit StratNews.com. Share This Post: Looking Further
By Nick Fritz
Eliot Peper is a former venture capitalist, strategist and currently a science fiction author of books like “Cumulous” and “Neon Fever Dreams.” In this session, Berit Anderson, CEO of Scout, discussed the real world inspirations for Peper’s books and his motivation to become an author.
The discussion began with an explanation of Peper’s uncommon background and transition to being an author. Peper realized from his time in venture capital that there is a locus of human drama in that world that nobody was writing about. He felt Big type A personalities, high stakes deals, fortunes won and lost, and potentially world changing technologies make for juicy writing. “This is the book that I wanted to read, but nobody had written it,” he said. “So I did.”
The discussion turned to the book “Cumulous,” set is a dystopian future world where a giant tech company governs the world. This tech company, although well meaning, has inadvertently created ubiquitous surveillance and crippling economic disparity. The theme of the book was inspired by the incredibly powerful social networking and software applications that are now being created more quickly than ever before. Further, it includes the suggestion that there may be very serious but unintended negative social externalities in this software age.
Digging into that idea of externalities, Anderson asked Peper about the negative externalities that he sees playing out over the next 15 years. His answer was primarily geopolitical. He said that information has reduced the usefulness of national borders and has increased the porosity of these borders with respect to information flow, economics, and crime. Traditional governments are not well equipped to deal with this border porosity, and as a consequence private companies are stepping in to fill these skill gaps in areas where governments typically operate. He gave the example of Google’s Jigsaw, which is working in online crime prevention, radicalization, disruption, and protection of human rights, all functions typically filled by government entities.
The interview ended with a discussion about Mars as another example of a private firm working in the traditional government sector. Peper spoke about the motivation for Mars colonization not only a as hedge against Earth, but also an intentional wake up call to think seriously about taking care of the Earth and to think disruptively about solutions.
Following this discussion, it becomes clear that Peper’s transition from tech venture capitalist to science fiction is actually not much of a leap. In a world where the nature of international borders is changing through technology and a private company is planning to colonize Mars in the next ten years, Peper’s science fiction may not read much differently from the perspective of his former investment opportunities.
Share This Post: A vision for the Congo
The second breakout session on day three of the Future in Review 2016 conference was focused exclusively on Presidential candidate in the Democratic Republic of Congo(DRC) Emmanuel Weyi and the massive problems – political, social, and technological – that he faces in getting his country to “leapfrog” others into the 21st century. The panel was moderated by Weyi and Bruce Dines, VP, Liberty Global, and was attended by a combination of curious learners, educators, computer scientists, and entrepreneurs various cautious and optimistic about the DRC’s future. Dines spoke of access to education, and the opportunities for disruption in the DRC, particularlyt in telecommunications, which would become the two major themes of the session. “There’s an opportunity to leapfrog not only in terms of technology but also in terms of systems and processes,” said Dines. Attendee Nelson Heller mentioned the success of some technology in Africa, and how it faces resistance. One educator mentioned her experience in trying to introduce computer education in the United States, and the analogous resistance she faced among parents. The responses to the issues in education dovetailed around experiments in self learning for children, and identifying pain points of parents and specifically attacking them. The other large theme of the session was telecommunications. Weyi spoke as telecommunications as one of his priorities, with special emphasis on 3G connectivity. Narrating a person experience, he spoke of his days in a mining company where he had to wait for upto 2 days to speak with his employees, as he was in rural areas. Leapfrogging was brought up again, particularly by computer scientist Dallas Beddingfield. The discussion shifted to tech companies and their attempts to connect the world cheaply. Google’s and Facebook’s internet initiatives were mentioned as solutions. The political realities of Weyi’s path were also acknowledged, including the Belgian system based election process in the DRC, where a primary system similar to the USA is followed by a runoff in case a candidate doesn’t get 50 percent votes.
This led to a discussion about Weyi’s motivations for visiting the US. He said his vision of the DRC involves tech, and the US is the center of it. “In the politics, there are two layers,” Weyi said. “The layer that everyone sees, and the one that no one sees.” He said his visits to US politicians and tech centers was his attempt to capitalize on the second layer, build connections, and create demand for his vision of the Congo. The final theme in the session was the environmental costs of rapid development. Weyi acknowledged that that the DRC has the second largest forest in the world after the Amazon, and the second greatest amount of biodiversity in the world. He lamented that some of this was threatened by Chinese logging activities. On the question of co-opting locals into the task of protecting the environment, Dines mentioned his work in the nature conservation and working to include local tribal populations into eco-tourism models. He cited the example of the Masai, who earlier fought over land, but now work together to benefit both tourists and the tribes. Weyi agreed and spoke of the need to bring people into the fold. “To bring change, you need involvement,” he said. “If you need change, you need to involve people.” Weyi spoke of his enthusiasm and faith in the youth of Congo, and how the educational infrastructure could be built by companies in exchange for advertising exposure. The session ended on a cautionary note, by acknowledging the massive difficulties in executing Weyi’s vision. Share This Post: Sensing Advanced Data Flows
By Chance Murray
This panel at the FiRe conference brought together experts from different fields like digitizing smells,quantifying underwater sound travel, and discovering seismic explosions. They were brought together by the world of sensors, which is reaching new heights and depths. “We’re trying to recreate a dog in a device,” said Chris Hanson, Founder and CEO of Aromyx. Aromyx’s technology digitizes smell and taste by creating sensors that mimic biosensors in the human nose and tongue. The application of such technology is significant. Bomb dogs at security points could be complemented or even replaced by devices employing such technologies. Noise pollution in the ocean has been on the rise to due increased movement of goods around the world. Roger Payne, Whale Scientist and Founder/President of Ocean Alliance, has been studying the physics of whale songs. “In an unpolluted ocean, we determined that whale songs can reach as far as 13,000 miles due to the unique physics of such ocean depths,” said Payne. Current sensors to monitor whale activity, commonly known as “critter cams,” provide only a static view of what whales see. Payne elaborated on a sensory device that would connect to whales and project a camera when sensors indicate another organism is nearby. This device would collect important information about the whales’ habits and environment, and would be powered by ocean currents rotating a small turbine incorporated into the device. John Delaney, Professor at the School of Oceanography, University of Washington, elaborated on a sensory device 400 KM off the Oregon coast that measures tectonic plate activity. The device sends 14 minutes of video, every 3 hours of each day. Among the highlights of data flow collected by the device is an underwater volcanic eruption, the audio of which Delaney played for the audience. “We’ve never heard anything like this before,” said Delaney.
He then introduced a design for a series of sensors to be placed along the Pacific Rim that would provide flows of data regarding tectonic plate activity. The series would span the entire northwestern coast, a comparable technology to what Japan installed not too long ago. “We can choose to invest in these sensors now, enabling us to capture valuable information about tectonic activity in the coastal region, or choose to incur the cost of reconstruction after a significant earthquake,” said Delaney.
Share This Post: Promises and perils at the Bleeding Edge of Healthcare
By Shelby Cate
As an industry that is often bemoaned, highly regulated and endlessly complicated, it is difficult to be an entrepreneur in healthcare. BBC’s Ed Butler once again hosted Oren Gilad, Don Straus, Shawn Iadonato, and Caitlin Cameron, the CEO’s of the FiRestarter companies Atrin Pharmaceuticals, First Light Biosciences, Kineta, and OtoNexus Medical, respectively, to discuss their experiences and the future of healthcare innovation. “It’s certainly a competitive landscape,” said Iadonato. “One thing that has evolved from the pharmaceutical industry is that they’ve largely gotten R&D, they are increasingly reliant on companies like my company, like [Gilad’s] company, to get the most attractive new technologies.”
Cameron sees the same story in the medical device industry. “They look to young companies like us, but they want the angels and the ventures to take the initial risk,” she said. Straus wondered the effect this trend has had on new ideas, and said he has seen a lot of good ideas wither and die on the vine because they aren’t developed enough for venture capitalists and angel investors, and yet can’t move forward without development money. “Right now it feels like we’re on a great wave and things are working,” he said. But he also said he has had periods of “sitting around waiting for a wave and it feels like maybe nothing is coming.” Straus, whose company provides rapid diagnosis of infections also touched on the challenges that high profile failures on his industry, such as the recent Theranos bust. “I talk to angel investors and every other one asks me ‘why are you not Theranos?’” he said. “We’re not Theranos, we have something real.”
According to Iadonato, the challenges of getting capital in pharmaceutical development in particular are that development is high risk, capital intensive, and takes a long time. This is in contrast to many technology ventures that are lower risk, require less investment and have quick development cycles. Gilad, however, took on a more hopeful perspective, pointing out that there are now venture arms associated with pharmaceutical companies that are on the ground in academic institutions looking for very, very early stage opportunities. “Capital is needed in the early stages where risk is high,” he said, “but it is out there.” All four CEOs acknowledged the ups and downs of the being on the bleeding edge of this industry, but also the excitement that comes along with it.
“We’re part of this lunatic group of people that actually feel we can do something good,” said Gilad. “and it’s very fun.”
Share This Post: Investing in Climate Change
The evidence supporting climate change continues to rise, and the implications on environments and economics are too significant to ignore, according to Hans-Peter Plag, Director and Professor, Old Dominion University, Mitigation and Adaptation Research Institute. “Currently, emissions from worldwide annual energy usage is comparable to the Lake Toba explosion that occurred nearly 100,000 years ago,” Plag said.
Compared to the former 100,000 years, the past 100 years have had climate metrics change drastically. Carbon Dioxide levels have risen, coastal zones have moved, and water temperatures have risen. Data suggests that a 1 degree celsius increase in global temperature equates to a 25 meter rise in sea level.
The implications of Plag’s research are broad. City planners and environmentalists can take action to ensure coastal zones have adequate infrastructure and are free of waste and pollution that will wash into the ocean. Real estate developers can invest in properties outside exposed coastal zones or utilize mobile components that are capable of adapting to rising sea levels.
“It’s time to divest of exposed coastal areas,” Plag said. “Or, if you build in the coastal zone, invest in mobile infrastructure that can relocate with rising water levels.”
Share This Post: Invisible to you
Katia Moritz, has been undiagnosed for a long time. The director of the documentary Undiagnosed was motivated by her own illness and those of people like her. She was joined on a panel by Tristan Orpin, EVP of Clinical Genomics, Illumina, John Ryals, CEO of Metabolon, and Robin Y. Smith, CEO of ORIG3N. The session hosted by Doug Jamison, CEO of the Harris & Harris Group. Upon introducing the panel, Sharon Anderson Morris spoke of “medical refugees” of undiagnosed diseases. The panel started with a clip of the film “Undiagnosed,” displaying the problems with people who have medically unexplained symptoms, and how they are often not treated for medical symptoms. However, the documentary has quickly led to the formation of the UnDx, a consortium of five tech companies with providers and patients brought together by Moritz. Moritz spoke of her personal story, and how she was working to give voice to the undiagnosed. She also spoke of the tragedy of not getting valuable data from people who the medical system refuses to treat and could offer something to the world. She got in touch with Dr. Isaac Kohani at Harvard and put together a database of undiagnosed. The companies’ offered their respective products, including genome sequencing by Illumina. With the gathered data and tech, they decided to continue looking for answers for families. Moritz also spoke with Jamison, who put her in touch with 5 biotech companies who set up the UnDx Consortium. “For the first time, all these companies who are working in different areas of biotechonology are working together,” he said. A big emphasis of the panel was on the fact that patients, providers, and companies were working together. The prevailing theme was “new technologies plus collaboration equals hope.” Taft spoke of his personal experience with a child of a friend with a family being undiagnosed. There was further discussion of inflection points, including the fact that ,any patients were not diagnosed for five to eight years, and many would never be diagnosed.
Share This Post: Digitalization, the cloud, and the transformation of the 21st century
The emerging trend of digitalization is blurring the line between the physical and digital world. The dramatic reduction in the cost of data collection, storage and analysis in the last several years has opened the door for this change, and it’s changing the nature of business. Greg Ness guided a discussion panel on the consequences of digitalization on Day 2 of the Future in Review 2016 conference. Preston McAfee, Michael Schwarz, Mark Sunday, Tim Fitzgerald, James Urquhart, and Edy Liongosari were also present as panel members. Their responses have been aggregated below.
So what precisely is digitalization, and what does it mean for large enterprise? Its definition has changed over time. The dramatic reduction in the cost of data services is impacting the way that we conduct business. Furthermore, the dramatic change in connectivity is driving change. This ability to capture data in real time from multiple sources enables us to react in real time. This increased flow of data between the physical and digital world is at the core of digitalization.
This increased flow has the potential to drastically change industries, some perhaps more than others. Potentially there is no limit to which industries can apply this idea. Agriculture is one example, where the monitoring of moisture levels can dramatically reduce water usage and increase relative yields. The mass collection of data will allow for macro-analysis, which can the be micro-targeted down to individuals based on their specific needs. These sort of “personal plans” will permeate many industries. Digitalization will also change the organizational structure of firms. To be used effectively, digitalization efforts will have to be embedded in all functional areas of a business, not siloed in one department. The bottom-line is this: digitalization is happening. Those firms that choose to get in front of the wave will prosper.
This begs the question: Who is working in this space now? General Electric is a great example of a firm who is adapting well. Seemingly overnight they transformed from a hardware company to a software company, and are now collecting enormous amounts of data on their equipment. Another consequence of this flow is cloud computing, which is being used to allow firms to fail fast in innovation. In this environment, the slow movers will be damaged quickly. Perhaps more quickly than ever before. It’s important to remember that although digitalization may spell big changes for the way that companies do business, it’s likely that consumers will not experience life changing effects. Something that cloud computing allows is collaborative filtering, whereby data sets and patterns are developed by millions of users but are accessible at a personal level. This is the biggest change for consumers. Cloud computing and mobile connections enable this. Furthermore, machine learning applications in voice, picture, and video digitalization are changing the applications of cloud computing. One form of digitalization not often mentioned is the digitalization of human assets. Through this process, it will become possible to select an ideal candidate for a job based on their digital profile, or to select the best customer type from a group of potential customers. Additionally, this process may have a “flow” effect, whereby the network created by human digitalization will allow us to seek out particularly useful contacts for a project or position. This digitalization may drive longitudinal change through the rest of the decade. Thought to text and universal language translation will may be the biggest change makers, probably by the end of the decade. These technologies will change the nature of human interaction and increase the digitalization speed of human assets tremendously. The organizational structure of firms may begin to change as well. Conway’s Law states that systems developed by organizations tend to mirror the communication practices of that organization. Digitalization will possibly reverse this trend, and organizational structures may begin reflecting the nature of the digital communication protocol.
Share This Post: Population Flows
Populations, like data, capital, and intellectual property, flows across borders. There are many drivers of population, which vary with time and region. This breakout session, hosted by Mike Winder, explored the nature and drivers of population flows in the 21st century.
Labor is one critical driver of current population flows, as demonstrated by the immigration debate here in the US. However, the nature of work in the future may be fundamentally different than it has been for the last hundred years. Automation is replacing human labor in positions across all industries. This displacement of traditional roles, which required attendance, by new service-based roles, which may be accomplished remotely, will necessarily change the dynamics of population flow with respect to labor. Other drivers of population flow may be environmental. War, poverty and famine are historically common reasons for human migration. However, in the future, environmental collapse may be a driver of population flow. China, particularly in highly industrialized and populated areas such as Beijing, is already experiencing acute pollution issues that are causing real human suffering. Populations will continue to flow as they always have. The drivers may be different in the 21st century, and understanding these trends will be critical to unlocking value in the coming hundred years.
Share This Post: View More Posts » Subscribe
Partner with SNS
© 2016 Strategic News Service LLC
FiRe, FiReFilms, Strategic News Service, Project Inkwell, and INVNT/IP are registered trademarks of Strategic News Service LLC
All other trademarks cited here are the property of their respective owners. | 科技 |
2016-40/3982/en_head.json.gz/3621 | Tag: automation
New Surveillance Program Listens For Gunshots, Get Police There in Minutes
By Veronique Greenwood | May 30, 2012 12:09 pm These days, our artificial ears and eyes are better than ever—and more ubiquitous than ever. A business recently profiled by the New York Times seems to embody both what’s most promising about such pervasive surveillance and also what’s potentially disturbing.
ShotSpotter sells and helps run an automated gunshot-reporting system to police departments, for a cost of $40,000 to $60,000 per square mile. Recording equipment is installed in neighborhoods and linked software that records sounds that could be gunfire, analyzes them to identify which are actually shots, and then submits its findings for review by a trained employee in the company’s Mountain View office. If a human verifies that the sounds are indeed gunfire, the police are notified with the location of the shots, pinpointed to within 40-50 feet. All this can happen in well under five minutes, meaning police can be there right away.
CATEGORIZED UNDER: Technology MORE ABOUT: automation, crime, guns, surveillance Google Tries to Jump-Start the Driverless Car, But Big Questions Loom
By Veronique Greenwood | May 23, 2011 4:17 pm What’s the News: Google’s self-driving cars have been generating buzz lately, with the news that the company has been lobbying Nevada to allow the autonomous vehicles to be operated on public roads. But it remains to be seen whether hordes of self-driving cars really going to work in the real world.
CATEGORIZED UNDER: Technology MORE ABOUT: automation, driverless cars, ethics, Google, robots, vehicles Google's Self-Driving Cars Are Cruising the California Highways
By Jennifer Welsh | October 11, 2010 11:56 am Google announced this weekend that it has been driving automated cars around California’s roads, and that the vehicles have already logged about 140,000 miles. A fully automated car just finished a big trip–all the way from Google’s campus in Mountain View, California to Hollywood.
Larry and Sergey founded Google because they wanted to help solve really big problems using technology. And one of the big problems we’re working on today is car safety and efficiency. Our goal is to help prevent traffic accidents, free up people’s time and reduce carbon emissions by fundamentally changing car use. [Official Google Blog]
A Google car drives with the help of a variety of sensors–including cameras on the roof and in front, radars, and laser range finders–which build a detailed map of the car’s surroundings. This information is transmitted to the Google servers and processed to detect and react to any obstacles that get in the car’s way, mimicking the decisions a human driver would make.
CATEGORIZED UNDER: Technology MORE ABOUT: automated driving, automation, cars, Google, google car NEW ON DISCOVER | 科技 |
2016-40/3982/en_head.json.gz/3631 | Weekly Wire: The Global Forum
Submitted by Roxanne Bauer On Thu, 05/15/2014 Tweet WidgetLinkedin Share ButtonAdd comment
These are some of the views and reports relevant to our readers that caught our attention this week.Most Of What We Need For Smart Cities Already Exists
The compelling thing about the emerging Internet of Things, says technologist Tom Armitage, is that you don’t need to reinvent the wheel — or the water and sewage systems, or the electrical and transportation grids. To a large degree, you can create massive connectivity by simple (well, relatively simple) augmentation. “By overlaying existing infrastructure with intelligent software and sensors, you can turn it into something else and connect it to a larger system,” says Armitage.Mideast Media Study: Facebook Rules; Censoring Entertainment OK
PBS Media Shift
A new study by Northwestern University in Qatar and the Doha Film Institute reveals that Middle Eastern citizens are quite active online, with many spending time on the web daily to watch news and entertainment video, access social media and stream music, film and TV. “Entertainment Media Use In the Middle East” is a six-nation survey detailing the media habits of those in Qatar, Egypt, Lebanon, Tunisia, United Arab Emirates (UAE) and Saudi Arabia. The results of the survey, which involved 6,000 in-person interviews, are, in part, a reflection of how the Internet has transformed Arab nations since the Arab Spring. More than ever, consumers in the Middle East/North Africa (MERA) region are using technology to pass along vital information, incite social and political change, become citizen journalists and be entertained. Global Information Technology Report 2014
The Global Information Technology Report 2014 features the latest results of the NRI, offering an overview of the current state of ICT readiness in the world. This year’s coverage includes a record number of 148 economies, accounting for over 98 percent of global GDP. The 13th edition of The Global Information Technology Report is released at a time when economies need to solidify the recovery of the past year and leave the worst financial and economic crisis of the past 80 years behind. Developed economies need to sustain their incipient economic recovery and find new areas of growth and employment creation; emerging and developing economies need to build their resilience against turbulence in the markets and foster their innovation potential in order to sustain the rapid economic growth they experienced in the past decade.
Africa Focuses on Building Resilient Cities
Developing viable, livable and resilient cities is increasingly seen as being of critical importance to giving opportunities to Africans to improve their lives. While rural development has long been a priority for governments and development agencies – with particular emphasis placed in recent years on Africa growing enough food to feed all her people – recent research is making a case for more attention to be given to urban areas. Already, more than half the world's people live in cities – as opposed to a tenth a century ago – according to United Nations statistics quoted by The Rockefeller Foundation's "resilient cities" initiative. And UN Department of Economic and Social Affairs figures quoted by Hannah Gibson of the Africa Research Institute in London suggest that Africa will be 50 per cent urban by the early 2030s and 60 per cent urban by 2050.
What Do New Price Data Mean for the Goal of Ending Extreme Poverty?
Every country in the world has a poverty line—a standard of living below which its citizens are considered poor. Typically, the richer the country, the higher the poverty line is set. The average poverty line of the poorest 15 countries in the world is used to define the global extreme poverty line—a minimum standard of living that everyone should be able to surpass. In 2005, this global poverty line was set at $1.25 per person per day. The Millennium Development Goals set out to halve the share of people living below this global minimum by 2015, and the successor agreement on sustainable development goals promises to finish the job and end extreme poverty everywhere by 2030. How feasible is the goal of ending extreme poverty, and where is poverty concentrated?
'Cow will make your baby fat': breaking food taboos in west Africa
"A pregnant woman should not eat cow. The child will be fat," said one respondent during research carried out on nutritional taboos among the Fulla people in the Upper River region of the Gambia. In comparison to the rest of western Africa, WHO classifies the Gambia's malnutrition rates as moderate. Nevertheless, the World Food Programme (WFP) is currently providing assistance to 12,500 pregnant and nursing mothers and 50,500 children in the Upper River region by distributing cereal (rice and millet) each month. Nutritional taboos can hamper NGOs' hunger and malnutrition relief efforts. The issue is even more of a concern during humanitarian crises, when food supplies are at a critically low level and people are likely to lack nutrients and be more susceptible to disease.Follow PublicSphereWB on TwitterPhoto credit: Flickr user fdecomite
Tags: Weekly WirenutritionfoodTaboosMillennium Development GoalssoftwareinfrastructureICT ReadinessWorld Economic ForumFacebookmediaCensorshipUrban DevelopmentInformation and Communication TechnologiesMiddle East and North AfricaAfrica | 科技 |
2016-40/3982/en_head.json.gz/3644 | | Politics
| Odd News
Education IPCC’s warning about ice disappearing from mountain tops based on student dissertation
By ANI Sunday, January 31, 2010 LONDON - In what may cause fresh embarrassment to the Intergovernmental Panel on Climate Change (IPCC), it has emerged that its warning about ice disappearing from the world’s mountain tops was based on a student’s thesis and an article published in a mountaineering magazine.
Earlier, the IPCC had to issue a humiliating apology over its inaccurate claim that global warming will melt most of the Himalayan glaciers by 2035 was based on a “speculative” article published in New Scientist.
In its recent report, IPCC stated that observed reductions in mountain ice in the Andes, Alps and Africa was being caused by global warming, citing two papers as the source of the information.
However, it has emerged that one of the sources quoted was a feature article published in a popular magazine for climbers which was based on anecdotal evidence from mountaineers about the changes they were witnessing on the mountainsides around them, The Telegraph reports.
The other was a dissertation written by a geography student, studying for the equivalent of a master’s degree, at the University of Berne in Switzerland that quoted interviews with mountain guides in the Alps.
After the surfacing of the fact that IPCC has been using unsubstantiated claims and sources for its warnings, sceptics have cast doubt over the validity of the IPCC and have called for the panel to be disbanded.
“These are essentially a collection of anecdotes. Why did they do this? It is quite astounding. Although there have probably been no policy decisions made on the basis of this, it is illustrative of how sloppy Working Group Two has been,” Professor Richard Tol, one of the report’s authors who is based at the Economic and Social Research Institute in Dublin, said.
“There is no way current climbers and mountain guides can give anecdotal evidence back to the 1900s, so what they claim is complete nonsense,” he added.
However, scientists from around the world leapt to the defence of the IPCC, insisting that despite the errors, the majority of the science presented in the IPCC report is sound and its conclusions are unaffected. (ANI)
Tags: London Discuss Bellow | 科技 |
2016-40/3982/en_head.json.gz/3646 | The Digital Skeptic: Will.i.am Brands Show Just How Hard It Is To Get Paid
Written by: Jonathan Blum
AAPL KO NEW YORK ( TheStreet) --
Andrew Smits is a big fan of Will.i.am. But when it comes to the gadgets the musician-slash-entrepreneur is spinning up, we agree they probably "Will.not.work."
"The youth market is not the wide open blue water people think it is," Smits told me a few weeks back over drinks and finger food.
Smits is no idle fan boy. He's creative director of Concept 73 , a San Clemente, Calif.-based marketing agency with serious chops selling action brands such as Rip Curl , Simple Mobile and Vans to kids.
He and I were at an industry confab trying to get our brains around how celebs such as Will.i.am., the Black Eyed Peas producer and early collaborator on the Beats by Dre line of headphones, are swimming into ever-odder consumer electronics waters.
Late last year Will.i.am rolled out what had to be the oddest device of recent memory: a clip-on smartphone camera lens case called the foto.sosho. The unit is designed, built and wholesaled by Will.i.am -- or William Adams, as he is known to his parents and the IRS. And it slots over an iPhone to extend its imaging features. High-end U.K. retailer Selfridges & Co. retails it for a stout roughly U.S. $480. Wider global release is expected later in 2013.
And no question, Mr. Adams is a legit master in cross-platform media jujitsu needed to get such a device off the ground.
"I travel a lot. I'm sponging all the time. I am a 'popthroplogist,'" he joked during the International Consumer Electronics Show as he explained his vision of entrepreneurship on stage during an event at the Las Vegas Hilton.
Back in 2003, he got it that the Black Eyed Peas track Hey Mama was hotter tied to a hip iPod commercial than as mere music on radio or the Web. He was early to leverage music's brand power to spin up electronics brands such as Beats by Dre, which sold to recently to HTC for $300 million. Will.i.am evens sells his insights to the Fortune 1000. Intel (INTC) , Dell (DELL) and Coca-Cola (KO) take serious bets based Will.i.am's advice. Witness Coke's launch of the Ekocycle brand , which supports products made from recycled material. Fortune magazine went so far as to plop Will.i.am on its January cover as "Corporate America's Hit Machine."
"Starting up a product was easier than I thought," he told the crowd.
That all may be true. But one does not need to be a new-media Will.i.am to see the chance that an iPhone add-on -- or the larger trend of betting on said celebrities as gadget rainmakers -- making any real money is almost incomprehensibly small.
"Will.i.am is definitely a success story in making money in the music business," Smits said. "But selling pricey iPhone parts, that's going to be a challenge." | 科技 |
2016-40/3982/en_head.json.gz/3647 | Google Hits ‘Glass’ Pedal as Apple Returns to Earth
By Sam Gustin @samgustinFeb. 26, 2013 Share
Carlo Allegri / REUTERSGoogle co-founder Sergey Brin wears Google Glass before the Diane von Furstenberg spring/summer 2013 collection show during New York Fashion Week on Sept. 9, 2012 Email
The U.S. technology industry is one of the most dynamic in the world, particularly with respect to mobile and Internet-based computing, two areas that are evolving at breakneck speed. Things can happen very quickly in the tech space: one day you’re up, the next day you’re down. Take Apple and Google, two tech titans currently battling for dominance in the mobile-Internet wars. Over the past several months, Google shares have increased by nearly 20% — last week topping $800 — while Apple shares have fallen by more than 30%.
Much of the movement happened in the past few months of 2012, as large investors, including hedge funds, pulled money out of Apple and, in some cases, poured it into Google, in order to maintain exposure to the large-capitalization technology sector, according to Colin Gillis, senior technology analyst and director of research at BGC Financial.
“As Apple started selling off, Google started taking off,” Gillis says in a phone interview. “If you’re an investor and you want exposure to large-cap tech stocks, there aren’t that many places you can go.”
The Apple sell-off is being driven in part by growing concerns about whether products like the iPhone and the iPad — devices that Apple is only incrementally improving — can continue to power revenue and profit growth, or whether Apple needs new, breakthrough products. After all, during his legendary career, Apple’s late co-founder Steve Jobs radically disrupted several markets with iconic products like the iPod and iTunes, and the iPhone and iPad, which set the standard for tech innovation. Current Apple CEO Tim Cook has yet to introduce a truly breakthrough new product of his own.
(MORE: Is Apple Losing Its Shine After Steve Jobs?)
“Tim Cook keeps alluding to the company’s great product pipeline, but there’s been an innovation vacuum for a couple of quarters,” says Gillis. “I’m not going to say the story is over — let’s give it one more year — but we’re certainly in a period of incrementalism with Apple.”
Scott Kessler, head of technology research at S&P Capital IQ, also raised the issue of incrementalism in Apple’s product cycle. “There are some well-founded concerns about the company’s ability to innovate, especially in light of Steve Jobs’ passing,” Kessler says in a phone interview. “It’s not just about the next big thing, but the next big category. People have been looking for new products and new categories for some time, but they haven’t seen them.”
It doesn’t help that Apple has experienced several quarters of slowing growth, which has further spooked investors. Last quarter, Apple generated profit of $13.1 billion, but that was flat compared with the year-ago period — the company’s lowest rate of profit growth in a decade. Google, by contrast, continues to report solid growth thanks to its dominant search engine and online advertising business. Last quarter, net income increased 13% on revenue of $14.42 billion, a 36% increase over one year ago. Google has now jumped ahead of Apple as the most widely held long technology hedge-fund position, according to Goldman Sachs’ new Hedge Fund Trend Monitor report, which analyzed 725 hedge funds with $1.3 trillion in gross assets.
(MORE: How Google’s Chief Innovator Sergey Brin Is Making Science Fiction Real)
In short, Apple expectations are returning to earth. “Apple has had a tremendous run from 2001 until the end of last year,” says Kessler. “People want the company to invent a new category. In the past, they’ve done that so frequently and successfully that when they don’t seem to do it as much or as profoundly, questions arise.”
Meanwhile, Google is hot. For example, Google’s new Chromebook Pixel laptop is garnering positive reviews. (“Thank you, Google. For obsoleting my MacBook,” as one CNET writer put it.) And the company’s Google Glass wearable computing project — high-tech Internet-connected specs — is generating the sort of buzz usually reserved for Apple products. The futuristic eyewear will be available to consumers by the end of 2013, just in time for the holiday shopping season, according to several reports. Here’s the latest official Google Glass video:
The Google buzz has been further amplified by chatter about the tech giant’s massive, 42-acre (17 hectare) expansion to its Googleplex headquarters at NASA’s Ames Research Center in Silicon Valley. That’s a convenient location, because it’s right next door to NASA’s Moffett airfield, where Google executives keep no fewer than six private planes, including a 757, a 767 and several Gulfstream jets, according to a report last year from NBC Bay Area. (Cash-flush Apple also has an ambitious new headquarters due in 2016 under development.)
“Google is getting a lot of attention and a lot of kudos for taking risks and trying something new,” says Kessler. “While Apple is reducing screen size [see the iPad Mini], Google is introducing a whole new product with wearable technology, which reinforces the perception that it’s being innovative.”
(MORE: Why Is Apple CEO Tim Cook Sitting Next to Michelle Obama?)
As for Apple, it’s telling that Cook has been on something of a p.r. tour in recent months, appearing on the cover of Bloomberg BusinessWeek and showing up for a rare on-camera interview with Brian Williams of NBC News. In an apparent attempt to burnish the company’s image, Cook recently announced plans to spend at least $100 million to “do one of our existing Mac lines in the Unites States.” (To put that into perspective, Apple made over $50 billion in profit over the past 12 months.) And earlier this month, Cook sat with First Lady Michelle Obama during President Obama’s State of the Union address.
But Cook is going to need more than high-profile appearances if he wants to restore Apple’s mojo. Incremental updates to existing product lines are well and good, but investors — and consumers — are looking for the company to unveil truly disruptive new products, as it did with the iPod, iPhone and iPad. There has been chatter that Apple might introduce a new TV product, or perhaps a “smart watch,” but thus far those are merely rumors. It’s time for Apple’s next revolutionary product to become a reality. | 科技 |
2016-40/3982/en_head.json.gz/3653 | Leigh Kish
Carnegie Museum of Natural History
412.622.3361 (0), 412.526.8587 (C)
kishl@carnegiemnh.org
BugWorks explores the anatomy of insects and how these physical forms work
A collaboration between Carnegie Museum of Natural History and Carnegie Mellon University School of Design
Pittsburgh, Pennsylvania…Observe insects up-close, and see the world through the eyes of insects to figure out how bugs work. Now open at Carnegie Museum of Natural History, BugWorks provides the rare opportunity to get up-close and personal with a few creepy crawlies and the Carnegie scientists who study them, through large-scale photographs, models, video, specimens, and illustrations, and terraria of live insects. BugWorks is a collaboration between Carnegie Museum of Natural History’s Center for Biodiversity and Ecosystems and Carnegie Mellon University (CMU) School of Design, and features exhibition elements designed and developed by CMU students as part of their senior capstone course. This collaboration and BugWorks are funded by The Fine Foundation and the Henry Lea Hillman, Jr. Foundation. BugWorksfocuses on the anatomy of various insects. For example, the lubber grasshopper has very long hind legs, which gives it the unimaginable ability to jump distances up to 20 times the length of its body. Examine the forms of different body parts to determine the functions or abilities of that particular bug, much like scientists do. Learn about the wide-ranging “jobs” a bug can do, from pollination to decomposition, and see how diverse, and sometimes downright bizarre, the insect world can be. The stars of the exhibition are alive: a giant water bug, an Emperor scorpion, darkling beetles, Allegheny crayfish, a young tarantula, and some lubber grasshoppers. Other highlights include:Large-scale Gigaprints of insects—incredibly detailed, high-resolution photographsSix enclosures of live bugs going about their daily activitiesSpecimens from the museum’s own bug collectionsVideo vignettes of Carnegie scientists behind-the-scenes in the insect collectionAn interactive photo booth to snap a picture with a favorite insect—Bring your camera! Bug blueprints that illustrate the forms of insect anatomyPhotographs from a bug’s-eye view, showing environments from the perspective of insectsTake-home collection cards of western Pennsylvania insectsMap of western Pennsylvania for visitors to show what bugs they’ve seen near their homesBugWorks is on view through July 28, 2013, and is free with museum admission.Center for Biodiversity and EcosystemsCarnegie Museum of Natural History’s Center for Biodiversity and Ecosystems engages scientists from collaborating institutions worldwide to understand, manage, and sustain the health of local and global ecosystems. It utilizes the museum’s vast collection and the environmental research center Powdermill Nature Reserve as a living laboratory for ecological research and as a site for visiting researchers and educators studying the mid-Appalachian ecosystem. The Center creates interdisciplinary research and educational projects that address some of the most pressing scientific questions of our time: questions regarding changes to the environment—past, present, and future—and how these changes affect nature and human cultures. The Center for Biodiversity and Ecosystems launched in January 2011 and is under the direction of John Rawlins.
Carnegie Museum of Natural History, one of the four Carnegie Museums of Pittsburgh, is among the top natural history museums in the country. It maintains, preserves, and interprets an extraordinary collection of 22 million objects and scientific specimens used to broaden understanding of evolution, conservation, and biodiversity. Carnegie Museum of Natural History generates new scientific knowledge, advances science literacy, and inspires visitors of all ages to become passionate about science, nature, and world cultures. More information is available by calling 412.622.3131 or by visiting the website, www.carnegiemnh.org. | 科技 |
2016-40/3982/en_head.json.gz/3788 | The "National Climate Service" scam
By Alan Caruba
web posted January 21, 2002
The one thing you learn as you follow the activities of the environmentalists devoted to the biggest hoax of the modern era, Global Warming, is that they are relentless in their devotion to pursuing the hidden agenda of "climate control." It isn't about the climate at all and never has been. It is about crippling the economy and, thereby, the hegemony of the United States.
Yes, "hegemony", because we are without doubt the most powerful nation on the face of the Earth today and a lot of people really hate our devotion to capitalism, to the workings of the free market, to property rights, to guaranteed political freedoms, and the free flow of information. Some of them fly hijacked commercial jets into the symbols of our power, the World Trade Center and the Pentagon. Others seek to undermine our Constitution by luring us into international treaties that have noble-sounding names, but which serve to strip us of the control our government has over our landmass, our natural resources, and other vital aspects of our lives.
That has been the purpose of the United Nations Climate Control Treaty, otherwise known as the Kyoto treaty. Even Japan, where the treaty was first proposed, has recently rejected many of its provisions. There's nothing like a recession or depression to get governments and people focused on how to pay the bills. The United States, to its credit, rejected the Kyoto treaty outright as totally bogus and an attack on the stability and maintenance of our economy. Of course, not everyone in government rejected it. The treaty's primary defender and proponent was then-Vice President Al Gore. He was supported in this by former President Bill Clinton. Not surprisingly, Republican-turned-Democrat, Jim Jeffords of Vermont, is also a rabid environmentalist.
If you think that environmentalists-Greens-have given up on curbing our use of energy to keep our computers goes, heat our homes, run our automobiles, fuel our industries, and generally insure that everything functions as intended, you are very wrong. The list of environmental organizations fill fat directories. There are thousands of these enemies of prosperity, health, and common sense buried in the ranks of government at all levels. They find their greatest support from the Democratic Party. My friend, David Wojick, who writes for Electricity Daily, recently alerted myself and others to a plan to create a "National Climate Service" in the Commerce Department. This nugget is tucked away in the huge Senate energy bill, S-1766. Now, we're not talking about the National Weather Service or the National Oceanographic and Atmospheric Administration. No, this new agency would exist to predict the weather ten years, fifty years, a hundred years from now!
At this point, you probably think I'm just making this up. After all, if there is anything we know for sure, it is that even the National Weather Service with the most sophisticated computer models and tracking technologies often gets a prediction for the next day's weather wrong. If it gets it right predicting what the weather will be five days from now, we think they are doing a dandy job.
The National Climate Service, we are supposed to believe-and fund-will have the capability of predicting what the weather will be decades, even centuries, from now. This is the United Nations Climate Control Treaty all over again! It is the deliberate lie that current computer weather models have the capability of predicting such distant events or trends. It is the Global Warming gangsters doing what they have been doing since they stopped predicting a coming Ice Age in the 1970s!
Another friend of mine, Christopher C. Horner, an expert on the way the Greens are perfectly willing to break in the backdoor if they can't get through the front one, points out that "it irrefutably is not possible to trace climate activity to man" and that "one volcanic eruption dwarfs anthropogenic (human) activity, exposing the follies of regulating energy use on the basis of man-as-weather-machine."
Horner, writing in The Washington Times, said, "Sane policy-making dictates ceasing all machinations toward CO2 (carbon dioxide) reductions and their related economic impact until science answers the threshold question: What concentration level constitutes 'dangerous interference'?" Good question. In the era of the dinosaurs, there was tons more CO2 in the Earth's atmosphere and they thrived for a hundred million years! The National Climate Service, proposed in Senate bill S1766, is intended to ignore real science and substitute the outright lies perpetrated by Greens who are devoted to advancing the Communist goal of global government. To achieve this, they came up with the ultimate scare, Global Warming. The proposed new agency would be charged with the responsibility of "developing assessment methods to guide national, regional, and local planning and decision making" based on its utterly, totally bogus predictions.
That's why this latest break-and-entry by the Greens is so dangerous. If the National Weather Service cannot predict next week's weather with any certainty, why would this nation want to predicate major policies on the predictions of a National Climate Service? The answer is that this is preposterous!
Preposterous, too, are claims about CO2 levels in the atmosphere as a danger to humanity or claims that the Earth is even experiencing a warming. It is not. It has not warmed at all in the passed half century since the 1950s. The Bush administration and the Republican Party have got to make a clean break with the environmentalists and their endless, deliberate falsification of science. This goes way beyond mere politics. This nation has declared and is waging war on Islamic terrorists around the world. It is time to do the same with the Green Taliban whose purpose is to destroy our economy and undermine our national sovereignty. Alan Caruba is the founder of The National Anxiety Center, a clearinghouse for information about scare campaigns designed to influence public opinion and policy. The Center maintains an Internet site at www.anxietycenter.com.
E-mail ESR Conservative Site of the Day
Home � 1996-2013, Enter Stage Right and/or its creators. All rights reserved. You've seen the banner, now order the gear!
Visit ESR's anti-gun control gear web site for T-shirts, mugs and mousepads! | 科技 |
2016-40/3982/en_head.json.gz/3790 | NCPA's Energy and Environment Blog Environment and Energy Insights from NCPA's E-Team
Environment Hub
Japan’s Lesson: It’s Time to Deal with Spent Nuclear Fuel
By Sterling Burnett Filed under Energy on March 24, 2011 with 7 comments There have been three main barriers to the construction of new nuclear power facilities: high construction costs, concerns about plant failure leading to a meltdown, what to do with the spent nuclear fuel (usually called waste). The second problem has been brought to the fore with the crisis at Japan’s Fukushima nuclear plant resulting from the horrific earthquake and subsequent Tsunami.
Even if the leaked radiation doesn’t ultimately result in significant illness or loss of life (and of course I hope it doesn’t), the questions raised by the still ongoing problems at this plant have only increased fear of nuclear power and almost certainly the costs involved in developing and operating an new facility. Since costs are already steep compared to other alternatives for electric power production it is doubtful more than a few of the nuclear plants currently in planning or development will be constructed in the next decade (and maybe ever in their current form). Whether or not we ever build or operate any additional nuclear power plants in this country, the third issue, what to do with the spent fuel, remains. As David T. Stevenson, Director of the Center for Energy Competitiveness at the Cesar Rodney Institute notes, despite all that has been reported about the problems with the multiple failing reactors at the Japanese plant, the most troubling and immediate potential hazard stems from the loss of water cooling the plant’s stored spent fuel rods. Stevenson states that, “The nuclear crisis in Fukushima, Japan shows, beyond a doubt, the time has come to open existing, secure nuclear storage facilities in the United States to avert a similar tragedy. Stored fuel is the biggest concern in Japan. We currently store spent nuclear fuel rods at power plants in above ground facilities in secure Transportation, Aging, and Disposal Canisters (TAD). These canisters can be shipped and stored without opening them. There are currently about 71,000 metric tons of spent fuel and high level radioactive waste stored at 121 nuclear power plants and non-military government sites. All of this waste, minus shipping containers, could be stacked forty-one feet high on one football field.”
Stevenson proposes three solutions: Storage at Yucca Mountain, Storage at the Waste Isolation Pilot Plant (WHIP), and recycling. Interestingly, these are the same three solutions I examined in a paper released March 2010. I agree with Stevenson, the time for talk is past, now is the time to either start shipping spent nuclear fuel to the permanent storage facilities which science has already demonstrated time and again to be safe, or to recycle the spent fuel for continued operation of currently existing facilities and to reduce the overall waste stream that ultimately needs to be stored. As Stevenson, explains, both the money and the facilities exist to handle spent nuclear fuel — all that has been lacking is the political will to act. Hopefully, Japan’s nuclear crisis will serve as a forceful prod getting U.S. politicians to act.
The Premature Promise of Renewables Texas is Right to Fight EPA Taxes, Tea Parties and the Environment Environmental Policy: Assessing the past year, immediate needs for the New Congress Renewable Energy Standard: Not conservative, not good for national security If you enjoyed this article, subscribe to receive more great content just like it.
Simon says: March 28, 2011 at 11:40 am Great post–which of the two options is more cost-effective over time? How often would there be a need to transport waste?
Larry Harmon says: March 28, 2011 at 1:24 pm Sterling:
Great article.
I enjoyed talking to you about a month ago.
Keep up the great work.
Alexis says: April 1, 2011 at 8:44 am Interesting article, these are all great solutions and important to discuss.
Jean says: April 4, 2011 at 12:56 pm Interesting article. We’ll have to wait and see what Washington ultimately decides to do. A shame that it takes travesties to shock them into concern, and even still, there is no guarantee that they will act.
Hugh says: May 28, 2011 at 3:33 pm More could be said about the history of recycling spent nuclear fuel. The USA had the lead in recycling spent control rods in the 1970s until Pres. Jimmy Carter issued an executive order closing the recycling facilities. France recycles all spent nuclear fuel. The result is a small volume of less “toxic” waste that can easily be stored. Japan followed our lead on nuclear energy so part of the problems now are the unintended but predictable consequences of Jimmy Carter’s executive order. Much of the problem in Japan involved the overheating of stored spent fuel rods. I don’t think we can wait for Washington to make a decision without being pressured by citizen groups. They don’t have the knowledge or the “cojones” to make the decision themselves.
Big D says: October 25, 2013 at 11:09 am Good article highlighting the immediate effects of cronyism and radiophobia. Recycling nuclear waste isn’t done partly because it’s cheaper to use fresh U than reprocess the “spent” fuel rods.
The “spent” fuel rods have 95+% of the U still there–the Pressurized Water Reactors now in service are that inefficient.
Meanwhile, engineering of Liquid Fluoride Thorium Reactors (LFTRs) based on research done at Oak Ridge in the 60’s is proceeding–in China. LFTRs have the ability to consume the “nuclear waste”
Oh by the way, LFTRs are safe (incapable of leaking, melting, or exploding), efficient (99% of actinides consumed), clean (no emissions, no CO2, tiny amt of short-lived waste), economical (after we defeat the regulatory and lawfare juggernauts fueled by radiophobia) and useful beyond electicity (process heat, desalinization, fuel generation, medical isotopes).
NO CO2
watch or download godzilla says: April 19, 2014 at 6:38 pm I really like it when folks get together and share ideas.
Great website, continue the good work!
Take a look at my web page watch or download godzilla
Air Problems
Crime and Guns
Rare Earths
Regulation and Risks
energy electricity coal natural gas petroleum oil renewable
environmental benefit
environmental footprint
federal regulations
federal subsidies
gasoline tax
highway trust fund
human emissions
Regulation EPA Electricity Energy
state subsidies
About the NCPA The National Center for Policy Analysis (NCPA) is a think tank, established in 1983, that develops and promotes free-market reforms in health care, taxes, retirement, education, energy and the environment. Read more about our ideas in energy and environment.
Recent Posts New York Primary “Fracking” Fight
Available and Off-Limits Offshore U.S. Oil and Natural Gas Resources
Oregon & EPA Launch Aggressive Moves Against Coal
Utah Joins Oklahoma in Rejecting Clean Power Plan
Obama Announces $98.1 Billion More Transportation Spending Waste
April 2016 March 2016 February 2016 January 2016 December 2015 November 2015 October 2015 September 2015 August 2015 July 2015 June 2015 May 2015 April 2015 March 2015 February 2015 January 2015 December 2014 November 2014 October 2014 September 2014 August 2014 July 2014 June 2014 May 2014 April 2014 March 2014 February 2014 January 2014 December 2013 November 2013 October 2013 September 2013 August 2013 July 2013 June 2013 May 2013 April 2013 March 2013 February 2013 January 2013 December 2012 November 2012 October 2012 September 2012 August 2012 July 2012 June 2012 May 2012 April 2012 March 2012 February 2012 January 2012 December 2011 November 2011 October 2011 September 2011 August 2011 July 2011 June 2011 May 2011 April 2011 March 2011 February 2011 January 2011 December 2010 November 2010 October 2010 September 2010 Blogroll
Energy Townhall
Planet Gore Blog | 科技 |
2016-40/3982/en_head.json.gz/3839 | Globalands
The pressure on land and natural resources is increasing worldwide. While there are many sectoral policies tackling different environmental problems, land use is not regulated in an integrated and overarching way. The discussion on sustainable biofuels and biomass highlights the continued lack of an effective and innovative framework to deal with complex land use issues.
The aim of the GLOBALANDS project was to identify promising existing land use policies and to develop possible governance tools towards a more resource efficient and sustainable global land use.
In a first step, the project team assessed past and current land use patterns by sector for all countries in order to identify the driving forces and trends behind major historical changes in global land use.
In parallel, an international governance screening was undertaken to gain an overview of global and multilateral (including EU) policies and their relevance for sustainable land use, including those policies that do not have land use as an explicit objective but are still relevant regarding their impact. Within the international governance screening, both governmental and non-governmental approaches were taken into account.
Based on this analysis and through a close interaction with relevant scientific partners and international institutions, the project team developed conceptual and strategic suggestions for potential approaches towards a global sustainable land use standard, as well as regulation tools and other instruments, and the role Germany can play within this process. These approaches were discussed in workshops with selected German, European and international stakeholders, and respective research partners.
The project was funded by the Federal Environment Agency (UBA) and the German Federal Ministry for the Environment, Nature Conservation and Nuclear Safety (BMU). It ran from November 2011 until April 2015.
© 1995-2014 Ecologic Institute gemeinnuetzige GmbH | Legal Notice | Privacy Notice | 科技 |
2016-40/3982/en_head.json.gz/3865 | Home Performance Diagnostics Auditing ARCHIVE CONTENT This article was originally published in the May/June 1997 issue of Home Energy Magazine. Some formatting inconsistencies may be evident in older archive content.
Data Loggers:
An Interview with Some Heavy Users
Home Energy spoke with four experienced data logger users. Each of our interviewees learned their lessons the hard way. Here, they offer tips and discuss pitfalls.
The first data loggers were human. When labor was cheap, students and others were hired to sit next to wattmeters and record the reading every 15 minutes. Data loss occurred for various reasons, such as boredom.
Digital data loggers have been refined into valuable energy-auditing tools. They make it easy to collect detailed information over a long span of time with a high degree of accuracy. But many auditors and contractors are still hesitant to use these expensive, high-tech devices. They still depend on instincts, senses, and instruments that give one bit of data for one moment in time.
We think data loggers could be used more widely by auditors, HVAC contractors, utility staff dealing with high bill complaints, and others in the energy and comfort fields. We assembled a group of four experts to describe how they use these tools---what they've learned, and what they regret. Their experiences might suggest a way that data logging could help you to monitor buildings or equipment.
Danny Parker is principal research scientist at the Florida Solar Energy Center. His most recent Home Energy article covered an intensive data-logging experiment on a home in Florida (Florida House Aglow with Lighting Retrofit, Jan/Feb '97, p. 21). Fredric Goldner is principal at Energy Management and Research Associates in Brooklyn, New York. In the July/Aug '96 Home Energy, he wrote Try These On for Size: New Guidelines for Multifamily Water Heating (p. 32). This article describes how his company monitored 30 multifamily buildings in New York. Jonathan Beers works for Madison Gas and Electric, which helped create a program to lend portable energy meters to customers directly through Madison-area libraries. He wrote Condensing Furnaces: Lessons from a Utility for the Nov/Dec '94 Home Energy (p. 13). David Springer is President of Davis Energy Group, which has completed numerous monitoring projects for utilities. Home Energy Executive Editor Alan Meier spoke with these four experts via teleconference.
Home Energy: What sort of loggers do you use, and what do you use them for?
David Springer: We use Data Electronics models DT50, DT100, and DT500 series (made in Australia). Also, Synergistic Control Systems, Campbell Scientific CR10, Fluke, and ACR Smart Reader. One of our largest residential sites was in Palm Desert, where we monitored 56 data points at one house, including temperature, water and gas flow, power, solar radiation, and wind.
Danny Parker: We use them to monitor energy end uses within residential and commercial buildings as well as meteorological conditions. From a typical building we usually gather between 20 and 30 channels worth of data.
Other than Campbell multichannel data loggers, we use some of the Pacific Science and Technology motor loggers and light loggers, to pick up lighting and other individual devices that otherwise are difficult to monitor.
Fredric Goldner: For the last six years, we've collected data on heat and domestic hot-water use in multifamily buildings. Monitoring allows us to model using real-time flows and temperatures in systems instead of using theoretical consumptions. We have collected data on fuel consumption, weather, indoor air quality, boiler cycling, apartment temperature, domestic hot water temperature and flows, boiler stack temperature, and makeup water. This was then analyzed. We came up with new system sizing and design methods and determined building dynamics. Most of my projects have had a couple dozen data points; some of them have been down to as few as four channels in a building.
Jonathan Beers: In the residential part of the company, we mostly use portable kWh meters to measure appliance energy consumption. I've mainly metered fridges in low-income households to find guzzlers that are candidates for replacement. The results surprised us. We found only a loose correlation between electricity usage and labeled amp draw, age, or even side-by-side versus top freezer styles. We've used three different brands of meters. As part of a grant from the Wisconsin Environmental Education Board, we put portable energy meters in Dane County libraries, so people can check them out using library cards. In-house, our meter readers have been loaning out portable meters for the last 15 years or so.
HE: Do customers like them ? And what do they use them for?
JB: There's a tremendous waiting list for the ones at the library. Mostly, they're using them to help decide whether to replace refrigerators. One guy was trying to figure out phantom loads--what used electricity even when switched off. They can measure anything that plugs in, at least if it's 110 volts. The meter readers have a couple of meters for electric clothes dryers, but they don't work on other 220 volt appliances, which have a different plug shape.
HE: What do you use?
JB: In-house we have some ancient Landis and Gyr. The ones in the libraries are Line Loggers from Pacific Science and Technology. We did a neighborhood-based program where the contractor chose to use the Energy Tellers, because they have the lowest cost. They weren't nearly as accurate, though. You get what you pay for.
DP: We've used different types of loggers. We've had the best luck with Campbell data loggers, which we now use almost exclusively. Even though their learning curve is steep, for a multichannel datalogger, they are extremely reliable. Reliability is really important.
DS: We've settled on Data Electronics for monitoring temperature. It's a really good analog input logger. For power monitoring, we prefer Synergistic Control Systems. The Data Electronics requires a separate power monitor to provide a pulsed input to a counter; with the Synergistic Control Systems you can connect a current transformer directly to the logger.
FG: Our greatest success has been with loggers that are integrated with existing building control systems. I take probably 80% of my data with OAS Heat Computers. Those act as the building boiler and temperature controller in multifamily buildings. These really extensive research projects would have been prohibitively expensive if we were not hooking into existing building systems. Controllers cost $5,000- $10,000 a pop, but for about $1,200 extra in each location, we had the company add memory board upgrades and a number of additional data input points.
HE: Danny, have you ever tapped into the HVA C thermostat for a single family building? (See 'Read Me Your Thermostat'.' Short-Term Evaluation Tools, Home Energy, Mar/Apr '93, p. 31)
DP: No, but we've tapped into the pulse-initiating meters that utilities have installed. We generally measure power with a single voltage tap off the main electrical panel and use split-core current transformers that go around the individual leads right there. We usually locate the data logger right there. To measure interior temperature, which is critical for cooling, we put a type-T thermocouple exactly where the thermostat is. We have people who are very clever at doing that so it's difficult to see it's there. You need craftspeople to install data loggers. They have to do nice looking work, so people don't see this as any kind of infringement on the beauty of their home or building.
HE: Have loggers ever answered questions you couldn't have answered any other way ? DP: All the time.
FG: In our first DHW research project, we collected 15-minute flow and temperature data for 14 months in 30 buildings, totaling over 12 million data points. There's no way we could have even considered doing that without loggers. I read a paper done by AGA Labs back in 1904 or something, where they looked at essentially the same parameters as we did, but they just had one gentleman sitting down in the basement, reading the meter for three days.
DS: We're currently monitoring geothermal heat pump systems. We couldn't accomplish what we're doing--taking data and calibrating computer models with that data--without being able to take short-interval, long-term data.
When Data Loggers Prove You Wrong
HE: Have you ever thought you knew what was happening with a house, only to have monitoring show that you were wrong?
DP: All the time.
JB: I've metered round-top, single-door fridges from the '50s where the doors wouldn't even close. But they're still not guzzlers, because the freezer section is small and is barely cold enough to make ice, and there's no automatic defrost. So they don't use much, even with a door that won't close right.
DS: You find things you don't expect, and those are sometimes the most important. We plotted a curve of coefficient of performance (COP) versus water temperature for one of the hydronic heat pump systems we're monitoring. We found that the COP at 140°F supply water temp was around 3, which is pretty good. But if you could run it down to 120°F, you'd have a COP more like 4.5. By doing some minor plumbing changes, they're able to run it consistently at 120°E That's a substantial improvement, and we wouldn't have noticed it without the data.
FG: We wanted to know where it's warmest in multifamily buildings. Because of the stack effect, we expected the top floor apartment to be warmest. But .from what we can tell so far, a majority of the time the bottom floor apartment is warmest. Similarly, on the domestic hot-water (DHW) side, we found 30% to 68% greater consumption than the ASHRAE industry standard numbers had indicated, which I think was a shock to a lot of people.
Avoiding Data Overload
HE: Fred, you mentioned having 12 million data points. How much do you look at? Have you ever tried to do real-time analysis ?
FG: In one problem building, I sat in my office for 20 minutes and watched data come in from a hot water system every five or ten seconds. Using that real time analysis, I determined there was a problem with the hot water coil. We found it, and it needed to be replaced. That was different from most studies, which look at data anywhere from a few weeks to months or years after it's recorded.
I'd say I probably look at haft the data in any given project. Since you have the ability to monitor more stuff, I say, let's take it. You never know what you'll find a use for. In our big DHW study, the first time around we monitored but didn't look at the heating side of it at all. Now, in a second or third project we're using five-year-old data, looking at temperatures, fuel flows, and burner run times. From a research standpoint, the data is just as valuable today as it was five years ago.
DP: Our largest project thoroughly analyzed a little over 9 million data points. Sometimes we'll throw all the data up on a graph initially, to make sure the values are in the proper range and the performance is as anticipated. Once things are running, we have an automated process for screening the data. A simple program looks at the data and compares ranges. For example, if a compressor is running, a fan should also be running. The program verifies that those run together and simple things like that. It allows us to evaluate the data as it comes in and catch problems and correct them before we get too far.
We usually check each site when it's instrumented. Since we're using Campbell data loggers, we can call it up via PC 208 software and look at it real-time to make sure it's OK. If there's a problem with something, or an instrument appears flaky, you just call it up on the phone to see what's going on. Often we'll have somebody at site looking at the data while I'm looking at it remotely. There can be somebody down in Miami working on a house. They'll call us and say, See ffI can fix this thing and you watch the data.
FG: You need to make sure you're collecting the right data to answer your questions. But if you have the capability to take additional data points or greater detail, take it.
This single-channel Hobo-Temp temperature logger provides programmable digital recording capability and is simple to use. This self-contained unit sells for under $ 100, and is the property of the Vital Signs Project. Vital Signs maintains alend-ing library of monitoring equipment that can be checked out by the staff and stu-dents of architectural programs at the University of California, Berkeley.
DS: Especially with remote monitoring projects. It might cost you 15 minutes to add a data point when you're at the site. Once the equipment's installed, adding a data point could cost days. We write a detailed monitoring plan that outlines what objectives we're trying to achieve and how to achieve the objectives. We boil that down to a data point list.
DP: With each project, we have a list of measurements we need. We also leave an analog and a pulse count channel open in case something comes up that we didn't expect. In terms of data overkill, I've been in projects where more data was taken than was necessary. But when reviewing that data, you can always dump the excess and keep what's useful. You need a data management system that lets you use the data you have. On most projects, analysis is very time-intensive. It may take two weeks or longer to satisfactorily analyze, interpret, and write up the information.
We have about 25 buildings; we're collecting 20 to 30 points of 15-minute data on each. Several of those have been going for three or four years. The data is reviewed by software--that is, when we collect the data nightly, it's brought in over telephone lines and sifted by the VAX mainframe we use here to collect the data. It checks for ranges and some other logical checks. We do things the hard way, in that we have a project engineer in charge of each project who reviews the data. They come in each morning and there's a plot waiting for them. It only takes a few minutes, and catches a lot of things that have gone wrong.
What's often forgotten is how much time is involved with doing a good job on the analysis. You have to manage large amounts of data and navigate through it. You need good software and you need to know how to use it.
FG: I, or a project engineer, usually download a building's data every three days and look at it the next morning. It takes a couple of hours the very first time, but you get to the point where you can sift through a few hundred kilobytes of data, which is God knows how many data points, in five minutes, or even two or three minutes if it is clean.
DS: It's easy to get sucked in by the data. I couldn't tell you how many hours I have spent pursuing some little
deviation in the data down a path, looking at it in different ways. It's a lot of fun, and you learn an awful lot, but it can really absorb your time.
HE: What fraction of time do you spend installing the equipment, debugging, and collecting and analyzing data ?
DP: Depending on its complexity, we have people who can install a site in between a halfa day and a full day. That would be a 20- to 30-channel project. Reviewing the data takes only a few minutes a day for the project engineer. Analysis is a long process and as Dave mentioned, you can get sucked into working on some specific problem. You sometimes go down roads that aren't fruitful, trying to troubleshoot something that you don't understand.
FG: We've cut down on the first steps, installing and debugging, by using, for the most part, existing building systems. The debugging part is probably the largest part. It's much larger than the installation. Collecting and analyzing takes probably 80% to 90% of the time. Sometimes that can go on for months.
DS: A day and a half to two days for a 30- to 40-data-point monitoring project would be typical. That includes calibrating the sensors, checking out data logger programming and modem operation, and everything else. There's always something that comes up that costs more time than you expect. Data collection and error checking, per building, takes maybe five minutes per day. Analysis time depends on what you're doing with the data. For the geothermal heat pump project, a report of COPs, energy usage, and that sort of thing is easy. Developing graphs from data is a snap. Trying to project energy savings from the data is most time-consuming.
Stories from Hell
HE: Do you have any data logger stories from Hell ?
DS: Running monitoring system wires through an attic in Palm Springs in July is the closest I've ever been to Hell.
DP: For a lot of our experiments on houses, we try to get answers on how something affects space cooling. The way we do that now, until we become brighter and find a better way, is we make a change right in the middle of summer and compare the before and after states. That involves doing a changeover during the second week in July, when it's pretty miserable. The summer before last, a radiant barrier was installed at the end of June. The installers planned to show up at 4 am, but they got to the house at 5 or 6. The way things get delayed, they were still up there at noon. One installer passed out and had to be taken to the doctor. The attic air temperatures were being monitored in real time, so we know that when he passed out, itwas almost 130°E It's amazing he was able to be up there at all. If your definition of Hell is thermal, that would be pretty good.
I also think of our experience using a donated data logger that someone wanted us to use in a project. The thing didn't work very well, and we lost most of the data as we went along.
This data logger (small box to immediate right of computer) is being set up to measure input power to an air conditioning unit. The computer and voltmeter (far right) are used for set-up and testing, but are not left at the site during monitoring.
FG: Always look a gift horse in the mouth. I worked with a client using 30 donated data loggers. These units take 4 data points. They were some of the most antiquated equipment I've ever used. They constantly went out and could not be reprogrammed on site. Many of the newer products I'm familiar with can be programmed for whether you want analog or digital input and all the other parameters. These have to go back to the factory to be reconfigured. You can try to hook three of them together to take 12 channels, but you have to use a phone switch. The manufacturer sold us a local area network to connect three data loggers in one building, but its outmoded design rendered it useless with currently available modems. The manufacturer finally suggested a third-party device, which they didn't even know how to hook up. They had us call an old client who helped us figure this out. It's nice to get a free data logger. It saved us maybe $30,000 or $40,000 in equipment. But it probably cost us three or four times that in man-hours, repairs, and hardware upgrades. There are also gaps in the data caused by the data from malfunctioning units, which makes the analysis hours go up considerably.
HE: If an HVA C contractor comes to you and says, I'd like a bit of data-logging equipment in order to do troubleshooting, what would you recommend? That includes both hardware and software.
DS: We actually did a project with a contractor, and he wanted to take some of the equipment back with him. So he bought one of the data loggers that was used in the project. I gather he's been using it for troubleshooting. This was a Data Electronics unit. What made it attractive to us was that it allowed us to do a lot of programming. And it had a lot of depth, so if you want to do logical statements in the programming you can do that. The key for a contractor would be the ability to measure temperatures easily, and perhaps extensive measuring equipment, like thermocou-pies, that connect directly to the logger.
HE: You don't recommend HOBOs or stand-alone equipment ?
DS: The HOBOs are great until you're measuring more than a couple of data points. By the time you buy the multiple HOBOs you need to get the job done, you could have bought a larger, more powerful system.
FG: Has anyone used the ACR Smart Readers? You can take up to eight temperatures with one of those, which seems nice. You don't need extra wires or phone lines. You install them and let them take data. After a time, you pull them and download the data to your PC.
DS: They might be the exception. We connected them to power monitors and used them for monitoring swimming pool pumps. However, the phase in the scanning varied enough from the phase on the power monitor output that for the first two weeks we got garbage. So that's something I'd be careful of if you're measuring power with those things. I'd recommend testing out equipment combinations before installing them in the field.
DP: The ACR loggers are good, I think, but they are expensive. But it depends on what you're trying to do. If you have very few channels, and it's always going to be the same channels you want to log, they're fine. These days, I'm leaning toward the Pacific Science and Technology loggers. They're cheap, easy to deploy once you have them set up, and they've been pretty reliable. They don't have anything to measure temperature, although we've been using one from a different company that's quite a bit less expensive than the ACR units. That's the Avatel T-101 high-temperature logger that sells for $220. You can collect about 120 days of data before you need to go retrieve it. We're using them in a project where we're trying to do temperature measurements inside homes inexpensively. Something that you can deploy quickly is always attractive.
Most of the time, what people from the A/C industry are interested in is Can you recommend a portable unit I can use to obtain temperature and enthalpy measurements before and after the coil? and so forth. There is a variety of equipment that will allow you to do that.
HE: How about in kWh meters?
JB: I like the Pacific Science and Technology Line Logger. It reads volts and watts, as well as kWh, and seems fairly straightforward. However, it retains the last reading indefinitely, so if customers don't reset it, they're adding new readings onto old. It's just a single data point kWh portable meter that plugs into the wall. You can hold it in the palm of your hand.
DS: We're impressed with Synergis-tics' programmability. They're pretty idiotproof and have built-in modems. It's easy to reconfigure the analog channels for different functions just by throwing a switch.
DP: I would agree with that--they're very easy to set up, easy to configure, and are nearly idiotproof. But having used both those and the Campbells, we prefer the Campbells. For certain measurements, especially very low-voltage devices like a pyranometer, you have a problem with the Synergistics. Also, with the Campbell, you can program it to give output in engineering units like degrees Fahrenheit and watts, rather than volts and pulses.
DS: I think they're best when you're monitoring lots of loads. The OAS Heat Computer can both control and monitor a building's existing boiler control system, saving time and money on installation of data loggers.
HE: What loggers haven't worked so well? DP: Unreliable sensors are a big problem. For example, flowmeters fail. Any insertion or paddlewheel type must be calibrated in situ to get anything reliable out of it. We use displacement meters, although they usually fail within two or three years. So if you've got 20 or 30 of them out there, you have one fail every month. The reliability becomes more important as a meter is an air trip or a full day's journey away. It may only take a minute to get in and replace the meter, but travel time adds up fast.
Hygrometers by most manufacturers are worthless. Only a few are worth anything. Chilled mirror types are most accurate, but require constant maintenance, so they're not really appropriate for field deployment. We've had most luck with Vaisala for measuring relative humidity. Other companies' meters saturate and don't come back. Also, lightning strikes cause most failures. The last logger we lost, we saw solar insolation go down to about 100 watts per square meter because a big thundercloud came, and poof---no data logger.
DS: We've had pretty good luck with flowmeters until our current geothermal heat pump project. The ones we have trouble with are the ones in the ground loops. The loops are charged with methanol, and require a big piece of charging equipment to replace the water once you've broken the line. We've had three fail within two to five days after installation. The broken ones don't seem to fail for any particular reason. There is a direct relationship between the liklihood of a device failing and how hard it is to access.
FG: We've had exactly the opposite experience with ISTA flowmeters. If you pay a little more, you can get ones with less pressure drop. We've found these meters to be quite reliable in hot water, cold water, and fuel. One has been running in heavy #6 fuel oil for six years now and has needed cleaning just once, after four years. Water meters have been extremely reliable.
DS: Thermocouples are risky with long wire runs. We see a lot of grounding problems.
DP: Differential thermo-couples, rather than single-wire runs should solve that problem.
DS: Using heavy gauge type-T thermocouples also helps.
FG: Even using shielded cable and type-T thermocouples, we've had trouble running them to the top floor of a six-story building. Another chronic problem is that some units provide output that's encrypted in a way that only their proprietary software can understand. Anyone shopping for data loggers should be sure the units they buy provide output in unencrypted ASCII format, so it can be used in any computer program.
HE: Is there any other lesson for people who might use data loggers ?
DP: With real estate, it's location, location. With data loggers, it's experience, experience. Always talk to someone who has used them before you do your own project. | 科技 |
2016-40/3982/en_head.json.gz/3980 | New search for global warming at poles
By Peter N. Spotts, Staff writer of The Christian Science Monitor /
For the next two years, the coldest places on Earth will become some of the hottest laboratories in the history of modern science.This Thursday marks the official start of the International Polar Year (IPY), an unprecedented research assault on Antarctica and the Arctic.
Some 10,000 scientists from more than 60 countries launched the push because of significant changes they see taking place at these frozen ends of the Earth. Many hold that global warming is triggering these changes, including shrinking sea ice in the Arctic Ocean, thawing permafrost, and growing instability in Greenland's ice cap and in some floes coursing through Antarctica's ice cap.
The US kicks off its part of the $1.5-billion project with opening ceremonies Tuesday in Washington.The goal is to gain a deeper understanding of processes affecting everything from the flow of glaciers, and key features of polar climate to plankton and polar bears. In addition, researchers plan to leave a legacy of networked, standard sensors and buoys that will help track changes in these crucial regions long after the IPY ends.Why North and South poles matterAt first glance, the poles may seem too remote to matter to anyone who doesn't live there. But Earth's "cryosphere" – its high-latitude regions of snow and ice – represents a central piece of the climate system. The poles act as sinks for the heat generated in the tropics and carried toward higher latitudes by the oceans and atmosphere. Over many centuries, the ice caps on Greenland and Antarctica hold the key to future sea-level rise as the climate warms up north.Thus, the hidden hand of a changing Arctic reaches farther south than icebergs alone suggest."There is no magic curtain that drops down at 60 degrees north," says ice scientist Jacqueline Richter-Menge, who heads climate-related research at the US Army Corps of Engineers' Cold Regions Research and Engineering Laboratory in Hanover, N.H.Changes in ecosystemsFor instance, ecosystems stretching from the Labrador Sea to the continental shelf off North Carolina are changing because colder, less-salty water is flowing along the continental shelf from the Arctic Ocean into the northwest Atlantic, according to two Cornell University scientists. Many researchers attribute the Arctic Ocean's freshening to global warming.
The scientists note that while overfishing triggered the collapse of lucrative cod fishing off the Canadian Maritime Provinces, this fresher, colder water along the shelf has hindered the cod's recovery there compared with stocks farther south.In their place, marine life, including shrimp and snow crab, that cod would have eaten are flourishing. The changes in water conditions have altered the timing for peak production among tiny plankton that nourish creatures higher up the food chain."These timing changes are going to lead to changes in the ecosystem. There will be winners and losers in the ecosystem. And there will be winners and losers in society," says Charles Greene, a Cornell oceanographer who was a co-author of the report.Meanwhile, in the south, scientists working on the global Census of Marine Life say they see biologically significant shifts in marine life along the sea floor that once anchored two large ice shelves known as Larson A and B. They broke away from the Antarctic Peninsula over the past 12 years."The more we understand what's going on, the more winners there will be," Dr. Greene says.International grass-roots effortThe IPY coincides with the 50th anniversary of the International Geophysical Year (IGY), the first postwar effort to study the entire planet, from the deep-sea floor and below to the outermost reaches of the atmosphere. Although this year's effort is dubbed the polar year, it spans two years to allow scientists to track conditions at both poles through a complete summer-winter-summer cycle.The IPY includes more biology and ecology to better gauge the effect changes are having on plants and animals, as well as on the organic carbon stored in frozen tundra. Scientists say that as the Arctic in particular warms, they expect this carbon to reach the atmosphere as carbon dioxide and methane – turning the Great White North into a source of heat-trapping greenhouse gases.Unlike the IGY, "this is a very grass-roots effort," says Robin Bell, a senior scientist at the Lamont-Doherty Earth Observatory in Palisades, N.Y.Last week, Dr. Bell and colleagues described how lakes in the right location beneath Antarctic ice "rivers" accelerate the ice's movement toward the sea.The poles "are the parts of the planet changing most rapidly" with global warming, she says. Understanding them is key to understanding how the rest of the planet is likely to respond.
New lake at North Pole? More of a pond, really
Antarctic ice at record-high growth, Arctic continues to lose
Arctic luxury liner cruises into controversy and opportunity | 科技 |
2016-40/3982/en_head.json.gz/3995 | Where is the Orbiter Now?
Blogs: Martian Diaries
Icy Material Thrown from Cratering Impact on Mars
This image taken on May 19, 2010, shows an impact crater that had not existed when the same location on Mars was previously observed in March 2008. The new impact excavated and scattered water ice that had been hidden beneath the surface. The location is at 63.9 degrees north latitude, 44.9 degrees east longitude. The 50-meter scale bar at lower right is about 55 yards long.The image is an excerpt from an observation by the High Resolution Imaging Science Experiment camera (HiRISE) on NASA's Mars Reconnaissance Orbiter. Additional image products from the same observation are at http://www.uahirise.org/ESP_017868_2440. The image has been processed to allow details to be seen in both the bright ice and the darker soil.The University of Arizona, Tucson, operates HiRISE, which was built by Ball Aerospace & Technologies Corp., Boulder, Colo. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Reconnaissance Orbiter Project for NASA's Science Mission Directorate, Washington.
HiRISE Flickr Photostream
HiRISE Flickr Wallpaper
HiRISE Flickr 4K | 科技 |
2016-40/3982/en_head.json.gz/4059 | Page last updated at 08:52 GMT, Monday, 8 March 2010
Internet access is 'a fundamental right'
Internet users around the world are attracted by the availability of information
Almost four in five people around the world believe that access to the internet is a fundamental right, a poll for the BBC World Service suggests.The survey - of more than 27,000 adults across 26 countries - found strong support for net access on both sides of the digital divide. Countries such as Finland and Estonia have already ruled that access is a human right for their citizens. International bodies such as the UN are also pushing for universal net access. INTERNET POLL
The findings in detail [477 KB]
Most computers will open this document automatically, but you may need Adobe Reader Download the reader here
"The right to communicate cannot be ignored," Dr Hamadoun Toure, secretary-general of the International Telecommunication Union (ITU), told BBC News. "The internet is the most powerful potential source of enlightenment ever created." He said that governments must "regard the internet as basic infrastructure - just like roads, waste and water". "We have entered the knowledge society and everyone must have access to participate." The survey, conducted by GlobeScan for the BBC, also revealed divisions on the question of government oversight of some aspects of the net. Web users questioned in South Korea and Nigeria felt strongly that governments should never be involved in regulation of the internet. However, a majority of those in China and the many European countries disagreed. In the UK, for example, 55% believed that there was a case for some government regulation of the internet. Rural retreatThe finding comes as the UK government tries to push through its controversial Digital Economy Bill. As well as promising to deliver universal broadband in the UK by 2012, the bill could also see a so-called "three strikes rule" become law. This rule would give regulators new powers to disconnect or slow down the net connections of persistent illegal file-sharers. Other countries, such as France, are also considering similar laws.
A season of reports from 8-19 March 2010 exploring the extraordinary power of the internet, including:
Digital giants - top thinkers in the business on the future of the web
Mapping the internet
- a visual representation of the spread of the web over the last 20 years
- the BBC links up with an online community of bloggers around the world
BBC SuperPower season
Recently, the EU adopted an internet freedom provision, stating that any measures taken by member states that may affect citizen's access to or use of the internet "must respect the fundamental rights and freedoms of citizens". In particular, it states that EU citizens are entitled to a "fair and impartial procedure" before any measures can be taken to limit their net access. The EU is also committed to providing universal access to broadband. However, like many areas around the world the region is grappling with how to deliver high-speed net access to rural areas where the market is reluctant to go. Analysts say that is a problem many countries will increasingly have to deal with as citizens demand access to the net. The BBC survey found that 87% of internet users felt internet access should be the "fundamental right of all people". More than 70% of non-users felt that they should have access to the net. Overall, almost 79% of those questioned said they either strongly agreed or somewhat agreed with the description of the internet as a fundamental right - whether they currently had access or not. Free speechCountries such as Mexico, Brazil and Turkey most strongly support the idea of net access as a right, the survey found. More than 90% of those surveyed in Turkey, for example, stated that internet access is a fundamental right - more than those in any other European Country.
Facebook has become a lightning rod for causes of all types
South Korea - the most wired country on Earth - had the greatest majority of people (96%) who believed that net access was a fundamental right. Nearly all of the country's citizens already enjoy high-speed net access. The survey also revealed that the internet is rapidly becoming a vital part of many people's lives in a diverse range of nations. In Japan, Mexico and Russia around three-quarters of respondents said they could not cope without it. Most of those questioned also said that they believed the web had a positive impact, with nearly four in five saying it had brought them greater freedom. However, many web users also expressed concerns. The dangers of fraud, the ease of access to violent and explicit content and worries over privacy were the most concerning aspects for those questioned. A majority of users in Japan, South Korea and Germany felt that they could not express their opinions safely online, although in Nigeria, India and Ghana there was much more confidence about speaking out. Bookmark with: Delicious Digg reddit Facebook StumbleUpon What are these?
BROADBAND WORLD
UK broadband 'notspots' revealed
UK backs 2Mbps broadband
Australia to get faster broadband
Boost for future broadband Virgin pilots 200Mbps broadband
Libya's wireless web access leap
Mapping the global picture
A tale of two broadband villages
The US broadband battle
Broadband goes big in Japan
Can India boost internet usage?
The face of future broadband
Explained: The need for speed
Is your broadband quick enough?
Bill Thompson: Digital rights
GlobeScan
FROM OTHER NEWS SITES
Melbourne Age Internet access seen as a right: poll - 6 hrs ago
IAfrica.com 'Web a fundamental right' - 6 hrs ago
SINA More than three-quarters see Internet as right: report - 7 hrs ago
Sydney Morning Herald Most see internet as a right: report - 11 hrs ago
Telegraph Four in five believe internet access is a fundamental right - 24 hrs ago | 科技 |
2016-40/3982/en_head.json.gz/4062 | Nanosensors could help determine tumors’ ability to remodel tissue Algorithm could enable visible-light-based imaging for medical devices, autonomous vehicles Scientists identify neurons devoted to social memory Collaborating with community colleges to innovate educational technology Engaging industry in addressing climate change Making smarter decisions about classroom technologies From engineer to urban planner Professor Emeritus Ali Javan, inventor of the first gas laser, dies at 89 By Topic
Students, faculty and staff gathered in Room E25-111 to hear a panel discussion about US and international security issues.Photo / Donna CoveneyFull Screen International experts warn of threats, challenges
Sarah H. Wright, News Office
MIT political scientists delivered a sobering view of current US security and near-future domestic and military life in panel discussions organized by the Center for International Studies and held on Sept. 12 and 17. Richard Samuels, the Ford International Professor of Political Science and director of the Center for International Studies (CIS), moderated both panels, organized in the wake of the Sept. 11 terrorist attacks.
"In this time of unspeakable difficulty, we continue to try to understand and cope with the new world in which we are all going to live," Samuels said. The first panel focused on the nature of the attacks, on the intentions and capabilities of the attackers and on possible US reactions. The second explored the challenges and diplomatic skills of US leadership, the dangers of extreme nationalism in the United States, and on a much-altered national role in international relations. Speakers portrayed the United States as vulnerable to further attacks and terrorist groups as likely to strike again, and they advocated heightened awareness among the nation's leadership of how other cultures view American power even as US military action of some kind was being crafted. The panelists specialize in US security policy, international security issues, nuclear proliferation, domestic preparedness and human rights law. They included Stephen Van Evera, associate professor of political science and associate director of CIS; Barry Posen, professor of political science; Allison Macfarlane, senior research associate in CIS; Balakrishnan Rajagopal, assistant professor of urban studies and planning; and Jeremy Pressman, a graduate student in political science who is writing his dissertation on the Arab-Israeli conflict.
Nazli Choucri, professor of political science, and Gregory D. Koblentz, a graduate student in political science, participated in the second discussion along with Van Evera, Rajagopal, Posen and Pressman. Joshua Cohen, the Leon and Anne Goldberg Professor in the Humanities and head of the Department of Political Science, speaking from the floor, described Tuesday's events as a "slaughter of innocents," but challenged any response based on nationalist fervor.
"It is a very bad mistake to say our cause is the cause of the USA. The issue is not a specifically American cause. Nor is the issue freedom, the rule of law, pluralism or the open society and its enemies. Many billions of people--of different moral and religious convictions--know that the slaughter of innocents is wrong. To paraphrase what Lincoln said about slavery: if slaughtering innocent people is not wrong, then nothing is wrong. And we should never respond in a way that continues that evil," Cohen declared. CHALLENGES FOR LEADERS
Posen, a specialist in American security issues and foreign policy, expressed confidence in the Bush administration but had reservations arising from President Bush's personal inexperience with life outside the United States. "The administration's direction looks right. But they'll need help. Bush himself has no internalization of international history. If they want to succeed, they will have to build coalitions, avoid collateral damages, and pay attention to subtleties--not the natural instinct of this crowd," he said. "They will have to sacrifice some sacred cows."
Van Evera is a specialist in the causes and prevention of war, and American foreign policy and security policy. "Americans must steel themselves for an Israelization of US life," he said. "There aren't any good military options. Are we willing to rethink assassinations? We'll need international help, we'll need diplomacy. Are we willing to rethink NATO expansion? The National Missile Defense Plan?"
Koblentz, whose research focuses on domestic preparedness, noted the measures already in place--more than $12 billion to combat terrorism and protect power plants and industrial sites--and urged protection of "soft targets," such as malls and sports stadiums, as well as federal funding for hospitals in preparation. "In public health, early intervention is the best method for saving lives," he said. "Our hospitals are not ready for mass infectious disease." Rajagopal's research focuses on human rights law. He noted that terror destablizes leadership but also warned the audience, "The history of states involved in counterterrorism is very ugly. Secret courts, secret evidence and secret processes are used against the populace. We do not need new laws. Laws are in place here and in the United Nations." NUCLEAR THREATS
Macfarlane, whose field of expertise is preventing the proliferation of weapons of mass destruction, noted that US nuclear plants are vulnerable both to being struck by planes with big fuel tanks, as the World Trade Towers were, or to being infiltrated by terrorists or thieves. Another threat originates from the potential use of nuclear weapons by terrorists. "The most difficult part of making a nuclear weapon is obtaining the fissile material that powers these weapons," Macfarlane noted. Plutonium and highly enriched uranium stored in poorly guarded sites in the former Soviet Union are vulnerable to theft and to smuggling, she added.
MILITARY ACTION, REACTION & ROLE
"The post-Cold War world ended yesterday," Posen said. "We haven't heard the last from these people. We must be prepared for a grinding attrition campaign involving intelligence, conventional and unconventional operations. Americans involved in this fight are going to die. "A failure of deterrence has occurred already. We have no choice but to root these enemies out wherever they are. And their hosts will not produce the terrorists unless they believe something awful is going to happen to them," Posen said. CHALLENGES FOR ALLIES
The panelists noted the history of abuses of power and influence by the United States in Southeast Asia, South Asia, Central America and Africa. Pressman commented on the export of US pop and media culture as arousing resentment among traditional societies, and on non-consumerist, religious ways of life. Choucri, an expert on moderate Arab states, urged the audience and the Bush administration to appreciate the tensions within countries such as Egypt, Jordan, Saudi Arabia and Pakistan as their leaders weigh the costs and benefits of allying with the United States. "These states face internal stresses over modernization, poverty, scarcity and gender issues that cannot be ignored. Their governments promise far more than they can deliver to their own people, and many of their regimes have serious concerns for their own legitimacy," Choucri said. "When the US seeks them to be allies, these leaders must attend to sources of domestic instability and internal opposition. What is needed now is the creation of security strategies that do not breed insecurities." Pressman summarized the history of American involvement in Afghanistan, noting that the ultimate failure of the 1979 Soviet invasion, repulsed in 1989 by US-backed forces, was interpreted by militants as "the defeat of the USSR in the name of Islam. A network was formed. It was strengthened by a sense of betrayal in the Gulf War, when holy shrines were used as US bases; by the 'Made in USA' stamp on weapons used in the Israeli-Palestinian conflict; and by the containment of Iraq. A new foreign policy for the US will require new coalitions. It's not just about us and our allies versus bin Laden," he said. Van Evera summarized increasing evidence that the author of the attack was Osama bin Laden, 44, the wealthy Islamic extremist likely based in Afghanistan. Bin Laden has called for a holy war against the United States. "He's been going after the US in increasing scale over the past eight years. He's personally very wealthy. He may have state support. And he's very well organized," said Van Evera.
This forum was sponsored by the Chancellor's Office, the Department of Political Science, and the Boston Review.
A version of this article appeared in MIT Tech Talk on September 19, 2001.
Topics: Political science, Security studies and military, September 11 | 科技 |
2016-40/3982/en_head.json.gz/4127 | Tight Spacesuits and Other 'Prometheus' Pursuits
5:00 AM PDT, June 05, 2012 Playing
Tight Spacesuits and Other 'Prometheus' Pursuits Working on Ridley Scott's Prometheus was a dream come true for Charlize Theron, Guy Pearce, Noomi Rapace, Michael Fassbender and Logan Marshall-Green -- except where those skin-tight spacesuits came in, requiring a team of people to help them just go to the bathroom. Watch the video to see the stars laugh about the sci-fi setting, plus Charlize talks about being a brand-new mom! Video: Meet the Crew and Good Ship 'Prometheus' In theaters Friday, Prometheus follows a team of explorers who discover a clue to the origins of mankind on Earth -- leading them on a thrilling journey to the darkest corners of the universe, where they must fight a terrifying battle to save the future of the human race. Prometheus Guy Pearce Charlize Theron SHARE ON FACEBOOK | 科技 |
2016-40/3982/en_head.json.gz/4149 | Glacier covered with blankets to reduce summer ice-melt
Thursday Jun 27, 2013 6:02 AM
Olivier Maire / EPAPeople living near the Rhone glacier in the central Alps of Switzerland have come up with a striking tactic to counter the effects of climate change. Each summer the glacier is protected by blankets to keep ice melt to a minimum, the European Pressphoto Agency reports. Like other glaciers in the Alps, the Rhone has retreated dramatically in the past 150 years, according to research cited by the Earth Institute at Columbia University.Each year, a tunnel is carved into the ice to enable visitors to walk inside the glacier, which is the source of the Rhone River.Olivier Maire / EPAThe entrance of the grotto of the Rhone glacier..Olivier Maire / EPAA general view of the Rhone Glacier.Olivier Maire / EPAA person entering the grotto of the Rhone Glacier.EDITOR'S NOTE: Images taken on June 26, 2013 and made available to NBC News today.Follow @NBCNewsPictures | 科技 |
2016-40/3982/en_head.json.gz/4150 | PIA12698: Closest Daphnis
Target Name: Daphnis
Original Caption Released with Image: The Cassini spacecraft captures here one of its closest views of Saturn's ring-embedded moon Daphnis.This image was taken July 5, 2010, at a distance of only about 75,000 kilometers (47,000 miles) from Daphnis. Seen at the upper left of this image, Daphnis (8 kilometers, or 5 miles across) appears in the Keeler Gap near the edge waves it has created in the A ring. The moon's orbit is inclined relative to the plane of Saturn's rings. Daphnis' gravitational pull perturbs the orbits of the particles of the A ring that form the Keeler Gap's edge, and sculpts the edge into waves having both horizontal (radial) and out-of-plane components. Material on the inner edge of the gap orbits faster than the moon so that the waves there lead the moon in its orbit. Material on the outer edge moves slower than the moon, so waves there trail the moon. See PIA11656 to learn more about this process.Daphnis can also be seen casting a short shadow on the A ring. This view looks toward the northern, sunlit side of the rings from about 14 degrees above the ringplane. The image was taken in visible light with the Cassini spacecraft narrow-angle camera at a sun-Daphnis-spacecraft, or phase, angle of 58 degrees. Image scale is 452 meters (1,483 feet) per pixel.The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the mission for NASA's Science Mission Directorate, Washington, D.C. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging operations center is based at the Space Science Institute in Boulder, Colo.For more information about the Cassini-Huygens mission visit http://saturn.jpl.nasa.gov/. The Cassini imaging team homepage is at http://ciclops.org. | 科技 |
2016-40/3982/en_head.json.gz/4153 | NSF Centers Will Use Nano-Interface Control and Bioengineering for Materials by Design September 27, 2005 in / The National Science Foundation (NSF) has established two new Materials Research Science and Engineering Centers (MRSECs) at Yale University and the University of Washington, with a combined NSF investment of up to $14 million over the next six years. The centers will also receive substantial support from the participating academic institutions, state governments and industry.
The Center for Research on Interface Structure and Phenomena will investigate the electronic, magnetic and chemical properties of complex oxide materials and their interfaces, with potential applications to magnetic storage, spintronics, and chemical sensing. The Center is a partnership between Yale University, Brookhaven National Laboratory and Southern Connecticut State University. The Genetically Engineered Materials Science and Engineering Center at the University of Washington will support innovative research and education that integrates modern biology with state-of-the-art chemical synthesis to construct hybrid materials that cannot be achieved through traditional biology or Chemistry.
Each award is initially for six years; renewed NSF support is possible through competitive review in the fifth year of the award.
In addition to the two new centers, another eleven existing MRSECs successfully renewed support in open competition in FY 2005. (A total of 29 Centers are currently supported by the MRSEC program with annual NSF support of $52.5 million.) Each Center has made a substantial commitment to effectively integrate its educational activities with its scientific research program, and to fully develop its human resource potential. The educational outreach activities can range from the elementary school to the postgraduate level. Additionally, the MRSECs constitute a national network of Centers that seeks increased impact on materials science and education beyond what is expected from any one Center.
"Advanced materials are the hidden 'stuff' that enables the modern world to function," said Lance Haworth, Executive Officer for DMR's Division of Materials Research. "Fundamental research on materials is essential to the nation's health, prosperity and welfare. New materials are key to a whole range of rapidly changing technologies such as energy, computers and communications, transportation and increasingly health- and medicine-related technologies as well. These two new awards join a vigorous network of NSF-funded interdisciplinary Centers that are doing exciting work at the frontiers of materials research and preparing the next generation of materials researchers."
Source: NSF
"NSF Centers Will Use Nano-Interface Control and Bioengineering for Materials by Design" September 27, 2005
http://phys.org/news/2005-09-nsf-centers-nano-interface-bioengineering-materials.html | 科技 |
2016-40/3982/en_head.json.gz/4157 | World Environment News Weather Center: 50 Percent Chance Of El Nino Later This Year
Date: 12-Jun-12 Country: USA Author: Josephine Mason
Homes which are sliding in to the sea in Pacifica, California are seen from the air, March 4. 1998.Photo: Sean Ramsey
There is a 50 percent chance the feared El Nino weather pattern which can trigger droughts in Southeast Asia and Australia and floods in South America may strike later this year, the U.S. Climate Prediction Center warned on Thursday.In its strongest prediction so far that El Nino could emerge, the CPC said conditions are still expected to be neutral between June and August, but there is a 50 percent likelihood that El Nino will develop in the remainder of the year.The CPC issues an El Nino watch when conditions are favorable for the coming six months. In its last update in May, it said it was still uncertain if it would develop.El Nino is a warming of sea surface temperatures in the tropical Pacific that occurs every four to 12 years and has far-ranging effects around the globe, particularly on food output.The CPC forecast will be closely watched by the U.S. crude oil industry as El Nino reduces the chances of storms in the Gulf of Mexico that could topple platforms and rigs there.Forecasters have already said they expect the Atlantic hurricane season, which started on Friday and runs to November 30, to be less active than last year.The phenomenon creates wind shear that makes it harder for nascent storms to develop into hurricanes in the Atlantic-Caribbean basin, but it also can produce drought and crop failure in parts of South Asia and unseasonably wet conditions in western coastal areas of South America.While drier conditions could benefit crops such as coffee and cocoa, which were hit by heavy rains last year, analysts have warned that prolonged heat can also hurt yields.Malaysia and Indonesia account for 90 percent of the world's palm oil supplies, while most of the world's rice is exported from Asia. Asia also produces nearly 40 percent of global wheat supplies and the bulk of natural rubber output.The last severe El Nino in 1998 killed more than 2,000 people and caused billions of dollars in damage to crops, infrastructure and mines in Australia and other parts of Asia.(Editing by Edwina Gibbs) | 科技 |
2016-40/3982/en_head.json.gz/4173 | TODAY IN BUSINESS
AT&T AND VERIZON WIN EXEMPTIONS AT&T and Verizon Communications can each merge their local and long-distance telephone operations with fewer government restrictions than some rivals wanted, the Federal Communications Commission said yesterday. In a order posted on its Web site, the F.C.C. exempted the two companies from price caps and other so-called dominant-carrier rules on long-distance service. The decision gives AT&T and Verizon the same type of exemption that Qwest Communications International, the third-biggest local-phone company, won in February. The carriers said the rules were outdated and would hinder their ability to compete with other long-distance, mobile and Internet-phone services. (BLOOMBERG NEWS) RULING FOR MOTOROLA Motorola did not knowingly mislead investors about the prospects of its failed Iridium satellite unit, a bankruptcy judge ruled yesterday. Judge James Peck of Federal Bankruptcy Court in New York said it could not be proved that Iridium was insolvent before its 1998 launch. His finding will prevent Iridium creditors from recovering at least some of the $3.45 billion they claim Motorola owes them under a theory that it was liable for Iridium's collapse because it saddled the unit with debt and a poor business plan as consumers shunned its bulky, expensive phones. (BLOOMBERG NEWS) G.E. COMPLETES SALE General Electric completed its $11.6 billion sale of its plastics division yesterday, and said it planned to use the proceeds to help pay for a stock buyback program. The sale, to the petrochemicals manufacturer Saudi Basic Industries, is expected to create a net gain, after taxes, of $1.5 billion. G.E. will use the proceeds to complete its current $27 billion stock buyback program, with $12 billion of the $14 billion planned for 2007 in the second half of this year. SABIC intends to expand the business globally and is not planning to cut jobs, company officials said. (AP) DELPHI SETTLES SUITS The auto parts maker Delphi said yesterday it had settled class-action lawsuits by shareholders and employees that grew out of financial restatements that were the subject of civil fraud charges the company settled last year. Participants in Delphi retirement plans will receive an allowed interest of $24.5 million and $22.5 million in cash from insurance carriers, Delphi said. Debt purchasers will receive an allowed claim and stock purchasers will receive an allowed interest for a combined $204 million in the case and about $90 million in cash from other defendants and insurance carriers, it said. The agreements require approval of the Federal District Court in Michigan where the class complaints were consolidated as well as the Federal Bankruptcy Court in New York. (REUTERS) AT&T DROPS WIRELESS PROJECTAT&T, following a similar move by EarthLink, has abandoned a plan to build a wireless Internet network covering the city of Springfield, Ill., after failing to settle financial terms. AT&T still plans to build its first citywide network using wireless-fidelity, or Wi-Fi, technology in Riverside, Calif., a spokesman, Fletcher Cook, said yesterday. EarthLink abandoned talks this week to build a wireless network in San Francisco with Google. Chicago also has scrapped plans for a Wi-Fi network after failing to reach a deal with EarthLink or AT&T. AT&T is in talks to build Wi-Fi networks in San Antonio and Napa, Cal., Mr. Cook said. EarthLink continues to work on citywide Wi-Fi projects in Philadelphia and Anaheim, Calif. (BLOOMBERG NEWS) COKE BUYS BRAIN-TWIST STAKECoca-Cola announced yesterday that it bought a 20 percent stake in beverage developer Brain-Twist. The investment gives Coca-Cola ''access to a pipeline of innovative ideas and products,'' Deryck van Rensburg, president of the emerging brands unit at Coca-Cola, said an e-mailed statement. The terms of the investment were not disclosed. Brain-Twist, based in New York, has created drinks including Cinnabon canned coffee and Defense vitamin and mineral beverages. It was founded by Larry Trachtenbroit, who in 2001 sold his Planet Java bottled coffee drinks to Coca-Cola. (BLOOMBERG NEWS) CHINESE COURT RULES FOR STUDIOS Six Hollywood film studios, including 20th Century Fox Film Corporation, were awarded a total of $25,500 from a Beijing seller of pirated DVDs, the China Court Web Site reported yesterday. The pirated films included ''Lord of the Rings'' and ''The Day After Tomorrow,'' the Web site said. It did not identify the five other studios. The case was heard in Beijing Xicheng District People's Court. The defendant was a video shop belonging to the Beijing Yongsheng Century International Cultural Development. (AP) Inside NYTimes.com Health » Too Hot to Handle | 科技 |
2016-40/3982/en_head.json.gz/4187 | Explaining the mysterious age gap of globular clusters in the Large Magellanic Cloud
Please use this identifier to cite or link to this item: http://hdl.handle.net/1959.3/358
Bekki, Kenji; Couch, Warrick J.; Beasley, Michael A.; Forbes, Duncan A.; Chiba, Masashi; Da Costa, Gary S.
The Large Magellanic Cloud (LMC) has a unique cluster formation history in that nearly all of its globular clusters were formed either ~13 Gyr ago or less than ~3 Gyr ago. It is not clear what physical mechanism is responsible for the most recent cluster formation episode and thus the mysterious age gap between the LMC clusters. We first present results of gasdynamical N-body simulations of the evolution of the LMC in the context of its Galactic orbit and interactions with the Small Magellanic Cloud (SMC), paying special attention to the effect of tidal forces. We find that the first close encounter between the LMC and the SMC about 4 Gyr ago was the beginning of a period of strong tidal interaction that likely induced dramatic gas cloud collisions, leading to an enhancement of the formation of globular clusters that has been sustained by strong tidal interactions to the present day. The tidal interaction results in the formation of a barred, elliptical thick disk in the LMC. The model also predicts the presence of a large diffuse stellar stream circling the Galaxy, which originated from the LMC.
Swinburne University of Technology. School of Biophysical Sciences and Electrical Engineering. Centre for Astrophysics and Supercomputing
Astrophysical journal,
Vol. 610, pt. 2 (2004), pp. L93-L96
pp. L93-L96
Copyright © 2004 The American Astronomical Society. Reproduced in accordance with the copyright policy of the publisher. | 科技 |
2016-40/3982/en_head.json.gz/4362 | Title: Climate Change and Water Use/Availability in Maine
Focus Categories: CP, DROU, ECON, HYDROL, M&P, WQL, WQN, WS, WU
Keywords: Climate, Conflict Management, Drought, Hydropower, Information Dissemination. Irrigation, Water Resources.
Duration: 3/1/00-2/28/01
Federal Funds Requested: $14,075
Non-Federal Funds Pledged: $22,355.
Principal Investigators: John Peckenham (Water Research Institute), J. Steve. Kahl (Water Research Institute), and Robert M. Lent (United States Geological Survey).
Congressional District: Second
Statement of Critical Regional Problem
There are several important inter-relationships between potential climate change and water resources. These relationships are important policy and planning issues, due to our poor ability to predict climate change and the natural variations in weather that occur over the period of years. For example:
1. What are the predicted regional effects of climate change?
2. How will these changes affect Maine�s water resources and supplies?
3. What will be our water requirements in the next two decades?
4. Are there conflicts with current water resources and what new conflicts
can be expected?
5. How do we evaluate the need for alternative water supplies?
Maine�s economy has a critical dependence upon its water resources with 1,542 square miles of lakes and ponds, 31,672 miles of streams and rivers, and 5,249 miles of coastline (MDEP, 1996). This includes water supply for consumers, hydropower impoundments, freshwater fisheries, and transportation. Recreationally, aesthetically, and economically, Maine�s 6,000 lakes are important features that alone contribute billions of dollars to the state economy (Boyle et al., 1997). Subtle changes in regional climate such as changes in the frequency and intensity of rainfall can have a profound affect upon key water resources.
According to summary data from the National Weather Service in Gray, Maine there have been record setting temperature extremes and rainfall events over the last five years. This includes higher than normal temperatures, drought, and intense rainfall. These changes in weather patterns affect the management of hydropower; i.e. balancing water input with power demands as drought periods increase in length and storm intensity increases. Water consumers are impacted from drought induced stresses on water quality and quantity. Freshwater fisheries are stressed by drought conditions and degradation of water quality due to increased runoff during intense storms, and spring spawning runs could be nonexistent without ample snowpack melting. The fisheries are already under stress on many major rivers in Maine, and recovery efforts are hampered by climatic change.
Statement of Results and Benefits
The Climate Change and Water Use Workshop will be a meeting between climate change researchers (e.g. NWS/NOAA and University), water scientists (USGS and University), emergency management agencies, state water-related agencies, hydropower companies, agricultural agencies, and fishery-related agencies. The goal is to have this informational sharing meeting result in a draft action plan that identifies key indicators that should be monitored and what actions should be taken when the key indicators change. As an example, we need to determine the balance between maintaining a water impoundment for power production and minimum base flow in a river for fisheries management when there are drought conditions. Another example is determining how many acre-feet of water can be drawn from rivers for irrigating blueberry land when salmon are spawning. Some aspects of water management have been addressed already on an issue-by-issue basis without examination of the broader scale problems. Obviously, some issues may not be resolvable during this workshop and research objectives will be developed in lieu of an action plan.
Part of this proposed effort will involve organizing the workshop. The first step will be assembling the information on the state�s water resources and recent streamflow trends (USGS), weather trends (NWS/NOAA), and climate model projections. The organization committee will then determine what information will be most useful for the workshop. The tentative structure is to have the workshop take place over two days. The first day will be devoted to information transfer with presentations on climate trends, weather patterns, existing water resources, and projections of future demand. Probably this will include presentations from agencies collecting weather and water data and from researchers. The workshop participants will then form working groups according to their specialty (water supply, hydropower, agriculture, fisheries, recreation, etc.). The working groups would meet in the evening of Day 1 with an informal agenda to analyze the presentations and prepare an agenda for Day 2. The second day will start with the working groups preparing draft action plans that will be presented to all participants in the afternoon of the second day. The end result being an exposition of what we know about water resources and weather trends followed by an inventory of what information is needed by each water resource sector. Finally, the results need to be merged into a format so that competing needs can be assessed. It is unrealistic to expect that a consensus will be reached over the course of two days and each working group will need to set a deadline (i.e. 1-month) for finalization of a draft action plan.
After the draft action plans are completed, the Water Research Institute will prepare a summary of the meeting proceedings and a compilation of draft action plans for all participants. This will be followed by a presentation to the Drought Task Force on how to implement the workshop results as a strategic action plan. The Water Research Institute and the USGS will provide technical support as the Drought Task Force debates policy issues. In summary this proposal will form the basis for organizing a workshop and resultant draft action plan that will be a template for managing Maine�s response to weather and climate-induced changes of our water resources. It will provide an opportunity for the main providers of basic information on water resources (USGS) and weather/climate (NWS/NOAA) to work closely with the key people involved in water management policy in Maine.
Nature, Scope and Objectives of the Research
Alteration of weather patterns, possibly indications of decade-long changes in climate, have been very noticeable in the last two years. These changes have lodged themselves in the popular press as these news excerpts from the Maine Climate Change web page (www.downeast.net/nonprof/cse/climate.html) and other sources reflect:
Record low snowfall for state (Bangor Daily News, January 10, 2000)
Record heat in our region, nation, and internationally . Over 271 people
dead in US from two heat waves. (Bangor Daily News, August 10, 1999)
105-year April-July record drought in MD, NJ, DE and RI. Second driest April-July in NY, CT, MA, and WV, third driest in ME. -- (Bangor Daily News, August 12, 1999).
Federal emergency aid sought to help drought-stricken farmers in the state. (Portland Press Herald, Saturday, August 14, 1999)
Record low ground water in central, western and eastern Maine according to US Geological Survey. (Bangor Daily News August 5, 1999)
Dry spring and summer in region lowers hydropower production, blueberry and corn yield, increases irrigation, lessens boating recreation. (Bangor Daily News, April 16, 1999)
14 out of 16 Maine Counties declared eligible for drought relief. (Bangor Daily News, August 26, 1999)
Arctic temperature has risen 1.5 degrees C since 1965; Inuit food supply disappearing;. Polar bears are dying and having fewer cubs as ice thins. (The Mail and Guardian, 8/6/99)
Case of malaria from mosquitos reported in MI and CT in the 1990s for the first time ever.
According to a study commissioned by the Pew Center on Global Climate Change (Frederick and Gleick, 1999), climate change will have a profound effect on water resources. In particular they predict changes in water quality due to changes in storm intensity, runoff duration, and water temperature. An evaluation of climate models by the U.S. Geological Survey (Wolock and McCabe, 1999) unveiled a range of uncertainty regarding changes in precipitation and runoff (increase or decrease). Either result would affect water resources in a profound manner. The scientific data supports some of the media coverage of climate change effects. According to Robinson and others (1993), the snow pack in higher latitudes has been smaller and snowmelt runoff has been less in the last two decades. Nicholls and others (1996) noted that in recent years there has been earlier lake ice melting, earlier snowmelt flooding and earlier land warming in high latitudes. The discrepancy between local weather and climatic factors may be explained by the scale of the climate models (large areal blocks and decades of time) compared to weather systems (local effects and time periods of hours to days) (Frederick and Gleick, 1999). The implication of these studies is that weather patterns have changed and will continue to change. According to climatic data analysis (NOAA, 1999; Kahl, 1995) changes already affect our region�s water resources through extremes in rainfall intensity (24-hour record set in September 1999), drought (summer of 1999) and mean temperature (April through November 1999).
The changes in weather patterns and short-term climate in Maine have a direct and measurable relationship with water resources. This includes the mass balance of the hydrologic cycle, the timing of demand and recharge, the maintenance of flowing rivers, the ability to support fisheries, hydropower production, and water-related tourism. When totaled, the state�s water resources account for billions of dollars of the economy. The stakeholders are not prepared to react to all the changes in water resources and it is expected that demands on these resources will only increase over the next decade (Frederick and Gelick, 1999).
In order to manage the human response to climate-induced changes it is crucial to bring together the stakeholders and the researchers evaluate the resources needed to develop a strategic action plan. The Water Research Institute at the University of Maine and the U.S. Geological Survey-Water Resources Division propose to sponsor a Climate Change and Water Use Workshop to develop the criteria for dealing with water resources at risk and to formulate a draft action plan for the next decade. Although the primary benefactor will be the Drought Task Force, the plan will become a basic tool for helping state agencies and water resource managers to deal with weather extremes and climate-induced changes to our water resources. Additional support is expected from the National Weather Service. This pro-active approach to managing the impact of weather and climate changes has never before been applied in Maine.
The Climate Change and Water Use Workshop will be the first organized attempt in Maine to look beyond problem identification and apply research to generate management objectives. To accomplish this project it is necessary to bring together the research community, regulatory agencies, and resource managers so that information can be shared and water resource objectives be melded into a common action plan. The potential scale of water resource change could have a tremendous negative impact on all stakeholders. The scope of the workshop should be broad enough so that the participants can identify how these critical resources can be shared for mutual benefit, or at least minimizing mutual loss.
Our objectives are:
1) To identify the key researchers on climate change and weather patterns. It is essential to get as clear a prediction of decade-scale trends as possible. The USGS is the primary provider of water resource measurements and the NWS/NOAA is the primary provider of weather and climatic data. There are researchers in Maine and the northeast who have particular expertise on climate changes (temperature regime, moisture regime, seasonal changes, etc.). This will be the opportunity to get the most current interpretations of trends to the stakeholders.
2) To enhance communication among stakeholders.
Changes to the quantity or quality of water resources will affect everyone in the state. If trends in water resources cause stress to natural systems, then there will certainly be secondary sociological stress. Since water resources are something that is common to an otherwise diverse and disjointed population, this workshop will be the first step in forging new management strategies. This workshop will give key stakeholders an opportunity to work together to develop a strategic plan to respond to the stresses.
3) To develop a draft strategic action plan.
The University of Maine WRI is taking the lead to foster water resource educational efforts in the state. The workshop will result in a draft action plan for the Drought Task Force that can be used to manage our collective response to changes in the water resources. As a management tool, it is more effective to have a response action plan with some uncertainties than no plan and a certainty of unclear action.
4) To develop a framework for future action.
Once this workshop is complete, a summary of proceedings and a draft action plan will be prepared. The workshop organization will be a template for future workshops if deemed necessary by the stakeholders.
Methods, Procedures and Facilities:
This proposed work will be divided into three phases: (1) workshop organization and operational management, (2) workshop facilitation, and (3) editing of proceedings for publication. WRI and USGS staff will take the lead in workshop preparation, including forming an organization committee to solicit contributors for the technical presentations and identification of key stakeholders (Phase 1). This includes devising the program, soliciting contributions to the program, advertising the meeting through direct mailings, the WRI Web Page, and others means. WRI, USGS, and the organization committee (Phase 2) will manage workshop facilitation. This includes securing physical space for the meeting and solving logistical problems during the workshop. Following the meeting the WRI will collate and publish proceedings of the workshop, including the draft action plan to be developed (Phase 3).
The workshop will be scheduled to occur in mid-summer so that University of Maine facilities can be used for meetings and lodging. This arrangement will also allow participants to familiarize themselves with resources available at the university.
An organization committee is proposed to oversee the workshop and to ensure that this
effort is brought to a successful conclusion. This statewide group will
include representatives from the federal and state regulatory agencies, water utilities, and hydropower companies,
U.S. Geological Survey, National Weather Service, as well as members at
the University of Maine staff and associated faculty of the Water Research Institute.
The organization committee will be formed from key stakeholder groups. The Drought Task Force will be the primary beneficiary of this workshop and its membership will be used as a basis for forming the organizing committee. Membership on the task force (Table 1) is from a diverse cross-section of regulatory agencies and businesses.
����� TABLE 1. Drought Task Force Membership
David Breau�������������������������DHS, Drinking Water Program
Steven Burgess������������������� Maine Emergency Management Agency
Mark DesMeules���������������� State Planning Office
Bob Devlin���������������������������Maine Municipal Association
Harry Doughty�������������������� Department of Conservation
Francis Drake����������������������DHS, Drinking Water Program
Shelley Falk�������������������������Department of Agriculture
Rachel Fisher���������������������� U.S. Army Corps of Engineers
Wes Hallowell��������������������� Florida Power and Light
Ray Hammond���������������������Public Utilities Commission
Tom Hawley������������������������ National Weather Service
Steve Kahl����������������������������Water Research Institute
Ruth Kitowiez���������������������� U.S. Army Corps of Engineers
Robert Lent��������������������������U.S. Geological Survey
Steven Levy�������������������������Maine Rural Water Association
Chris Lockwood������������������Maine Municipal Association
Marc Loiselle�����������������������Maine Geological Survey
Jeff Martin����������������������������Great Northern Paper Company
Jeff McNelly������������������������Maine Water Utilities Association
Brent Mullis�������������������������Consolidated Farm Services Administration
Dana Murch�������������������������Department of Environmental Protection
Joe Nielsen���������������������������U.S. Geological Survey
Tom Parent���������������������������Department of Conservation
Arnold Roach���������������������� Consolidated Farm Services Administration
Clough Toppan������������������� �DHS, Health Engineering
The University of Maine Water Research Institute and USGS will provide the oversight of this Committee. In addition
to the Drought Task Force, potential membership for this committee will be
derived from these organizations or
U. S. Geological Survey
Maine Geological Survey
Maine Emergency Management Agency
Maine Department of Agriculture
Maine Department of Environmental
Maine Department of Conservation
Maine Department of Inland Fisheries
and Wildlife Maine State Planning Office
Consolidated Farm Services Administration
Paper Products Industry
Penobscot Nation
Hydro-Power Industry
University of Maine, Water Research Institute
Maine Water Utilities
Specific groups with water resource interests that do not participate in the organization committee will be invited to participate in the workshop. This may include out of region experts and citizen action groups.
Previous Climate Change Meetings hosted by the University of Maine. In 1993 the
Water Research Institute at the University of Maine sponsored a program entitled,
A Regional Response to Global
Climate Change: New England and Eastern Canada. This conference was attended
by 135 natural resource managers, scientists, and policy makers from New
England and Eastern Canada. Nine areas of the regional economy were identified
as being sensitive to global climate change.
Tourism and Recreation
Wetland Ecosystems
Coastal Infrastructure
Water Resources, and
The conference attendees proposed an action plan to address climate change:
Diversify
the natural resource based economy,
Reduce risks to human health, ecological communities, and economic infrastructure, and
and share information on climate change issues.
Specific issues needing attention were identified to clarify the three recommendations. Conference presentations and conclusions were summarized in a proceedings volume. The implementation of recommended policy changes has been very limited.
Other Climate Change Conferences in Maine. Pam Person (Coalition for Sensible Energy) co-chaired the "Climate Change in Maine - The Risks and Opportunities" conference in April 1999 in Lewiston, Maine. The conference had participation from 50 sponsoring organizations, 43 members on the Planning Committee, over 70 speakers and 300 attending with great representation from all sectors. There were speakers from diverse groups- United Technologies (corporate as well as the Maine Pratt and Whitney plant), Maine Chamber and Business Alliance E2 Center, Shaw's Supermarkets, J.D. Irving Corporation, Champion International, J.M. Huber, Bangor Gas, the American Petroleum Institute, Maine Oil Dealers, solar and wind energy producers, Independent Energy Producers of Maine, Bangor and Aroostock Railroad, Maine Organic Farmers and Gardeners Association, Maine
Snowmobile Association, Northeast Energy Efficiency Partnerships, The Colony Hotel, the Snowmobile Association, and the Canadian Home Builders Association, Brunswick Naval Air Station, Town of Wells, City of Portland and representatives from the Maine congressional delegation. Agencies, academic institutions and non-governmental organizations were well represented. Information was shared on agriculture, climate
change, sea level rise, marine resources, ecosystem effects, recreation, economics, industrial energy efficiency. A Proceedings volume summarizing most of the talks was given to each conference attendee. Actions initiated at this meeting are continuing. Climate Change and Water Resources in New Brunswick, Canada. The New Brunswick government has been examining the effects of climate change and extreme weather events on their water resources (B. Burrell, New Brunswick Committee on River Ice and the Environment, personal communication, December 1999). Of special concern is the Saint John River and the sensitivity of river ice and flooding to climate change. The government has identified areas that will need to be assessed in response to anticipated climate changes, such as: bridge engineering, cold-water fish survival, ice jam flooding, drinking water quality, and coastal flooding.
There is a wealth of weather and climate data available for Maine that is accessible via the internet. These data include real-time weather and stream flow, as well as historical data going back several decades. These resources are the most important part of the water management process.
The U.S. Geological Survey measures snowfall and streamflow in a variety of locations in Maine. These measurements are available in real-time (http://aug1dmeags.er.usgs.gov/) and compiled data covering multi-year records for specific stations can be viewed or downloaded. Finding this information and learning how to use it will be an important part of the workshop.
The National Weather Service monitors atmospheric conditions and makes projections about weather in the near term (hours to months). The data used to predict weather and measurement of weather conditions (including remote sensing) are available via the internet (www.nws.noaa.gov/er/gyx/main_menu.shtml). The type of information available here that is of greatest importance is prediction of storm events and compilation of weather extrema.
Weather data from weather observations collected over many years are the basis of climate data. Assessing how weather variation can reflect climate change is a complex problem. The objective of using climatic data is to look at long-term trends to help predict the future. In this instance, the focus will be on water resources. Climatic data is accessible via the internet from the Northeast Regional Climate Center at Cornell University (http://www.cit.cornell.edu).
Boyle, K., J. Shuetz, and J.S. Kahl (1997). Great Ponds Play an Integral Role in Maine�s Economy, Dept. Res. Economy. Policy Paper REP473, 50 p.
Frederick, K. and P. Gleick (1999). Water & Global Climate Change: Potential Impacts on U.S. Water Resources, Pew Center on Global Climate Change, 55 p.
Kahl, J.S., 1995. Climate change and surface water resources in Maine. Report for the Maine Environmental Priorities Project. Technical Appendix V. 1. 6 p. Maine DEP, Augusta, ME. Maine Department of Environmental Protection, 1996. State of Maine 1996 Water Quality Assessment, Doc. No. DEPLW96-15, 206 p.
National Oceanographic and Atmospheric Administration, 1999. Climate of 1999, Annual Review-Preliminary Report, National Climate Data Center.
Nicholls, N., G.V. Gruza, J. Jouzel, T.R. Karl, L.A. Ogallo, and D.E. Parker. 1996. Observed Climate
Variability and Change. In Climate Change 1995: The Science of Climate Change. IPCC,
Working Group I Assessment. Cambridge University Press, Cambridge.
Robinson, D.A., K.F. Dewey, and R.R. Heim, Jr. 1993. Global Snow Cover Monitoring: An Update.
Bull. of the Amer. Meteorological Soc. 74: 1689�1696.
Wolock, D.M., and G.J. McCabe. 1999. Simulated Effects of Climate Change on Mean Annual Runoff in the Conterminous United States.
Paper presented at the American Water Resources Association�s conference,
Potential Consequences of Climate Variability and Change to Water Resources
of the United States, Atlanta, GA (May 10�12).
U.S. Department of the Interior, U.S. Geological Survey
URL: http://water.usgs.gov/wrri/00grants/MEclimate.html
Maintained by: John
Schefter
Tuesday November 1, 2005 10:21 AM | 科技 |
2016-40/3982/en_head.json.gz/4369 | Sign Up To Receive Weekly E-News From 90.5 WESA Under Its Frozen Exterior, Scientists Say Europa's Ocean Is Salty Like Ours By editor
Mar 6, 2013 ShareTwitter Facebook Google+ Email The mosaic was constructed from individual images obtained by the Solid State Imaging (SSI) system on NASA's Galileo spacecraft during six flybys of Europa between 1996 and 1999.
NASA/JPL-Caltech/University of Arizona
Here's a quote we found awe-inspiring: "If you could go swim down in the ocean of Europa and taste it, it would just taste like normal old salt." That's California Institute of Technology (Caltech) astronomer Mike Brown talking about Jupiter's moon Europa. Brown and his colleague Kevin Hand from NASA's Jet Propulsion Laboratory believe that if you could drill your way through the moon's frozen exterior, the ocean beneath it would taste a lot like our own sea water. How, you are wondering, did the scientists come to this conclusion? The simple answer is that they looked at the moon's surface and using spectroscopy — that is inferring physical properties through analysis of an object's light — they were able to discern the chemical makeup. But the remarkable part is not that. The remarkable part about this study, which is scheduled to published in the Astronomical Journal, is that they were able to get past all the frozen stuff to find out what the ocean beneath is made of. They were able to do that because bits of the ocean are working their way toward the surface. "We now have evidence that Europa's ocean is not isolated—that the ocean and the surface talk to each other and exchange chemicals," Brown said in a press release issued by the W.M. Keck Observatory, which they used to make the measurements. The end-game here is that scientists have always thought that where there is water, there's a chance for life. So NASA reports: "Europa is considered a premier target in the search for life beyond Earth, Hand said. A NASA-funded study team led by JPL and the Johns Hopkins University Applied Physics Laboratory, Laurel, Md., has been working with the scientific community to identify options to explore Europa further. 'If we've learned anything about life on Earth, it's that where there's liquid water, there's generally life,' Hand said. 'And of course our ocean is a nice, salty ocean. Perhaps Europa's salty ocean is also a wonderful place for life.'" Copyright 2013 NPR. To see more, visit http://www.npr.org/. © 2016 90.5 WESA | 科技 |
2016-40/3982/en_head.json.gz/4397 | July Program Changes Click here to print a schedule Yahoo Buys News App From British Teenager For A Reported $30 Million By Jeff Brady
Originally published on March 27, 2013 5:58 pm Transcript AUDIE CORNISH, HOST: A British teenager has sold his mobile application to Yahoo for a reported $30 million. Seventeen-year-old Nick D'Aloisio created his app called Summly when he was only 15. As NPR's Jeff Brady reports, the teen will now go to work for Yahoo. JEFF BRADY, BYLINE: When Yahoo was founded in 1995, Nick D'Aloisio wasn't even born yet. He says he taught himself to program computers using books and video tutorials. In an advertisement for the launch of his application, D'Aloisio says he was looking for a way to skim through the oceans of news articles on the Internet. NICK D'ALOISIO: So I started a company called Summly. You tell it your interests and it shows you summarized content. But instead of just a headline, Summly gives you 400 characters. That's more than a tweet, but less than a full article. BRADY: It's a simple concept, but one that attracted loyal advocates. Speaking on CNN, D'Aloisio says he originally created the app under a different name. D'ALOISIO: So I released a demo of the application when I was 15 years old and it was called Trimit. And I've been told that the investors read about this on a few technology blogs and it was actually featured by Apple as one of their apps of the week. BRADY: D'Aloisio says investors flew to London to meet him and committed $300,000 to the project. In November, Summly launched its iPhone application. The company says a half million people downloaded it in the first month. On Monday, Yahoo announced it was buying Summly. Yahoo will shut down the application and incorporate it into the company's products. D'Aloisio and two of his colleagues will become Yahoo employees. D'ALOISIO: I'll be focusing on other projects kind of on the side as well as completing my A-level exams. BRADY: Those are the tests D'Aloisio will take to get into college. That's right. He's developed a company, attracted investment and sold the whole thing for millions even before entering college. Kara Swisher is executive editor at All Things Digital. She says Yahoo is buying more than D'Aloisio's app. KARA SWISHER: He brings, you know, a great story. If you notice, there are about 20 stories about, you know, Yahoo boy genius kind of thing. He's very good at getting good PR. He's a lovely kid. Very smart. BRADY: Swisher says in Yahoo's announcement, the company focused on that and didn't reveal how much it will pay for Summly. Swisher pressed her industry sources for that information. SWISHER: It was $30 million. It's mostly in cash, 90 percent in cash and 10 percent in stock. BRADY: The positive media coverage comes at a good time for Yahoo. CEO Marissa Mayer was widely criticized recently for a decision to end the practice of employees working at home. Swisher says this narrative plays right into Mayer's effort to transition Yahoo from desktop computers. SWISHER: It sort of gets people thinking about Yahoo as a mobile company, even though they've been really a laggard in that area for many, many years. BRADY: This is the latest in a string of acquisitions Yahoo has announced as Mayer tries to overhaul the company. The news prompted investors to bid up Yahoo's stock a few percent. The company says it expects to complete its purchase of Summly within a few months. Jeff Brady, NPR News. (SOUNDBITE OF MUSIC) CORNISH: I'm Audie Cornish. And you're listening to ALL THINGS CONSIDERED from NPR News. Transcript provided by NPR, Copyright NPR.Related Programs: All Things Considered © 2016 WQCS | 科技 |
2016-40/3982/en_head.json.gz/4410 | Say Hello to the Serendib Scops Owl
We are losing species of birds, along with every other type of living creature, at a rate unprecedented in history. Dr. Stuart Pimm, world-renowned conservation biologist, summarized the severity of the situation in an 1998 interview with the American Museum of Natural History:
There are about 10,000 species of birds on the planet at the moment. If those species were going extinct at a geological rate, we ought to see one extinction about every 100 years. In other words, it ought to be, at best, a one-in-a-lifetime event. I have seen species that are now extinct — several of them — extinctions that I have known and hated. It’s not hard to go around the world and say good-bye to about one or two or three species per year. That very immediacy of those extinctions — of birds that we know very well — tells us that the current extinction crisis represents an extinction rate that must be at least 100 times what it ought to be.
It’s so easy to get caught up in the lament over loss of species that we sometimes forget that there are at least a few feathered friends as yet unmet. Just recently, in fact, a bird species has been added to the plus, rather than the minus, column, as reported by Birdlife International. Please welcome the Serendib Scops Owl.
The Serendib Scops-Owl (Otus thilohoffmanni) is a small, short-tailed scops-owl. This bird lacks apparent ear-tufts and has a weakly-defined facial disk. Small, rufous, and earless, the Serendib Scops Owl is quite unlike any other owl in Sri Lanka or anywhere else in the Indian subcontintent. Its closest relative is the Reddish Scops Owl (Otus rufescens) which ranges from Thailand to Indonesia. The Serendib joins 47 other species of the Otus genus, all members of the family Strigidae, which are commonly known as ‘typical owls’ as opposed to Tytonidae, or barn owls. Screech owls, sometimes given the genus Megascops, are also considered part of the Otus entourage. These owls can be found all over the world.
More than fifty years ago, an authority on the matter stated, “It is most improbable that a bird entirely new to science could now exist in Ceylon.” How is it, then, that a new species could be discovered on the Resplendent Isle for the first time since 1868? Birders know that, in the field, ears come before eyes. Deepal H. Warakagoda, a bird watcher with a background in electronics, first heard the Serendib’s unfamiliar call in 1995. He was able to tape the vocalization at a number of opportunities, but didn’t spot his quarry until January of 2001. The next month, a photographer captured the very first photos of the serendipitous scops. True capture, via mist-net, occurred in August of that year. Now that the bird’s discovery has been published in the Bulletin of the British Ornithologist’s Club (Vol. 124, 2004) we can accept it as official.
The good news is that we have a new bird. The bad news is that it’s already endangered. The Serendib Scops Owl is only found in the government-protected lowland rainforests of the south-west quarter of Sri Lanka. Like so many species, it seems to require unbroken tracts of a certain size. As of January 2004, only 45 individuals have been counted. There may be more, but it is likely that the boundaries of its habitat exert a terminal cap on the owls’ potential population.
With luck, we will continue to discover brand new bird species faster than we lose them. But as fortune gives, it also takes away. Consider the poor Cozumel Thrasher, endemic to that resort island off the coast of Mexico. At one time, the thrasher population numbered about 10,000, but most of them apparently died in the wake of Hurricane Gilbert in 1988. Right now, their numbers have dwindled a bit; as far as anyone knows, there is only one Cozumel Thrasher left. So, don’t expect us to change the name of this site just yet.
owls Mike Mike is a leading authority in the field of standardized test preparation, but he's also a traveler who fully expects to see every bird in the world. Besides founding 10,000 Birds, Mike has also created a number of other entertaining but now extirpated nature blog resources, particularly the Nature Blog Network and I and the Bird. Share This Article
FB Comments Leave a Reply Click here to cancel reply.
Enlightened Birding What is the Chicken Inferno? Explore These Related Posts Birding superstitions.
Playback or Birding in the Dark
Northern Hawk Owl in Germany
Peterson Reference Guide to Owls of North America and the Caribbean: A Book Review by a Lover of Parliaments
Birding on Steroids or the Kerkini Lake – Greece – in 10 Pictures
She Turned Me Into A Newt
Where Are You Birding This First Weekend of December 2012?
Red-necked Stint with a Japanese flag
Avian Quiz – March 11, 2011
Best Bird of the Weekend (First of December 2009) | 科技 |
2016-40/3982/en_head.json.gz/4450 | NASA marks 10 years since loss of Columbia, crew
By MARCIA DUNNAssociated Press
A wreath placed at the Space Mirror Memorial is seen during a remembrance ceremony on the 10th anniversary of the loss of space shuttle Columbia crew at the Kennedy Space Center Visitor Complex. CAPE CANAVERAL, Fla. — Schoolchildren joined NASA managers and relatives of the lost crew of space shuttle Columbia on Friday to mark the 10th anniversary of the tragedy and remember the seven astronauts who died.
More than 300 people gathered at Kennedy Space Center for the outdoor ceremony, just a few miles from where Columbia was supposed to land on Feb. 1, 2003, following a 16-day science mission. It never made it, bursting apart in the sky over Texas, just 16 minutes from home.
Representing the families of the Columbia seven, the widow of commander Rick Husband told the hushed audience that the accident was so unexpected and the shock so intense, “that even tears were not freely able to fall.”
“They would come in the weeks, months and years to follow in waves and in buckets,” said Evelyn Husband Thompson.
She assured everyone, though, that healing is possible and that blessings can arise from hardships. She attended the ceremony with her two children, her second husband and Sandra Anderson, widow of Columbia astronaut Michael Anderson.
“God bless the families of STS-107,” said Thompson, referring to the mission designation for Columbia’s last mission. “May our broken hearts continue to heal and may beauty continue to replace the ashes.”
A pair of songs added to the emotion of the day. The young nephew of a NASA worker performed a song he wrote, “16 Minutes from Home,” on the keyboard, along with a vocalist. And Grammy award-winning BeBe Winans, an R&B and gospel singer, performed “Ultimate Sacrifice,” which he wrote for soldiers serving overseas in harm’s way.
As it turns out, Anderson had taken a CD of Winans’ music into orbit with him. It was recovered in the debris that rained down on East Texas that fateful morning. Winans did not know that until it was mentioned at Friday’s ceremony.
“I honor you today, I really do honor the families and those who have given the ultimate sacrifice,” he added. Some in the crowd wiped away tears as he sang.
Also present were 44 students from Israel, the homeland of Columbia astronaut Ilan Ramon. He was Israel’s first astronaut.
The teenagers were proud to note that they go to the same school as Ramon once did. They wore white sweat shirts with an emblem of their nation’s first spaceman and the religious items he took into orbit.
“He represented Israel in the best way possible, so I think it’s an honor for us to be here,” said Eden Mordechai, 15.
The other Columbia crew members were co-pilot William McCool, Kalpana Chawla, Dr. Laurel Clark and Dr. David Brown.
NASA’s human exploration chief, Bill Gerstenmaier, said no single person or event caused the Columbia disaster. Rather, “a series of technical and cultural missteps” were to blame, dating back to the first shuttle launch in 1981 when fuel-tank foam insulation started coming off and doing damage.
A chunk of foam punched a hole in Columbia’s left wing during liftoff, leading to the catastrophic re-entry.
The astronaut who led the charge back to shuttle flight two years later, Eileen Collins, stressed that the 30-year shuttle program had its share of successes along the way and achieved its ultimate goal, building the International Space Station. The shuttles were retired in 2011.
“We still miss you,” Collins said of the Columbia seven. “How can we ever thank you for your contributions to the great journey of human discovery.”
The hourlong ceremony was held in front of the huge black granite monument bearing the names of all 24 astronauts who have died in the line of NASA duty. The three-man crew of Apollo 1 died in the Jan. 27, 1967, launch pad fire. The Challenger seven were killed Jan. 28, 1986, during liftoff. Husband and his crew honored them during their own flight, just four days before dying themselves.
On Friday, the names of each of the dead were read aloud. Afterward, mourners placed carnations and roses on the grating in front of the mirror-faced monument.
“I felt compelled to be here to memorialize those who were a big part of my life,” said David Nieds, 39, a grocery store manager who got up early to drive from Fort Lauderdale with his mother and 16-year-old nephew.
He attended dozens of launches. Some people like sports, he explained, while he follows the space program.
Memorial services also were held at Arlington National Cemetery, where three of the Columbia crew are buried; in East Texas, where the shuttle wreckage fell; and in Israel. | 科技 |
2016-40/3982/en_head.json.gz/4501 | Study: Arctic getting darker, making Earth warmer
Feb 18th 2014 5:49AM
WASHINGTON (AP) -- The Arctic isn't nearly as bright and white as it used to be because of more ice melting in the ocean, and that's turning out to be a global problem, a new study says.
With more dark, open water in the summer, less of the sun's heat is reflected back into space. So the entire Earth is absorbing more heat than expected, according to a study published Monday in the Proceedings of the National Academy of Sciences.
That extra absorbed energy is so big that it measures about one-quarter of the entire heat-trapping effect of carbon dioxide, said the study's lead author, Ian Eisenman, a climate scientist at the Scripps Institution of Oceanography in California.
The Arctic grew 8 percent darker between 1979 and 2011, Eisenman found, measuring how much sunlight is reflected back into space.
"Basically, it means more warming," Eisenman said in an interview.
The North Pole region is an ocean that mostly is crusted at the top with ice that shrinks in the summer and grows back in the fall. At its peak melt in September, the ice has shrunk on average by nearly 35,000 square miles - about the size of Maine - per year since 1979.
Snow-covered ice reflects several times more heat than dark, open ocean, which replaces the ice when it melts, Eisenman said.
As more summer sunlight dumps into the ocean, the water gets warmer, and it takes longer for ice to form again in the fall, Jason Box of the Geological Survey of Denmark and Greenland said in an email. He was not part of the study.
While earlier studies used computer models, Eisenman said his is the first to use satellite measurements to gauge sunlight reflection and to take into account cloud cover. The results show the darkening is as much as two to three times bigger than previous estimates, he said.
Box and University of Colorado ice scientist Waleed Abdalati, who was not part of the research, called the work important in understanding how much heat is getting trapped on Earth. | 科技 |
2016-40/3982/en_head.json.gz/4530 | | Nanomedicine | Bionanotechnology
Biodegradable Nanoparticles Can Carry DNA to Brain Cancer Cells
Published on April 30, 2014 at 5:47 AM
Written by AZoNanoApr 30 2014
Working together, Johns Hopkins biomedical engineers and neurosurgeons report that they have created tiny, biodegradable "nanoparticles" able to carry DNA to brain cancer cells in mice.
Biodegradable plastic molecules (orange) self-assemble with DNA molecules (intertwined, black circles) to form tiny nanoparticles that can carry genes to cancer cells. Credit: Stephany Tzeng
The team says the results of their proof of principle experiment suggest that such particles loaded with "death genes" might one day be given to brain cancer patients during neurosurgery to selectively kill off any remaining tumor cells without damaging normal brain tissue.
A summary of the research results appeared online on April 26 in the journal ACS Nano.
"In our experiments, our nanoparticles successfully delivered a test gene to brain cancer cells in mice, where it was then turned on," says Jordan Green, Ph.D., an assistant professor of biomedical engineering and neurosurgery at the Johns Hopkins University School of Medicine. "We now have evidence that these tiny Trojan horses will also be able to carry genes that selectively induce death in cancer cells, while leaving healthy cells healthy."
Green and his colleagues focused on glioblastomas, the most lethal and aggressive form of brain cancer. With standard treatments of surgery, chemotherapy and radiation, the median survival time is only 14.6 months, and improvement will only come with the ability to kill tumor cells resistant to standard treatments, according to Alfredo Quiñones-Hinojosa, M.D., a professor of neurosurgery at the Johns Hopkins University School of Medicine and a member of the research team.
Because nature protects the brain by making it difficult to reach its cells through the blood, efforts turned to the use of particles that could carry tumor-destroying DNA instructions directly to cancer cells during surgery.
The initial experiments made use of cancer cells that Quiñones-Hinojosa and his team removed from willing patients and grew in the laboratory until they formed little spheres of cells, termed oncospheres, likely to be the most resistant to chemotherapy and radiation, and capable of creating new tumors.
Quiñones-Hinojosa then worked with Green to find a vehicle for genes that would cause death in the oncospheres. Green's laboratory specializes in producing tiny, round particles made up of biodegradable plastic whose properties can be optimized for completing various medical missions. By varying the atoms within the plastic, the team can make particles that have different sizes, stabilities and affinities for water or oil. For this study, Green's team created dozens of different types of particles and tested their ability to carry and deliver a test sequence of DNA — specifically a gene for a red or green glowing protein — to the oncospheres.
By assessing the survival of the cells that engulf the particles and measuring the levels of red or green light that they emitted, the researchers determined which formulation of particles performed best, then tested that formulation in mice with human brain cancer derived from their patients.
They injected the particles directly into mice with an experimental human brain cancer, and into the brains of healthy mice for use as comparison. Surprisingly, healthy cells rarely produced the glowing proteins, even though the DNA-carrying particles were entering tumor cells and non-tumor cells in similar numbers. "This is exactly what one would want to see, cancer specificity, but we are still researching the mechanism that allows this to occur," says Green. "We hope our continued experiments will shed light on this so that we can apply what we learn to other scenarios."
Related StoriesMulti-Functional Nanoparticles and Their Role in Cancer Drug Delivery – A ReviewDNA Analysis, The Development of a Portable High-Speed DNA Analysis Device - Paving The Way Towards Point-Of-Care Diagnosis And Advanced Medical TreatmentCancer Cooking Lesson, A Basic Look At How Nanotechnology Can Be Used To Physically Destroy Cancer Cells and Cure The Body of Cancer
"It is exciting to have found a way to selectively target gene delivery to cancer cells," says Quiñones-Hinojosa. "It's a method that is much more feasible and safer for patients than traditional gene therapy, which uses modified viruses to carry out the treatment."
He adds that the particles can be freeze-dried and stored for at least two years without losing their effectiveness. "Nanoparticles that remain stable for such a long time allow us to make up formulations well in advance and in large batches," says Stephany Tzeng, Ph.D., a member of Green's team. "This makes them easy to use consistently in experiments and surgeries; we add water to the particles, and they're good to go."
In a related study, published online on March 27 in the same journal, Green's group also showed that a different particle formulation could effectively carry and deliver so-called siRNAs to brain cancer cells. siRNAs are very small molecules that carry genetic information to cells, but unlike DNA that can turn genes on, siRNA interferes with the production of particular proteins and can turn cancer genes off.
Green explains that siRNAs must be encapsulated in particles that are different from those used to carry DNA because siRNAs are about 250 times smaller than the DNA molecules usually used for gene therapy. "siRNAs are also much stiffer than DNA, and they don't have to enter the cell nucleus because they do their work outside it, in the cytoplasm," he says.
An initial library of 15 biodegradable particle formulations was tested for their ability to carry siRNAs into human glioblastoma cells that were genetically engineered to make green fluorescent protein (GFP). The siRNAs added to the particles contained the GFP code, so successfully targeted cells would stop glowing green.
By tweaking the chemical properties of the particles, the team was able to find a composition that decreased GFP's glow in the human brain cancer cells by 91 percent. To test the ability of the particles to deliver death-inducing siRNAs, the team loaded the particles with a mixture of siRNA codes designed to prevent important proteins from being made. They then added these particles to brain cancer cells and to non-cancerous brain cells growing in the laboratory.
As in their mouse study, the siRNA was more effective — in this case at causing cell death — in the brain cancer cells (up to 97 percent effective) than in the non-cancerous cells (0 to 27 percent, depending on nanoparticle type).
Green emphasizes that for nanoparticle-based genetic therapies that are safe for patients, the specific siRNA or DNA being delivered in a clinical treatment would be carefully chosen so that, even if there was off-target delivery to healthy cells, it would only be detrimental to cancer cells. Green is encouraged by the results so far. "Combining what we've learned in these two studies, we might even be able to design particles that can deliver DNA and siRNA at the same time," he says. "That would allow us to fine tune the genetic self-destruct code that our particles deliver so that cancer cells die and healthy cells do not."
"Dr. Green and his colleagues have taken important steps in developing polymeric nanoparticles for DNA and siRNA delivery, with promising specificity for tumor cells and enhanced stability," said Jessica Tucker, Ph.D., program director for drug and gene delivery systems and devices at the National Institute of Biomedical Imaging and Bioengineering, which provided partial funding for these studies. "Though many challenges still remain, such work could potentially transform treatment outcomes for patients with glioblastoma and related brain tumors, for which current therapies provide limited benefits."
Source: http://www.hopkinsmedicine.org/ | 科技 |
2016-40/3982/en_head.json.gz/4557 | Smart Planet
The high cost of our throwaway culture
In the second of a two-part series, Gaia Vince explains how our consumer appetite is draining the planet of its resources, and why it is time to act more responsibly.
By Gaia Vince
As I said in my previous column, we are the biggest force in moving the planet’s rocks and sediments around. Our global extractions are environmentally damaging and depleting some resources to the extent that they are in danger of running out.Many of those resources find their way into the goods, gadgets and machines that we find indispensable in our everyday lives. But do we really need so much stuff, or are we simply addicted to the new? There's no doubt that our consumption of resources from food to gadgets has risen dramatically over the past 60 years, and much of the world seems to be in the grip of a shopping epidemic. But is there a conscious effort driving it?There have been many times during my travels when I've needed something repaired, from rips in my backpack, to holes in my clothes, zippers that have broken or memory cards that have lost data. From India to Ethiopia, I have had no trouble in finding someone who can sort the problem out, repair what is broken or find an ingenious way of side-stepping the issue. In rich countries, such items often would be thrown away and replaced with new ones without a second thought. But the developing world is still full of menders, make-doers and inspired users of others’ scrap. I’ve seen a bicycle in Nairobi made from bits of car, a colander and a leather belt; aerials made from all manner of implements; and houses constructed out of old boat sails, rice sacks and plastic drinks bottles.But then there are those items that seemingly can’t be repaired. My camera shutter, battered by the dust and grime of travel, no longer works. I'm told I should throw away my camera, even though it works fine, apart from the shutter mechanism. Like the majority of consumer electronics, my camera has not been designed to be easily reparable. Thirty years ago, I could have found service manuals and spare parts for all camera models, as well as a thriving repair industry. But things have changed. Camera models have got far more numerous and complicated, and manufacturers no longer release repair manuals.Since the mobile phone handset market reached saturation in Europe and the United States about a decade ago, we have chosen not to wait for our devices to fail. Almost all new phones purchased are “upgrades”, replacing functioning phones simply for reasons of fashion or for technological additions that many of us rarely use, and which could otherwise easily be achieved through software upgrades to existing handsets. Indeed, most phone companies compete for business by automatically upgrading customers’ phones every year. And these smart phones use mined resources from some of the most ecologically sensitive areas of the world, as a recent Friends of the Earth campaign points out.Made to failThe idea that something that works fine should be replaced is now so ingrained in our culture that few people question it. But it is a fairly recent concept, brought about by a revolution in the advertising and manufacturing industries, which thrived on various 20th century changes, including the mass movement of large populations to cities, the development of mass production, globalisation, improved transport, international trade and public broadcast media.The earliest example of manufacturers convincing people to frequently replace a product may be the so-called “lightbulb conspiracy”, in which a group of companies is supposed to have orchestrated the Phoebus Cartel to prevent companies from selling lightbulbs with a longer than 1,000-hour lifespan (most settled on 750 hours), even though bulbs lasting more than 100,000 hours existed. The cartel said its intent was to develop international standards, but the net result was that households needed to replace their bulbs regularly, providing a far larger consumer market.This way of selling more products by designing products that deliberately fail, cannot be repaired, or have a set lifespan imposed in some other way is known as planned obsolescence. However, it is not just a cynical plot dreamed up by manufacturers to boost profits, many politicians and economists believe it to be a societal necessity. The idea was born in the US during the 1930s depression as a way to get the economy moving again by compelling people to buy more stuff. There were plenty of factories and masses of unemployed looking for gainful work, the trouble was the people who could afford to buy things already had them. What was needed, strategists proposed, was a reason for shoppers to buy things they already had, or didn’t know they “needed”.By the 1950s, planned obsolescence had become the dominant paradigm in mass production with things no longer built to last. A sophisticated advertising industry persuaded people to shop. Mechanisms flourished to make this easier, from department stores to credit. Consumerism was born. Some industries, such as fashion are predicated on planned obsolescence, with items being made to last a single season or less. Other industries are following fashion’s high-turnover model and bringing out products that have cosmetic gimmicks or seasonal appeal but which will soon appear dated.Computer says noMany people wonder whether electronic products are being designed to fail, from television sets, which have heat-sensitive condensers deliberately fitted onto the circuit board next to a heat sink connected to the transistors, to washing machines with ball-bearings fitted inaccessibly into the drums so they cannot be replaced. Specific lifespans are programmed by the manufacturers into chips in some equipment, so that printers will stop working after a preset number of pages (sending a message: “internal error”), coffee makers will cease functioning after a preset quota of brews, and memory cards will stop uploading after a preset number of photo uploads. The user is then forced to buy a replacement. Some computers are difficult to upgrade – new versions of software are upwardly compatible so files won’t work on older models or, as in the new Apple MacBook Pro, the RAM is soldered to the motherboard, making it difficult or costly to upgrade or replace the hard drive and battery. As a result, users can find themselves buying a new computer every couple of years.Another trick is to make parts and accessories incompatible between brands or even models of the same brand. Thus, not only do consumers need to buy a different memory card or battery or charger for each device or brand of electronic equipment they buy, from phones to laptops to toothbrushes, but they must also buy a new charger or adaptor when they upgrade from an iPhone 4 to the iPhone 5, for example.Some consumers are starting to hit back, though, advising people on the internet how to find and remove the printer chip or over-write the memory card software. But this is technically challenging and time-consuming, and most people aren’t even aware of the reason for their device no longer working.Californian Kyle Wiens is the type of sophisticated electronics tinkerer that emperors of the IT industry like Microsoft and Apple dream of employing. But Kyle is a guerilla geek – he slipped through their net and crossed over to the consumer side. Instead of dreaming up intricate ways of fitting more and more components into ever-slimmer gadgets, Kyle is taking them apart.It started a decade ago, when he was a student and his iBook stopped working, Kyle explains. “I couldn’t get find a service manual for it, so as my friend [fellow student Luke Soules] and I took it apart to fix it, we created our own manual.”Kyle and Luke began taking apart computers and other equipment, getting around the copyright protection by creating their own service manuals from scratch, and posting them online for free. The pair called their site iFixit.com and over the past decade, they have expanded to hundreds more manuals, encouraging home hackers to create their own manuals for the site – “we’re a Wikipedia of service manuals” – and they also sell the tools and spare parts users need to fix their own electrical devices.They are contributing to a cultural shift in perception, Kyle says. “When someone has repaired their broken iPad or whatever, it’s a life-changing experience. That person will become a different consumer. They will choose new products based on how long-lasting they are and how easy they are to take apart and repair.”Market forcesAt the moment, there is little onus on manufacturers to improve, and indeed things may be getting worse. As Apple’s beautiful-looking and increasingly thin gadgets take an ever-larger share of the market, other manufacturers are following suit. Motorola and Nokia, both of which were known for their durable phones with easily replaced long-lasting batteries, have now both released thin phones with glued-in batteries.But some companies have been bucking the trend. HP and Dell release service manuals and make computers that are easily upgradable and repairable – iFixit.com recently gave HPs Z1 model a score of 10 out of 10 for repairability.And other companies are joining the move towards a circular economy, in which economic growth is uncoupled from finite-resource-use. Instead of the linear manufacturing route: mining materials, fabricating, selling, throwing them away; a circular economy is based around making products that are more easily disassembled, so that the resources can be recovered and used to make new products, keeping them in circulation. British yachtswoman Ellen MacArthur is a strong advocate of the concept and commissioned a report into the idea, which found that the benefits to Europe’s economy alone could be $630 billion, based on cycling just 15% of materials in 48% of manufacturing and just being recycled once.The high-end outdoors clothing company Patagonia, for example, issues a guarantee that it will repair any of its products for free over their lifetime, and employs a team of seamstresses to do just that. And in 2011, it actually ran an advertising campaign asking consumers to buy less of the stuff they don’t need, to curb waste and environmental damage.As mines become depleted, it could soon force the market to change its recent ways. During the course of the 20th century there was a steady fall in the price of commodities, but that price has been climbing since 2000. “We’re moving towards a resource-limited economy, which means manufacturers are going to have to move their business model towards remanufacturing and repairs, as the cost of rare earth metals and other resources slash profits,” Kyle says. Perhaps, then, planned obsolescence will begin to reach its own expiration date.Do you agree with Gaia? If you would like to comment on this article or anything else you have seen on Future, head over to our Facebook page or message us on Twitter.
View image of Pedal power (Credit: Copyright: Hal Watts)
Matter of Life & Tech
Cycle to recycle old gadgets Pedal power to dispose of 'dead' devices
View image of Beating our consumption habit (Credit: Copyright: Getty Images)
Beating our consumption habit
How can we stop our appetite for consumerism from damaging our planet?
View image of Why keep an ancient appliance? (Credit: Copyright: Thinkstock)
Why keep an ancient appliance?
In a culture that sees everything as disposable, what makes us hold on to our devices? | 科技 |
2016-40/3982/en_head.json.gz/4736 | $7b proposed for particle study By Jia Hepeng Updated: 2007-02-09 06:45 International scientists yesterday proposed a $7-billion-plus plan to build an international linear collider (ILC) to find unknown particles and conduct studies.The international scientific community has been working for two decades to build an ILC, but they agreed on a budget only yesterday at an International Committee for Future Accelerators (ICFA) meeting in Beijing. The meeting continues today.Scientists hope the proposed ILC will be completed by 2016, but it is yet to be decided where it will be based or how the funds would be realized.A linear collider is a gigantic device that makes electron beams hit each other, and scientists can find unknown particles by studying the collision results.Scientists say the collisions can create an array of new particles that could answer some of the most fundamental questions about the nature of the universe, such as the origin of mass, dark matter and dark energy.The collider will hurl about 10 billion electrons and other particles toward each other at almost the speed of light, making them collide 1,400 times every second and creating extremely high levels of energy 500 billion electronvolts (GeV).The energy level will increase to 1000 GeV, or 1 trillion electronvolts (TeV), in the second stage. GeV and TeV are units of energy used in particle physics. One Tev is equal to the energy of a flying mosquito. But the proposed ILC will squeeze a TeV into a space about a million million times smaller than the size of a mosquito.There are more than 70 colliders across the world, but most of them are not large and long enough to enable scientists to observe the basic nature of a material. Many of them, like the Beijing Electron Positron Collider (BEPC), have a ring-like structure akin to a giant tennis racket.Difference in collidersThe proposed ILC will be different from the existing colliders, and will be built in a 31-kilometre-long tunnel that can be extended up to 50 kilometres in the second stage.The costs, ICFA scientists said, will include $1.8 billion for site-related construction, $4.9 billion for high-tech equipment and for 2,000 scientists and engineers working on the construction.Over 1,000 scientists and engineers from 100 universities and laboratories across two dozen countries are already working for an ILC. Albrecht Wagner, a leading ICFA scientist, said the US National Academy of Sciences had suggested the US Government be the base for the project."Chinese scientists and industries have contributed a great deal to the development of an ILC by providing theoretical know-how and advanced equipment," Wagner said.Chen Hesheng, director of the Chinese Academy of Sciences' Institute of High-energy Physics, said China's degree of involvement and whether it would be a candidate to host the huge project would be decided by the country's top policymakers, in accordance with its economic strength and industrial development level."But one thing is clear. The greater involvement of Chinese scientists in ILC will not only promote the international collider, but also boost the country's scientific capacity and train a new generation of physicists," Chen said.(China Daily 02/09/2007 page3) China Daily PDF Edition | 科技 |
2016-40/3982/en_head.json.gz/4865 | EDITORIAL: MSNBC.com Report on U.S. "Nuclear Risks" Features Many Flaws
104 comment(s) - last by mkrech.. on Mar 18 at 12:23 PM
Today's MSNBC report on U.S. nuclear risks misinterpreted government data and overstated realistic risks by as much as two orders of magnitude. For example the site stated that the Indian Point 3 reactor (pictured) had a 1 in 10,000 chance of core damage from an earthquake. The actual estimate is one in 670,000. (Source: Mike Segar / Reuters)
Misinformed by the media, many in the public are stocking up on radiation pills and suggesting banning nuclear power. (Source: FOE Europe)
The Japanese government is also releasing contradictory and alarming information. According to its latest statement no cores have been breached, so there's no immediate danger to the population, even in this "worst case" scenario. (Source: The Times)
Fear, uncertainty, disinformation -- news sites offer misinformation, speculation on nuclear power for profit
The nuclear crisis in Japan is bringing international attention. And there's plenty of misinformation based on current media reports. We wanted to examine a couple of the top reports circulating, including a report on the risk of a similar disaster occurring in the United States.
I. Is the U.S. at Risk? Do You Want the Truth?
Are you at risk of a quake breaking a nuclear plant's core containment vessel and exposing you to potentially cancer-causing levels of radiation? Yes. You are also at risk of dying from lightning, getting mauled by a pig, killed by falling coconut, and all myriad of other unforeseen, unlikely events.
But according to a joint U.S. Nuclear Regulatory Commission (NRC) and U.S. Geological Survey(USGS) report (PDF), the odds of that happening are extraordinarily low. Risk, after all, is an assessment of uncertainty -- not a prediction that something will happen. There's plenty of catastrophic but incredibly unlikely risks we face on a daily basis -- the chance of plant damage in the U.S. is one of them.
The report, which is gaining a great deal of attention in the wake of the Japanese incident, should be considered reassuring, if anything.
According to the report, the greatest risk any plant in the U.S. faces is 1 in 10,000 risk of core damage per year at the perfect frequency. Note this is the probability of core damage, not "large early release" (LER) -- a completed release of radiation into the environment.
MSNBC.com did an excellent job digging up the document. Unfortunately, from they made numerous factual mistakes in interpreting it.
First, their report offers the hyperbole:
It turns out that the U.S. Nuclear Regulatory Commission has calculated the odds of an earthquake causing catastrophic failure to a nuclear plant here. Each year, at the typical nuclear reactor in the U.S., there's a 1 in 74,176 chance that the core could be damaged by an earthquake, exposing the public to radiation. No tsunami required. That's 10 times more likely than you winning $10,000 by buying a ticket in the Powerball multistate lottery, where the chance is 1 in 723,145.
First, all statistic chances are not created equally. There are 104 commercial nuclear power plants in the U.S. (69 pressurized water reactors and 35 boiling water reactors). That means there's roughly 1 in 742 chance per year of core damage -- or roughly 1 in 7.4 chance per century of such an incident at a single plant. By contrast there are dozens of $10,000 "Powerball multi-state lottery" winners every year and will likely be thousands of winners per century. Thus the comparison itself is a bit puzzling.
But the error runs far deeper.
Note, the report says that the risk is of "the core being damaged by an earthquake, exposing the public to radiation". But as we mentioned earlier, that's not what the report says. The report references the risk of core damage, which does not estimate the actual probability of a "large early release" of radiation at all. As the report says, in the case of core damage, such a release would be a "possibility", but given additional containment measures, would likely be a far lower probability than the cored damage frequency (CDF) estimate.
In other words, the report does not predict the risk of the public being exposed to radiation directly at all.
And the errors continue. The MSNBC report offers a list of plant yearly risks, compiled in handy text format online and in the form of an Excel document. These risks were taken from the report, but they were the risks at a specific earthquake frequency. For example the most "at risk" plant -- New York's Indian Point 3 plant -- has a 1 in 10,000 annual risk of core damage if an ultra-powerful 10 hz earthquake were to strike (thus this is dubbed the "maximum risk" or "weakest link" model). The actual risk is far lower. The report gives what is likely the most accurate estimate in the form of a weighted average. For example for Indian Point 3, the risk is 1 in 670,000 per year.
Now consider the difference between 1 in 10,000 to 1 in 670,000. We've now gone from 1 in 100 chance of quake core damage per century to 1 in 6,700. Looking at the actual numbers, this means that the conclusions goes from there would be likely one core damage at a single plant in the U.S. over the next millennia, to that there would likely be none.
II. So Reports are Sensationalized -- Why Should I Care?
Now it would be far too easy to cast a blind eye to this kind of misinformation. All news sites and networks make errors. But the problem is that in the wake of the earthquake the media has seized on this topic with particular sensationalist fervor and offered much speculation and hyperbole.
The net result is that the U.S. public is becoming mistrusting and fearful of nuclear power. Anecdotal evidence of that is given by the run on radiation pills in the U.S.
This could have a tremendous deleterious effect on the energy future and security of the U.S. Nuclear power in the U.S. is arguably the cheapest and most tested form of alternative energy. The U.S. contains many rich deposits of uranium and other fissile isotopes -- enough to drastically reduce the reliance of the U.S. on fossil fuels from volatile foreign sources.
But public fear can and does have a number of direct effects that may sink that effort. First, past efforts to build plants in the 1970s and 1980s led to massive lawsuits that raised costs of construction so high that no new U.S. plants were even seriously considered until a year or two ago.
Finally, the U.S. has its first new plant application in three decades and is preparing to embark on a new era of nuclear energy.
It is important to also consider that fission power is widely viewed by the scientific community as only a stopgap solution that at most will be used for power generation for a couple more centuries before being replaced by fusion power. Nature shows us that fusion is a far more abundant and lucrative source of energy in our universe, so if we can't harness fusion power within 300 years we've done something wrong, given how close we seemingly are.
In other words, nuclear power is a short-term solution and thus risk should only be considered in the short term (as discussed above).
Further, the risks on these new plants will be orders of magnitude less and that they will produce less nuclear waste and more energy.
And last, but not least any discussion of risks should put things in perspective by providing information on equivalent dangers of fossil fuel power generation -- something virtually none have done. As underscored by the recent coal mining disasters in Chile and West Virginia, fossil fuel power is hardly safe and human friendly. Every energy source has a cost. For some alternative energy sources like solar and wind, that cost is high production costs. For fossil fuels, it's loss of life. In total 6,400 people died between 1970 and 1992 during coal mining operations, and 1,200 died extracting natural gas [source].
The importance of the truth and accuracy in this situation cannot be overstated. It is of the utmost importance that the media offers accurate information to the public in countries with nuclear interests, particularly in the wake of the Japanese incident.
III. Confusion in Japan
MSNBC.com and other news organizations are not solely responsible for the confusion and misinformation that's permeating all news outlets. Some of it is coming from those who should be reassuring, not speculating -- the government of Japan.
Japan's Fukushima nuclear plants still face a precarious situation in the wake of the record-setting 9.0 magnitude Sendai earthquake. Smoke has been billowing up from southern Fukushima I's reactor three -- steam from a damaged roof. Now that smoke is steam from broken water pipes in the cooling system of the reactor building.
Japanese officials on earlier today in a report [PDF] suggested that the reactor core may have been released and that radiation could be carried in the steam into the environment endangering the public. But then later in the day, they said that the core was not compromised.
Of course by then a score of outlets had already reported that it was compromised.
Similarly, many reports stated that the reactor rods had "melted down" -- a serious problem. These reports are based on statements made by Japanese officials that the rods may have melted. There is some evidence of this conclusion -- water was observed to have boiled off of some of the rods, leaving them uncooled. But officials don't know or haven't released to what extent the rods have melted.
The levels inside the most radioactive plant reached approximately 6.4 millisieverts per hour, before dropping. To put this in context, a full chest CT scan gives you 7 millisieverts [source] of radiation. In other words, you could work in the most damaged plant with no protective gear and only receive the amount of radiation of a common medical procedure. Now that's absolutely not to say that there aren't more serious risks if certain possibilities play out, but the risk of loss of life from the nuclear accident just isn't there yet.
Ultimately, the fault for these confusing and contradictory reports rest largely on the shoulders of Japan's government and international regulators. They have cooperated to publish contradicting and overly speculative reports.
IV. Conclusions
The situation in Japan is ongoing. Officials are using helicopters and fire-trucks to spray water, and possibly boric acid to cool the smoldering cores. We won't have the final picture of what -- if any -- significant long-term radiation release the reactors will create for some time now.
If the media wants a sensational story, they can get some great coverage of the efforts to contain the overheating rods. But in the interest of accuracy they should beware or offer disclaimers on the statements of government officials, given their contradictory track record. And they should most definitely avoid going out of their way to create more misleading statements themselves by misinterpreting obscure U.S. government reports.
Only 20 people wath NBC anyway
overlandpark4me
With the link from this site, they doubled their audience. "If you can find a PS3 anywhere in North America that's been on shelves for more than five minutes, I'll give you 1,200 bucks for it." -- SCEA President Jack Tretton
"Snowflake" Plasma Containment Field Could Hold Key to Fusion's Future
Calculations Show Black Holes Very Possible at LHC, But Mostly Harmless
NRG Energy Files First New U.S. Nuclear Power Plant Application in 30 Years | 科技 |
2016-40/3982/en_head.json.gz/4866 | Google Invests in World's Largest Wind Farm
Tiffany Kaiser - April 19, 2011 10:40 AM
55 comment(s) - last by heffeque.. on Apr 23 at 9:59 AM
Rick Needham (center) with partners Arielle Bertman and Matthew Stepka at the Shepherds Flat Wind Farm (Source: The Official Google Blog)
Google has invested $100 million in the Shepherds Flat Wind Farm in Arlington, Oregon
Aside from running the successful Android operating system and the world's most popular search engine, Google has been making some environmentally conscious efforts as well. Just last week, the web giant invested $168 million in the Ivanpah Solar Electric Generating System located in the Mojave Desert in California.
Now, Google is investing $100 million in the Shepherds Flat Wind Farm in Arlington, Oregon. It will be joining this project with Caithness Energy, which is the project developer, and GE, an early investor and turbine manufacturer as well as an operations and maintenance supplier. Other investors include Tyr Energy and Sumitomo Corporation of America. The Shepherds Flat Wind Farm is still under construction, but is expected to be the largest wind farm in the world. Once completed, it will produce 845 megawatts of energy, which can power over 235,000 homes. "This project is exciting to us not only because of its size and scale, but also because it uses advanced technology," said Rick Needham, Director of Green Business Operations for Google in The Official Google Blog. "This will be the first commercial wind farm in the U.S. to deploy, at scale, turbines that use permanent magnet generators - tech-speak for evolutionary turbine technology that will improve efficiency, reliability and grid connection capabilities. Though the technology has been installed outside the U.S., it's an important, incremental step in lowering the cost of wind energy over the long term in the U.S."
The Shepherds Flat Wind Farm is expected to help benefit Oregon economically, and will also help California meet its renewable energy goals. In addition, the electricity generated at the wind farm will be sold to Southern California Edison under "long term agreements." The Shepherds Flat Wind Farm will be completed in 2012. Comments Threshold -1
RE: I wonder
So I guess you are also against other subsidies also? It's not like the oil companies need them. Parent
Yep, this government shouldn't be able to spend a penny outside of infrastructure or defense. The word "subsidize" causes blood to shoot from my eyes.God, I'm getting old. Parent
Google Invests $168 Million in Ivanpah Solar Electric Generating System
Syracuse University, UM Researchers Improve Wind Turbine Efficiency | 科技 |
2016-40/3982/en_head.json.gz/4868 | Beats Electronics Won't Renew Contract with Monster for Headphones
Tiffany Kaiser - January 13, 2012 9:36 AM
26 comment(s) - last by kleinma.. on Jan 17 at 11:28 AM
The relationship went downhill due to financial terms
Beats Electronics and Monster Cable Products have decided to go their separate ways once their contract expires later in 2012.
Beats Electronics, a brand of headphones and loudspeakers that is marketed by hip hop artist Dr. Dre and Interscope-Geffen-A&M Records chairman Jimmy Iovine, has decided not to renew its five-year contract at the end of the year with electronics company Monster Cable Products, which manufactured the headphones since 2009.
The relationship went downhill due to financial terms, according to Businessweek. Both sides would argue over who deserves the most credit for the idea and for the success of the top-of-the-line headphones.
At the Consumer Electronics Show in Las Vegas this week, Noel Lee, chief executive of Monster Cable Products, unveiled some of his plans for the future of his company post-Beats Electronics. While Beats tends to cater to young 20-year-olds, Monster is looking to appeal to other groups such as athletes, business professionals and women. He showed off some of the company's new headphones at CES, which are now available for pre-order.
An example of Monster's new offerings is a pair of in-ear headphones dubbed the Miles Davis line where the buds are shaped like trumpets and have volume controllers that look like piston vavles. There are currently eight new lines.
"We can be the Apple of the headphones space, with or without Beats," said Lee.
Beats will continue to hold the rights to the sound technology, the brand and the circular design after the break-up. While Beats can be found in HTC smartphones, HP computers and the Chrysler 300 S sedan, it's looking to expand into audio gear for athletes and TVs as well.
"We have very big ambitions for Beats beyond headphones," said Iovine. "Music has got to succeed on the phone or else the record industry will never thrive."
As far as the split goes, Iovine said, "They're doing their thing, and we're doing ours."
And the beat goes on.
Source: Businessweek Comments Threshold -1
DockScience
This is a horrible mistake.No one knows how to overprice mediocre products better than Monster.If Beats isn't careful, their quality will increase and the prices will go down resulting in unplanned massive increases in sales, a logistical nightmare. "If they're going to pirate somebody, we want it to be us rather than somebody else." -- Microsoft Business Group President Jeff Raikes
Sony's Walkman Phone Looks Hot, But Isn't so Beautiful on the Inside
HTC, Beats by Dr. Dre Team Up to Team Up for Smartphones this Fall | 科技 |
2016-40/3982/en_head.json.gz/4904 | Home > Cool Tech > Japan considers national solar power to replace… Japan considers national solar power to replace nuclear dependency By
Japan’s Prime Minister, Naoto Kan, is expected to announce the country’s energy plans at the upcoming G8 Summit in Deuville, France. Along with assurances about the current safety of Japan’s nuclear power plants which they will continue using for now, Kan is also expected to unveil a plan for nation-wide renewable energy.
According to the Nikkei business daily, the Japanese government wants to make it compulsory for all buildings and houses to be equipped with solar panels by 2030. If the mandate is successful, Japan would be home to the world’s first national solar array.
This push for nation-wide clean energy comes in the wake of the Fukushima Daiichi nuclear power plant disaster, where a tsunami and earthquake caused a nuclear crises in the country. The No.1, No.2. and No.3 Fukushima reactors all had meltdowns. According to Reuters, engineers are still fighting to bring radiation leaks under control, months after the 9.0 earthquake of March 11. Scientists from the IAEA are currently at the Japan site to investigate the crises.
Japan’s goals are for renewable energy and conservation. Naoto Kan believes that the push for a national solar array would not only release Japan from dependency on nuclear power and create a cleaner and safer energy source, but would also spur technological innovation in solar energy. The nation’s massive solar focus would pour more money into the industry creating better efficiency and ultimately drive down costs. Solar power could potentially be a solution, rather than a supplement.
Though it seems like such a massive project, Naoto Kan says that it would promote wider uses of renewable energy, but it will also show Japan’s resolve. If accomplished, it would certainly brighten the future. | 科技 |
2016-40/3982/en_head.json.gz/4905 | Home > Gaming > Microsoft may start watching you through your Xbox… Microsoft may start watching you through your Xbox Kinect By
A newly published Microsoft patent shows the Redmond, Washington-based computing giant may soon have the ability to monitor Xbox users through their Kinect hands-free controller. The system would allow content providers, like movie studios or record companies, new ways to monetize their goods. Only problem: The technology sounds like something straight out of 1984.
Microsoft’s patent, entitled “Content Distribution Regulation by Viewing User,” was first filed in April of 2011, and was published by the U.S. Patent and Trademark Office (USPTO) on November 1 of this year. The patent describes a “content presentation system and method allowing content providers to regulate the presentation of content on a per-user-view basis.”
So what, exactly, does that mean? Well, according to the patent, that means content providers will be able to offer content licenses to customers based not only on time (i.e. “this rental lasts 24 hours”) but also on the number of people watching a movie, for example, or listening to a song or album. Microsoft’s system would then use the Kinect’s camera to actually see how many people are in the room. If it’s more than the amount allowed by the purchased license, the entertainment will stop, and the system (probably Xbox Live) will demand that the customer upgrade to a new license that allows for more viewers.
Or, in the words of the patent itself: “The users consuming the content on a display device are monitored so that if the number of user-views licensed is exceeded, remedial action may be taken.” Furthermore, Microsoft’s technology may enable individuals to be “specifically identified and the amount of their consumption of the content tracked relative to their specific use.”
A note on copyright law: Right now, U.S. copyright law contains a provision known as “Public Performance” that gives copyright holders the right to when their protected intellectual property is performed or shown in public. So, what does “public” mean in this case? The law says “public” is a “place open to the public or at a place where a substantial number of persons outside of a normal circle of a family and its social acquaintances are gathered.”
The tendency might be to assume that Microsoft’s patent is there to give copyright holders a better way to police public performances of their movies and music – and that certainly seems like a no-brainer application of the technology. But that is not really what Microsoft is proposing here – this is some other beast entirely.
What Microsoft’s enabling with this patent is the exact opposite of public performance monitoring. Indeed, it seeks to monitor what are obviously private performances, which are not prohibited by copyright law. Not only that, but it greatly strengthens copyright holders’ power over users by allowing them to block their legally obtained content if, say, someone else walks in the room.
The benefits for Microsoft and content providers goes far beyond protecting copyright; it allows them to establish entirely new pricing for content. Only one person watching this movie? That’ll be $2. There’s 10 of you? Well, you’re going to have to cough up $15. And you’d better hope nobody shows up late to the party.
For consumers who are already disgusted with the disintegration of ownership in the 21st century, technology like that which Microsoft describes in this patent is clearly a giant leap in the wrong direction. We can only hope this is one of those ideas that dies in the prototype phase, and doesn’t become an industry norm.
And to think, these are the same companies that want to dissuade people from pirating their content. Here’s a hint: This won’t help. | 科技 |
2016-40/3982/en_head.json.gz/4907 | Home > Apple > iPhone 5: Everything you need to know iPhone 5: Everything you need to know By
Confirming all the rumors detailing its appearance, internals, and bundled earbuds, the iPhone 5 launch was very predictable. But lack of shock-factor aside, the new handset still has a lot to offer. Apple’s newest iPhone has been slimmed down, suped up, and redesigned. If you want to know more about the iPhone 5 then you’ve come to the right place.
Check out our complete iPhone 5 review. If you want to know how it compares to the iPhone 4s, check out our iPhone 5 vs. iPhone 4s spec showdown.
Advanced Retina Display
With the iPhone 5, Apple changed the size of the iPhone screen for the first time ever. The new screen is elongated, with a widesceen aspect ratio to 16:9. The new 4-inch, Retina display is described by Apple as “panoramic” and boasts 44 percent more color saturation than the iPhone 4S, making it full sRGB (aka: it’s gorgeous).
Apple also introduced new touch technology to the display of its latest smartphone. Rather than having two layers, the pixels in the screen and the touch sensors have been integrated into a thinner, single layer. The pixels actually sense the user’s touch all on their own, without the need for separate components. What does this mean for you? Much better viewing angles, a clearer image, a lighter device, and a thinner profile.
Updated Design
The design of the new iPhone was no surprise as Apple has obviously lost control of its secrets in a post-Jobs era. The iPhone 5 looks exactly as it was portrayed in many leaked images. Lead designer Jony Ive said Apple has “never built a product with this extraordinary level of fit and finish.” Made with a glass display and an aluminum unibody, the iPhone 5 comes in the standard color options: black or white. Thanks to its new touch-integrated display, the iPhone 5 is the thinnest LTE smartphone in the world at just 0.29 inches. It’s also 20 percent lighter than the 4S at 112 grams or 3.95 ounces.
A6 Processor
A new iPhone means a faster CPU. This one’s called the A6 — following the established numbering system — and Apple claims it is two times faster than it’s predecessor, able to load web pages 2.1 times speedier.
One advantage of buying the latest Apple product is having a device designed specifically for the latest and greatest software version. iOS 6 brings a lot of new features to the table, and builds on many of those introduced in iOS 5. For instance, Siri will now play nice with Yelp, Opentable, and Fandango. It’s also able to launch apps (finally). Facebook is now integrated into the OS alongside Twitter, and Safari has an offline reading list. Apple’s thrown in its own version of Google Maps with Yelp support and turn-by-turn navigation. Passbook is a new feature as well, essentially a wallet app that can store your boarding pass, Starbucks card, and more. Read our full breakdown of iOS 6 for extra details.
“Ultrafast Wireless” is the moniker Apple’s chosen for its latest wireless chip. The single chip supports standards necessary to operate on all carriers, according to Apple. It can handle HSPA+ (20Mbps), DC-HSDPA (42Mbps), and LTE (max 100Mbps). The iPhone’s Wi-Fi has been beefed up as well, sporting 802.11N (dual-channel, 2.4GHz and 5GHz) and a max speed of 150Mbps. The iPhone can, like most modern routers, switch between connections for the best results.
Sadly, Verizon said recently that the iPhone 5 will not support simultaneous voice and data connections, even on LTE. This is in direct contrast to the rest of its LTE phones, which can all handle voice and data. The feature won’t be supported by Sprint either. AT&T, however, has confirmed with The Verge that its customers can simultaneously access voice and data over HSPA+ or LTE.
Improved Audio
Wanting to make its iPhone refresh as thorough as possible, Apple made sure its audio components underwent a makeover as well. The iPhone 5 features three microphones: one on the bottom, back, and front. As a result, the sound quality of recordings and voice calls will be improved. In a truly impressive move, Apple has managed to include noise cancelling without the need for external hardware. In real world use, this means background noise should be muted as much as possible during voice calls. And with what Apple calls, “wideband audio,” the iPhone 5 will use more bandwidth to transmit higher-quality audio over your network. This feature will depend on carrier support, however.
The internal speakers are now 20 percent smaller and yet have grown from three to five magnet transducers for a richer sound. Last but certainly not least, the EarPods are Apple’s latest bundled earphones for its new mobile devices. Designed to be much more comfortable and better-sounding than the current unwieldy earphones, the EarPods are also more ergonomic and meant to fit comfortably in the ear. We look forward to trying these out.
Camera Features
Apple spent a lot of time talking up the iPhone’s camera improvements in its live event, putting the strongest emphasis on its new panorama mode, though it blissfully ignored the fact that most high-end smartphones already have this feature. It works much the same here, you simply pan the phone across an area and the software will stitch the images together for you. The iPhone 5 will even correct blur from your shaky hands and remove any moving objects.
The back camera keeps the five-element optics and 8-megapixel sensor (3264 x 2448) found in the 4S. It also has backside illumination, an f/2.4 aperture (slower than the F/2.0 aperture on the HTC One X and Evo 4G LTE), and a hybrid infrared filter. A few video features have been tossed in as well: still photo capture while recording, improved image stabilization, and face detection.
The iPhone 5 also ships with a dynamic low light mode. The new image processing chip picks out the areas of a photo in need of noise reduction and leaves the rest alone. Because of this, Apple claims photography in low light is much improved. In addition, photos can be taken 40 percent faster than the 4S, which already has a super fast shutter. Even the lens has been revamped, this time made with a sapphire crystal (the next hardest material to diamond).
FaceTime is now in 720p HD and, as long as carriers play nice, video conversations will be available over a user’s cellular network. Like the rear-facing camera, the front-facing one has face detection built-in. According to The Verge, Verizon has recently stated users may run FaceTime over the network with no extra charge, regardless of which data plan they have. Sprint has also said it won’t be charging for the service, while AT&T will only allow it if you sign up for a new Mobile Share data plan. Lightning Connector
Here’s where Apple’s going to frustrate a lot of Apple users who have undoubtedly amassed a large collection of 30-pin connectors. Luckily (or unluckily, depending how you look at it), the new, smaller 8-pin connector can still be used with older cords by purchasing a $30 adapter. A neat feature of the Lightning connector is its reversible orientation, it will plug into your device no matter which side is up.
The battery life of the iPhone 5 is nowhere near the impressive numbers shown earlier this month in Motorola’s newest RAZR line, but for the average user, it will do. Apple states the official numbers as 225 hours on standby, 8 hours of 3G talk, 8 hours of browsing with LTE, 10 hours on Wi-Fi, and 30 hours of video.
Availability and Pricing
On September 21, the new iPhone will be available in the United States, Canada, UK, Germany, France, Australia, Hong Kong, Singapore, and Japan. Availability will extend to the remaining countries on September 28. Pre-orders began at midnight on Sept. 14. If you’re trying to get an iPhone 5 at launch, keep informed with our “How to Find an iPhone 5” article, which we’re updating with availability information as we get it.
The iPhone 5 will come in three different storage sizes and price points: 16GB for $200, 32GB for $300, and 64GB for $400. These prices include a two-year contract, however. At full price, the phone starts at $650 for the 16GB model. And since the iPhone 4S is old hat now, it will be offered for $100 on a two-year contract with the iPhone 4 costing zilch, nada, nothing after signing on the dotted line.
After the dust has settled, the iPhone 5 still stands tall as a premium gadget offering superior performance, outstanding build quality, and snappy wireless speeds. But is that enough? Will you be waiting in line for the sixth generation of Apple’s infamous smartphone-to-rule-them-all or are you sitting this one out? | 科技 |
2016-40/3982/en_head.json.gz/4908 | Home > Mobile > Beats by Dr. Dre Beatbox heading exclusively to… Beats by Dr. Dre Beatbox heading exclusively to AT&T By
If you’ve ever passed by an AT&T store and become overwhelmed with the sudden need to purchase Beats by Dr. Dre products, only to grow distraught with the knowledge that Beats couldn’t actually be found there, you’re now in luck: On March 11, AT&T will release an exclusive, special edition Beats by Dr. Dre Beatbox. The wireless high performance portable audio system will come in black and white and be available in retail stores and online.
The $399 device, which lifts its name from a previous Beats by Dre boombox released a few years ago, will feature a built-in Dock Connector for all your iDevices, as well as Bluetooth connectivity and the ability to operate on 6 D-cell batteries — for a truly wireless audio experience. The tech specs include a 5.25-inch woofer for the bass-heavy Beats sound you’ve come to expect, as well as a 3.5MM input for your line-in needs. Handles are built in, but we imagine only to help you extract it from its packaging, as no one would really carry something like this around with them.
“The Beatbox Portable is the perfect mobile sound system,” says Luke Wood, President & Chief Operating Officer of Beats by Dr. Dre, in a statement. “In addition to having the advantage of Bluetooth wireless technology and battery-powered portability, the Beatbox Portable delivers the power and emotion found in all Beats By Dr. Dre products.”
It’s no secret that Jimmy Iovine, the co-founder of Beats by Dre (with you guessed it — Dr. Dre) has been positioning his company as a mobile audio brand leader. A recent and acrimonious split from Monster has left the popular headphone maker open to new opportunities, and its strategic $300 million partnership with HTC could herald a new era of high-end mobile audio — if Iovine has anything to say about it: “Music on the cellphone should sound and feel great and we don’t want you to just download bad-sounding MP3s and play them on bad-sounding $3 earbuds,” he told The Los Angeles Times last year. “Why spend hundreds of dollars on a phone or a tablet and listen to music out of $3 earbuds? It just doesn’t add up.”
AT&T also announced it will be pushing an Android 4.0 upgrade to the HTC Vivid smartphone in the coming weeks, which will include an update to HTC’s Sense UI — the long anticipated refresh that will purportedly bring HTC’s Beats audio to third-party music apps. Initially, HTC’s Beats audio feature was limited to its phone’s built-in music player.
In addition to the exclusive Beatbox, AT&T will be adding a few more Beats products to its shelves. Wireless HD Stereo Bluetooth headphones will be available for $280 and feature a built-in microphone and control buttons. The wildly popular Beats Solo HD headphones and UR Beats in-ear headphones will also be available, for $200 and $99, respectively.
“As more customers begin to use their smartphone as their primary portable music device, it becomes more important for them to be able to enjoy premium sound quality no matter where they are,” said Michael Cowan, director, product marketing management – Accessories, AT&T Mobilily, according to the press release. “The Beats by Dr. Dre Beatbox gives consumers an opportunity to experience their music in its purest form without sacrificing quality for portability.”
Although it’s always great to have more high-end audio choices for the increasingly itinerant modern music lover, these AT&T/Beats announcements are decidedly more marketing gimmick than audiophile breakthrough. Hey Dre: We’re still waiting for that rumored HTC/Beats streaming audio service. | 科技 |
2016-40/3982/en_head.json.gz/4909 | Home > Apple > Samsung releases Galaxy Tab pricing Samsung releases Galaxy Tab pricing By
The tablet wars are heating up. We’ve been saying that ever since the release of the iPad, but it has turned out to be a slow simmer rather than a fast boil. There are more tablets on the way, many more tablets on the way, but so far, for the most part, we have been left to stare longingly at the specs and pictures of what will soon be the next wave of tablets. One of those tablets, perhaps one that will actually be able to rival the iPad, is the Samsung Galaxy Tab. While we do not have an exact U.S. release date yet, we now know the price.
The Galaxy Tab will retail for between $200 and $300, but the final pricing will vary based on wireless-carrier subsidies; no U.S. carriers have been announced. According to the Wall Street Journal, Samsung is in talks with multiple carriers, and the tablet will likely not be sold by Samsung, but rather through the carriers that agree to carrier the device. The Galaxy Tab is technically similar to Samsung’s Galaxy S smartphone, which is now being carried by T-Mobile, Sprint, AT&T, and soon to be Verizon.
The Galaxy Tab should first debut in Italy next month, then Vodafone will debut the tablet in several European markets shortly after. It is expected to be available worldwide in time for the holidays, and Samsung has claimed that it expects to ship 10 million Galaxy Tabs this year, which would give it one-third of the tablet market in the world. Following the release of the seven-inch tablet, Samsung is also considering increasing the Galaxy family to include a six- and a 10-inch model, but neither have been officially confirmed.
The Galaxy Tab features a seven-inch touch screen, an Android 2.2 operating system, Bluetooth 3.0, two cameras and a 1Ghz Cortex processor. Check out our head-to-head between the Galaxy and the iPad for a full list of the specs. | 科技 |
2016-40/3982/en_head.json.gz/4954 | Freescale claims compound MOSFET progress
Peter Clarke1/30/2006 02:00 PM EST Post a comment
LONDON Freescale Semiconductor Inc. said Monday (Jan. 30) that it had developed the industry's first gallium arsenide (GaAs) metal oxide semiconductor field effect transistor (MOSFET). The development is set to enable improved power amplifier and low-power, high-speed semiconductor devices and could transform analog-to-digital conversion technology, Freescale (Austin, Texas) claimed.
Silicon-based MOSFET technology is the basis of CMOS and digital logic and is used increasingly for analog and radio frequency applications. Gallium arsenide is intrinsically a faster material than silicon but prior to Freescale's development fundamental limitations prevented the application of industry-standard MOSFET processes, equipment, and interconnect methods in GaAs. Freescales’ latest development is set to change the technology landscape for a material that generates less noise and conducts electrons up to 20 times faster than traditional silicon, Freescale said. The industry's previous inability to deploy silicon dioxide or other dielectric materials into GaAs device technologies had prohibited the incorporation of metal oxide gate structures that are critical to the creation of viable GaAs-based MOSFET devices. Freescale has identified GaAs-compatible materials and devices that provide scaling capabilities on par with traditional silicon materials. This eliminates oxide-semiconductor interface defect issues that had discouraged the creation of high performance MOSFET devices based on GaAs compounds in the past. Freescale said that it anticipates that early generations of GaAs-based MOSFET devices will be highly specialized and designed to complement traditional semiconductor technology. Freescale plans to accelerate deployment of the technology by collaborating with partners with particular efforts being applied to communications infrastructure, wireless and optoelectronic products that require extreme computing performance. “This remarkable achievement overturns industry assumptions and has the potential to fundamentally change the way high performance semiconductors are designed, manufactured and deployed,” said Sumit Sadana, senior vice president of strategy and business development and acting chief technology officer for Freescale, in a statement.
“Freescale's GaAs MOSFET technology holds the promise of having a disruptive impact in the industry,” said Asif Anwar, GaAs Services director for industry analysis firm Strategy Analytics, in the same statement. | 科技 |
2016-40/3982/en_head.json.gz/4970 | What is the Future of Citrix Netscaler? [VIDEO]
By Sean Michael Kerner | May 2, 2014 | Print this Page
http://www.enterprisenetworkingplanet.com/nethub/what-is-the-future-of-citrix-netscaler-video.html At the core of Citrix's networking product portfolio is the Netscaler product family. Over the years, Netscaler, much like the needs of the networks it serves, has changed and evolved.
In a video interview with Enterprise Networking Planet, Sunil Potti, VP and GM for the Netscaler product group at Citrix, detailed the direction that Netscaler is now taking as an Application Delivery Controller (ADC) platform.
"You now use an ADC or a Netscaler in front of every tier of your applications," Potti said. "Netscaler has moved from being just a web front-end for scaling, to an app and database front-end, and that has occurred over the last few years."
Potti noted that application security and policy are needed across web, desktop and mobile devices, which is why the Netscaler is finding a home as a unified gateway. He added that Netscaler is now evolving further into a service delivery platform.
Back in 2008, in the early days of Netscaler, Citrix partnered with content delivery network (CDN) provider Akamai. Now in 2014, Potti said there really isn't as much need to align with a CDN as there once was.
"I think that the IaaS (Infrastructure-as-a-Service) providers actually have a better footprint to extend their compute cloud all the way to the content delivery cloud," Potti said.
Watch the full video interview with Sunil Potti below:
Sean Michael Kerner is a senior editor at EnterpriseNetworkingPlanet and InternetNews.com. Follow him on Twitter @TechJournalist | 科技 |
2016-40/3982/en_head.json.gz/5024 | projects of note
The Remote Village IT project has been the centerpiece of our early efforts. Dubbed the "Pedal Powered Internet" by The New York Times Magazine, it has focused our ideas on effective use of information technology in rural economic development. Starting from the express request for local and Internet telecommunications from villagers in Laos as relayed by the Jhai Foundation of San Francisco, the initial outlines of the system first took form as sketches on a napkin drawn by Lee Felsenstein at a Silicon Valley restaurant in 2001.
At that meeting Lee Thorn and Vorasone Dengkayaphichith of the Jhai Foundation described the situation: a cohesive group of five refugee villages, displaced from their ancestral homes in the Plain of Jars by bombing, had been settled in a valley out of cell phone range and without electrical power or telephones. They had voluntarily taxed themselves to improve their school, and while adult literacy was about 50 percent, the children were 100 percent literate in Lao. The villagers lived by farming and the women wove fine cloth for which there was demand, but their efforts at taking both products to market were hampered by lack of communication. They had many relatives overseas, and wanted to be able to talk with them by phone - only possible through Internet telephony from cafes in Vientiane. They also wanted some elementary computer capabilities (word processing, spread sheet) to enable them to assemble construction bids on paper.
The initial concept was of a telegraph station or telephone office in each village, using Wi-Fi links to interconnect the villages and to reach the nearest telephone line. This office would be run by schoolchildren, who could provide the needed literacy component for written communication. The software would be for email. Further research showed that Voice over Internet Protocol (VOIP) was a viable and growing technology, whose variable quality was not considered a problem by Lao users. The system concept was therefore modified to be a telephone system implemented through low-power computers and open-source software.
The project was taken on as a pro bono effort in early 2002. With the help of early volunteers Mark Summer and Steve Okay a pilot prototype was assembled and demonstrated in July of 2002 in San Francisco, transmitting data and images over a 3 km link using directional antennas. With the help of satellite-generated topographical maps the villages were located relative to the terrain, and likely relay points were identified on a mountain ridge overlooking the villages.
Computers were assembled from low-power modules manufactured for industrial applications--PCs on a single card with expansion capability (PC-104 type). A simple aluminum frame was designed to mount the modules in NEMA-4 standard outdoor electrical enclosures. Advice was solicited from wireless user's groups in the San Francisco area concerning the best transceivers, and Cisco Aeronet LM-352 cards were specified.
Architecturally, the system could be described somewhat like a palm tree. The long trunk represents the 9 km wireless link from the town of Phon Hong, where telephones and power were available, to the relay station on a 300 meter mountain. The branches of the tree represent the wireless links from the five villages to the relay point, which functions as an access point, mediating among the multiple villages. Where necessary the relay station routes data to the Phon Hong link, but otherwise the villages can communicate as a local area network.
The village PC drives an LCD display screen and a dot-matrix printer, selected for ruggedness and minimal use of consumable products. To power the printer 120-volt power inverter is built into the housing, and a switch circuit controls the power to the LCD and to the printer through the inverter. A Quicknet Phone Card VOIP adapter is plugged into the PCMCIA card slot not used by the wireless transceiver. Cables to the phone, display, printer keyboard and pointing device are conducted through the case in conduit fittings, which can be sealed up with caulking compound upon installation. A desiccant cartridge containing silica gel keeps the internal atmosphere dry, requiring attention once each two months, at which point it must be baked in an oven.
Due to the monsoon season, it was considered unreliable to power the village PC from solar panels, although two panels were used for the relay installation, which was in remote terrain. The higher power requirements of the village PC called for an alternative, and a bicycle generator manufactured in India was located as an inexpensive method of harnessing the childrens' power. The generator on the bicycle will charge a small lead-acid battery, which will then be transferred over to the large 360 ampere-hour battery which runs the village system.
All components were selected with an eye to a ten-year operating life.
Originally it was planned to use the Millennium flash disks-on-chips provided with the computer boards, but the requirements of the software and unforeseen driver problems with large versions of the disks-on-chip caused us to move to cheaper and larger Compact Flash cards. Adapters were purchased allowing these cards to plug into the IDE disk interface connectors on the computer boards. In order to gain necessary time for modification of the file system, hard disk drives based upon the IBM Micro Drive design were used instead of solid-state Compact Flash cards. These disks will later be replaced with Compact Flash when we solve the problem of running a journaling file system in memory and write to the flash cards only on power-down.
The operating system was Linux from the beginning, and has stabilized as Debian Linux 2.4. The village PC runs a localized version of the KDE application suite known as Laonux, the fruits of a project initiated by Anousak Souphavanh with many Lao and Thai volunteers. The X graphical user interface system must be run in order to run Laonux, and the resulting processor load necessitated the village PCs be upgraded to boards running the National Geode chipset, a Pentium 2 class processor clocked at 300 MHz. The relay and "server", which interfaces to the phone lines, remain using 486 class processors clocked at 128 MHz. These chips, from ZF Micro, use only 5 watts of power and can upload their own BIOS from an outside port if necessary, making for a maintenance advantage.
The server originally was to use a Line Jack card manufactured by Quicknet in order to place and receive telphone calls, but this product was discontinued and an alternative Quintum interface was used operating through an Ethernet port.
In January 2003 a test was run using laptops to communicate from the village of Phon Kham over the mountain to Phon Hong and thence to the Internet. Phone calls, email and images were transmitted, laying to rest reservations expressed by some about the technical feasibility of the project. In February the first attempt was made to install the system, but problems with software drivers caused major delays and the effort had to be called off when two hard disk drives were scrambled by a power failure. The project members travelled to Phon Kham to discuss the situation with the villagers, who expressed their understanding and continued support, and who held a party for us and others who had traveled far to witness the event.
In March Bob Marsh was asked to participate, as was Elaine Sweeney, an experienced software development manager. Many other volunteers came forward and work proceeded up to a second attempt at installation in June. This might have succeeded, but political problems arose which prevented installation of the relay in the time available. As of this writing the system has been working reliably and has been demonstrated at the ICT for Development platform at the World Summit on the Information Society in Geneva in December 2003.
In October, Lee Felsenstein was honored as a Laureate of the Tech Museum of Innovation in San Jose, California for work in developing the Jhai Remote Village IT system.
home | who we are | what we do | projects | blog | news | contact
Copyright © 2004 The Fonly Institute | 科技 |
2016-40/3982/en_head.json.gz/5025 | TopicsCapital Investment
Advertisement Advertisement IFT Launches Campaign Touting Food Science Thu, 06/28/2012 - 12:35pm Comments by LAS VEGAS (Newswise-IFT) — The Institute of Food Technologists (IFT) today launched a new public education campaign called ”A World Without Food Science” that will aim to generate greater awareness of the role food science plays in ensuring a nutritious, safe and abundant food supply. The campaign is a multimedia, national initiative featuring a series of videos that highlight how food science has responded to major food issues and provided positive solutions on a global scale.The overarching kick-off video, unveiled during the keynote session at IFT’s Annual Meeting & Food Expo in Las Vegas, accurately depicts what a grocery store would be like without the existence of food science. The black and white footage shows empty shelves, rotten fruit, insect-infested grain and spoiled meat to show the realities of a world without food science. The scene changes to color when the voiceover explains how dedicated food science professionals make it possible to have food that is safe, flavorful and nutritious. The concepts of the video are based on an IFT scientific review titled “Feeding the World Today and Tomorrow: The Importance of Food Science and Technology” published in the peer-reviewed journal, Comprehensive Reviews in Food Science and Food Safety.The campaign also includes five separate video segments that feature interviews with experts from various food science disciplines to show the positive impact of food science on the public. The first two video segments of the series were presented during the keynote address at the IFT’s Annual Meeting & Food Expo. The first video highlights the challenges surrounding availability of food and how we will need to feed approximately 9 billion people by 2050. The second video focuses on food safety and the important role of food science in ensuring that the food we eat is safe.“As a scientific society, education is at the core of our mission as we advance the science of food. It’s especially important for the public to understand where their food comes from,” said IFT President Roger Clemens, DrPh. “This campaign tells the story of food science in a new visual way so that consumers understand the role of food science in their daily lives.”In addition to consumer education, another goal of this campaign is to reach and inspire students to pursue food science careers. Food science incorporates concepts from many different fields including microbiology, chemical engineering, biochemistry and more. The ever-expanding field of food science encompasses a wide range of careers in areas such as food production and processing, quality assurance and control, food product development, food science research, and regulation and enforcement of food laws. IFT.org has information on becoming a food scientist, as well as lesson plans and activities for teachers. IFT also produced the Day in the Life of Food Scientist videos to help people understand what it’s like to walk in the shoes of a NASA food scientist, a product developer at Disney Consumer Products, and a food packaging professional at a multinational food packaging and processing company.As part of the World Without Food Science campaign, three more videos will be released within the year. Topics include Nutrition, Environmentally Responsible Food Production, and Developing Food Products for Specific Populations. Each video will be distributed nationwide and featured along with facts and additional resources on www.worldwithoutfoodscience.org. The videos complement IFT Food Facts, a multimedia website created to show the practical applications of food science for consumers, such as food safety in the farmer’s market, how to store leftovers and understanding expiration dates. For more information, please visit iftfoodfacts.org for more information.This video campaign was produced thanks to funding from the following IFT Divisions—Product Development, Quality Assurance, Citrus, Food Microbiology, Nutraceuticals, and Refrigerated & Frozen Foods.For more than 70 years, IFT has existed to advance the science of food. Our nonprofit scientific society—more than 18,000 members from more than 100 countries—brings together food scientists, technologists and related professions from academia, government, and industry. For more information, please visit ift.org. Safety Advertisement Advertisement View the discussion thread. Connect with Food Manufacturing Facebook | 科技 |
2016-40/3982/en_head.json.gz/5093 | Six Signs That We've Entered a New Geologic Age
By Maddie Stone on at We’ve heard a lot of buzz recently about the Anthropocene, the geologic epoch of man and machine. Does it exist? Are we in it right now? Later this summer, the International Stratigraphic Union will convene and attempt to answer these weighty questions. Deciding whether or not we’ve entered a new chapter in geologic history isn’t going to be easy. Normally, scientists use shifting rock layers, fossils, and geochemical evidence to place new ticks on the geologic scale. But the fingerprints of industrial society are not yet buried within sedimentary strata; they’re all around us. We’re creating them right now. To figure out if humanity has truly become a geologic force of nature, we need to be sure that our traces will persist long after we’re gone. Here are six pieces of evidence scientists are considering. Technofossils If there’s one thing modern humans are great at at producing, it’s waste. From CD-ROMs to plastic cups to e-waste, we’re quickly filling up our landfills, our oceans, and even our solar neighbourhood with stuff that doesn’t decompose. So-called “technofossils” are likely to remain on Earth for thousands to millions of years, even if we humans don’t survive the century. An e-waste dismantling junkyard. Image: EarthFix/Flickr While eyesores like the Great Pacific Garbage Patch offer striking evidence of our trash problem, the big technofossil beds of the future will come in the form of landfills, according to a recent paper in the journal Anthropocene: Over geological timescales, the plastics buried in landfill sites may be in part a ‘time-bomb’ of plastic release. Some landfills, in low ground in tectonically subsiding areas, will simply be buried by more strata, to be fossilized as palaeontological middens. Where landfills are eroded, though, they will begin releasing their debris, including plastic, into the sedimentary cycle. As bleak as it sounds, when alien archaeologists excavate society’s remains in the distant future, they may assume that shopping bags, not humans, were the dominant life form on our planet. Actual Fossils Plastic legacy aside, the age of humans will be marked by dramatic changes to the natural fossil record. For starters, there’s us. The human population has grown exponentially since the start of the Industrial Revolution, from roughly a billion people at the turn of the 19th century to more than seven billion today. Another four billion of us could be added to the world population by the end of this century. And humans haven’t risen to global dominance alone: we’ve brought along our domestic animals, including cows, pigs, sheep, cats, and dogs. In a geologic blink, the fossil record will be overtaken by a handful of two and four-legged mammals. Humans, there are a lot of us. Image: James Cridland/Flickr To counter our meteoric rise, other species are fast disappearing – as scientists verified last year, we’re in the early stages of a sixth mass extinction. Meanwhile, organisms that aren’t going extinct are being scrambled about the planet in new and unnatural ways — think the introduction of Cane toads to Australia, zebra mussels to Lake Michigan, or rabbits to basically everywhere. Global travel, climate change and urbanisation have resulted in a planet-wide migration, with some species marching northward as the poles warm, others moving into cities to occupy new niches, and still others hitching a plane, train or boat to the far corners of the Earth. If the fossil record was a book, the Anthropocene chapter was attacked by a hyperactive five-year-old with scissors, crayons, and glitter glue. Carbon Pollution It’s no secret humans are burning fossil fuels and releasing tremendous quantities of carbon into the air — some 10 billion tons a year at last check-in. Carbon dioxide is warming our climate, but it’s also reshaping atmospheric chemistry in a way that’ll leave an indelible mark, especially when you stack it alongside all the nitrous oxide, sulphur dioxide, chlorofluorocarbons and other industrial pollutants we’re pumping skyward. Given tens of thousands of years, newly formed ice layers at the north and south poles will trap tiny samples of our modern atmosphere as air bubbles, offering the geochemists of future a taste of the smog-filled skies of yesteryear. That is, unless we burn all of our fossil fuels and melt away the evidence. Nitrogen Fertilizer Agriculture has been reshaping our planet for the past ten thousand years, but all previous agrarian achievements pale in comparison to the technological advances of the mid 20th century. One of those, the so-called “Haber-Bosch” process, radically transformed the way we feed ourselves and our planet. Developed by the German chemists Fritz Haber and Carl Bosch, the process uses high pressure and heat to convert atmospheric nitrogen (the inert gas N2) into ammonia fertiliser, a feat that was previously only possible with the aid of “nitrogen fixing” bacteria. Suddenly, fertilizer was fast-acting and inexpensive, and farmers could apply it liberally to their fields. Crop yields boomed. Image: Wikimedia The most obvious consequence of Haber-Bosch — enabling the human population to double again and again — overshadows the more insidious impact of all that extra nitrogen on our biosphere. Overspill from fertilisation has roughly doubled the amount of actively cycling nitrogen in our biosphere, which has caused some species to become weedy at the expense of others. For instance as fertiliser seeps into lakes, rivers, and coastal waters, it fuels vast algae blooms that soak up oxygen and choke out other forms of life. The sudden turbocharging of Earth’s nitrogen cycle will leave an indelible mark in the geochemistry and ecology of the Anthropocene. Boreholes One of the strangest ways humans are reshaping the planet right now has nothing to do with the Anthropocene at all: it has to do with the geologic epochs that came before. Humans are digging, drilling, mining, and blasting their way deep into our planet’s crust, through thousands of metres of sediment accumulated over hundreds of millions of years. No other species or natural process has ever done anything like this. Image: Wikimedia So-called “anthroturbation” may be the most permanent, and therefore truly geologic, scar that humans leave. As Jan Zalasiewicz, chair of the International Commission on Stratigraphy’s Anthropocene Working Group told me in 2014, “The only way these marks can go away is by coming to the surface and being eroded, or getting caught up in a continental collision, or some other tectonic activity. Any scenario for erasing them will take tens to hundreds of millions of years.” Nuclear Weapons For academics, a key point of debate is not whether the Anthropocene exists, but when exactly it began. Some argue for an early start date — say, the first evidence for a human-caused shift in atmospheric CO2, when Europeans migrated to the New World and proceeded to kill everybody. (This caused huge swathes of farmland to revert to forest.) Others say the proverbial “golden spike” should land in 1964. The year 1964 was a big one for nuclear weapons testing — so big that it caused a dramatic uptick in the amount of radioactive carbon, or carbon-14, in our atmosphere. The extra carbon-14 worked its way into the food chain and the biosphere, from plants to animals to humans to soil. If you lived on Earth during the 1960s or '70s, you contain an indelible trace of the Cold War in your bones — and it might literally herald the dawn of a new age. *** Defining a new geologic age — especially one as weird as the Anthropocene — won’t be easy. Despite all the evidence that seems to point toward a new epoch, there’s still considerable academic debate, particularly on the matter of whether the Anthropocene has begun. What if the biggest changes to our planet are yet to come? Should we really be rushing off to place this painfully brief moment of time all by its geologic lonesome? Are we too blinded by the present, too awestruck by our own participation in a planet-wide experiment, to put ourselves in the appropriate context? Perhaps. Then again, to the best of our scientific knowledge, the present is like nothing the planet has ever seen. That alone makes the age of humans — however short-lived — geologically remarkable. Tags:
scienceenvironmentgeologyextinctionfeaturedclimate changeglobal warmingclimatefuturismanthropoceneearth sciencessixth mas exticntion
Maddie Stone | 科技 |
2016-40/3982/en_head.json.gz/5098 | Pluristem Awarded a $3.1 Million Grant by Israeli Government
April 30, 2012 06:00 ET
| Source: Pluristem Therapeutics Inc. HAIFA, Israel, April 30, 2012 (GLOBE NEWSWIRE) -- Pluristem Therapeutics Inc. (Nasdaq:PSTI) (TASE:PLTR) today announced that its wholly owned subsidiary, Pluristem Ltd., has received approval for a 11.8 million New Israeli Shekels (approximately $3.1 million) grant from the Office of the Chief Scientist (OCS) within the Israeli Ministry of Industry, Trade and Labor. Once received, the grant will be used to cover R&D expenses for the period March to December 2012. According to the OCS grant terms, Pluristem Ltd. is required to pay royalties in the rate of 3% - 5% on sales of products and services derived from technology developed using this and other OCS grants until 100% of the dollar-linked grants amount plus interest are repaid. In the absence of such sales, no payment is required.
The OCS, empowered by the Law for the Encouragement of Industrial Research & Development – 1984, oversees all Government sponsored support of R&D in the Israeli hi-tech and bio-tech industries. This broad-spectrum support stimulates the development of innovative state-of-the-art technologies, enhances the competitive power of the industry in the global hi-tech market, creates employment opportunities and assists in improving Israel's balance of payments.
"We are pleased Pluristem's PLX cells were recognized as an innovative state-of-the-art technology with a potential to create long term sustainable competitive advantage in the cell therapy industry," said Zami Aberman, Chairman and CEO of Pluristem. "This grant will assist the company in enhancing its R&D plans and clinical trials, helping us bring the PLX product candidates to market for the treatment of millions of patients around the world."
About Pluristem Therapeutics
Pluristem Therapeutics Inc. (Nasdaq:PSTI) (TASE:PLTR) is a leading developer of placenta-based cell therapies. The company's patented PLX (PLacental eXpanded) cells drug delivery platform releases a cocktail of therapeutic proteins in response to a variety of local and systemic inflammatory diseases.
PLX cells are grown using the company's proprietary 3D micro-environmental technology and are an off-the-shelf product that requires no tissue matching or immune-suppression treatment prior to administration.
Data from two Phase I clinical trials indicate that Pluristem's first PLX product, PLX-PAD, is safe and potentially effective for the treatment of end stage PAD. Pluristem's pre-clinical animal models have demonstrated PLX cells are also potentially effective in nerve pain and muscle damage when administered locally and in inflammatory bowel disease, multiple sclerosis, or MS, and stroke when administered systemically.
Pluristem has a strong patent portfolio, company-owned GMP certified manufacturing and research facilities, strategic relationships with major research institutions and a seasoned management team. For more information visit www.pluristem.com, the content of which is not part of this press release. Follow Pluristem on Twitter @Pluristem.
The Pluristem Therapeutics Inc. logo is available at http://www.globenewswire.com/newsroom/prs/?pkgid=6882
This press release contains forward-looking statements within the meaning of the "safe harbor" provisions of the Private Securities Litigation Reform Act of 1995 and federal securities laws. For example, we are using forward looking statements when we discuss the safety and potential effectiveness of PLX-PAD for the treatment of end stage peripheral artery disease, or when we discuss the potential effectiveness of PLX cells in treating nerve pain and muscle damage and in inflammatory bowel disease, MS and stroke. These forward-looking statements are based on the current expectations of the management of Pluristem only, and are subject to a number of factors and uncertainties that could cause actual results to differ materially from those described in the forward-looking statements. The following factors, among others, could cause actual results to differ materially from those described in the forward-looking statements: changes in technology and market requirements; we may encounter delays or obstacles in launching our clinical trials; our technology may not be validated as we progress further and our methods may not be accepted by the scientific community; we may be unable to retain or attract key employees whose knowledge is essential to the development of our products; unforeseen scientific difficulties may develop with our process; our products may wind up being more expensive than we anticipate; results in the laboratory may not translate to equally good results in real surgical settings; our patents may not be sufficient; our products may harm recipients; changes in legislation; inability to timely develop and introduce new technologies, products and applications; loss of market share and pressure on pricing resulting from competition, which could cause the actual results or performance of Pluristem to differ materially from those contemplated in such forward-looking statements. Except as otherwise required by law, Pluristem undertakes no obligation to publicly release any revisions to these forward-looking statements to reflect events or circumstances after the date hereof or to reflect the occurrence of unanticipated events. For a more detailed description of the risks and uncertainties affecting Pluristem, reference is made to Pluristem's reports filed from time to time with the Securities and Exchange Commission.
Pluristem Therapeutics Inc.:
William Prather R.Ph., M.D. Sr. VP Corporate Development
William.PratherMD@pluristem.com
Daya Lettvin
Investor & Media Relations Director
+972-54-674-5580
daya@pluristem.com
Matthew Krieger
Ruder Finn - for Pluristem
matthew@finnpartners.co.il
other press releases by Pluristem Therapeutics Inc.
Pluristem Advances Towards U.S. Clinical Trial in PLX-R18 and Announces Principal Investigator
Pluristem Provides Fiscal Year 2016 Corporate and Financial Highlights
Pluristem to Conduct Symposium on PLX-PAD Potential for the Treatment of Peripheral Artery Disease at a Multinational Conference on Vascular Medicine
Pluristem’s Phase III Critical Limb Ischemia Study Wins $8 Million Grant from Europe’s Horizon 2020 Program
Pluristem Receives Positive Feedback from FDA and Gears Up for Phase III Trial of PLX-PAD in Critical Limb Ischemia
Pluristem Therapeutics Inc.
Pluristem Therapeutics Inc. Logo | 科技 |
2016-40/3982/en_head.json.gz/5105 | Should you put a Pre in your pocket?
With the arrival of Palm's Pre and a new generation of iPhones on the way, it's now a much more complicated world of smartphone buying.
Brian Nadel (Computerworld) on 12 June, 2009 06:50
I have a confession to make: About two years ago I made a big mistake and bought a phone designed for mere mortals when what I really wanted was an Apple iPhone. Don't get me wrong -- my Sony Ericsson W580i has served me well -- but I'm ready to move up to a super-phone.The problem is that with the arrival of Palm's Pre and a new generation of iPhones on the way, it's now a much more complicated world of smartphone buying. I've now had a chance to test the new Pre, comparing it to a friend's iPhone 3G running the iPhone OS 2.0 and examining the spec sheet of the upcoming iPhone 3G S, trying to figure out what to get.Size mattersFor me, size matters, and the Pre is a little smaller but thicker than the iPhone. I prefer the Pre's rounded organic shape to the rectangular iPhone, and the Pre feels more comfortable in my hand. On the downside, its case is a bit too slippery for my clumsy fingers.The Pre's 3.1-inch display is smaller than that of the iPhone, but it's still a step up from my previous phone and just big enough for comfortable Web cruising, reading e-mails and viewing videos. The good news is that the display is mounted flush with the case's surface, making tapping and sliding my finger on the screen easy to accomplish.Without a doubt, it's also the brightest, richest phone display I've seen. But like so many other phones (including the iPhone), the screen picks up stray smudges easier than a first grader's face at a birthday party.I really want a phone with a touch screen, and the Pre's capacitive technology is accurate and reliable for tapping, pressing and moving things around. I particularly like the two-finger gestures: Spread your fingers apart to zoom out or bring them together to zoom in, for instance. As with the iPhone, there's also a built-in accelerometer that automatically switches from a landscape to portrait view when it senses that the device has been turned on its side; it takes a couple of seconds, but it works like magic.Keying inMy personal and business lives revolve around e-mail, and the Pre's slide-out mechanical keyboard makes typing easier on the go than the iPhone's onscreen keyboard. It's not quite as comfortable as a BlackBerry, but the 34 keys are arranged in the familiar QWERTY format. At 4.75mm wide, the keys make my fingers feel fat and stubby (and probably will have the same effect on you).Even though they're small, the keys provide enough feedback to make typing more accurate than tapping on the iPhone's screen. The text appears on screen immediately, with no annoying lag to throw off my typing rhythm. There's even a dedicated "@" key that makes addressing e-mails much easier, but all the same, typing on the Pre takes practice and concentration to master. I had to purposely slow myself down to become more accurate.SoftwareWith a grid of program icons (3 across, 4 down), the Pre's main screen doesn't show as many items as the iPhone's does, but the colors are more vivid and it's easier to find the right one. All the info I need is there at a glance. There's also a bar on top of the screen that shows the time, network status and a battery gauge, and another at the bottom that give access to the phone app, contacts, e-mail and calendar. There's also an onscreen button for switching between the program list and the active item.To my surprise, the Pre also does a couple of tricks that make using it easier. I love that sliding my finger just below the screen from the middle to the edge makes the system go back one screen. I can close any app by literally flicking it away.I like that any of the apps can run full-screen vertically or horizontally, but the Pre goes further. When an application is up but not active, a thumbnail of it, which Palm calls a card, appears onscreen. These cards appear to float above the background; with the flick of a finger I am able to shuffle my programs to, say, read headlines on a Web site while listening to music and then change tracks quickly or go to a different site.The Pre comes with more than enough software to get started, including the expected (e-mail, Web browsing and music player) and unexpected (links to YouTube and Google maps). Of course, with only 18 downloadable programs available, it can't compare to the tens of thousands available for the iPhone. As of this writing, the Pre had add-ons for Pandora (for downloadable music), AP News (for the latest headlines) and MotionApps (for using PalmPilot apps). Palm promises to have 1,000 third-party programs available within a month. Ambitious, but it will take a long time to catch up with Apple.Making connectionsWhat gets lost in the hype over everything else it does is that the Pre is an excellent phone. Palm is also promising a worldwide HSDPA version that will fit into the Australian market, but that's further out on the horizon. I made many calls on the Pre, which connected quicker than my current phone. It also sounded better with wider dynamic range and fewer audio dropouts.For data-intensive work, the Pre has 802.11b/g Wi-Fi built in; it can't work with the latest 802.11n gear, however. It connected reliably with my office's Linksys WRT54GS router and stayed in contact up to 80 feet away -- a little on the short side, but adequate for working at a cafe or airport hot spot.Feel the powerWhen it comes to power, I like that the Pre's 1,150 mAh lithium-ion battery pack ran for about four hours of intensive calling, Web surfing, e-mailing and online video viewing. In my experience, that translates to five or six hours of stop-and-go use.
While working with the Pre, I noticed that its back got warm to the touch, but it seemed cooler than an iPhone doing the same work. However, the iPhone beat it in performance -- the Pre was noticeably sluggish at times, leaving me waiting for apps to load. It took 1 minute and 33 seconds to fully load up from a dead start, twice as long as the iPhone did.When I travel, I often don't see an AC outlet for hours, so the removable battery of the Pre is a big advantage. I plan to keep an extra battery charged and ready to go in my bag.Palm Pre TouchstoneThe Pre is one of the easiest phones to charge I've seen. Its US$70 Touchstone base looks like a small angled block and is a technological marvel that uses inductive technology to charge the phone without a physical connector. Just place the Pre on top of Touchstone's angled surface and the power flows into the device through the back cover. The included special cover weighs just 2.8g more than the standard cover, and the battery charges up in about 2 hours.I love that the Touchstone doubles as a stand for either horizontal or vertical viewing of the Pre. And when the Pre is in the Touchstone base, calls are automatically set to speakerphone mode. Pick up the phone and it's changed back to handset mode. This is so convenient that I think I'll dump my clunky landline speakerphone. Unlike the iPhone's US$50 dock, though, the Touchstone can't synchronise with a computer.For that, you need to use the included USB cable (you can also use the cable to charge the Pre more traditionally). However, there's a big snag. In a design move I can't understand, the Pre's mini-USB plug is slightly thinner than a standard mini-plug. In other words, a $2 off-the shelf cable won't work with the Pre -- you need to buy Palm's US$20 cable. I lose cables as fast as I get them, and having the option of using generic cables would have been a big positive of the design.Photos and feesAlthough the Pre can't record video (something the third-generation iPhone promises and most mobile phones do today), its 3-megapixel camera creates detailed and vivid images. They're close to the quality of a point-and-shoot camera, and the Pre handles high-contrast scenes particularly well.The one flaw, however, is that in low-light situations, I found that the Pre's photos had a green cast to them. When I used the LED flash (something the iPhone lacks) they came out much brighter, with near-perfect color balance.The Pre includes an AC charger, USB cable, headset and a soft pouch in the sales package.ConclusionWhen my present phone contract expires at the end of summer, I'm going to get a Pre -- that is, if my budget allows. It's just the right size, has a screen that's big enough for Web work and e-mail, and now that I've used the Pre's micro-keyboard I don't ever want to go back to a screen-based one again.For me, an even bigger step forward is the Touchstone charger and the way it automatically goes between speakerphone and handset. I only wish that Palm had used a standard cable for it.It does so much in such a small package that the Pre is a winner. All told, for my own use, it's a better iPhone than the iPhone is.At a GlancePalm PrePalm, Inc.Price: TBAPros: Excellent design, bright screen, mechanical keyboard, swappable battery, inductive chargingCons: Small screen, lack of add-on apps, slower performance than iPhone, doesn't use standard USB cable
Tags PalmAppleiPhone 3G SPalm Presmartphonesiphone 3g
Brian Nadel | 科技 |
2016-40/3982/en_head.json.gz/5106 | India’s Satyam rebounds, but progress is slow
The company posted lackluster revenue growth in the quarter ended Dec. 31 John Ribeiro (IDG News Service) on 14 February, 2011 19:23
Indian outsourcer, Satyam Computer Services, which is on the recovery after a corporate scandal, said that revenue and net profits grew in the third quarter ended Dec. 31.The company, which was once India’s fourth largest outsourcer, is far from reaching to the revenue and profit levels of its competitors, such as Tata Consultancy Services, Infosys Technologies, and Wipro. All three companies reported strong revenue and profit growth in the quarter ended Dec. 31, benefiting from a recovery in the outsourcing market.Satyam said on Monday that revenue had grown to 12.8 billion rupees (US$281 million) up by about 3 percent from the previous quarter, while net profit had more than doubled to 590 million rupees from 233 million rupees in the previous quarter. The results are in accordance with Indian accounting rules.The company is now stable at the $1 billion to $1.1 billion revenue level, and there isn't an exodus of customers any more, said Sudin Apte, principal analyst and CEO of Offshore Insights, a research and advisory firm in Pune, India.Satyam has also started focusing on select markets, but these efforts have not yet translated into high revenue and profits growth for the company, although the outsourcing market is bouncing back, he added.A comparison with the company’s revenue and profit figures in the same quarter in the previous year is not available as it was exempted by India’s Company Law Board from publication of financial results for the quarters ended from December 31, 2008 to March 31, 2010.The company’s operating margins are still very thin, at less than 4 percent, Apte said. Going by current revenue levels, the company show a revenue drop of about 8 percent in its fiscal year ending March 31, 2011, he added.
Satyam plunged into a financial crisis in 2009 over an accounting scandal in which revenue and profit had been inflated for several years. Now, Satyam plans to merge with another Indian outsourcer, Tech Mahindra, which acquired a dominant 43 percent stake in Satyam in 2009 as part of a revival package for the company. Minority shareholders have demanded that the merger should be delayed until the company’s full recovery, and when the valuations of equity are reasonable.The company reported in November last year that it had returned to profits in the quarters ended June 30 and Sept. 30.The company is still saddled with a lot of potential liabilities and costs, including a claim by some 37 companies who say that they want to be repaid 12 billion rupees that they had allegedly advanced to Satyam.The company also faces a class action suit in the U.S. alleging violations of the U.S. federal securities laws. The company delisted last year from the New York Stock Exchange after it failed to publish its results according to U.S. accounting rules within a stipulated period.Its profits in the quarter were whittled down by 533 million rupees in exceptional items relating to restructuring costs, forensic investigation and litigation support, and erosion in value of assets in subsidiaries.However, these are risks that were already taken into account earlier, and are not likely to affect the company’s operations or customer confidence, Apte said.The company added 764 staff in the quarter taking the total to 28,832 at the end of the quarter. The company had 217 customers at the end of the quarter.John Ribeiro covers outsourcing and general technology breaking news from India for The IDG News Service. Follow John on Twitter at @Johnribeiro. John's e-mail address is john_ribeiro@idg.com
Tags business issuesInfosys TechnologiesoffshoringTech Mahindrafinancial resultswiprorestructuringTata Consultancy Servicesservicessatyam computer servicesoutsourcing | 科技 |
2016-40/3982/en_head.json.gz/5272 | From a Wee Girl in Derry to mass retail in Walmart James O'Shea
CEO of child safety product company Bubblebum Grainne Kelly is a force of nature, a woman who doesn't take no for an answer and is in a hurry to disrupt the traditional child car seat industry. In a few short years her small Derry based company has sold over 300,000 car seats worldwide and featured on a recent episode of Dragon's Den, Ireland and the UK's version of the popular show Shark Tank.
Grainne Kelly candidly says "who would have thought a wee girl from Derry would launch her product in Walmart?"
She came up with the idea for the world's first inflatable car booster seat when she was traveling with her own kids and was totally frustrated by the car rental firms lack lustre approach to her child's safety. "They never seemed to have the booster seats even when I had prebooked them" she said. In a moment of madness, she decided to create a solution to the vile rigid booster seats that the car rental firms charged anywhere between $7-$11 per day to rent and more often than not didn't supply when it came to the crunch.
She knew it had to be lightweight and compact so she chose inflatable. She called all the testing laboratories across the world and asked them to treat her like a 3 year old so she could understand the regulatory requirements for car booster seats. They told her that inflatable was not an option as the seat had to be crash tested in the 'worst case scenario' which is of course deflated. This was a great challenge for Grainne who worked out that the seat could easily pass the crash testing if she used memory and air technology along with a harness apparatus, so she set sail for China to make a prototype which is a story all in itself. Within 9 months she had taken the product from concept to shelf through crash testing, patent protection, marketing and website with sales starting locally on the first day the site went live. She continually called on her board of 12 kids under the age of 11 who told her where she was going wrong with the colors and the designs and identified new opportunities in the market place.
In Feb 2010, Grainne discovered that she was getting no response or in her own words "not quick enough for an Irish girl" response from the testing laboratories in the USA. In order to sell the product in the USA it had to be tested for that market too. The only solution for Grainne was to move her family to the US to speed things up.
While living in the USA, she trained with Safekids and became a Nationally Certified Child Passenger Safety Technician and worked at car seat check points to better understand the market place. By engaging directly with the safety coalitions she better understood any objections that may be raised in relation to this new technology and built the trust and relationships that formed her grassroots promoters. She spent time with crash test laboratories and very importantly with her new American friends who helped her navigate her way around the new culture.
In January 2011 she returned to Ireland where she set about launching the product in the US market and launched online in June2011. In Oct 2011 she was a finalist in the Ernst & Young Entrepreneur of the Year, IIHS Best Bet Booster Award Winner and also the USA JPMA Innovation Award Winner.
BubbleBum is now selling in 24 countries globally and an Amazon best seller, the product has just launched in Walmart in over 2,100 stores nationwide. With a pipeline of new innovative travel solutions due to hit the shelves early in 2015, the company is growing from strength to strength. There are 6 in the company and Grainne would describe their days like drinking from a fire hydrant.
Grainne, CEO of the now global company, tells us, "They say the Irish are full of hot air, and there may some truth in that as some of the best inflatable technology came from Ireland, look at the pneumatic tyre, as well as the BubbleBum of course both invented in Ireland! We have led the way in inflatable technology and now world leaders such as Volvo and Ford are following with their inflatable infant carrier and inflatable seat belts."
When asked what motivates you, she quite simply says "teaching the children that anything is possible and making them proud is my motivation".
What is the best lesson learned "listen to your gut, it is there for a reason".
She lives her life by : "Do things for the right reasons and only the right thing can happen".
Her AVA profile reads : Sees no as a temporary obstacle. | 科技 |
2016-40/3982/en_head.json.gz/5329 | »Research channels powerful Kansas wind to keep electricity running
By Julie FosbergOne of Kansas' most abundant natural resources may hold the key to preventing major power outages. A team of Kansas State University engineers is researching ways to use Kansas wind and other distributed energy sources to avoid cascading failures.
Sakshi Pahwa, doctoral student in electrical and computer engineering, India, explored the topic for her recently completed master’s project, "Distributed Sources and Islanding to Mitigate Cascading Failures in Power Grid Networks." The project was a winner at the recent Capitol Graduate Research Summit in Topeka.
Pahwa's co-advisers on the project include Caterina Scoglio, associate professor of electrical and computer engineering, and Noel Schulz, Paslay professor of electrical and computer engineering and K-State's first lady. Pahwa is continuing this work for her doctoral research under Scoglio and Ruth Douglas Miller, associate professor of electrical and computer engineering.
The research looks at using distributed energy sources to avoid cascading failures in power grids. A cascading failure occurs when an interconnected part of a power system fails and then triggers successive parts to fail – like the one that happened in the Northeast Blackout of 2003, a power outage that affected 55 million people in the United States and Canada.
To prevent cascading failures researchers are investigating a technique called islanding, which works to minimize the impact of a power system fault to a small area. Islanding prevents this fault from affecting other areas and stops further disturbances in the network.
"We used a network partitioning algorithm, and then depending on where the fault is I can disconnect that portion of the network," Pahwa said. "That disconnected portion can then be powered using renewable or distributed energy sources, such as wind turbines or solar panels, and the remaining parts are still being powered by conventional sources."
The Kansas wind can potentially provide abundant renewable energy that could power the disconnected portion of the network. For data collecting and testing purposes, the researchers plan to use the university's wind turbine north of campus, near the intersection of Denison and Kimball avenues, as well as four other wind turbines installed at the Riley County Public Works Facility.
The university turbine was installed for Wind for Schools, a project led by Miller, director of the Kansas Wind Application Center. The Riley County wind turbines were installed for the Resourceful Kansas project, a cooperative effort between Miller, Scoglio, Riley County and the Kansas City-based consulting firm GBA, and funded by the U.S. Department of Energy.
"We need to set up power systems that are reliable and stable so that when that wind is blowing, we can use that power, but when the wind isn’t blowing, there are also stable systems," Schulz said. "That's what this project is about -- modeling the network so we understand the different aspects for when there are changes, when the wind blows, when it doesn't and how that affects the power system."
Scoglio and Pahwa started the project when Pahwa was a master's student. As they began studying complex network systems, they turned to Schulz, a power grid expert who has done previous work with islanding. They also collaborated with power systems expert Anil Pahwa, professor of electrical and computer engineering, and Shelli Starrett, associate professor of electrical and computer engineering.
"With the proper design and the right intelligence, some of the problems related to power failures can be prevented," Scoglio said. "We need to make sure that the communication network will monitor the network and detect the problem and will implement the reaction securely to implement these solutions."
Sakshi Pahwa's research aims to not only study the problem from a theoretical aspect, but also provide practical solutions to real-world problems. It also fits in with the Renewable Energy Standards Act, which was signed in 2009 and states that major Kansas utilities should be able to generate about 10 percent of their power from renewable sources by 2011 and 20 percent by 2020.
"This project benefits the state because it reduces carbon emissions through renewable energy," Pahwa said. "It is a good opportunity to create jobs, and renewable energy incorporation is also a support to the conventional sources so we don’t need to import fuels from other countries. It helps the economy as well."
Pahwa's research was supported by the four companies involved in the K-State Electrical Power Affiliates Program: Westar Energy, Burns and McDonnell, Nebraska Public Power District and Omaha Public Power District. Schulz directs the program, which supports undergraduate and graduate research programs.
"This research is a benefit for Kansas and the whole nation because I think that innovation, coming from research and support from companies such as those that are part of the power affiliates, can really bring the country back to a better economic situation," Scoglio said. "Innovation comes with jobs and can really improve the whole nation."
In this issue Research
Psychology professor says unions battling at the bargaining table enhances American workplace
K-State Salina to offer first master's degree, professional master of technology
Journalism graduates to discuss climbing the professional ladder
K-State Olathe grand opening
Student Group Hosts Golf Tournament for K-State Cancer Research
Sounds of spring: April showers music across campus
Department head of grain science and industry receives industry leader award
Civil engineering professor presents four papers at ASCE Congress
How to request a duplicate W-2
Get the lead out! | 科技 |
2016-40/3982/en_head.json.gz/5436 | Lockheed Martin Team Moves Forward in 'Elite Eight' Following DARPA Robotics Challenge Trials
HOMESTEAD, Fla., Dec. 23, 2013 – Lockheed Martin [NYSE: LMT] Advanced Technology Laboratories (ATL) recently completed the Defense Advanced Research Projects Agency (DARPA) Robotics Challenge trials at the Homestead-Miami Speedway. The Lockheed Martin-led team, which includes the University of Pennsylvania and Rensselaer Polytechnic Institute, guided an Atlas humanoid robot through a number of tasks designed to simulate disaster response scenarios.
Lockheed Martin is one of eight teams to move forward onto the next phase of the Challenge. In 2014, the team will continue to refine and expand its robotic system concept in preparation for the DARPA Robotics Challenge finals. The final winner will receive a $2 million prize.
“The DARPA Robotics Challenge presented an exciting competition for our team,” said Bill Borgia, director for ATL’s Intelligent Robotics Lab. “It helped us further our expertise in developing robotic autonomy. We’ll continue to move that technology forward both in this challenge and our other efforts.”
DARPA Robotics Challenge
As part of the DARPA Robotics Challenge, Lockheed Martin developing autonomous systems that work together with human operators.
As a top qualifier in DARPA’s Virtual Robotics Challenge held earlier this year, the Lockheed Martin team received an Atlas robot to combine with advanced control algorithms and an operator station. The team developed a conceptual system and programed Atlas to perform a series of disaster relief tasks. Tasks included driving a vehicle, walking over various hazards, climbing a ladder, walking over debris, opening doors, drilling a shape in a cement wall, closing various valves and attaching a hose to a hydrant.
This wasn’t Lockheed Martin’s first participation in a DARPA challenge. in 2007, for example, Lockheed Martin ATL competed in the DARPA Urban Challenge in Victorville, Calif., where autonomous cars navigated through a 60-mile urban course in less than six hours. Out of hundreds of initial entrants, the Lockheed Martin car was the fourth to finish the course.
ATL is Lockheed Martin’s applied research and development facility that specializes in advanced computing technologies, and transitions them throughout the company, military services and service laboratories. Technology focus areas include autonomy and artificial intelligence, network-centric operations, cognitive computing, information exploitation and spectrum technologies.
Headquartered in Bethesda, Md., Lockheed Martin is a global security and aerospace company that employs about 116,000 people worldwide and is principally engaged in the research, design, development, manufacture, integration and sustainment of advanced technology systems, products and services. The corporation’s net sales for 2012 were $47.2 billion.
Meet DRC Team TROOPER This is a DARPA produced video of the Trooper Team for the DARPA Robotics Challenge. | 科技 |
2016-40/3982/en_head.json.gz/5451 | T-Mobile execs clarify throttling, Wi-Fi Calling policies
updated 04:35 pm EDT, Tue March 26, 2013
Only fixed caps will see throttling to 3G
T-Mobile CEO John Legere and other executives have clarified the carrier's policy on data throttling and Wi-Fi Calling in the wake of the company's iPhone announcement. During a Q&A session, they stated that throttling measures will only drop users to 3G speeds if they're on a fixed plan and exceed their data caps. Anyone on an unlimited data plan will be able to continue at full speed, as long as they aren't interfering with other customers.Legere described the arrangement as a "fair use" policy aimed at keeping T-Mobile's network open to more customers; together the executives commented that T-Mobile already has customers who use large amounts of data without being throttled, in some cases over 50GB per month, but the extreme examples are typically taking place at times that aren't affecting other customers. "If there is ever a case where we're going to use a fair use policy, we're going to post it so you can have a look at it," added Legere. "It's not a number -- if someone is having a party at 3AM, I don't really care."
T-Mobile marketing officer Mike Sievert has separately told Engadget that the iPhone 5 won't support the carrier's Wi-Fi Calling feature, at least not at launch. The company "loves its Wi-Fi Calling feature, and I'll have to leave it at that," according to Sievert. The option lets people outside the US receive calls from a US phone number so long as they're connected to a Wi-Fi hotspot. Several Android phones on T-Mobile already support the technology. TAGS
TheGreatButcher
Is T-Mobile's 3G network still incompatible with the iPhone? Another article mentioned that a modified iPhone 5 is coming to include compatibility with T-Mobile's LTE network, but didn't mention whether or not that compatibility would include their 3G network. In both cases, it seems possible iPhone users could find themselves throttled to EDGE.
Mac Elite
the answer to that depends on where you are at the moment, but they have been upgrading their network to allow the iPhone to be compatible with their 3G bands since at least January and likely have most big cities covered by now. YMMV. Long story short, I don't think unlimited users will find themselves throttled to 2G in most cases, and as stated above it appears to be quite hard to hit T-Mobile's limit at present.
DiabloConQueso
Grizzled Veteran
Note that the article says "3G speeds," not "cease using our LTE network infrastructure and switch over to our 3G network infrastructure."
It's possible to throttle someone down to 3G *speeds* while they remain connected to the LTE network. You don't have to actually jump over to the 3G infrastructure in order for this to happen.
When Comcast starts throttling your cable connection, they don't flop you over to a different technology like DSL or something, they just slow you down to DSL-like speeds. This is probably the same thing that T-Mobile is doing.
chrup
Fresh-Faced Recruit
I have a t-mobile S2, which has WiFi Calling. So I know what it really is as the description in the article is wrong.
If you have a weak t-mobile signal but a good WiFi signal where ever you are, it lets you connect to the t-mobile network via WiFi instead of going through a cell tower. It works great when I'm at home (weak signal) or at the office (no signal), except that the S2 is the worst phone I ever had ("Sorry the android.phone app crashed" when receiving phone calls) - WiFi or no WiFi. I'm using my old out-of-contract and legally unlocked iPhone 4 and 3G on t-mobile now and the signal is OK in my study, but there is no signal on the 1st floor or in the basement. WiFi calling would be really handy to have at home. You still pay for the minutes, but with the unlimited calls feature, what do I care?
And I pay $100/month for 4 lines +$18 for fees and taxes and such - vs. $340 for the same features with at&t. | 科技 |
2016-40/3982/en_head.json.gz/5465 | Advertisement Advertisement Advertisement Drone Display Shows Potential Use From Crimes To Crops Wed, 06/25/2014 - 7:57am Comments by Dave Kolpack, Associated Press GRAND FORKS, N.D. (AP) -- An unmanned aircraft the size of a push lawnmower was launched shortly after a report of a person being held at knifepoint. With red, green and white lights flashing below its rotors, the drone slowly circled the scene and relayed sharp images to those watching from afar on a digital screen.The mock police scene that played out Tuesday kicked off an annual unmanned aircraft conference in Grand Forks, home to the first drone test site in the country to open for business. Another demonstration featured the Draganflyer X4ES recording evidence, such as skid marks and debris, from a two-car accident."The possibilities are endless," said pilot Jake Stoltz, who helped navigate the drone and its sensors Tuesday. The exhibition was meant to show the shape of things to come when drones reach the commercial market, he said.Based at the Grand Forks Air Force Base, the test site is one of six in the country where experts are working on the safety of flying drones in civilian airspace and the public perception of having them above. The Federal Aviation Administration does not currently allow the commercial use of drones, but it is working to develop operational guidelines.Stoltz touted the plane's ability to go from a crime scene to a crop scene."We're already testing uses in agriculture," Stoltz said. "I'm not an expert, but I find it amazing that one day you will have farmers using them to check on the health of their crops."At least 75 percent of the drone use in civilian airspace is expected to involve agriculture. When the Northern Plains Unmanned Aircraft Systems Test Site became operational last spring, it began testing how drones can check soil quality and the status of crops. Unmanned aircraft could help farmers by using images taken from above with infrared photography and other technology to show the heat or water content of their plants.Michael Toscano, president and CEO of the Association for Unmanned Vehicle Systems International, said Tuesday that drones could help keep the family farm alive."The average farmer today is 58 1/2 years old," he said. "One of the ways to keep the young people on the farms is using robotics, because this is cool technology and farmers feel it's a way to keep their sons and daughters on the farm."Stoltz, who grew up in western North Dakota, agrees."I think it's another exciting piece of technology that younger farmers will probably embrace, versus the farmer who has been doing it his way for 30 years already," he said.Watching the drones in action could also ease the worries of some who perceive the aircraft as an invasion of privacy, said Tim Schuh, the Grand Forks police corporal who led the demonstration."People come and see this little 5-pound helicopter and wonder why we were making such a big deal about it," Schuh said. "I think when people see a lot of the capabilities of the aircraft, it will put their minds at ease."A survey unveiled at the conference Tuesday by UND professors Cindy Juntunen and Thomasine Heitkamp showed a high rate of acceptance for the drones. The survey of 728 participants in 16 northeastern North Dakota counties found that people's top concern about unmanned aerial systems is safety, not personal privacy."That was a surprise," Juntunen said. "It also may speak to the fact that we don't know how educated people are about what (drones) do." Aerospace Advertisement Advertisement View the discussion thread. Home | 科技 |
2016-40/3982/en_head.json.gz/5494 | Body scanners finding plenty of creative uses in U.S.
If Doug McMakin's latest experiment is successful, it's going to save travelers some time and hassle at the airport someday soon.
Shoppers will know their sizes immediately by stepping into a scanning booth such as being developed at PNNL. A three-dimensional body image is printed out allowing the shopper to know specific sizes to match different brands of clothing.
Paul T. Erickson/Trip-City Herald/MCT
Passengers go through full body scanners at Chicago's O'Hare International airport in 2010.
Phil Velasquez/Chicago Tribune/MCT
Rob Hotakainen - McClatchy Newspapers
WASHINGTON — If Doug McMakin's latest experiment is successful, it's going to save travelers some time and hassle at the airport someday soon.
They won't have to take off their shoes when they go through security, because a scanner will examine their feet and immediately detect whether they're security risks.
Thanks to McMakin's engineering work at the Pacific Northwest National Laboratory, the same technology already is in use at a handful of malls around the country, where clothing shoppers can step into machines and have their measurements instantly matched with different sizes and brands.
As questions are raised overseas about the safety of full-body scanners, engineers in Washington state are touting machines that they claim are safer and could ease airport lines and spot potential suicide bombers.
They're also trying to improve on the scanner technology to look not only at security, but at other more everyday applications, such as exposing household pests hidden behind walls, as well.
Last month, the European Union banned the use of some body scanners at airports because of cancer fears. But there's one big difference: Those that were banned emit low levels of radiation, while the technology designed in Washington state does not.
Last year, the King of Prussia Mall near Philadelphia became the first in the nation to use scanning machines for shoppers. Here's how they work:
Without disrobing, shoppers can step into scanning booths at kiosks, and three-dimensional body measurements are matched with clothing information in a database. Out pop lists that can be sorted by brand, price, style and retailer, and shoppers can head to the racks at their favorite stores to pick out their purchases.
Company officials say the signals are much weaker than those that come from cell phones, but they record more than 200,000 points of reference for precise measurements. Radio waves bounce a signal off the skin, without using radiation or X-rays, and the entire process takes roughly 10 minutes.
After installing the scanner at the Pennsylvania mall, Unique Solutions Design Ltd. of Nova Scotia has put them in stores in Texas and Georgia, as well. Earlier this year, a Canadian investment group provided $30 million to get the scanners installed in more locations across the U.S.
Company officials expect the machines to boost sales, particularly among women, whose chief shopping complaint is that clothing sizes aren't consistent, according to retail surveys. The company said one survey found that 54 percent of consumers had difficulty finding clothes that fit, and that 28 percent of women disliked shopping because they felt uncomfortable trying on clothes in dressing rooms.
"This is a fantastic idea and is going to revolutionize the way people shop," said Tanya Shaw, the president and chief executive officer of Unique Solutions.
When people step into the "Me-Ality size matching station," they must stand still for 10 seconds while a vertical scanning wand goes to work, its 196 small antennas sending and receiving low-power radio signals.
Shaw said the company currently had only eight of its "Me-Ality" size-matching stations operating, but plans call for getting about 400 of them running at major regional shopping malls in the next three years, including yet-to-be-named locations in Washington state.
McMakin, the original project manager for developing the technology at the federal government research lab in Richland, Wash., has been working on the scanners since the 1980s.
He said one of the biggest challenges was finding a market for them. Having the right product at the right time doesn't hurt, either.
"We tried to license the technology in the '90s for the security applications, but the market wasn't ready for it at the time until, obviously, 9/11 happened," McMakin said. "That changed everything. ... But that's one of the major challenges: Even if you have a technology that's ready to go, is the market ready? And is anybody willing to invest the money to bring that technology to the market?"
He's working on an experiment that would allow authorities to use scanners to detect potential suicide bombers even before they reach an airport.
And while the idea remains in development, some entrepreneurs at the University of Oregon hope to use the scanner technology to help pest-control businesses see little critters right through the walls.
Scanners could be used at your health club, helping people lose weight and providing exact measurements of their ever-shrinking bodies.
"You can do that on a scale, but this would give you a much more precise look at how your body is actually changing," said Bruce Harrer, a commercialization manager at Pacific Northwest National Laboratory.
The scanners remain most popular at airports, with roughly 1,000 of them in use around the world, half of them in the United States.
About 60 percent of the scanners use the millimeter wave holographic body-scanning technology designed at Pacific Northwest National Laboratory to detect concealed objects. The remainder use "backscatter" X-ray technology, which has been banned at European airports, at least until the risks are better assessed. Pacific Northwest lab officials are confident that their technology is harmless and will become more popular as a result of the ban in Europe, even though the potential harm from backscatter scanners is unclear.
Taxpayers help fund the research.
The facility in Washington state is a Department of Energy Office of Science national laboratory that has an annual budget of nearly $1.1 billion and employs 4,800 people, who work on issues related to energy, the environment and national security. It's been managed by Ohio-based Battelle since 1965.
McMakin said the lab received $7.5 million in special funding from the Federal Aviation Administration to work on the scanner technology in the 1990s and that it got another $660,000 recently from the Department of Homeland Security. On the flip side, the lab has raised about $5 million in royalties and other income, splitting the proceeds with Battelle.
"Our strategy is not to be a profit center, although we'd like to not be a loss center either," Harrer said. "We'd like to at least cover the cost of what we do."
Will oceans' tides supply endless electricity?
TSA tests new images system for airport security
California scientist adapts wine scanner for Homeland Security use | 科技 |
2016-40/3982/en_head.json.gz/5587 | SEPT. 9 I KORG I PARTNERSHIP
Korg, Marshall Part Ways
Korg USA has announced it’s parting ways with Marshall Amplification. The two companies have reached an agreement where Korg, which has represented two of the industry’s major amplifier brands for several decades, will no longer distribute Marshall products as of Oct 1. Marshall will open its own branded distribution network, Marshall USA, to serve the U.S. market.
“Korg USA has had a reputation as a brand builder,” said Joe Castronovo, Korg USA president. “We’re extremely proud of the job we’ve done for Marshall over many years and wish them luck in the future.”
Korg will continue to distribute, market, service and support its parent company’s brands, Korg and Vox, as well as the recently added Lâg guitar brand.
“The Marshall brand has been well-served by the efforts of Korg USA, and we would like to thank them for many years of loyal service, support and friendship,” said Jon Ellery, Marshall’s co-managing director. “The time is now right for us to distribute and market our products in the U.S., taking the Marshall brand to the next level.” korg.com
HOME • MAGAZINE • MARKETPLACE • EVENTS • RESOURCES • CONTACT • HELP | 科技 |
2016-40/3982/en_head.json.gz/5625 | Nov 15 2013, 9:45 am ET
‘Invisible' bicycle helmet - an airbag for the head
by Henry Austin and Michelle Kosinski
LONDON - A new “invisible” bicycle helmet that uses technology similar to a vehicle airbag has been developed in Sweden. The Hövding device, worn around the neck, is designed to shoot a protective, inflatable nylon hood around the user's head within one tenth of a second of impact. Designers Terese Alstin and Anna Haupt said they were tired of traditional hard plastic designs that were unfashionable and ruined their hair. “I don’t want anything on my head,” said Alstin. “I don’t want my hair to be destroyed.” The pair began to work on the device in 2005 when they were studying Industrial Design at Sweden’s University of Lund. Over the next seven years the pair engineered, refined and tested the collar, which can monitor a cyclist's movements more than 200 times a second using an inbuilt computer, sensors and gyro. If it senses a collision, a small gas canister held in the back of the collar inflates the protective cover within milliseconds. “We had to simulate all known accidents,” Alstin said, adding that they had enlisted the help of professional cyclists to help them develop it. "Everything from an icy road crash to getting hit by a car." The Hovding bike helmet Haupt added that the air pressure remains constant for several seconds allowing a cyclist to withstand multiple head impacts during the same accident before it starts to deflate. “It’s actually three or four times better in terms of shock absorbance,” Alstin said. “And that’s the most important factor. It covers more of the head - including the entire neck - than traditional helmets." The pair have also come up with a series of fashionable color designs for the collar, which retails online for 399 Euros ($536). Henry Austin and Michelle Kosinski
Topics Innovation
First Published Nov 15 2013, 9:45 am ET | 科技 |
2016-40/3982/en_head.json.gz/5795 | Smartphone Sales Beat Sales Of Regular Cell Phones In 2013
· Friday, February 14, 2014
· 1 Comment For some time now, we’ve seen signs that sales of smartphones were beginning to dominate the cell phone market not just in the United States but throughout the world. In a new report, it would appear that we finally reached that point in 2013:
Watching the sales of smartphones and feature phones in recent years has felt similar to watching a Ferrari race a horse. You know the Ferrari is going to win, it’s just a matter of when.
Well, for smartphones, it looks as if it finally happened. On Thursday, Gartner, the research firm, said that in 2013 worldwide smartphone sales surpassed the sales of feature phones (typically, this means a regular cellphone) for the first time.
In total, the number of smartphones sold was close to a billion, reaching a record 968 million for 2013. This was up 42 percent from 2012. About 839 million feature phones were sold.
During the fourth quarter of 2013, Gartner said that 58 percent of mobile phone sales were smartphones.
Anshul Gupta, principal research analyst at Gartner, said in the report that most of the new smartphones sales were in less mature markets, including Latin America, the Middle East and Africa, Asia and Eastern Europe. Toward the end of the year, China also contributed significantly to the increase in smartphone sales as devices like Apple’s iPhonebecame available on China Mobile, China’s largest mobile carrier.
While the iPhone has been dominant in the United States for several years, Apple struggled to gain traction in China, where people have opted for less expensive smartphones, particularly those operating Google’s Android platform that can cost a fraction of the cost of an iPhone.
“Mature markets face limited growth potential as the markets are saturated with smartphone sales, leaving little room for growth with declining feature phone market and a longer replacement cycle,” Mr. Gupta said. He added that there was a lack of “compelling hardware innovation” for existing customers to upgrade to new devices.
Leading the pack in smartphone sales was Samsung, which accounted for 58 percent of overall mobile phone purchases for the fourth quarter of last year, selling more than 83 million smartphones. This was up 44 percent from the year before.
While Apple was a distant second, selling 50 million smartphones during the fourth quarter, the company still broke earlier sales records. Apple sold a little over 43 million iPhones during the same period a year earlier, the report said.
One expects that these sales trends won’t continue forever. For one thing, as these phone become more ubiquitous and get into the hands of regular consumers we are likely see not as much eagerness to upgrade to the latest and greatest new smartphone that we have seen from early adopters. Partly, this is because we’ve reached a point where the “new” features that these phones offer are becoming less and less revolutionary, meaning that the compulsion to upgrade isn’t quite as strong as it might have been in the past. Additionally, carriers are becoming less and less generous with their upgrade options and some have even introduced plans that allow people to buy their phone outright without being tied to a contract. The price for that, of course, is that the consumer is paying close to full price for their phones rather than the heavily subsidized prices that you get if you tie yourself to a two year contract. If you’re doing that, you’re far less likely to upgrade as quickly as you might have been in the past. Nonetheless, it would appear that we’ve entered the era of smartphone dominance. What comes along to shake the market up the way the iPhone did is something only time will tell.
FILED UNDER: Doug Mataconis, Economics and Business, Quick Picks, Science & Technology Related Posts:
More Smartphones Than Regular Mobile Phones
Apple Sells 10 Million iPhones In Three Days
iPhone Outsells BlackBerry
91% Of American Adults Own A Cell Phone, Majority Of Them Own A Smartphone
PJ says: Friday, February 14, 2014 at 20:00
We can all thank Google for that.
If the cheapest smart phone would have cost the equivalent of $450 (the price of an unlocked iPhone 4S), this would never have happened. Sure, a cheap unlocked smart phone running Android isn’t quite the same experience as the three models Apple sells or an high end Android phone. But it still is a smart phone, and it allows people to do things that can’t be done with a regular cell phone. | 科技 |
2016-40/3982/en_head.json.gz/5809 | Science Harsh space winds diminish hopes for life on red-dwarf planets
BY Talia Mindich June 3, 2014 at 4:58 PM EDT
An artist’s conception of a planet, with two moons, orbiting a red dwarf star. Image by NASA/Harvard-Smithsonian Center for Astrophysics/D. Aguilar
Space winds in the habitable zones of the most common star in the galaxy may be too severe to harbor life, a new study finds.
Due to proximity, stellar winds constantly blowing from red dwarf stars might be so harsh as to strip the atmosphere of any rocky planet orbiting in the star’s habitable zone.
Red dwarf stars are smaller and cooler than the sun, meaning that a planet must be around 9 million to 18 million miles from its host star to be warm enough for liquid water. In contrast, Earth is about 93 million miles away from its sun.
Extreme space conditions may also trigger aurorae — or Northern Lights — 100,000 times stronger than those on Earth.
“If Earth were orbiting a red dwarf, then people in Boston would get to see the Northern Lights every night,” said Ofer Cohen of the Harvard-Smithsonian Center for Astrophysics in a press release.
“On the other hand, we’d also be in constant darkness because of tidal locking, and blasted by hurricane-force winds because of the dayside-nightside temperature contrast,” Cohen added. “I don’t think even hardy New Englanders want to face that kind of weather.”
The research was presented Monday at the American Astronomical Society meeting in Boston.
astronomy NASA PREVIOUS POSTFIFA ethics investigator won’t review new corruption allegations against Qatar
NEXT POST New bill would allow veterans to seek health care outside VA Are you aware of our comment policy? | 科技 |
2016-40/3982/en_head.json.gz/5812 | Nintendo Says It Has Sold 5 Million 3DS Consoles in Japan, Overcoming Slow Start
By Jay Alabaster, IDG News Service
Nintendo said Monday that it has sold 5 million 3DS handheld game consoles in Japan, which it says makes the device the fastest-selling game platform ever in the country.
The Kyoto-based firm said the console reached the landmark Sunday, less than a year after its launch on February 26, 2011. It said the number was based on estimates from data it obtained directly from its domestic distributors.
The 3DS was initially snatched up by gaming afficionados but then stumbled badly, forcing the company to suddenly slash its price worldwide less than six months after launch. Nintendo then launched an aggressive sales push for new game titles, a lack of which was a major reason the console initially failed to catch on.
Despite those efforts, Nintendo last month slashed its yearly sales target for the 3DS by 13 percent, to 14 million. The lower price has also cut deeply into its profit margins, and the company is on track for a massive loss in the current fiscal year through March.
Nintendo said the 3DS reached the five-million mark much faster than its predecessors. The smash-hit DS took 56 weeks to sell five million units, while the earlier Game Boy Advance needed 58.
In Japan, the 3DS competes with Sony's Playstation Vita handheld, which is due to launch this week in North America and Europe.
The company is also gearing up for the launch of its next-generation Wii U home console this year, which it has said will be out in time for the holiday season in the U.S., Europe and Japan. The successor to the popular Wii console will feature controllers that are similar to portable consoles themselves, with touchscreens, motion detection and cameras.
Though it's uglier and trickier to get the hang of than its predecessors, the 3DS manages to be a unique gaming machine that Nintendo fans will want. Read the full review
Clever and intrepid social networking features
Dual cameras can capture 3D pictures and video
Portable 3D without special glasses
Narrow "sweet spot" viewing field for 3D effect
Low 0.3-megapixel cameras
Only 5 hours battery life playing 3DS games
Nintendo Slashes Annual Targets for 3DS Despite Price Cuts,...
Email "Nintendo Says It Has Sold 5..." | 科技 |
2016-40/3982/en_head.json.gz/5822 | Search Home » The birth cry of a black hole NewsNewsroom
Perimeter Magazine
Slice of PI
The birth cry of a black hole February 12, 2013 Might we someday predict – and then carefully observe – the birth of a black hole? Perimeter Faculty member Luis Lehner thinks it’s possible.
Lehner and his collaborators have been studying the mergers in compact binaries – that is, binary ‘star’ systems where both stars are extraordinarily dense: either neutron stars or black holes. Lehner studied two kinds of mergers: one in which a neutron star orbiting a black hole gets destroyed and sucked in, and a second in which two neutron stars spiral together and collide, forming first a hypermassive neutron star and then collapsing to form a new black hole. Either event is fantastically powerful. Neutron stars pack the mass of the sun into a sphere smaller than most cities. Conjure the familiar image of spacetime as a rubber sheet with the sun dimpling it like a bowling ball. A neutron star dimples it like a pinhead that weighs as much as a bowling ball. That ‘dimple’ is more like a well: steep-sided and deep. A black hole, to go one step further, is like a bottomless pit. The energy packed into either distortion of spacetime is tremendous, and when two such distortions merge, they set off ripples in the rubber sheet – gravitational waves. Gravitational waves have long been predicted by Einstein’s theory of general relativity, but they have never been observed. That may soon change – the first of a new generation of gravitational wave detectors came online in 2002 and more are being built and improved every year. Massive binary mergers, of the kind Lehner has been modelling, are thought to be the ideal source for the signal these detectors are working to capture. They are also thought to be the source of gamma ray bursts: intense bursts of high frequency light that have been spotted all over the sky for 25 years. Short gamma ray bursts last less than two seconds – but in those two seconds, they outshine the rest of the universe combined. These bursts are thought to be powered by the way the two stars twist and tangle their magnetic fields as they spiral in, eventually creating fields trillions of times stronger than the Earth’s magnetic field. What Lehner and his colleagues have done is shown how the two signals – the gravitational wave and the electromagnetic radiation from the system – might be related. Their new model examines the magnetic fields both inside and outside the colliding stars – an improvement on previous models whose simplifications made them unable to cope with both the inside and outside regions simultaneously. The new model predicts that there should be a strong electromagnetic counterpart to the gravitational wave signal the neutron star merger puts out. Having two signals – one gravitational and one electromagnetic – gives scientists two ways to observe the same event. That’s a new and hot idea in the field that goes by the name ‘multimessenger astronomy.’ Lehner’s work will be key as scientists work to understand to what extent multimessenger astronomy is possible. The promise of multimessenger astronomy is more than just two sets of data. It allows one model to check the other. It allows us to look at different parts of the system – at gravitational waves from deep within the merging stars and electromagnetic signals from their surface, for instance. And – because gravitational waves ramp up slowly before neutron star mergers – they could give us warning that a neutron star merger is about to happen, allowing us to turn our telescopes and catch the gamma ray burst: the birth cry of a black hole. FURTHER EXPLORATION
Read the related paper, "Gravitational and electromagnetic outputs from binary neutron star mergers" on arXiv
Check out Lehner et al.'s previous arXiv paper, "Intense Electromagnetic Outbursts from Collapsing Hypermassive Neutron Stars"
View a 2008 arXiv paper that this latest research builds upon, "Magnetized Neutron Star Mergers and Gravitational Wave Signals"
About Perimeter Institute
Perimeter Institute is the world’s largest research hub devoted to theoretical physics. The independent Institute was founded in 1999 to foster breakthroughs in the fundamental understanding of our universe, from the smallest particles to the entire cosmos. Research at Perimeter is motivated by the understanding that fundamental science advances human knowledge and catalyzes innovation, and that today’s theoretical physics is tomorrow’s technology. Located in the Region of Waterloo, the not-for-profit Institute is a unique public-private endeavour, including the Governments of Ontario and Canada, that enables cutting-edge research, trains the next generation of scientific pioneers, and shares the power of physics through award-winning educational outreach and public engagement. http://www.perimeterinstitute.ca/
Eamon O'Flynn Manager, Media Relations eoflynn@perimeterinstitute.ca (519) 569-7600 x5071 The above figures show the magnetic field configurations (the blue lines) and current sheets (the orange regions) for several different configurations of merging neutron stars. (Click on image to enlarge) The above simulation, one of several done by the researchers, shows the last few milliseconds before the merger of two neutron stars. The different colours represent different strengths of the directional energy flux of the electronic field one would measure at 80 kilometres from the centre of the system. (Click on image to view video) | 科技 |
2016-40/3982/en_head.json.gz/5853 | Atmospheric Sciences & Global Change Staff Awards & Honors
James Edmonds Elected AAAS Fellow
James A. Edmonds Congratulations to Pacific Northwest National Laboratory's Dr. James A. Edmonds on being named an American Association for the Advancement of Science Fellow. The AAAS is the world's largest general scientific society and publishes the journal Science. The AAAS gives the distinction of Fellow to members who have made efforts toward advancing distinguished science applications. Edmonds will be recognized in February 2007 at the Fellows Forum during the AAAS national meeting in Boston. Edmonds is a Laboratory Fellow in the Atmospheric Science and Global Change Division and a chief scientist at the Joint Global Change Research Institute, a collaboration with the University of Maryland. He was selected for his "distinguished contributions to the field of climate change economics, particularly modeling and analyzing interactions of energy, the economy, technology carbon cycle and climate." Edmonds is the principal investigator for the Global Energy Technology Strategy Program to Address Climate Change, an international public-private research collaboration. Edmonds is well known for his contributions to the field of integrated assessment of climate change and the examination of interactions between energy, technology, policy and the environment. His extensive work on the subject of global change is included in books, papers, and presentations. Since 1991, he has contributed significantly to the UN-convened Intergovernmental Panel on Climate Change, which shared the 2007 Nobel Peace Prize with former Vice President Al Gore. Edmonds served as a lead author for all four major assessments published by the IPCC and numerous interim assessment reports, as well as serving on the IPCC steering committee on new integrated scenarios. He frequently testifies before Congress and has briefed the Executive Branch of the United States Government. During his 35 years in research and teaching, Edmonds has written or co-written hundreds of publications and presentations. He also serves on editorial boards, review panels, and advisory committees that have energy and/or environmental missions. Page 519 of 581 | 科技 |
2016-40/3982/en_head.json.gz/5878 | The Generation That Doesn't Remember Life Before Smartphones
The Generation That Doesn't Remember Life Before SmartphonesWhat it means to be a teenager in 2015
Jacqueline Detwiler
Down a locker-lined hallway at Lawrence Central High School in Indianapolis, Zac Felli, a junior, walks to his first class of the day. He wears tortoiseshell glasses and is built like he could hit a ball hard. He has enviable skin for a teenager, smooth as a suede jacket. Over one shoulder he carries a slim forest-green and tan messenger bag that would have been social suicide in 1997. But 1997 was the year Zac was born, so he wouldn't know anything about that.Advertisement - Continue Reading BelowAdvertisement - Continue Reading BelowA squat, taupe monolith flanked by parking lots, Lawrence Central smells like old brick and floor polish and grass. Its gleaming floors squeak if you move your foot a certain way. The school has existed on precisely this spot of land since 1963: maroon block letters over the door, tang of chlorine from the indoor pool. None of that has changed. Here's what has: After Zac turns the doorknob of Room 113 and takes his seat in Japanese III, he reaches into his shoulder bag, pushes aside his black iPhone 5S and Nintendo 3DS XL, and pulls out his Microsoft Surface Pro 3 tablet with purple detachable keyboard, which he props up on his desk using its kickstand. By touching a white and purple icon on his screen, he opens Microsoft OneNote, a program in which each of his classes is separated into digital journals and then into digital color-coded tabs for greater specificity. And then, without a piece of paper in sight and before an adult has said a word, he begins to learn.
How does all that change the monotony and joy and pain and wonder and turmoil that is the average teenager's life?
Zac probably started developing memories around 1999, the year Napster upended the music industry by turning songs into sharable files that nobody owned. Or maybe in 2000, the year Google became Google. Regardless, he is part of the first generation of human beings who never really lived before the whole world was connected by pocket-sized electronic devices. These kids might never read a map or stop at a gas station to ask directions, nor have they ever seen their parents do so. They will never need to remember anyone's phone number. Their late-night dorm-room arguments over whether Peyton or Eli Manning won more Super Bowl MVPs will never go unsettled for more than a few seconds. They may never have to buy a flashlight. Zac is one of the first teenagers in the history of teenagers whose adult personality will be shaped by which apps he uses, how frequently he texts, and whether he's on Facebook or Instagram or Twitter or Snapchat. Or whatever comes after Snapchat. Clicking like, clicking download, clicking buy, clicking send—each is an infinitesimal decision in the course of the modern American teenager's life. They do this, collectively, millions of times a minute. But together these tiny decisions make up an alarming percentage of their lives. This generation is the first for whom the freedom to express every impulse to the entire world is as easy as it used to be to open your mouth and talk to a friend.How does all that change the monotony and joy and pain and wonder and turmoil that is the average teenager's life? What is it like?Like many of the other 2,350 students at Lawrence Central, Zac knows computers better than even last year's graduating class did. The students here use them constantly—up to two and a half hours a day, according to Lawrence Central's principal, Rocco Valadez. This year is the first that Lawrence Central is one-to-one, which in educational speak means that every student on campus has been provided with a leased Chromebook laptop computer. Valadez considers Zac one of his beta testers, one of ten or so students the administration turns to for reports and opinions on how the technology is working. Zac, incidentally, asked if he could use his own Surface instead of a Chromebook. Because Zac is a high-level user, Valadez obliged. ("I'm a Surface guy," Zac says.)Advertisement - Continue Reading BelowYou hear two opinions from experts on the topic of what happens when kids are perpetually exposed to technology. One: Constant multitasking makes teens work harder, reduces their focus, and screws up their sleep. Two: Using technology as a youth helps students adapt to a changing world in a way that will benefit them when they eventually have to live and work in it. Either of these might be true. More likely, they both are. But it is certainly the case that these kids are different—fundamentally and permanently different—from previous generations in ways that are sometimes surreal, as if you'd walked into a room where everyone is eating with his feet.
Christaan FelberAn example: It's the penultimate week of classes at Lawrence Central, and the pressure has been released from campus like a football gone flat. The instructor of Japanese III, at the moment ensconced behind a computer monitor that is reflected in his glasses, switches on the announcements. The American tradition in this situation—end of school, little work to do, teacher preoccupied—is that the students would be passing notes, flirting, gossiping, roughhousing. Needing to be shushed. Instead, a boy to Zac's left watches anime. A girl in the front row clicks on YouTube. Zac is clearing space on his computer's hard drive, using a program called WinDirStat that looks like a boring version of Candy Crush—deftly, quietly, he moves small colored squares around to clean up the drive. Green, red, blue, purple. (When he types, he types evenly—none of that hinky freeze-pause-backspace thing that every adult with a hint of self-consciousness does when typing in front of anyone else.) Above a ziggurat of loaner Chromebooks at the front of the classroom hangs what's called a Promethean board, a panel that looks like a digital tablet the size of a Shetland pony. On the Promethean board, the day's announcements play, including a news segment on a London School of Economics study. The anchor begins: "Test scores increase by more than 6 percent at schools that ban smartphones …" At this the students in Japanese III—absorbed in private computational fiddling, phones out on their desks like pencil cases—let forth a chorus of snorts.Otherwise, Room 113 is eerily quiet.Leah
Leah Arenz, on the right, looks at One Direction photos on Tumblr.
Christaan FelberAdvertisement - Continue Reading Below"The primary motive that teenagers have when they're screwing around online is to connect with somebody else," says Larry Rosen, former psychology chair at California State University, Dominguez Hills, and author of Rewired, a book about how technology has changed the lives of the next generation. Lawrence Central senior Leah Arenz does this every day. Also every night. Sometimes even when she ought to be sleeping.Leah, hair slicked back under a headband, is editor in chief of the school newspaper, often staying in the journalism classroom until midnight to close an issue. She's eighth in her class. When she speaks to you, she makes quick eye contact, and then just as quickly breaks it. Unless you ask her about the band One Direction. Then she becomes someone different altogether.Leah has a secret Tumblr page she posts to twenty-five to fifty times a day. It's how she interacts with what she calls the Fandom, some 25 million acolytes of One Direction who repost, rehash, and relive the band's activities like a swarm of unpaid TMZ employees. Her friends at school do not know the address of her page—they don't even know about this other life at all. "I don't think they would understand or appreciate it the way people in the Fandom do," she says.When Leah talks about the Fandom, she engages like a drivetrain, leans forward—almost out of her chair—and speaks so quickly you'd swear she was on some new drug. She cracks jokes. Her eyes meet yours. When she blogs, at her dual-screen laptop-monitor setup under a green-shuttered window, she holds her face inches away from her monitor, tabbing between Web pages so quickly she must be selecting photos using only some primordial subconscious boy sense. You get the idea, talking to her, that her Tumblr is more than a hobby. More like an Internet breathing tube. A silicon lung. "I'm weird," she says. "I like 1D. But there are millions of people online who like 1D like me."For someone like Leah, who built her first website at fifteen and seems uninterested in climbing the capricious social hierarchy that exists in every high school, the community of the One Direction Fandom offers an entirely alternate social structure. It is a universe in which her skills at iMovie, GIF creation, and news collection, paired with the scope and anonymity inherent to the Web, render her an authority on teenage problems both minor and major. "A girl recently contacted me and said that her friend was dealing with social anxiety. She didn't know how to talk to her about it. Later tonight I'm going to think out a carefully worded answer," Leah says.
"I like 1D. But there are millions of people online who like 1D like me."
In this universe, she has great power. Having dinner with her father in an Applebee's near her home, Leah describes how, a few weeks ago, the Fandom was infuriated when one of its favorite One Direction songs, "No Control," had not been chosen as a single by the record label. The Fandom loved the song and was frustrated they weren't being heard, a familiar refrain among teenagers. And so Leah, along with thousands of other Directioners, as they're called, campaigned radio stations to demand airplay. Leah herself tweeted and blogged and reblogged incessantly, staying up at her computer until the stars emerged and receded over her home's suburban lawn. At 2 a.m., she and many, many others favorited tweets to a radio station in New Zealand until finally it played the forgotten song eleven times in a row.Advertisement - Continue Reading BelowBy the time the band began the American leg of its tour, One Direction was playing "No Control" at every show, and Louis Tomlinson, Leah's favorite band member, thanked the teens for their input onstage. It's as if Beatlemania junkies in 1966 had had the ability to demand "Rain" be given as much radio time as "Paperback Writer," and John Lennon thought to tell everyone what a good idea that was. The fan–celebrity relationship has been so radically transformed that even sending reams of obsessive fan mail seems impersonal. Leah's mom, in fact, was one of the original devotees of X-Files fan fiction in the 1990s, but she never had the opportunity to achieve the kind of connection her daughter can. As David Weinberger, senior researcher at the Berkman Center for Internet & Society at Harvard University, explains, "You're not simply a consumer anymore. By connecting with other people, you become a participant in the life of the band."In the Applebee's, Mark Arenz is shocked enough by Leah's story to take a fatherly feint at her sleep habits. "Is that what you were doing that late?" he says, then immediately relents. How much of an argument can a parent level against a teenage girl with a 4.7 GPA whose worst offense is staying up late to crush on a boy she'll never meet?"Imagine what you guys could do in the Middle East," he says.
Christaan FelberThe TeachersEnglish teacher Paige Wyatt is one of Lawrence Central High School's eCoaches, a group of teachers who educate other teachers about Chromebooks, passwords, Promethean boards, and wise and unwise uses of YouTube. At twenty-seven, she understands much of what the students do. And when she doesn't, she asks.Take the rules of Instagram. Wyatt just learned them today in the honors English class she teaches. "I was ribbing on this one boy a little bit. He was on Instagram and I asked him, 'What the heck is Instagram?' He said, 'I was just liking pictures. I can't be a ghost.' I said, 'What do you mean, a ghost?'"Then the whole class chimed in. They explained the rules to Wyatt: You have to have more followers than people you follow. You have to comment on and like people's stuff. Otherwise you're a ghost. And being a ghost is bad.A campus firewall called Lightspeed is supposed to block Instagram. But obviously, somehow, the students are using it. Because here's something about high school that will never change: The kids live by their own codes, and if adults make rules, the kids will find ways to break them.
In class, earbuds are more common than full-sized headphones, because you can listen to music and still hear the teacher talk—one bud in, one bud out.
Advertisement - Continue Reading Below"We talk to the kids about responsible use of technology. We talk about credible sources. We talk about being smart," Wyatt says. "A lot of them seem to know, 'Well, you'd never post that online. That's stupid.' But a lot of them also don't know. And a lot of them don't care."For the most part, kids behave themselves. But then sometimes they watch movies—in class. Sometimes they take selfies—in the bathroom. Sometimes a student will send a teacher a rude message on an anonymous Web platform, or post his own phone number and home address on Facebook.It's now the end of the year, and all the Chromebooks have been collected for the summer. "Our study halls are huge—there are forty to seventy kids and one faculty member," Wyatt says. "Normally they just plug in their headphones and keep themselves busy on their Chromebook, but now that we've collected most of them, we're worried that they'll be off-the-wall crazy without something to keep them busy."Zac
Christaan FelberWith the school year nearly concluded and final concerts wrapped a week ago, there is nothing to do today in band class. The band director barely makes an appearance. Pairs of students flop all over the classroom on the risers, sharing earbuds attached to single phones. One girl, wearing pajamas, has curled up around a music stand like a cocktail shrimp and gone to sleep.Devoid of a responsible adult, this class period has become one of those interstitial moments in teenage life, a rare, vital time when regulation of the adolescent schedule is up to the teen himself. To emerge whole after teenagerhood, one must craft a (hopefully) charismatic personality out of the preferences and skills discovered during times like these. Playing this guitar. Kicking that soccer ball. Reading a book on the history of Asia during the Tang Dynasty. Eventually, the teen becomes a complete person with well-defined interests who then locates other complete people to keep as friends. This has not changed, only now the potential interests to explore are limitless, and the resulting communities so far-flung and socially disconnected that "friends" are not friends the same way they used to be. A "like" is not really a like. The teens' brains move just as quickly as teenage brains have always moved, constructing real human personalities, managing them, reaching out to meet others who might feel the same way or want the same things. Only, and here's the part that starts to seem very strange—they do all this virtually. Sitting next to friends, staring at screens, waiting for the return on investment. Everyone so together that they're actually all apart.In the band room, if you're standing at the conductor's podium, Zac is sitting on the left side of the risers, about where you'd usually find a pack of timpanists. He's taking a break, playing a video game he found on the blog of a computer programmer he follows. He discovers a lot of entertainment this way—down in the lightless caves of the Internet known only to young people. He has Facebook, Instagram, and Google accounts and four Tumblr blogs. He follows a goateed YouTube comedian named JonTron who reviews videos and games. And Reddit. "I don't know how much time I've wasted on Reddit," he says.Advertisement - Continue Reading BelowBehind Zac, on the floor, sits the group that 1997 Zac, if he existed, might have wanted to join. It is mixed—four boys and two girls. The girls are cute and friendly. The boys affect cool nonchalance. They talk about who's got a crush on whom and whether they should all go to a JV basketball game together. One of the girls tries to convince one of the boys to put on some of her makeup."Absolutely not," he says."It's only going to be us that sees it!""You sound like when boys ask for nudes," he says."We're not even going to take pictures!"Zac pays no attention to this, not even when the group names a specific couple that has been "smashing"—2015-speak for having sex—for months. Neither does Zac notice the boys reading rude jokes from a cellphone propped on a music stand. He is deep in an entertainment rabbit hole of his choosing, and he has no need to amend his interests to fit in with the social noise around him.On the one hand, this is an enviable turn of events for high schoolers. There's no dulling the edges of young personalities in an effort to be popular. "If you have an interest in coral reefs or six-string bass guitars or whatever, you can now find a set of people who are just as interested, and you can explore any topic to whatever depth you want," says Weinberger, the researcher from Harvard. What a great time to be alive!
The kids live by their own codes, and if adults make rules, the kids will find ways to break them.
But the converse is also true. "If I'm talking to a stranger and I mention some smaller, lesser-known video game, or I say, 'Hey, have you heard of this band?' chances are they haven't heard of it," Zac says. "I think the only thing that I can truly maybe relate to someone with is something iconic, like Finding Nemo or something." There are fewer true shared experiences—fewer TV shows that everyone watches, fewer bands that everyone knows. There will be no I Love the 2010s version of the VH1 series I Love the '80s because there won't be enough nostalgia for communal culture to make such a thing worthwhile. Following this observation to its logical conclusion is fearsome: an entire generation that can go online and find legions of humans (user names, really) with similar interests but that barely knows how to connect to one another in the physical world. Friends, sitting around screens, not talking.Zac's Surface sits on his knees, his cellphone on his thigh. He has moved on to playing a fighting game—Super Smash Bros.—on his Nintendo 3DS XL, which he holds in front of his face. Three gadgets vie for his attention within his immediate field of vision. A green message, sent through Japanese texting app Line, appears on the Surface Pro. He places the 3DS XL next to his phone while he responds in silence.Lately, Zac has become concerned about his attention. He's been drifting off in class more than usual, daydreaming, tapping on things. When he brought his worries to his parents, they offered to make him an appointment to get tested for ADHD in the summer, after the school year was over. No point adding stress to finals week, they said.Advertisement - Continue Reading BelowNext to Zac, Chance Williams, a friend, plays a video game for a minute but then abruptly shoves his computer away and tries to interest Zac in a real-life game involving note cards. Chance is largish, spiky-haired, full of kinetic energy."I've got an invite to Emma's graduation party," he says.Zac doesn't respond.Anthony
Anthony watches videos of Olympic discus throwers on YouTube. Like many students, he keeps no books in his locker. Just a bottle of cologne.
Christaan FelberBefore the other kids file into Mr. Smith's Introduction to Engineering Design class, a science and technology course Lawrence Central offers in conjunction with a nearby technical center, Anthony Thomas is in his seat, looking around. "Some of the kids in this class are bad," he says.Anthony is such a good kid that everyone at Lawrence Central describes him in exactly that way. "What a good kid," his teachers say. And he is. Anthony plays linebacker for the football team, where he displays the loping, easy athleticism of a large cat. He holds the door open for strangers. He has known he wanted to be a mechanical engineer since he was thirteen years old, when, after spending the day fixing things at his grandma and granddad's house, he searched "Hands-on careers" on Google and it sent back "engineering." He is mature, he is focused, and he is sixteen.It's the last period of the day, and the other students in engineering design tumble into the classroom like a passel of river otters—flopping into chairs, whipping out stickered iPhones, unraveling earbuds. The earbuds are more common than full-sized headphones, Anthony tells me, because you can listen to music and still hear the teacher talk—one bud in, one bud out.
"Of course there were goofballs ten years ago, but I don't think the ratio was as high."
Anthony does not wear earbuds today. Neither does he look at his phone, even though that too is allowed. Today, the class is supposed to turn in a final assignment: Use the design software Autodesk Inventor to outline and laser-cut parts for a cellphone station that can hold at least five unique accessories. Playing with apps while trying to do this would only distract him, Anthony says. It's an obvious point but not one you expect from a high school student in 2015.Advertisement - Continue Reading BelowOnce, Anthony says, he received a message on his phone that he felt he had to answer in class. But then he wandered from the messages app to the Colts football app to Instagram to playing Madden football. In class! When his teacher announced that it was time for the students to start on an assigned project, Anthony had to ask someone next to him what they were supposed to be doing. He was disappointed in himself for losing focus. "I didn't like that," he says.Anthony now avoids distraction by keeping his iPhone in his bag. He knows that if he clicks its lone, enticing button, he'll slide down the rails to the carnival of games it contains. On his desktop monitor he operates only in Autodesk, not clicking on any of the other myriad sites quietly awaiting his attention below the still face of his design template.You look at Anthony, and you see the way the system is supposed to work. The way Principal Valadez intends for it to work. And then you look at some of the other kids and you worry. Five of the other nine students are watching YouTube videos of a self-driving Mercedes-Benz. They debate the merits of rappers Gucci Mane and Soulja Boy. One makes a rude comment about a woman in a video on his screen.At this, Anthony bristles. "Ay, man," he shouts with a disappointed shake of his head that not only actually silences the other kids but draws not a single snarky response. He looks around the classroom: "They don't even work," he whispers.Will they fail?"Yeah!" he says, with the incredulity of someone who can't imagine a person allowing such a thing to happen.Anthony walks to the lab down the hall to cut out the pieces of his cellphone holder. Back in the classroom, another student—one of the five involved in the rude YouTube comment debacle—gets into a verbal altercation with the teacher, who, exasperated, throws him out of class.
You look at Anthony, and you see the way the system is supposed to work.
"Of course there were goofballs ten years ago, but I don't think the ratio was as high," the teacher, Jeff Smith, says afterward. He's taught in this district for twenty-four years. "A video game, you get killed or whatever, you push reset. You can reset a thousand times. For some of this stuff, there's no reset button, and when something doesn't work, they're over it."Anthony stays late to use the laser-cutter even though the bell has rung, eyes shining as he watches the pieces of his project materializing out of a blank sheet of cardboard. He stands with his hands clasped behind his back, smiling. "I probably shouldn't be this excited," he says. And then he runs off to catch his bus. He doesn't bother to stop at his locker. What would he need to keep in there?Brianna
Brianna McMonagle is allowed to use Instagram but not Snapchat.
Christaan FelberAdvertisement - Continue Reading Below"We were the mean parents," Tabitha McMonagle says, stroking her son's hair while he presses his face against her hip. Her daughter, Brianna, wasn't allowed to have a phone in elementary school, which put her in the minority. She didn't get one until the beginning of seventh grade. She wanted a phone, but Tabitha and her husband said no. Not until she had a reason, such as after-school sports, to make it necessary, they said.Now, at fifteen, Bri has a smartphone, a powder-blue iPhone 5c in a LifeProof case with a yellow softball sticker on the back. She wants Snapchat, but her parents have put their foot down about that. "Too many kids make stupid decisions with apps like Snapchat," her mom says. "That's a quick picture that you think is gone as soon as you send it. But the reality is, it isn't gone. I can take a screen shot of Snapchat. I don't think my children are going to send inappropriate images necessarily, but a lot of children have."Bri has Instagram now, but she didn't for a long time. That was a battle. "She asked for it repeatedly, and I didn't see the purpose," Tabitha says. "You have a phone! You can text your friends, you can call them."In the end, Tabitha joined Instagram first and did a search for all her kids. She found one—Bri's older half-brother. "He had everything in there—his school, what grade he was in, his phone number," Tabitha says. "One day when he was visiting—he lives in another house—I pulled up the sex-offender map. I said, 'These are just the ones we know about. They can find you.' Well, that worked. He was like, 'How do I get all this off here?'"
"Too many kids make stupid decisions with apps like Snapchat."
Tabitha has concerns about giving the vast capabilities of the Internet to people whose brains aren't fully developed. "We're in a climate right now where there is so little tolerance for children's mistakes," she says. "I understand that teenagers are children. It's important to make them wait until they're mature enough to understand that there are some risks involved with putting that much of your own information out there."If you don't, you end up with problems like Zac Felli's: He created his first Facebook account at twelve and included his cellphone number and email addresses. "I wanted people to be able to contact me," he says. "I'm still trying to get rid of the aftermath of ad lists and servers that I've been put on. I still get spam every day. Oh my gosh. It's ridiculous."At her softball games, where she plays shortstop, Bri keeps her cellphone in her bag. Her teammates do too. Sports are no time for distractions. The resulting scene is as peaceful as a sepia-toned photo, and about as ancient: Bri's father crouches behind the umpire, shooting photos with a real camera. A Canon. What the camera sees: Bri's hay-yellow braid flopped over her shoulder, a spray of freckles across her nose, a permanent blush from all the sun. The girls, clay-dusted in uniform, cheer for their teammates against the fence, no phones anywhere. It could be 1997, another era completely.Advertisement - Continue Reading BelowExcept on the sideline, where Bri's towheaded twin brothers, bored half into a stupor, look at exceptionally stupid ifunny.com photos on their phones. Nearby, a baby tries to grab an iPhone until a toddler brings her a flower instead. The baby tries to eat it.Zac, Summer
Christaan FelberThe hardest ADHD test, in Zac's opinion, was the one with the stars. First, a technician strapped a motion sensor to Zac's forehead. Then Zac had to sit in front of a computer for twenty minutes staring at a white screen. Black stars with different amounts of points appeared every few seconds. The instructions: Click the space bar if you see a five-pointed star. Don't click it if you don't. "It was so hard," Zac says. "One: To keep paying attention. Two: To sit down and do nothing else for twenty whole minutes."Many experts, Larry Rosen among them, will tell you that teens' constant multitasking is actually just task-switching, moving back and forth between different activities at a breakneck pace. Through multitasking, teens spend about seven and a half hours a day consuming about ten and three- quarter hours' worth of media. Every one of the students in this story, all of them good kids, sometimes receded into an alternate universe mid-conversation. They don't seem always capable of—or maybe interested in—doing a single thing for a long time. For example: the stars.To many adults, this sounds calamitous. A generation of task-switchers who can't think deeply about anything. We're all doomed! But then ask teenagers if they feel drained by the relentless demands of the Internet, and they'll tell you they don't. "I wouldn't say I feel overwhelmed with all the technology, especially because my generation grew up utilizing it," Zac says. "Overwhelmed? No," Anthony says. "I was born with technology around." And so what if teens are multitasking all the time? We're not any better at it than they are, and the Internet isn't going away. Maybe they do get access to the entire world before they're ready. Can we expect them to choose the opposite?
Anthony Thomas helping another student log in to his computer in study hall.
Christaan FelberIn the short story "All Summer in a Day," by the science-fiction writer Ray Bradbury, a young girl from Earth named Margot has moved to Venus. There she finds that it rains constantly, the sun emerging from behind the clouds for only a few hours on a single day every seven years. Margot remembers sunlight from her time on Earth. She misses its heat and light. The other children, the children of Venus, who have never experienced anything but the incessant pinging of droplets, aren't bothered by the rain at all.The test results say that Zac has mild ADHD. But he also has a 4.1 GPA, talks to his girlfriend every day, and can play eight instruments and compose music and speak Japanese. Maybe his brain is a little scrambled, as the test results claim. Or maybe, from the moment he was born, he's been existing under an unremitting squall of technology, living twice the life in half the time, trying to make the best decisions he can with the tools he's got.How on earth would he know the difference?This story appears in the December/January issue of Popular Mechanics.Read Next: | 科技 |
2016-40/3982/en_head.json.gz/5900 | Mark Engebretson Joins VUE Audiotechnik Engineering Team
Jun. 21, 2012, by PSW Staff
RSS VUE Audiotechnik design chief Mike Adams (left) with Mark Engebretson
Loudspeakers, Management, Personnel, Vue Audiotechnik
VUE Audiotechnik Appoints Doug Green To Expand Distribution Across Europe, Middle East & Africa VUE Audiotechnik Appoints Jeff Taylor To Manage North American Sales Network Adamson Adds Ryan Grant To Bolster Technical Sales Support VUE Audiotechnik has announced that Mark Engebretson has joined the company’s San Diego-based engineering leadership group as an engineering consultant, overseeing development and patent efforts for several advanced audio technologies slated for deployment throughout the VUE Audiotechnik line over the next 24 months.
“Mark and I have worked together numerous times over the years and I couldn’t be more excited that we’ll be collaborating once again,” states VUE Audiotechnik design chief Michael Adams. “I have tremendous respect for Mark’s innate ability to develop and leverage seemingly esoteric concepts into industry-defining products. “His reputation speaks for itself and his experience will be a huge asset as we pursue new performance standards with unique designs that challenge commonly accepted conventions.”
Engebretson notes, “I’m excited about what I see happening behind the scenes at VUE Audiotechnik. There’s a powerful combination of passion, experience, and a commitment to exploring all options in order to drive innovation forward. At VUE I can bring something totally new to the industry I love. It’s a level of creative freedom that I’ve not experienced in a very long time.”
Prior to joining VUE Audiotechnik, Engebretson held leadership positions at some of professional audio’s most respected companies. Most recently he was the vice president of R&D and chief design architect for QSC Audio, and prior to that, he led the JBL Professional Northridge, CA-based R&D department. Other highlights include a stint at Summit Laboratories, serving as vice president of product development for Altec Lansing, and the role of vice president of R&D for Paramount Pictures.
In 2002 Engebretson was honored at the 74th Academy Awards with a Scientific and Engineering Award for his contributions in the design and engineering of the modern constant directivity, direct radiator style motion picture loudspeaker systems.
VUE Audiotechnik Filed in AV
Topics: AV, Live Sound, News, AV, Business, Loudspeaker, Manufacturer, Audio,
Tags: Loudspeakers, Management, Personnel, Vue Audiotechnik | 科技 |
2016-40/3982/en_head.json.gz/5916 | SEMINAR: "Advanced Thermoelectric Materials for Addressing Grand Challenges: Global Energy and ThermJanuary 21 @ 11:00 AM - 12:00 PM - BRK 1001Abstract: Energy, water and food supplies are some of the grand challenges facing humanity. While energy is arguably the greatest grand challenge since solving the energy problem will help alleviate many of the other problems, energy also represents perhaps one of the greatest opportunities for technological innovation, entrepreneurial success, and beneficial advancements for civilization and the environment. This talk will discuss the energy situation, including the nature and magnitude of the problems, and outline some solutions already being implemented to address the energy problems. The importance of thermal management as a grand technological challenge will also be discussed. In particular, key features of thermal management relative to some of the broader energy issues will be noted. The focus will then narrow to advanced thermoelectric materials and unconventional applications. Rather than the more common pessimistic tenor, the outlook presented will be upbeat and positive.Bio: Dr. Dudis is a Principal Research Chemist in the Air Force Research Laboratory – Materials and Manufacturing Directorate. He is widely recognized as an expert in electronically conductive polymers, semicondcutive macromolecular materials, and nonlinear optical materials. His award winning work has been published in over 100 publications, and Dr. Dudis has given over 200 technical seminars and presentations nationally and internationally. He has served as the Research Leader for the Polymer Core Technology Area, and in that capacity had oversight of up to 60 researchers and a multi-million dollar budget. In addition to his work in developing new materials for photovoltaic, battery and fuel cell applications, he served as technology transition lead to facilitate commercialization of materials technologies for power and energy applications. Recently his research endeavors have focused on new materials for thermoelectric energy harvesting and thermal energy storage, but maintains strong interest in batteries, fuel cells, and photovoltaics. He has spoken widely to audiences on energy and societal issues. Additionally, he recently organized and teaches the Alternative Energy Sources course at Sinclair Community College, and is working with Sinclair to develop a sustainable energy education program. Dr. Dudis holds a bachelor degree from the University of Dayton, a Ph.D. from Case Western Reserve University and was a research associate at Texas A&M University, a post-doctoral fellow at Iowa State University – DOE Ames National Laboratory, served on the faculty of the US Naval Academy (Annapolis), was a visiting scientist at the Naval Research Laboratory (in the J. Karle group), and was a guest scientist at the University of Mons (Belgium).Contact DetailsTim Fishertsfisher@purdue.edu | 科技 |
2016-40/3982/en_head.json.gz/5940 | Massive Diplodocus Fossil Hits Auction Block Next Month
by Brett Smith for redOrbit.com - Your Universe Online
Brett Smith for redOrbit.com - Your Universe Online
Anyone with a very large display space and over half-a-million bucks of disposable income should consider a trip to southern England next month where a 55-foot-long dinosaur skeleton is expected to be auctioned off.
The skeleton is of a Diplodocus longus named “Misty,” which was discovered near the fossil hotbed Dana Quarry in Wyoming, in 2009. Misty would be the first dinosaur ever put up for auction.
According to Errol Fuller, an author who curates the auction, there are around six of these skeletons in museums around the world. He added the sons of renowned fossil hunter Raimund Albersdörfer first discovered the fossil after their father sent them to dig an area near the Dana Quarry.
"He directed them to this area - just very close to, but not actually in the quarry - where he thought there might be some very worthless fragments and they came back at the end of the day and said they'd found an enormous bone," Fuller told CNN. "They quickly realized that there was going to be many other bones ... so they stopped work on the proper quarry."
The paleontology team needed nine weeks to excavate the entire skeleton. Fuller said the team named the dinosaur Misty because the fossil was discovered at a "mysterious quarry.”
Because the fossil was discovered on private, not public, land, it does not fall under the auspices of federal regulations.
"It's perfectly legal to bring it from America and legal to move it to any country in the world," Fuller said. "Museum workers will sometimes try to stop these things ... but almost every great fossil discovery was made by fossil collectors or dealers."
The auctioneer said the dinosaur could easily be stored and preserved “as long as the skeleton was reasonably, carefully handled and reasonably, carefully housed."
He speculated it would take two or three persons about a day to take down the skeleton from its scaffolding and another day to put it up again. He added the Summers Place auction house could arrange supervision of the process.
"It's been specially designed so that it can be dissembled and assembled again," Fuller said. "There's no piece so heavy that two people couldn't lift it."
He noted how the fossil’s metal scaffolding, or armature, is "a feat of engineering. It's maybe 60 feet long by 12 feet high and a colossal weight. There are quite a lot of safety considerations because you don't want pieces falling off."
The auction house has estimated the fossil will ultimately go for between $640,000 and $960,000, a paltry sum compared to a Ferrari that sold for nearly $27 million in August, Fuller said.
"If I was a rich man, I could actually have a fossil dinosaur ... that would impress my friends much more than a Ferrari and it would cost me just a fraction of ($27 million)," he mused. "That is really incredibly cheap if you compare it with a collector's car and you've got a much more spectacular, gob-smacking exhibition." | 科技 |
2016-40/3982/en_head.json.gz/5979 | $2.5 billion wagered in Illinois' first year of video gaming
ROCKFORD — The first year of legalized video gaming in Illinois has come and gone. Gamblers across the state wagered more than $2.5 billion and lost $205 million since the terminals went live last September.Video gaming terminals are now raking in nearly $30 million a month, making them more profitable than nine of the state's 10 riverboat casinos.And while Winnebago County has proven to be one of the most active areas for video gaming in the state — local gamblers have lost $18.9 million here — it's unclear how much of an effect the money being pumped into the machines has on the local economy.In fact, the majority of the money spent in a local terminal leaves the area in many cases.Unlike casinos, which can draw tourists and pump new money into a locality that otherwise would not have come, the video terminals are now readily available everywhere across the state. Because there's no difference in the five video poker or slot machines you'd find in a bar in Springfield or in the Chicago suburbs or in Rockford, there's no reason to expect a video gambler to travel. That means the vast majority of money spent at a local terminal is probably spent by locals.Since it's money gambled, it's safe to assume that money would have been spent elsewhere, said Bob Evans, professor of economics and political science at Rockford University."It would not be logical to assume this money would have been saved or invested," Evans said. "So the question is how they would have spent it. Is there a net increase of money flowing into the city, or is it just a diversion from one local source to another?"In other words, if a bar owner gains $5 because someone doesn't spend $5 on a movie, there is no gain to the local economy, Evans said.And a chunk of the money is lost to the area.Of the profit at each terminal, 25 percent is sent directly to Springfield for the state of Illinois' general fund. Another 35 percent goes to the terminal operator, which is typically a small company licensed by the Illinois Gaming Board to install and maintain the gaming terminals. The companies are based throughout the state and don't necessarily have a local connection.Of the money that stays local, 35 percent goes to the owner of the bar, restaurant or parlor that houses the terminal, and 5 percent goes to the local municipality.And bar and restaurant owners have enjoyed the financial boost. Many bars like Scanlan's in Rockford have added hours to accommodate gamblers who want to start earlier. Some bar owners like Luke Meyer have expanded to open small parlors that offer the slot and poker machines without the bar atmosphere. Even businesses traditionally outside of liquor and food, such as Sonco Pool Supplies in Loves Park, have added the terminals to supplement business.Some bars have added staff, but the gaming terminals are not labor-intensive, and the effect on jobs has been minimal.Unless you're a bar owner, any local benefit you'd see from video gaming would come from the 5 percent that goes to the municipality."I don't think anyone is calling this a vehicle for development," Evans said. "But what it does do is generate money for a city that's a substitute for taxation,"It's up to each municipality how to spend its 5 percent.The Rockford City Council has designated its cut will be used on equipment and vehicle purchases. Rockford had the highest betting totals in the state over the last year, generating $400,000 in local taxes. That will be used to help pay off leases on city squad cars, ambulances, trucks and fire engines.Since May, when the amount being gambled first started to level off, Rockford has been getting an average of $48,000 a month in tax revenue.Springfield comes in a close second and is followed by Loves Park, which net $229,000 this past year from the gaming terminals. Since May, Loves Park — with a population of about 23,800 — has received about $28,000 a month.That revenue hasn't been designated for a specific purpose, Loves Park Mayor Darryl Lindberg said.Just half of that annual revenue is enough to pay the Loves Park portion of a multi-million dollar regional effort to build and upgrade amateur sports venues."We embraced video gaming early on because we knew it would be a pretty good revenue source," Lindberg said. "It's exceeding what we anticipated. It's something that's not going to count as an answer for everything, but we've been fortunate."Greg Stanley: 815-987-1369; gstanley@rrstar.com; @greggstanley | 科技 |
2016-40/3982/en_head.json.gz/5982 | E-cigarette shops increase with demand in Rockford area
Vape shop owners see their product as a healthier option for smokers By Melissa WestphalRockford Register StarA no-smoking sign hangs on the wall at Vape-N-Juice, but clouds of vapor fill the store as owners Jeremy Petrocelli and Adam Fitzgerald talk about their new business.Vape-N-Juice is one of several electronic cigarette stores, also called vape shops, that have opened in Rockford and across the region in recent months. The store snagged prime real estate at the corner of East State Street and Alpine Road, and curious customers started stopping by before the doors were open.Business has been better than the guys could have expected after only two weeks of being open. Display cases are filled with battery packs, or “mods,” that power the e-cigarettes and tanks that hold juices — with or without nicotine — that come in 45 flavors such as guava, peanut butter and cinnamon roll.Petrocelli and Fitzgerald are former smokers who ditched cigarettes after switching to e-cigarettes about a year ago. They’re passionate about helping other smokers kick the habit even though little is known about the actual effects of e-cigarettes, according to the U.S. Food and Drug Administration.Places such as Chicago, Los Angeles and New York City have banned the indoor use of e-cigarettes, and the devices can’t be sold to minors. Store owners are resistant to further regulations, in many cases, because of the success they’ve had in using the products and quitting smoking.“They’re healthier, and they’re way more affordable,” Petrocelli said. “You can start off with a stronger nicotine level, and then lower it, possibly to zero. Then it’s just enjoying the act of vaping for flavor and habit.“I knew this would change the future for a lot of people, and we wanted to help people get that same effect of not smoking, getting healthier and saving money.”E-cigarettes turn nicotine, flavorings and other chemicals into an aerosol that is inhaled by the user. The FDA doesn’t regulate them but it has proposed rules to extend its authority over the products. The public can comment on the rules through Aug. 8.Health officials are concerned about the long-term effects of e-cigarettes and about minors trying the products because of the flavored juices and novelty of them. In Rockford, the stores fall under commercial/retail zoning regulations rather than stricter rules for tobacco shops. Competition is heating up locally, and tobacco shops carry the products, too.“Are e-cigarettes safer than regular cigarettes? No doubt. Almost nothing is more dangerous than smoking cigarettes,” said Larry Didier, tobacco programs coordinator for the Winnebago County Health Department.“That doesn’t mean they’re safe. There are a lot of questions about the safety of the products, and they’re being presented as alternatives to smoking regular cigarettes. People often continue to smoke regular cigarettes and use the e-cigarettes in places where they can’t smoke.”Didier said he supports restrictions on e-cigarettes in public places the same way regular cigarettes are restricted. Fitzgerald said he has been asked not to use the device in a restaurant, but he and Petrocelli don’t favor a blanket ban on indoor use.“It should be up to each business,” Petrocelli said. “I don’t think it should be regulated by the state or government. It should be a business-to-business decision.”Vape-N-Juice allows buyers to customize their own juices, bottles of which retail for $7 and come with child-proof caps. Cards with information from the Consumer Advocates for Smoke-free Alternatives Association sit prominently on one of the display cases.Education is a big part of the business, whether it’s helping customers buy new equipment or selling parts to hobbyists interested in building their own devices.Steve Conti of Rockford first tried an e-cigarette last year during a vacation in Tennessee. He had smoked for 20 years, but vaping helped him quit.He started brainstorming ways to open his own e-cigarette store locally before his vacation ended. He opened Drago Vaporz a month and a half ago at 929 S. Alpine Road, Suite 101, inside the Liberty Square retail and office plaza.Conti has six suppliers that provide the liquid, or “juice,” for vaping. He doesn’t make his own juices in-store; rather, he prefers to leave that work to people who’ve been in the business longer.“When I first saw this business in Tennessee, I saw customers streaming in constantly, so you see the money is there. But people are realizing this is an actual way to quit smoking,” Conti said.Conti said he started vaping with juice that contained 18 milligrams of nicotine. He has decreased the nicotine volume to 6 milligrams, and he sometimes vapes with no nicotine at all.“I don’t plan on quitting, but I’m able to bring the stuff way down. I know what’s in my juices. I know it’s healthier than cigarettes. We don’t know the long-term effects, but we do know the long-term effects of regular cigarettes.”When he first contacted the city last year about opening his store, Conti said, he was told he’d need a $5,000 tobacco license. But further research proved differently, and the space was already zoned commercial.Vapor Co. didn’t encounter any government regulation issues in any of the communities where it opened stores, General Manager Mark Dohse said. The four stores are in Rockford, Freeport, Belvidere and Monroe, Wisconsin.The Freeport store opened about two years ago inside a tobacco shop, Dohse said. It quickly expanded to its own storefront in the former Hostess bakery outlet at 1103 S. West Ave.The Rockford store was the most recent to open back in April, and it’s the fastest-growing store in the company.“We were lucky enough to be in kind of early. People know us and trust us,” Dohse said. “I think our brand is a little more professional, a little more established.”Dohse said the only pushback came from landlords who didn’t want to rent to an e-cigarette store.Melissa Westphal: 815-987-1341; mwestphal@rrstar.com; @mlwestphal | 科技 |
2016-40/3982/en_head.json.gz/6016 | Advertisement Advertisement Computational Biophysicist Klaus Schulten to Speak on Large-Scale Computing in Biomedicine and Bioengineering Thu, 03/20/2014 - 3:16pm Comments by ISC LEIPZIG, Germany – Dr. Klaus Schulten, a leading computational biophysicist and professor of physics at the University of Illinois at Urbana-Champaign will discuss “Large-Scale Computing in Biomedicine and Bioengineering” as the opening keynote address at the 2014 International Supercomputing Conference. ISC’14 will be held June 22-26 in Leipzig, Germany.
In his talk on Monday, June 23, Professor Schulten will discuss how the atomic perspective of living cells has assumed center stage through advances in microscopy, nanotechnology and computing. Schulten will share how decades of refinements of in silico, in vitro and in vivo technologies has opened a new era in life sciences. Researchers are now able to investigate living systems made up of millions of atoms involved in cell mechanics, viral infection, medical diagnostics and even the production of second-generation biofuels.
Schulten is the leader in the field of computational biophysics, having devoted over 40 years to establishing the physical mechanisms underlying the processes and organization of living systems — from the atomic scale up to the level of the entire organism. As of 2014, his work in biological physics has yielded over 625 publications, which have been cited over 67,000 times. Schulten is also the co-director of the NSF-funded Center for the Physics of Living Cells and his work has been honored with numerous awards, including the Distinguished Service Award of the Biophysical Society in 2013, and the IEEE Computer Society Sidney Fernbach Award in 2012. Schulten also received the prestigious Humboldt Award of the German Humboldt Foundation in 2004.
In addition to the opening address on Monday, each subsequent day of the conference will feature a remarkable keynote presentation.
On Tuesday, June 24, one of Japan’s leading HPC experts, Professor Satoshi Matsuoka, will deliver a keynote titled “If You Can’t Beat Them, Lead Them – Convergence of Supercomputing and Next Generation ‘Extreme’ Big Data.” In this thought-provoking talk, Matsuoka will share why he believes that supercomputer architectures will converge with those of big data and serve a crucial technological role for the industry. His assertion will be exemplified with a number of recent Japanese research projects in this area, including the JST-CREST “Extreme Big Data” project.
Matsuoka, a professor at the Global Scientific Information and Computing Center of Tokyo Institute of Technology (GSIC), is also the leader of TSUBAME supercomputer series and is currently heading various other projects such as the JST-CREST Ultra Low Power HPC and the JSPS Billion-Scale Supercomputer Resilience.
On Wednesday, June 25, Professor Thomas Sterling, a perennial favorite at ISC, will offer a vibrant summary of the past year in his keynote: “HPC Achievement & Impact 2014 – A Personal Perspective.” Sterling is set to track the improvements in microprocessor multicore and accelerator components as well as general system capabilities. On the topic of exascale, Sterling will talk about the international programs devoted to leading-edge HPC that will bridge the second half of the decade. His keynote address will end with an early summary of the emerging area of interests in “beyond exascale,” including superconducting logic, optical computing, neuromorphic and probabilistic computing.
Sterling is professor at the Indiana University’s School of Informatics and Computing and serves as chief scientist and associate director at the PTI Center for Research in Extreme Scale Technologies (CREST). He is currently engaged in research associated with the innovative ParalleX execution model for extreme-scale computing.
In the Thursday, June 26 keynote, Professor Karlheinz Meier, a European leader in neuromorphic computing will deliver a talk titled “Brain Derived Computing beyond von Neumann – Achievements and Challenges.” In the keynote, Meier will review the current projects around the world that are focused on neuromorphic computing and introduce the work in this area that will be conducted under the European Commission’s Human Brain Project (HBP). As an HBP co-director, Meier’s mission will be to develop neuromorphic hardware implementations with a very high degree of configurability.
Karlheinz Meier is a professor of experimental physics at Heidelberg University’s Kirchhoff Institute of Physics. In his role as the co-director of the European Human Brain Project, Meier leads a research group in neuromorphic computing. Funded by the European Commission, HBP is an ambitious 10-year, €1.19-billion project, with the intention of greatly advancing the understanding of the human brain using cutting-edge computer technologies. He has initiated and led two major European initiatives in the same field, but on a smaller scale — FACETS and BrainScaleS.
For detailed information on the ISC’14 program, click here
Advance registration is now open and by registering until May 15, attendees can save over 25 percent off the onsite registration rates.
About ISC’14
Now in its 29th year, ISC is the world’s oldest and Europe’s most important conference and networking event for the HPC community, offering a strong five-day technical program focusing on HPC technological development and its application in scientific fields as well as its adoption in an industrial environment
Over 300 hand-picked expert speakers and 170 exhibitors, consisting of leading research centers and vendors, will greet this year’s attendees to ISC. A number of events complement the technical program including Tutorials, the TOP500 Announcement, Research Paper Sessions, Birds of a Feather (BoF) Sessions, the Research Poster Session, Exhibitor Forums, and Workshops
ISC’14 is open to engineers, IT specialists, systems developers, vendors, end users, scientists, researchers, students and other members of the HPC global community.
The ISC exhibition attracts decision-makers from automotive, defense, aeronautical, gas & oil, banking and other industries, as well as analysts, solution providers, data storage suppliers, distributors, hardware and software manufacturers, the media, scientists, and universities. By attending, they will to learn firsthand about new products, applications and the latest technological advances in the supercomputing industry. Related Reads Bio-engineered Molecule Shows Promise for Quick Control of Bleeding
NIH Begins Large Study of Pregnant Women in Zika-Hit Area
Large Protein Nanocages Could Improve Drug Design and Delivery
Large Study Finds 15 Genetic Sites Linked to Depression | 科技 |
2016-40/3982/en_head.json.gz/6042 | Shacknews 'Best of 2011' Awards
The Shacknews 'Game of the Year' awards have begun. This year, we've done things a little differently. Here's how we've decided the Best of 2011.
We put 2011 in the history books, and with that the time comes to recognize the best from the impressive collection of videogames released over the year. Before even getting to the nominees, though, we took a good look at the awards themselves. When we brought them back for 2010, we took a pretty standard approach, recognizing best of's for genres and platforms, special awards for unique attributes, and, ultimately, a game of the year. The result created confusion and dissatisfaction over what games qualified for which awards and, despite having so many of them, only calling out a few titles which then won multiple times.For 2011 we're taking a different tack. Rather than dilute things across a wide range, we put everything in contention for the grand prize of Game of the Year. Each member of the team cast a ballot of their top five games of the year, which we used on a weighted scale to determine our top ten games of the year.Starting this week, we'll be working through that list, beginning with five honorable mentions, one each day. We chose not to rank this second half of our top ten because the difference between their scores was rather insignificant. Each of these games found its way onto more than one list, but they didn't receive the same consensus as our top five award recipients.The following week we'll dive into the top five. We'll be awarding fourth, third, second, and first runners-up each day. Finally, the series culminates in the Shacknews Game of the Year, to be named Friday, January 20.While the order might be a source of debate, we're excited about how the awards turned out this year. 2011 offered a variety of excellent games and we feel like these represent the very best of them. There's sure to be a lot to talk about with each selection and the awards as whole when all is said and done. We look forward to being a part of those conversations with you.And with that, let the 2011 Shacknews Game of the Year Awards begin! Hope you enjoy them.Shacknews Game of the Year 2011: The Witcher 2: Assassins of Kings'Best of' First Runner-Up: Portal 2'Best of' Second Runner-Up: Batman: Arkham City'Best of' Third Runner-Up: The Elder Scrolls V: Skyrim'Best of' Fourth Runner-Up: Deus Ex: Human RevolutionHonorable Mention: BastionHonorable Mention: Super Mario 3D LandHonorable Mention: The Legend of Zelda: Skyward SwordHonorable Mention: Saints Row: The ThirdHonorable Mention: LittleBigPlanet 2 Chatty
Shadow13th
Dark souls was sloppy and unpolished. The controls wouldn't work sometimes. The graphics really under delivered. It wasn... meoff
Mortal Kombat > All Other Games. apocreg
Honourable mentions for saints row and the quite boring (to play) LBP but not Witcher or dark souls?? You are just wrong... Visit Chatty to Join The Conversation
Game of the Year 2011 | 科技 |
2016-40/3982/en_head.json.gz/6066 | State museum using high-tech methods to share parts of past
Technology that helps police investigate crimes today is allowing scientists at the Illinois State Museum to more thoroughly investigate the past and share what they learn with the public.Wednesday morning, members of the Illinois State Police crime scene services command set up a scanner to create three-dimensional views of the museum�s American mastodon � one of the largest specimens known.Jeffrey Saunders, the museum�s curator and chair of geology, excavated the mastodon in the 1970s.The mastodon, an ice-age elephant, is on display next to a giant ground sloth in the museum�s Changes exhibit.Brian Miller, forensic diagramming and animation supervisor for the Illinois State Police, said his office assists investigators by measuring and documenting crime scenes.The equipment used is a Leica C10 3D scanner that captures about 9 million points as it completes a 12-minute scan.It is one of the latest tools for law enforcement that ranges from the most basic � a tape measure � to equipment similar to that used by surveyors and 3D imaging scanners.Ultimately, the scans will be used to provide additional learning opportunities outside the museum.�We want something that can be manipulated online, virtually by the general public or boiled down to a model that could be downloaded and 3D printed,� said Chris Widga, assistant curator of geology.Classrooms could make use of the scans when studying elephant anatomy, for example, or when puzzling over hard-to-explain Ice Age mammals like the giant ground sloth.The sloth has long claws, even though it is not a carnivore.�It is totally foreign in terms of its anatomy,� Widga said.He said plans to create 3D scans of some of the museum�s larger specimens have been in the works for some time.Chris Young can be reached at 341-8487 or chris.young@sj-r.com.***On the �Net* Chris Widga�s blog �Backyard Paleo�: http://backyardpaleo.wordpress.com.* Smithsonian Institution 3D scan of a woolly mammoth: http://3d.si.edu/explorer?modelid=55 | 科技 |
2016-40/3982/en_head.json.gz/6263 | Look us up on Google — we can help make an observant life easier
If Google’s tax arrangements are any guide, its London offices may be an illusion. So it is hardly surprising that getting to your desired floor can prove tricky. The lifts into the internet giant’s colourful Tottenham Court Road premises are bafflingly complex, controlled from the outside, like a time machine. But once in, there’s no mistaking where you are. Google screens and products are everywhere, from the touch-screen sign-in to an oversized sparkling logo on the wall.
I’m visiting to learn about life as a Jewgler — shorthand for Jewish Google employee — a significant element in its global presence, given the two offices in Israel and the number of Jews working at its Mountain View, California, base. One of the Israeli offices has a kosher restaurant. In California, a succah is put up every year and a menorah lit at Chanucah. It’s not yet quite the same for Jewglers in London, where operations are split between two offices with a plan to move to a flagship site in Kings Cross. But in line with the company’s aim of making life easier for its employees, at least one British employee enjoys the daily provision of kosher meals. “I took in a sandwich on my first day,” recalls Stephen Rosenthal, who heads up the UK public affairs team. “They asked why I wasn’t eating their food. The next thing I knew, a Hermolis menu was on my desk.”
Reputedly one of the most creative office spaces in the country, the set-up doesn’t disappoint. “You can never say that something doesn’t exist at Google, because it might,” says Maxine Kohn, described as a “veteran” after seven years with the company.
Facilities include a state-of-the-art gym, massage rooms and a space-agey library with few books but endless cushions. When staff are not working — which, despite all the distracting toys, doesn’t appear to be that often — they can relax on the Green (leisure area), plant herbs in the allotment or chow down for free in the restaurant. And the plush outside space with views over London is an idyllic spot. “I have a one-year-old and a three-year-old and they adore it here — it’s like a fantasy world,” says David Grunwald, whose role is to take Google products into small businesses. It’s nice work if you can get it and, indeed, the Jewglers seem overjoyed to be there, bandying about buzz-words such as “mission”, “vision” and “ethos”. “It’s the best job in the world,” Ms Kohn enthuses.
Although it might not boast the active Jewish societies of City law and accountancy firms, Google has hosted a number of communal events over the past year or so for Young Jewish Care, World Jewish Relief and the Union of Jewish Students. And as a company with a finger in every possible pie, it naturally believes its tools can make being an active Jew easier.
Which is why it has started speaking Yiddish — well sort of, as its software will translate to or from the language. Aside from this being useful to help Jewgler James Rosenthal understand exactly what his Yiddish-speaking relatives are saying, it’s a way of preserving the language. For modern languages, including Hebrew, you can also now translate a spoken conversation, which helps, for example, when a Tel Aviv tech firm wants to collaborate with British companies. With one of two Google campuses for young tech start-ups being in Tel Aviv (the other is in east London), it’s certainly useful internally. Stephen Rosenthal (no relation) says that “Tel Aviv and London are two of the most exciting start-up hubs in the world. And Google is acquiring a lot of Israeli companies, like Waze. There are Israelis everywhere in Google.”
Jewglers talk of rabbis conducting shiurim with counterparts in Israel via Google Hangouts, Jewish-themed Google Doodles (including one marking the company’s barmitzvah) and a recipe function that allows users to type in the contents of their fridge to speedily identify a kosher-friendly meal. And will the new “augmented reality” Google glasses aid religious practice such as the laying of tefillin? “I look forward to wedding videos with the Israeli dancing seen from the groom’s perspective,” Grunwald jokes. James Rosenthal, whose wife is South African, notes that for families separated by continents, the hangout function enables them to spend a virtual Shabbat with relatives abroad — as his kids do. And with 100 hours of video content uploaded to YouTube every minute, Emma Stephany, who works on the site, points out that there is a Jewishly-relevant video for almost every taste. “It can be anything from finding out how to cook the perfect chicken soup, or researching your Jewish wedding band. Both of which I have done.”
Sacha Nehorai — who worked on David Cameron’s election campaign before joining Google as an events producer — highlights the use of Cultural Institute, “a catalogue of all the museums around the world. Auschwitz-Birkenau is on there, so if you’ve not got a chance to go yourself you can go online and read all about it.” Google also digitised the Dead Sea Scrolls. As with Yiddish, it’s seen as a way of preserving heritage. James Rosenthal says that a vastly improved search facility also aids Jewish research. “My children, who are six and three, often ask when festivals are or when Shabbat comes in, things I don’t have stored in my brain.” And whereas you could always search for, say Rosh Hashanah, and be presented with a series of links, with no accuracy filter, now a box appears with all the details, knowing where in the world you are and the year you are asking about.
Maps are also becoming more personalised and, as Google now owns Zagat, a resource for not just locating kosher restaurants but also finding up-to-date reviews. “Everyone wants to be Giles Coren,” says Stephen Rosenthal. “My wife and I were in Spain and we typed in ‘kosher shop near me’. It showed me how to get there, how long it would be to walk or drive, it showed me a shul. It just made our weekend.” Google has been under fire of late for not doing enough to block explicit or abusive content. Likewise, critics claim too little action is taken over extremist material, including antisemitic or Holocaust denial websites. “Google is not the internet, we have an index of pages,” Stephen Rosenthal argues. “One of the founders is Sergey Brin, who grew up in a Jewish refusenik family in Russia, so his views on censorship are very strong and it’s part of the ethos of the company that freedom of speech has to be protected. We give the power to every one of our users to flag issues, so if you see content which is antisemitic, you can flag it. It gets assessed by one of our teams, by a human not by a computer, and if it breaches our guidelines it gets taken down. But we don’t clear pages before they go on — it would be impossible. We index a trillion web pages. We need the help of the community.”
The Jewglers acknowledge that the strictly Orthodox do not see Google in a positive light. But Stephen Rosenthal argues: “You set the limits on how you use it. There are so many things about it that bring value, be it doing shiurs online, or that someone who is bed-bound can effectively visit Yad Vashem.” Looking to the future, they ponder whether Google’s mooted self-driving cars would be Shabbat-friendly. But Stephen Rosenthal maintains that the ultimate gift to Jewish users would be a function whereby the desired answer to a question is typed in and Google finds a rabbi to support it. “We haven’t come up with that one yet,” he grins.
Source URL: http://www.thejc.com/lifestyle/lifestyle-features/109406/look-us-google-%E2%80%94-we-can-help-make-observant-life-easier | 科技 |
2016-40/3982/en_head.json.gz/6388 | Home / Science News 500-year-old South American female mummy casualty of ritual sacrifice, scientists say
By Brooks Hays | Updated Feb. 27, 2014 at 1:11 PM Follow @upi Comments
MUNICH, Germany, Feb. 27 (UPI) -- The mystery of the unidentified mummy can be taken out of the cold case file and reclassified as "solved" -- thanks to the work of German scientists, who recently determined the age, origin and cause of death of a naturally mummified young woman who languished for decades in a Munich museum.
New research shows the mummy, recently rediscovered in museum archives, is more than 500 years old, hails from the ancient Incan Empire of South America, and perished as a result of a couple of sharp blows to the back of the head.
Scientists surmise that her murder was ritual sacrifice. The Incans regularly executed young women to appease the sun gods.
"We assumed she died in a ritual killing but we have no clear evidence from written sources," Professor Andreas Nerlich, a paleopathologist at Munich University, told BBC News. He and his colleagues have detailed their study of the previously ignored mummy in the latest issue of Plos One.
CT scans, injury reconstructions and DNA evidence helped the researchers piece together evidence of the young woman's early demise, as well as her place of origin. Until now, it was assumed the young woman was German -- one of the many well-preserved bodies recovered from Europe's peat bogs.
But the mummy's deformed skull is evidence of Incan head flattening practices, and DNA evidence indicated a diet of fish and corn, staples of the New World. The mummy's braided pigtails also featured llama hair, more evidence of her Incan past.
The young woman was not purposefully mummified, but was well preserved by her resting place in the arid climate and salty sands of the desert -- most likely northern Chile's Atacama Desert.
The mysterious mummy made its way to Germany at the end of the 19th century, after Princess Therese of Bavaria acquired two mummies on a trip to South America. Back in the south of Germany one of the mummies was lost while the other found its way into the Bavarian State Archaeological Collection in Munich. That's where the young woman rested for decades.
Analysis by the German researchers also determined that the woman suffered from Chagas disease, a parasite common among poorer villagers who lived in modest mud huts. She would have had difficulty breathing and digesting food.
"She might have been chosen as a victim for a ritual murder, because she was so ill and it might have been clear that she might have lived only for a relatively short period," Nerlich told Discovery News.
Not everyone's convinced the woman was sacrificed. Emma Brown, professor of archaeological sciences at the University of Bradford in the United Kingdom -- who did not participate in the study -- says the woman's murder could have just as easily been the result of European conquest.
"This individual is older than the usual profile of ritually killed females, who are typically around the age of 13 or 14," Brown told the BBC. "It is important to recognise the historical context of this mummy. The radiocarbon dates cover the period of the Spanish conquest of the Americas."
[BBC News]
[Discovery News]
Sandra Bullock's final 'Gravity' payday will be astronomical
'Girls' star Allison Williams is engaged
Rick Ross's label produced track inspired by Lamar Odom Kid Cudi drops surprise album, Beyonce-style | 科技 |
2016-40/3982/en_head.json.gz/6415 | News Space Shuttle Endeavour Launch Remains on Track October 27, 2009 10:43 AM
All systems are go for Friday's scheduled launch of the space shuttle Endeavour's mission to the International Space Station. VOA's Jessica Berman reports the mission comes amid uncertainty at the US space agency NASA over the direction it will take under the incoming administration of President-elect Barack Obama. NASA officials have given the green light for launch Friday night of the space shuttle Endeavour barring any bad weather. But for now, the head of the mission management team, LeRoy Cain says everything look good for lift off. "Everybody is 'go' to proceed on toward launch on Friday of STS 126. So, we're ready to go. The vehicle and crew and the ground teams have prepared very hard for this mission." Endeavour's seven astronauts will be bringing extra bedrooms, another bathroom and a kitchenette to increase the living space aboard the orbiting scientific outpost, which currently houses three permanent astronauts. The expansion is designed to make room for a total crew of six astronauts to live and work aboard the space station beginning in the middle of next year. The shuttle is also delivering a recycling system which will turn waste water and urine into drinkable water on the space station. During the 15-day mission, Endeavour's crew will perform four complicated space walks to repair and lubricate a massive solar array joint which is essential for the panels to track the sun. They will also perform preventive maintenance on the second solar joint. With an aging space shuttle fleet that's increasingly in need of repair and due to be retired in 2010, NASA officials are looking toward the incoming administration of President-elect Barack Obama for the future of the space agency. The space agency has planned between ten and twelve more shuttle flights, but Cain says it's possible the program could be extended. "However many more times we are directed to fly as a matter of policy, we will do it with flight safety being at the very top of the list in terms of things of importance," he said. Meanwhile, NASA officials say there's a forty percent chance that a cool front of rain and heavy clouds moving toward the launch pad in Florida could postpone Endeavour's Friday's lift-off until Saturday.
Shuttle Endeavour to Resume Construction of International Space Station
NASA Prepares for Space Shuttle Endeavour Launch
Lift-Off Nearing for Launch of Space Shuttle Endeavour | 科技 |
2016-40/3982/en_head.json.gz/6543 | The MiceChat Network
Archived MiceAge Discussions
MiceChat News Archive
Disney's better Internet Mouse trap - CNN/Money 10/27/06
cellarhound
Disney's better Internet Mouse trap CNN/Money 10/27/06
Disney's better Internet Mouse trap NEW YORK (CNNMoney.com) -- That real estate slowdown you keep reading about? It certainly isn't taking place in the virtual world.
Sellers of Internet properties are having no trouble finding buyers. Big media companies are tripping over themselves in an online land grab.
Google agreed to plunk down $1.65 billion for YouTube earlier this month. Also this month, Viacom announced that it was buying Quizilla, a Web site catering to teens. That deal comes on the heels of Viacom's purchase of online video site Atom Entertainment for $200 million in August.
Sony bought online video firm Grouper for $65 million in August. GE's NBC Universal ponied up $600 million for iVillage, a network of sites that focuses on women, in March. Time Warner's AOL has bought three firms with social networking or online video ties this year. (Time Warner also owns CNNMoney.com.)
And, of course, News Corp. shelled out $590 million last year for Intermix, parent of the social networking phenomenon MySpace. After YouTube: Who's next to sell?
But one big media firm has been notably absent from the merger wave sweeping across the Web: Walt Disney. And while Wall Street speculates about who might want to buy privately held social networking company Facebook, news site Digg and other online upstarts, Disney's name is rarely mentioned among the list of possible suitors.
Still, does Disney need to make a splashy Internet deal?
Steve Wadsworth, president of Walt Disney Internet Group, said that the company isn't averse to making an online acquisition but that it has to be at the right price. He adds that there are few Internet companies that have wound up selling out recently that Disney hasn't looked at but that, ultimately, it was more sensible to not pursue deals.
"To pay what is, on the surface, significant amounts of money doesn't make sense for us. For others it may make sense. We are concerned about valuations," he said.
Disney's strategy is a stark contrast to its media rivals. But it could turn out to be the most prudent and sound. After all, Disney seems to be the one media firm that still remembers the last time there was a rush to embrace the Web and how painful that turned out.
In the late 1990s, Disney spent more than $2 billion on Web search firm Infoseek, changed the name to Go.com and then set up a tracking stock for this Internet unit. After the dot-com bubble burst, Disney shut down most of the site's operations in 2001 and converted the tracking stock back into Disney common shares.
"One of the things that were learned in the late '90s is that you have to have a sustainable business model to justify the investments that companies have made and are making online," he said.
Wall Street likes the strategy
And investors apparently aren't too upset by the fact that Disney doesn't have its own MySpace or YouTube. Shares of Disney have surged 33 percent this year, and the stock has vastly outperformed most of its major media rivals.
Plus, it's not as if Disney is shunning the Internet. It has just chosen, so far, to focus more on establishing a strong presence through Web sites for its existing brands such as ABC, ESPN and, of course, Disney.
"Disney has more of a build mentality. I think that it is their strategy - to extend brands rather than buy new ones. I don't think they need to mimic what News Corp. and others have done," said David Joyce, an analyst with Miller Tabak & Co.
Full Story - Disney's better Internet Mouse trap Check out my other blog:The Buena Vista Files | 科技 |
2016-40/3982/en_head.json.gz/6550 | Emil Dimantchev
About Emil is a climate policy researcher and a graduate student at MIT. He currently researches U.S. climate policy and its impacts on air pollution and human health at MIT’s Selin Group and MIT’s Joint Program on the Science and Policy of Global Change.
He spent five years as a climate policy analyst at Thomson Reuters (previously, Point Carbon) where he authored policy analyses on European and global carbon pricing legislation for lawmakers and Fortune Global 500 companies. His research has been featured in policy hearings in the EU Parliament and Commission and frequently quoted in the media including the Guardian, BBC, the Wall Street Journal, Politico, CarbonBrief, Climate Home, Carbon Pulse, the Telegraph, and others.
Something about Emil you didn’t know: he regrets having to write this in the third person.
Get in touch on Twitter: https://twitter.com/EDimantchev
Emil on TwitterMy Tweets Create a free website or blog at WordPress.com. | 科技 |
2016-40/3982/en_head.json.gz/6552 | Consciousness Studies/The Neuroscience Of Consciousness
< Consciousness Studies
"All parts of the brain may well be involved in normal conscious processes but the indispensable substratum of consciousness lies outside the cerebral cortex, probably in the diencephalon" Penfield 1937.
"The brain stem-thalamocortical axis supports the state, but not the detailed contents of consciousness, which are produced by cortex" Baars et al 1998.
2 Neuroanatomy
2.1 General layout of the CNS
2.2 Sensory pathways
2.3 Motor and output pathways
3 Topological mapping and cortical columns
It is recommended that readers review The Philosophical Problem before reading the sections on the neuroscience of consciousness.
One of the most exciting discoveries of neuroscience is that nearly all of the brain performs functions that are not part of conscious experience. In everyday life we are usually unaware of breathing or heartbeats yet there are parts of the brain dedicated to these functions. When we pick up a pencil we have no experience of the fine control of individual muscles yet large areas of cortex and cerebellum implement this. Things do not appear as greyscale and then have the colour poured into them although this strange colour addition is done in the visual cortex. Most of the brain is non-conscious but how is the "ghost in the machine", the mind, created by and linked into the mostly non-conscious brain?
Although most of the processes in the brain are non-conscious there can be little doubt that the output of sensory processes contribute to experience. For example, although we do not experience the process of adding colour to visual data in cortical area V4 we do experience coloured forms and although we have little inkling of the hugely complex creation of words in the temporal/frontal lobes we do experience verbal thoughts. Our experience is an integrated output of most of the brain processes that deal with sensation as well as dreams, thoughts and emotions. But how and where does this experience occur?
The signals that compose phenomenal consciousness have not been elucidated. Perhaps the least likely signals for this role are electrical impulses in nerve fibres because they are distributed unevenly in time and space and can even be absent for relatively long periods. Furthermore, electrical impulses across the membranes of neurons have an all or nothing character; they cannot be easily superimposed on one another. There are many other possibilities however, such as: the electrical fields on the dendrites of neurons, the fields of chemicals spreading out from synapses, the radio-frequency emissions of action potentials, events in the microtubules in cells, the depolarisations of glia, the varying fields measured by EEG devices, the quantum superposition of brain states etc...
Phenomenal consciousness could exist in the dendritic field of ten neurons receiving 100,000 synapses or as an oscillation of fields over the whole brain. The substrate of phenomenal consciousness could be staring us in the face as a state of the whole brain or be like a needle in a haystack, lurking in a tiny region of brain, unsuspected and undiscovered.
Given that there is no widely accepted theory of phenomenal consciousness Crick (1994) and Crick and Koch (1998) approached the problem of the location of the substrate of consciousness by proposing that scientists search for the Neural Correlates of Consciousness. These neural correlates consist of events in the brain that accompany events in conscious experience.
Crick, F. (1994). The Astonishing Hypothesis. New York: Scribners.
Crick, F. & Koch, C. (1998).Consciousness and Neuroscience. Cerebral Cortex, 8:97-107, 1998 http://www.klab.caltech.edu/~koch/crick-koch-cc-97.html
Neuroanatomy[edit]
General layout of the CNS[edit]
The Central Nervous System (CNS) consists of the spinal cord, the brain and the retina.
The CNS consists of two major groups of active cells, the neurons and the glia. The neurons conduct short impulses of electricity along their membranes called 'action potentials and encode data as frequency modulated signals (i.e.: different intensities of stimulation are converted into different rates of firing). The glia modify the connections between neurons and can respond to neuron activity by a change of voltage across their membranes. Glia also have many other roles such as sustaining neurons and providing electrical insulation.
Neurons have three principal parts: the cell body, the dendrites and the axon. Impulses flow from the cell body to the axon. The axon can be over a metre long and bundles of axons form nerve fibres. Where an axon makes contact with the dendrites or cell body of another neuron there is a special sort of junction called a synapse. Transmission of data across synapses is usually mediated by chemical signals.
Areas of the brain where there are many cell bodies have a beige/grey tinge and are called grey matter. Areas that contain mainly nerve fibres are called white matter. Masses of grey matter outside of the surface of the cerebral cortex or the cerebellum are called nuclei.
The brain is of central interest in consciousness studies because consciousness persists even when the spinal cord is sectioned at the neck.
The brain can be divided into five distinct divisions or 'vesicles on the basis of embryological development. These are the myelencephalon, metencephalon, mesencephalon, diencephalon and telencephalon (See the illustration below).
Myelencephalon: Medulla oblongata.
Metencephalon: pons and cerebellum.
Mesencephalon: midbrain (tectum containing the superior colliculus and inferior colliculus, red nucleus, substantia nigra, cerebellar peduncles.
Diencephalon: thalamus, epithalamus, hypothalamus, subthalamus.
Telencephalon: corpus striatum, cerebral hemispheres.
These divisions tend to obscure the physical anatomy of the brain which looks like a rod of spinal cord with a swelling at the top due to the thalamus and corpus striatum. Around the top of the rod is a globe of deeply indented cerebral cortex and at the back there is the puckered mass of cerebellum. The physical anatomy is shown in greater detail in the illustration below where the thalamus and corpus striatum have been splayed out to show more detail.
The thalamus is a complex organ with numerous nuclei. These are listed below:
Type of Nucleus
Abbrev
Reticular
Intralaminar
Centromedian
Arousal, attention, motivation, pain
Parafascicular
Central lateral
Paracentral
Intralaminar Midline
Paraventricular
Rhomboid
Nonspecific
Pulvinar
Lateral dorsal
Anteromedial
Anteroventral
Lateral posterior
Medial Dorsal
Specific Thalamic Nuclei
Lateral geniculate
(Sensory Relays)
Medial geniculate
MGN
Ventral posterior
General sensation
Ventral anterior
(motor)
Ventral lateral
The location of these nuclei is shown in the illustration below:
The cerebral hemispheres consist of a thin layer of nerve cell bodies on the surface (the cerebral cortex) with a mass of white, interconnecting fibres below (the cerebral medulla). Each hemisphere is divided into four principle lobes as shown in the illustration below:
The cortex is a set of interconnected processors. The general layout of the cortex with the location of the processors is shown in the illustration below:
The pathways in the brain tend to preserve the topography of the sense organs so that particular groups of cells on the retina, cochlear or body have corresponding groups of cells in the thalamus or cortex. The retina is said to have a topological mapping onto the thalamus so that the projection of the optic nerve is said to be retinotopic.
Nerve fibres that go to a part of the brain are called afferents and fibres that come from a part of the brain are called efferents.
The cortex and thalamus/striatum are intimately linked by millions of connecting fibres and there is also a direct connection from the motor cortex to the spinal cord.
Sensory pathways[edit]
Information from the sense organs travels along the appropriate sensory nerve (optic, auditory, spinal etc.) and once in the brain is divided into three principal paths that connect either with the thalamus, the cerebellum or the reticular formation.
There are thalamic nuclei for each broad type of sensation and these have reciprocal connections with specific areas of cortex that deal with the appropriate mode of sensation. The large mass of nerve fibres that mediate the connection between the thalamus and cortex are known as the thalamo-cortical and cortico-thalamic tracts. There tend to be more sensory nerve fibres returning from the cortex to the thalamus than connect from the thalamus to the cortex so it is difficult to determine whether the cortex is the destination of sensory data or a region that supplies extra processing power to thalamic nuclei.
The cerebellum mediates reflex control of complex movements and receives input from most of the sense organs.
The reticular formation is a group of loosely distributed neurons in the medulla, pons and mesencephalon. It receives a large amount of autonomic input and also input from all the sense organs. The intralaminar nuclei of the thalamus are the principal destination of reticular output to higher centres. In the most primitive vertebrates the reticular formation performs most of the higher control functions of the animal. The reticular formation is implicated in the maintenance of sleep-wake cycles and activates the higher centres. This activity has attracted the label ascending reticular activating system (ARAS) to describe how the activity of higher centres is controlled by reticular input. This title is unfortunate from the point of view of consciousness studies because it implies that conscious experience is a result of activating the cortex when it could be due to turning on or off particular systems all the way from the reticular formation to the cortex. Destruction of the reticular formation leads to coma.
Motor and output pathways[edit]
Motor control of the body below the skull is accomplished by three principle routes.
The motor cortex of the frontal lobes and related cortex in the parietal lobes can control movement directly via nerves known as the cortico-spinal tract (also called the pyramidal tract). The activity of the motor cortex is modified and controlled by a loop that passes through the corpus striatum, the substantia nigra and the subthalamic nucleus and returns to the cortex. These controlling nuclei are, along with the amygdala, known as the basal ganglia.
The cerebellum and the corpus striatum provide complex reflex control of the body through nerves that travel through the red nucleus and form the rubro-spinal tract.
The vestibular nucleus, which processes signals related to balance and posture, has direct connections with the periphery via the vestibulo-spinal tract.
Apart from the routes for controlling motor activity there are also other outputs from the brain, for instance the autonomic nervous system is intimately linked with the reticular formation which has areas that control blood pressure, respiratory rhythm etc.
Topological mapping and cortical columns[edit]
Sensorimotor homunculus
The cerebral cortex has a highly convoluted surface that provides a large area of tissue. The parts of the cortex that are used for motor and sensory functions are organised so that different areas correspond to different zones of the body. This topological organisation is shown classically by a drawing of the sensorimotor homunculus such as that shown on the right.
Within a given area of the cortex there are further subdivisions. For example, the occipital cortex corresponds to the eyes of the sensorimotor homunculus and it is further organised so that areas of the retina have corresponding areas on the cortex. This mapping of the layout of the retina onto the cortex is known as topological mapping. It results in a corresponding mapping of the receptive field of the eye onto the cortex. The mapping is like an image on the surface of the brain tissue and the visual scene that is presented to a subject can be recovered by using fMRI along with computer analysis (Miyawaki et al. 2008).
The human cortex is fairly deep, containing 100-200 neurons from the surface to the white matter. It is divided into six histological and functional layers. These layers can be further subdivided. In 1957 Mountcastle used microelectrode measurements to show that activity of small zones of cortex about 0.1 to 1 mm in diameter corresponded to particular points in the receptive field. These functional columns of cortical tissue are called cortical columns.
The diagram above shows the organisation of ocular dominance columns. Each column represents a particular part of the receptive field of a single eye. The columns for left and right eyes are linked together in lines. The lines of ocular dominance form a pattern like a fingerprint on the surface of the cortex.
The same part of cortex can have overlapping columns for different functions. For instance there are columns that react to particular orientations of edges at particular places in the visual field. These columns tend to be located together on the cortex forming a pinwheel of columns that cover all orientations at a particular receptive field position.
There are also topologically arranged columns for colour, spatial frequency etc.
Yoichi Miyawaki, Hajime Uchida, Okito Yamashita, Masa-aki Sato, Yusuke Morito, Hiroki C. Tanabe, Norihiro Sadato and Yukiyasu Kamitani. (2008). Visual Image Reconstruction from Human Brain Activity using a Combination of Multiscale Local Image Decoders. Neuron, Volume 60, Issue 5, 10 December 2008, Pages 915-929
The neurophysiology of sensation and perception
The cortex and thalamus
Rivalries and synchronisation
Retrieved from "https://en.wikibooks.org/w/index.php?title=Consciousness_Studies/The_Neuroscience_Of_Consciousness&oldid=2740068" Category: Consciousness Studies Navigation menu | 科技 |