id
stringlengths 30
34
| text
stringlengths 0
71.3k
| industry_type
stringclasses 1
value |
---|---|---|
2016-40/3983/en_head.json.gz/1580 | You are hereHome » Investing in Clean, Safe Nuclear Energy Investing in Clean, Safe Nuclear Energy
Description President Obama announces more than $8 billion in loan guarantees for two new nuclear reactors as part of the Administration's commitment to providing clean energy and creating new jobs.Speakers President Obama, Steven ChuDuration 10:42Topic Energy EconomyLoansEnergy PolicyCredit Video courtesy of WhiteHouse.gov PRESIDENT BARACK OBAMA: Good morning, everybody.
AUDIENCE MEMBERS: Good morning.
PRESIDENT OBAMA: Before I begin, let me just acknowledge some of the people who are standing behind me here. First of all, two people who've been working really hard to make this day happen, Secretary Steven Chu, my energy secretary – Steven Chu – (applause) – and my White House adviser on everything having to do with energy, Carol Browner. (Applause.)
I want to acknowledge the outstanding governor of Maryland, Martin O'Malley, as well as his lieutenant governor, Anthony Brown. (Applause.) We've got Mark Ayers from the building trades and Billy Hite from the UA Plumbers and Pipefitters. Give them a big round of applause. (Applause.)
Gregory Jaczko, who's – the Nuclear Energy Commission, is here. Where is he? (Applause.) Ed Hill, president of IBEW International. (Applause.) And I want to thank Chuck Graham and everybody here at Local 26 for their great hospitality. (Applause.)
Thank you for the warm welcome. Thanks for showing me around. I was just mentioning that I got a chance to pull the first fire alarm since I was in junior high. (Laughter.) And I didn't get in trouble for it. This is an extraordinarily impressive facility, where workers are instructed on everything from the installation of sophisticated energy hardware and software to the basics of current and resistance. We need to look no further than the workers and apprentices who are standing behind me to see the future that's possible when it comes to clean energy. It's a future in which skilled laborers are helping us lead in burgeoning industries. It's a future in which renewable electricity is fueling plug-in hybrid cars and energy-efficient homes and businesses. It's a future in which we're exporting homegrown energy technology instead of importing foreign oil. And it's a future in which our economy is powered not by what we borrow and spend, but what we invent and what we build. That's the bright future that lies ahead for America, and it's one – it's a future that my administration is striving to achieve each and every day.
We've already made the largest investment in clean energy in history as part of the recovery act, an investment that is expected to create more than 700,000 jobs across America – manufacturing advanced batteries for more fuel-efficient vehicles, upgrading the power grid so that it's smarter and it's stronger, doubling our nation's capacity to generate renewable energy. And after decades in which we have done little to increase the efficiency of cars and trucks, we've raised fuel economy standards to reduce our dependence on foreign oil while helping folks save money at the pump.
But in order to truly harness our potential in clean energy, we're going to have to more, and that's why we're here. In the near term, as we transition to cleaner energy sources, we're going to have to make some tough decisions about opening up new offshore areas for oil and gas development. We'll need to make continued investments in advanced biofuels and clean coal technologies, even as we build greater capacity in renewables like wind and solar. And we're going to have to build a new generation of safe, clean nuclear power plants in America.
That's what brings us here. Through the Department of Energy, under the leadership of Nobel prize-winning physicist Steven Chu – although, as just a quick side note, when he was talking to some of the instructors here and they were talking about currents and this and that and the other, I indicated to him that he could have saved a lot of money; instead of getting a Ph.D., he could have come here and learned some of the same stuff. (Laughter, applause.) But, you know, the instructors here were just keeping up – they were right there with him.
But through the Department of Energy and Secretary Chu's leadership, we are announcing roughly $8 billion in loan guarantees to break ground on the first new nuclear plant in our country in three decades. The first new nuclear power plant in nearly three decades. (Applause.) It's a plant that will create thousands of construction jobs in the next few years, and some 800 permanent jobs – well-paying permanent jobs – in the years to come. And this is only the beginning. My budget proposes tripling the loan guarantees we provide to help finance safe, clean nuclear facilities, and we'll continue to provide financing for clean energy projects here in Maryland and across America.
Now, there will be those that welcome this announcement, those who think it's been long overdue. But there are also going to be those who strongly disagree with this announcement. The same has been true in other areas of our energy debate, from offshore drilling to putting a price on carbon pollution. But what I want to emphasize is this: Even when we have differences, we cannot allow those differences to prevent us from making progress. On an issue that affects our economy, our security and the future of our planet, we can't keep on being mired in the same old stale debates between the left and the right, between environmentalists and entrepreneurs.
See, our competitors are racing to create jobs and command growing energy industries. And nuclear energy is no exception. Japan and France have long invested heavily in this industry. Meanwhile, there are 56 nuclear reactors under construction around the world: 21 in China alone; six in South Korea; five in India. And the commitment of these countries is not just generating the jobs in those plants; it's generating demand for expertise and new technologies.
So make no mistake: Whether it's nuclear energy, or solar or wind energy, if we fail to invest in the technologies of tomorrow, then we're going to be importing those technologies instead of exporting them. We will fall behind. Jobs will be produced overseas instead of here in the United States of America. And that's not a future that I accept.
Now, I know it's been long assumed that those who champion the environment are opposed to nuclear power. But the fact is, even though we've not broken ground on a new power plant – new nuclear plant in 30 years, nuclear energy remains our largest source of fuel that produces no carbon emissions. To meet our growing energy needs and prevent the worst consequences of climate change, we'll need to increase our supply of nuclear power. It's that simple. This one plant, for example, will cut carbon pollution by 16 million tons each year when compared to a similar coal plant. That's like taking 3.5 million cars off the road.
On the other side, there are those who have long advocated for nuclear power, including many Republicans, who have to recognize that we're not going to achieve a big boost in nuclear capacity unless we also create a system of incentives to make clean energy profitable. That's not just my personal conclusion; it's the conclusion of many in the energy industry itself, including CEOs of the nation's largest utility companies. Energy leaders and experts recognize that as long as producing carbon pollution carries no cost, traditional plants that use fossil fuels will be more cost-effective than plants that use nuclear fuel.
That's why we need comprehensive energy and climate legislation, and why this legislation has drawn support from across the ideological spectrum. I raised this just last week with congressional Republican leaders. I believe there's real common ground here. And my administration will be working to build on areas of agreement so that we can pass a bipartisan energy and climate bill through the Senate.
Now, none of this is to say that there aren't some serious drawbacks with respect to nuclear energy that have to be addressed. As the CEOs standing behind me will tell you, nuclear power generates waste, and we need to accelerate our efforts to find ways of storing this waste safely and disposing of it. That's why we've asked a bipartisan group of leaders and nuclear experts to examine this challenge. And these plants also have to be held to the highest and strictest safety standards to answer the legitimate concerns of Americans who live near and far from these facilities. That's going to be an imperative.
But investing in nuclear energy remains a necessary step. What I hope is that with this announcement, we're underscoring both our seriousness in meeting the energy challenge and our willingness to look at this challenge not as a partisan issue but as a matter that's far more important than politics, because the choices we make will affect not just the next generation but many generations to come.
The fact is, changing the ways we produce and use energy requires us to think anew, it requires us to act anew, and it demands of us a willingness to extend our hand across some of the old divides, to act in good faith and to move beyond the broken politics of the past. That's what we must do. That's what we will do.
Thank you very much, everybody. Appreciate it. (Applause.) | 科技 |
2016-40/3983/en_head.json.gz/1605 | Next Generation Firewall Networking / Nest Releases Open Source Version of Thread Networking Protocol
Nest Releases Open Source Version of Thread Networking Protocol
By Jaikumar Vijayan | Posted 2016-05-11
Nest's goal is to accelerate development of connected home products based on its Thread networking protocol for the Google subsidiaries home automation products.
Nest on May 11 released Open Thread, an open-source version of the networking protocol that allows the company's smart thermostat and other home automation products to connect to each other and to the Internet of things.
Nest's goal is to speed up the development of connected home products by eliminating the need for developers to create their own specialized protocols. Nest, an Alphabet-owned subsidiary, relied on a proprietary protocol called Thread to network its products.
The protocol, which is maintained by the Thread Group, is designed to give developers of smart home products an easy way to connect and control their products while making it easy for consumers to set up such devices in their homes.
Developers can sign up to become members of the organization for annual membership fees ranging from $2,500 for a basic membership to $100,000 to become a sponsor with a seat on the board of directors. More than 230 organizations are currently members of the Thread Group.
"As more silicon providers adopt Thread, manufacturers will have the option of using a proven networking technology rather than creating their own," Nest said in a statement announcing the availability of Open Thread. "And consumers will have a growing selection of secure and reliable connected products to choose from."
Because Thread is built on IPv6 and open source standards, millions of installed 802.15.4 wireless devices can be updated relatively easily to run Thread. Releasing an open-source version of the protocol will spur that process and will drive wider deployment of the networking standard, Nest predicted.
Several technology vendors will contribute to the development of OpenThread. They include Qualcomm, Texas Instruments, Dialog Semiconductor and ARM. In addition, Thread-capable radios and development kits from multiple vendors will be capable of running OpenThread, Nest said in its release.
Nest will distribute OpenThread on the GitHub source code repository. Developers will have access to sample code and access to support on Nest’s discussion forum as well as on the Stack Overflow site for programmers, the company said.
Nest’s introduction of Open Thread represents another effort to expand its sphere of influence in the connected home market and the broader IoT market. Last year, Nest announced a product integration initiative under which the company has committed to ensuring its products would interact with intelligent home technologies such as smart lighting and heating systems from other vendors.
Some think the company's goal is to eventually position its technologies as hubs that can be used to not just interact with other home automation systems, but also to control them.
But overall though, after getting acquired by Google for over $3 billion in 2014, Nest has struggled to live up to early performance expectations for the company. In recent months, Nest has been more in the news for the exodus of employees from the company and the reportedly abrasive management style of chief executive CEO Tony Fadell, than for its technology. LATEST NETWORKING ARTICLES SAP Earmarks $2.2 Billion to Expand IoT Efforts Worldwide
HPE Offers Conference Room Service With Microsoft, Logitech
Qualcomm Shifts Business Strategy for IoT
ARM Unveils New CoreLink Interconnect
Tech Vendors, Automakers Form Group to Push 5G for Self-Driving Cars | 科技 |
2016-40/3983/en_head.json.gz/1606 | Next Generation Firewall Security / Your New Car May Connect You to Greater Cyber-Risk
Your New Car May Connect You to Greater Cyber-Risk
By Wayne Rash | Posted 2016-02-27
Smith said that much of the problem is that, like many other IoT devices, the computers in cars are designed with the assumption that they're internal devices that aren't connected. Now they are, and the designers have to deal with the learning curve that requires.
"They're doing better than when I first started," Smith said. "They're taking security seriously."
Unfortunately, not all is rosy inside IoT land. "A lot of it, the more severe stuff, tends to be based on wireless communications," Smith explained. "There are usually not a lot of barriers to getting into the trusted system."
I thought about the car I'd purchased just two days before and its ability to get weather radar and Yelp reviews. Smith said that the worst vulnerabilities are centered around cellular communications and other types of wireless as well. Wireless communications can also include on-board WiFi hotspots and on-board diagnostic systems. But, at least, most of the car companies aren't totally clueless when it comes to security.
"I'm seeing the automotive industry doing a lot more threat modeling," Smith said. Unfortunately, there's no good way for people who buy and drive connected cars to do much about the security since there aren't any antivirus or anti-malware packages out there for cars. On the other hand, some carmakers are paying attention, even to the extent of offering over-the-air updates.
I thought back to the conversation I'd had with a member of my carmaker's support team. "You need to go to a local dealer and get your car's software updated," she said. She's been checking my car online, and apparently didn't like what she'd seen. For other vendors, notably Tesla, the updates are pushed to the car if there's a WiFi network available.
Smith said that cars, like other Internet of things (IoT) devices, could be a lot more secure than they are. "There's not a whole lot you can do without security standards," he said. Much of the problem is that the folks who design car systems weren't used to thinking about security first. "They had the mentality that the vehicle was trusted," Smith said. "They assumed that the cellular network was secure."
Smith advocates for greater openness on the part of the manufacturers, explaining that by allowing anyone to examine the basic code, automotive systems are much more likely to be secure since there are more eyes to spot problems. He pointed to Tesla, which has a HackerOne project, which allows owners and researchers to notify the company of apparent security breaches.
"GM has a vulnerability exposure process" in which revealing holes in the company's security is encouraged, Smith said. He also suggested paying attention to the Open Garages Website, where car and IoT security researchers discuss vulnerabilities and fixes.
Smith also said that the companies need to be more open, if only because it makes it easier to find problems and fix them.
LATEST SECURITY ARTICLES Attivo Networks Launches Self-Learning Dynamic Security
Top iOS Bug Bounty Award Hits $1.5M
CryptoWall Infections Fell Dramatically Over Past Year
Karamba Advances Autonomous Car Security, Raises New Funding
Shape Security Raises $40M to Expand Its Global Reach | 科技 |
2016-40/3983/en_head.json.gz/1734 | A 30-partner consortium of industry and academic partners will transform two onetime Merck & Co. manufacturing plants in Europe into early drug discovery venues, in a €196 million ($265 million) project intended to translate basic research into new treatments ready for clinical trials.
The European Lead Factory will set up shop at BioCity Scotland in Newhouse, in space Merck closed in 2010; and in Oss, the Netherlands, shut down a year later. The consortium will study small molecules that will come both from in-house research and the corporate collections of seven pharma giants.
Merck and the other six pharmas—AstraZeneca, Bayer, Johnson & Johnson-owned Janssen Pharmaceutica, Lundbeck, Sanofi, and UCB Pharma—will contribute “at least” 300,000 chemical compounds by donating their chemical collections to Lead Factory. Another 200,000 compounds will come from academic researchers and small- and medium-sized enterprises.
Both sources will comprise a new Joint European Compound Collection consisting of up to half a million compounds to be accessible to all project partners and to public organizations that offer new targets for drug discovery screening. Target proposals will be selected through competitive calls.
“This project will not only advance the chances of success in the discovery of new medicines by European researchers, but also add value by building research capacity in Europe,” Michel Goldman, executive director of the Innovative Medicines Initiative (IMI), which is supporting the project, said in a statement.
Repurposing drugs that failed to reach the market for their original indications has grown in acceptance within biopharma. IMI last month launched StemBANCC, an academic–industry partnership intent on discovering new drugs using human induced pluripotent stem cells. And last year, NIH’s year-old National Center for Advancing Translational Sciences launched the Discovering New Therapeutic Uses for Existing Molecules pilot program, which matches researchers with 58 compounds that have undergone significant industry R&D, including safety testing in humans, from eight participating pharmas.
In 2011, the U.K.’s Medical Research Council solicited proposals by researchers for repurposing 22 AZ compounds. And a few weeks back, AZ gave access to 250,000 of its chemical compounds to Germany’s Lead Discovery Centre (LDC). LDC will screen the combined library to identify promising new targets, against which drugs will be advanced with in vivo proof-of-concept. “LDC and AstraZeneca will agree on a project-by-project basis on individual licensing terms for successful lead projects of high interest,” the partners said at the time.
At the Lead Factory, a new European Screening Centre will aid public contributors of the new targets in developing tests. Scientific management of the screening center will be overseen by Netherlands-based nonprofit Top Institute Pharma. Both the sites in Scotland and the Netherlands will run state of the art facilities for compound logistics and high-throughput screening to respectively handle the 500 000-strong compound library and to evaluate new compounds that are active against the novel targets.
Ten universities are members of Lead Factory. Five come from the Netherlands: Foundation Top Institute Pharma (Stichting Top Instituut Pharma), Leiden University, Radboud University Nijmegen, Stichting Het Nederlands Kanker Instituut, and University of Groningen. The other five are Germany’s Universität Duisburg-Essen and Max Planck Society; Technical University of Denmark; and the U.K.’s University of Dundee and University of Leeds.
Of the Lead Factory’s budget, €91 million ($121.9 million) will be provided as in-kind contributions from participating companies belonging to the European Federation of Pharmaceutical Industries and Associations (EFPIA). Another €80 million ($107.2 million) will come from the European Commission’s Seventh Framework Programme for Research (FP7), with the remaining €25 million ($33.5 million) to be contributed by non-EFPIA participants.
Roche, Broad Institute Seek New Uses for Old Drugs
A SPA for PMTs
Development of a Homogeneous RSV Neutralization Assay | 科技 |
2016-40/3983/en_head.json.gz/1758 | Fujitsu Australia LifeBook T2010 (3.5G)
Excellent handwriting recognition, active digitiser tablet, fast networking and 3.5G capabilities are built in, good battery life, bi-directional hinge
Screen orientation wasn't saved and had to be reset each time in tablet mode, Omnipass consumes half the CPU while it's running, QuickPoint pointing device is uncomfortable to use, no optical drive
For the travelling business professional, theT2010's 3.5G and tablet features, spill-resistant keyboard and overall build quality are strengths. It performs everyday applications and handwriting well, but it's has a few quirks.
Coupon: Dell XPS 13 Laptops Coupon
With a SIM card facility and 3.5G antenna built in, as well as WACOM touch-screen technology, this sub-2kg tablet-convertible notebook is very easy to use and fully-equipped to connect to the Internet using a mobile data plan.Physically, the LifeBook T2010 has a 12.1in touchscreen with a 1280x800 resolution and a sturdy, bi-directional hinge. That means you can turn it either way without any fear of turning it the wrong way. The notebook's base measures 30 by 22 centimetres, and is 2cm thick — it doesn't house an optical drive. Its 3.5G antenna sticks out of the right-hand side of the screen, and this can also be used as a grip when the screen is in a tablet position.There's not much to the base: it has two USB 2.0 ports, a mini-FireWire port and a PC Card slot for expansion. You'll also find an SD/MS memory card slot, a D-Sub port and a gigabit Ethernet port. On the inside it has Bluetooth 2.0 and 802.11a/b/g/draft-n networking (with MIMO) as well as a 160GB hard drive.For such a cutting-edge model, it's surprising that an ExpressCard slot has been omitted, but Fujitsu is perhaps betting that business users might have requirements for older add-in cards instead. An ExpressCard slot might not be required unless you want to use e-SATA storage or a non-USB-based TV tuner.The transition between an on-the-road ultra-mobile notebook and an office-dwelling machine can be made swiftly if the optional docking station is purchased, which gives the T2010 an optical drive, as well as facilities to easily connect external peripherals and a monitor.Its keyboard is spill resistant and features full-sized keys, which are soft and have plenty of travel. They're easy to type with, but the delete key is not positioned in the expected position on the keyboard. The palm-rest area is adequate and its middle portion is actually the removable 6-cell lithium ion battery. There's no space for a Touchpad, so Fujitsu has installed a TrackPoint-like device instead, which it calls Quick Point. Unfortunately, it's a little uncomfortable to use mainly due to the button design — the left and right buttons are stiff, shallow and slope rearwards.Handwriting recognition was almost perfect — even without any training, it recognised printed and cursive text — and the pen was responsive and accurate. If you're new to tablets, then this one will leave a good impression and once you get the hang Vista's tablet functions for inserting text into documents and Web browser fields, you'll never want to use it like a regular notebook again.
Fujitsu has employed active digitser technology from WACOM, so only the pen will have an effect on the touchscreen, not your hands. This means you can write very comfortably by leaning on the screen, and it also means that you can 'hover' the pen over screen items to view tool-tips.While it was very accurate at selecting even the smallest icons in the system tray, as well as links in Web pages, it wasn't accurate at the extreme corners of the screen. For example, the pen needs to be placed slightly outside the screen's perimeter in order to hit the 'close window' button in maximised windows. We also found its gestures to be a little hit-or-miss. Using pen flicks, you should be able to scroll and navigate backwards and forwards in Windows, but this didn't always work perfectly in our tests.The screen's orientation feature never remembered our preferred position. We had to change the orientation each time we switched on the T2010 in tablet mode.On the screen's bezel, there is a dedicated button for changing the screen's orientation in tablet mode. As well, there are other buttons for invoking the Task manager, for scrolling, and for bringing up the Fujitsu Menu, from which you can adjust brightness and power options among other things.As for performance, the T2010 has an Intel Core 2 Duo U7600 ultra-low voltage CPU, which runs at 1.2GHz, and it offers plenty of grunt for running productivity applications and handwriting recognition. Additionally, there is 2GB of RAM installed, which means you can load plenty of programs simultaneously and switch between them at your leisure. Of course, the T2010 relies on integrated Intel graphics, so it isn't designed to run 3-D applications effectively.In our WorldBench 6 tests, the T2010 scored 54, which is a low result. However, this result was influenced by 3dsMax 3D rendering tests, which the CPU took a very long time to complete. In the file compression, office application, Web browsing, and even the media encoding and multitasking tests, the T2010 performed like a typical, dual-core notebook.We did notice one quirk with the machine's performance that could hinder its speed, not to mention its battery life; the Omnipass application — more specifically, a task called 'secureapp.exe — which is used in conjunction with the fingerprint reader to remember and give you quick access to log-in information for Web sites, as well as encrypted files, was always using 50 per cent of the CPU. If you don't want to use Omnipass, you can kill this process, but if you rely on Omnipass, then this CPU drain will be annoying.Away from an outlet, the T2010 will last at least between three and four hours if you use the 'power saver' battery plan. Using this plan, the machine will still be capable of recognising handwriting without slowing down the rest of the system. After the machine has been running for a while, its cooling fan will kick into gear, and this can be quite audible at its highest speed. | 科技 |
2016-40/3983/en_head.json.gz/1794 | German / DeutschFrench / FrançaisTurkish / TürkçeEnglish
IntroductionEvolutionists' Intermediate-Form DilemmaCambrian Fossils and The Creation of Species"Missing Link Discovered" Headlines are an Unscientific DeceptionDarwin's Illogical And Unscientific FormulaFossil Specimens of Land Animals (1/2)Fossil Specimens of Land Animals (2/2)Fossil Specimens of Marine Creatures (1/5)Fossil Specimens of Marine Creatures (2/5)Fossil Specimens of Marine Creatures (3/5)Fossil Specimens of Marine Creatures (4/5)Fossil Specimens of Marine Creatures (5/5)Fosil Specimens of Plants (1/2)Fosil Specimens of Plants (2/2)Fossil Specimens of Insects (1/2)Fossil Specimens of Insects (2/2) < <
Fossil Specimens of Marine Creatures (2/5)
Needlefish
Age: 95 million years
Period: Cretaceous
Location:Haqel, Lebanon
The eyes, fins, gills, digestive systems, reproductive systems—in short, all the features of all the guitar fish that have ever lived throughout the course of history—have been fully formed, unique and ideally structured. In addition, these structures’ present-day forms are identical to what they were tens of millions of years ago.
According to Darwinist claims, however, these fossils should present a diametrically opposite picture. The fossil record should be full of "half-needlefish." The fact that fossils do not fit the Darwinian picture, and actually exhibit structures that argue the exact opposite, is an expression of the dire straits into which the theory of evolution has fallen.
Squid (with its pair)
Location: Haqel, Lebanon
Darwin knew that his theory could be verified only by the fossil record, for which reason he pinned great hopes on paleontological research. In one part of his book he said:
". . . if my theory be true, numberless intermediate varieties, linking closely together all the species of the same group, must assuredly have existed. . . . Consequently evidence of their former existence could be found only amongst fossil remains . . ." (Charles Darwin, Origin of Species, p. 179)
Yet no intermediate-form fossil has been found over the 150 or so years since Darwin’s day. So his claims have never been verified and confirmed. Fossils have buried Darwin’s theory of evolution, whose invalidity is now a proven fact. One such fossil is this 95-million-year-old fossil squid, identical to living present-day specimens.
Age: 206 to 144 million years
Period: Jurassic
Location: Solnhofen, Bavaria, Eichstatt, Germany
The shrimp pictured is some 200 million years old. Shrimps, having remained unchanged for all that time, tell us that no evolutionary process ever happened.
The fossil record deals one of the heaviest blows to the theory of evolution, because:
1. Evolutionists maintain that living things progress from the primitive to the more advanced by undergoing a constant succession of small changes. Fossil findings, however, prove that living things undergo not the slightest change over even hundreds of millions of years.
2. Evolutionists maintain that all living things are supposedly descended from a common ancestor. Yet to date, not a single fossil has been unearthed that can be regarded as the forebear of any other living thing.
3. Evolutionists say that life forms are descended from one another, via intermediate forms. Yet from among all the millions unearthed as the result of research over the last 150 years, not a single intermediate form fossil has ever been discovered to indicate this.
Catshark
The fossil record reveals that living things do not change, as long as they remain in existence. The 95-million-year-old catshark pictured is one of those life forms that have not altered over millions of years. This means that evolution—which maintains that living things are in a constant state of change and progress from the primitive to the more developed—is invalid. In actuality, evolution's claims regarding the origin of life do not reflect the facts, as is expressed in the book Integrated Principles of Zoology, jointly authored by three evolutionist biologists:
"Many species remain virtually unchanged for millions of years, then suddenly disappear to be replaced by a quite different . . . form. Moreover, most major groups of animals appear abruptly in the fossil record, fully formed, and with no fossils yet discovered that form a transition from their parent group." (C.P. Hickman [Professor Emeritus of Biology at Washington and Lee University in Lexington], L.S. Roberts [Professor Emeritus of Biology at Texas Tech University], and F.M. Hickman, Integrated Principles of Zoology, St. Louis: Times Mirror/Moseby College Publishing, 1988, p. 866)
One of the characteristics of guitarfish, members of the sub-order Rhinobatoidei, is their guitarlike body shape. They generally live at the bottom of tropical seas, close to the shoreline.
The fossil pictured shows that guitarfish have remained the same for 95 million years, condemning evolutionists to a profound silence. These creatures, which have survived unchanged for tens of millions of years, demonstrate that evolution never happened to them and that they were created by Almighty God.
Location: Solnhofen Formation, Bavaria, Germany
One common tactic that evolutionists employ is to distort or carefully conceal those fossils that represent indisputable proof of Creation. Although the fossil record shows that evolution never took place, they determinedly ignore this fact.
The American paleontologist S. M. Stanley describes how facts revealed by the fossil record are ignored by the Darwinist dogma that dominates most of scientific world:
"The known fossil record is not, and never has been, in accord with gradualism. What is remarkable is that, through a variety of historical circumstances, even the history of opposition has been obscured. . . . as the biological historian William Coleman has recently written, "The majority of paleontologists felt their evidence simply contradicted Darwin's stress on minute, slow, and cumulative changes leading to species transformation." . . . but their story has been suppressed." (S. M. Stanley, The New Evolutionary Timetable: Fossils, Genes and the Origin of Species, N.Y.: Basic Books Inc., 1981, p. 71)
However, these Darwinist efforts to silence dissent are now of no avail. It is no longer possible to conceal the fact of Creation revealed by fossilized shrimp like this one pictured, some 200 million years old.
This is a fossil with both negative and positive slabs.
The coelacanth is a large fish some 150 centimeters in length, whose body is all covered by thick scales reminiscent of armor. It is a member of the class of bony fishes (Ostechthyes), of which the earliest fossils are found in strata belonging to the Devonian Period (417 to 354 million years ago). For years, evolutionists portrayed fossils belonging to this vertebrate as belonging to an intermediate form, until the capture of a live coelacanth invalidated such claims. Research into the fish's anatomy again inflicted a major defeat on Darwinists.
In an article in Nature magazine, an evolutionist paleontologist named Peter Forey said this:
"The discovery of Latimeria [coelacanth] raised hopes of gathering direct information on the transition of fish to amphibians, for there was then a long-held belief that coelacanths were close to the ancestry of tetrapods. . . . But studies of the anatomy and physiology of Latimeria have found this theory of relationship to be wanting and the living coelacanth's reputation as a missing link seems unjustified." (P. L. Forey, Nature, Vol. 336, 1988, p. 727)
The latest information regarding the complex structure of the coelacanth continues to pose difficulties for evolutionists. This problem was expressed in Focus magazine:
"According to fossils, fish emerged some 470 million years ago. The coelacanth emerged 60 million years after that. It is astonishing that this creature, which would be expected to possess very primitive features, actually has a most complex structure." (Focus, April 2003)
For evolutionists insisting on a gradual process of evolution, the appearance of the coelacanth with its complex structure naturally came as a major surprise. Yet there is nothing surprising about this at all. Any rational person is able to understand that God creates all living things, together with their complex structures, in the form and at the time He so desires, and in a single moment. The entities flawlessly created by God are all means by which His might and power can be appreciated.
Location:Hjoula, Byblos, Lebanon
Evolutionists cannot point to even one of the countless stingray fossils unearthed as evidence for their claims. No stingray with supposedly primitive, semi-developed features belonging to two different life forms has ever been encountered. Every stingray fossil discovered belongs to a creature that was identical to stingray alive today and had exactly the same characteristics. This goes to show the invalidity of the claim that species are descended from one another and that life forms developed by way of small, gradual changes.
The 100-million-year-old stingray fossil pictured proves once again that living things did not evolve, but were created.
The crayfish pictured is 95 million years old, and there is no difference between it and crayfish living today. These invertebrates, which have undergone not the slightest change in the intervening 95 million years, show that evolutionists' claims are fantasies, products of the imagination, and that scientific data and findings do not support them in any way.
Due to their materialist perspectives, Darwinists have a habit of making various assumptions and adorning them with Latin words and scientific terms difficult for ordinary members of the public to understand, presenting them as if they were scientific facts. The fact is, however, that the evidence showing the invalidity of evolution is perfectly clear. Even by a child of primary school age can easily understand it. One of these pieces of evidence is the fossil record. The absence of any difference between living things that existed hundreds of millions of years ago and specimens alive today totally undermines the concept of evolution.
Period: Carboniferous
Location:St. Louis Formation, St. Louis, Missouri, USA
Sea urchins are free-moving, spiny invertebrates. Their entire bodies are covered in spines. A roughly 300-million-year-old sea urchin defies all evolutionist claims regarding the origins of life.
But sea urchins are by no means the only living things to invalidate evolutionists' claims. The fossil record is full of fossils of plants and animals that have undergone no changes. There is no evidence of any half-developed or deficient forms, despite the passage of very lengthy geological ages. Evolutionists have no rational and scientific answer for how or why living things have remained unaltered for so long. Yet for people who have not been taken in by Darwinist preconceptions, the answer is clear: Living things never evolved, but were all created by our Lord, God.
Despite all the findings and evidence, Darwinists refuse to admit that Darwinism has been defeated by scientific findings. They still blindly espouse claims first put forward under the primitive level of scientific knowledge in the 19th century. They turn their backs on all the scientific data out of ideological concerns and various preconceptions, and resort to hoaxes, distortions and irrational and illogical explanations.
However, the millions of fossils unearthed over the last 150 years make it impossible for them to defend the theory of evolution any longer. Each and every fossil shows that living things have remained unchanged for millions of years—in other words that they never evolved, and that Creation is the origin of life. One such fossil is the 95-million-year-old stingray pictured here.
Age: 37 to 23 million years
Period: Oligocene
Location: Carpathian Mountains, Rowne, Poland
These fish, members of the Perciformes (perchlike fishes) order, are classified under the family Serranidae. This roughly 30-million-year-old fossilized sea bass, identical in terms of its appearance and structural characteristics to those fish living today, is one of the proofs that invalidate the theory of evolution.
Like all their other theses, Darwinists' claims regarding the "evolution" of fish are nothing more than fairy tales, with no scientific foundations. When we examine the evolutionist literature, we never encounter even a claim regarding any potential intermediate forms. Evolutionists have no fossil findings they can use to support the idea that invertebrate organisms developed into fish.
According to the fossil record, life forms emerged independently of one another, each one in a single moment, and with no line of familial descent between them. Fish, for example, did not emerge from invertebrate life forms, as evolutionists maintain, and neither did they later turn into reptiles.
In his 1991 book Beyond Natural Selection, the American paleontologist R. Wesson describes what the fossil record tells us about the emergence of life:
"The gaps in the fossil record are real, however. The absence of a record of any important [evolutionary] branching is quite phenomenal. Species are usually static, or nearly so, for long periods . . ." (R. Wesson, Beyond Natural Selection, Cambridge, MA: MIT Press, 1991, p. 45)
Sand Fish
Despite having been scientifically discredited, the theory of evolution is kept constantly on the agenda of certain circles. Accompanied by drawings of imaginary half-man, half-ape creatures of no scientific validity, reports headlined "Missing Link Found!" announce every new fossil discovery. Captions read, "Our ancestors were microbes," "We are no different from apes," "Did we come from space?" and "Evolution in test tubes". The theory of evolution is constantly depicted as having solid evidence to support it, one that can explain every aspect of human life.
The fact is, however, that fossils demonstrate that such reports and the claims associated with them are mere nonsense. As with the 95-million-year-old sand fish fossil pictured, all fossils reveal that living species have not changed at all over millions of years—in other words, that they never evolved. Faced with this reality, evolutionist propaganda is seen as nothing more than helpless posturing.
The fossil in the illustration is a mirror-image one, traces of which can be seen on both surfaces of the split rock.
The fossil in the illustration is a mirror-image one, traces of which can be seen on both surfaces of the split rock..
These fish, members of the order Anguilliformes (true eels), are classified under the family Congeridae (marine eels). The fossil in the picture proves that eels have not undergone the slightest alteration over 95 million years. It's just one of the other millions of fossil species that undermine Darwinism. Fossil research over the last 150 years or so has revealed not one single fossil to support evolutionists' theories. On the other hand, countless fossils prove that living things appeared suddenly, complete with all the features they possess, that they have not changed over millions of years—and that they were created, rather than having evolved.
Lady Fish (Elopidae)
The evolutionist fossil expert David Pilbeam admits that fossil findings invalidate the theory of evolution:
"If you brought in a smart scientist from another discipline and showed him the meagre evidence we've got he'd surely say, 'forget it; there isn't enough to go on.'" (Richard E. Leakey, The Making of Mankind, Barcelona: Sphere Books Limited, 1982, p. 43)
It is meaningless for Darwinists to refuse to see the groundlessness of their theory. Fossil discoveries have demolished the theory of evolution. The Elopidae fossil pictured, aged 95 million years, is one of these findings that defeat evolutionists' claims.
Pipefish (Syngathodei)
Age: 5 to 1.8 million years
Period: Pliocene
Location: Marecchia River Formation, Poggio Berni, Rimini, Italy
One of the most important features of pipefish is the long, tubular structure on the end of their mouths. With their structures that have remained unchanged for millions of years, these marine fish, members of the suborder Syngathodei, are a challenge to the theory of evolution. Even if Darwinists continue to make every effort to distort the facts or resort to hoaxes, they can no longer conceal the facts revealed by the fossil records. Fossils state that living things did not evolve, but were created.
Location: Solnhofen Formation, Germany
With their characteristics that have gone unaltered for millions of years, horseshoe crabs, members of the family Chelicerata, are among those life forms regarded as "living fossils," even by evolutionists. Horseshoe crabs living in the Jurassic Period, approximately 150 million years ago, are identical to those living along seacoasts today. This lack of differences demolishes evolutionist claims and once again proves that the thesis of living things' evolution of is a ridiculous myth.
Science irrefutably reveals that living things are the work of Almighty and All-Powerful God.
In claiming that all species multiplied by having evolved from one another over long periods of time, Darwinists never stop to consider that almost all the main categories of species known today emerged suddenly and at the same time in the geological age known as the Cambrian Period, 530 to 520 million years ago. They fail to understand that none of the living things whose remains are preserved in the fossil record underwent any change, and that this fundamentally demolished the theory of evolution.
Yet even if evolutionists refuse to think and understand, fossil findings such as the 95-million-year-old sand fish pictured here reveal the invalidity of evolution for all to see.
The fossil pictured has both positive and negative slabs.
You can read Harun Yahya's book Atlas of Creation - Volume 3 online, share it on social networks such as Facebook and Twitter, download it to your computer, use it in your homework and theses, and publish, copy or reproduce it on your own web sites or blogs without paying any copyright fee, so long as you acknowledge this site as the reference. Related Websites | 科技 |
2016-40/3983/en_head.json.gz/1862 | Tomorrow's World: Are These The Innovations Of The Future?
For those brought up in the era before on-demand TV and before even Channel 4 flickered into life, programmes such as Tomorrow's World offered an intriguing glimpse of an otherwise-unseen future packed full of useful technology and innovation.
The enthusiasm of presenters, including the great Raymond Baxter and Maggie Philbin, was often overshadowed by bearded inventors who looked as though they'd been held hostage in some dark rooms for a very long time but the programme had a charm and truly foresaw many everyday items that we now take for granted. Where would we be without ATMs, home computers, digital cameras, compact discs (they in turn superseded by new technologies), global positioning systems (GPS) and even the Channel Tunnel.
Of course, for every idea that came to fruition, many didn't see the light of day or are still being talked about and developed. We've come up with our own list of innovations that we'd like to see entering the world of the mass consumer in the years to come and some that may prove perhaps more controversial.
The flying car
Often seen as the way forward to solve the problems of increasingly clogged-up city roads, the idea of the flying car is something that refuses to disappear in spite of the impracticalities and inherent dangers of letting people loose in the skies. Harrison Ford (Blade Runner) and Bruce Willis (The Fifth Element) both offered glimpses of city life with such vehicles. Hmm, perhaps a good reason to keep our feet on the ground, after all. For those keen to get higher than the rest of us, here's one of the latest prototypes - the Terrafugia Transition flying car on display at the New York International Auto Show last month.
Traffic management systems
Failing the successful introduction of flying cars, traffic systems in cities clearly need a new solution for reasons of both environment and congestion. Already vehicle-sharing companies are gaining in popularity, networks incorporating cars with bikes are familiar in many European cities with London seeing a huge rise in two-wheeled transport thanks to the introduction of 'Boris Bikes'. Eco-friendly car and buses also populate the streets but the ability to keep traffic moving is still an issue and the creation of am active traffic management system that is 'aware' of changing situations and can act on them to keep traffic flowing is still some way off.
Personal robots
Long a staple of science fiction and comedy - Woody Allen's Sleepers a case in point - the idea of a personal servant that doesn't moan, doesn't need paying and will satisfy every whim is something that has never disappeared. In fact, things have got better with age as the Asimo robots demonstrate. Still a long way off mass production, these incredibly advanced automatons are almost childlike in their interactions. The latest models can now run faster, balance on uneven surfaces, hop on one foot, pour a drink and even almost "think" on their own. One for every home? Quite possibly, but let's hope they're not just making the daily cuppa.
Ok, so this is the 'Big One' and not your usual tech innovation, but along with HIV and other stubborn diseases or viruses, it's a disease that is still killing millions. Leading scientists and medics have often predicted time scales for cures, but to eradicate these illnesses in the same way that smallpox is now a distant memory is taking longer in spite of many 'advances' and innovations.
One widely-held belief that people who enjoy going under the knife in the search for eternal youth is that they aren't fooling anybody, except perhaps themselves. But, that doesn't mean the quest to defeat ageing is some sort of false idolatry; the long-established and successful cosmetics industry is a clear sign of what people want. But aside from looking good, science is helping to combat the effects of ageing with developments such as artificial limbs with tactile sensations, retinal implants to help restore vision and a 'spinning heart' with no pulse or blood clots. How soon before parts can be mail ordered?
Energised
Anyone who's watched Star Trek ("Beam me up, Scotty") or Harry Potter (port keys) will have hoped that one day they will be to transport themselves from one place to materialise in another. Complete science fiction/wizardry or is there real potential? We think this is the longest shot in our list, but really want to believe it will happen.
Personalised medicos
The idea of self-treatment for medical purposes is nothing new but doctors are always on the lookout for innovations that can make their jobs easier and improve the health and welfare of their patients without the need for time-consuming trips to their GP or hospital. Miniature medical equipment is one such advance and it may not be long before we see tiny implants that can test, diagnose and even alert medics to potential illnesses or problems. There have been reports of researchers in the Netherlands who have created a pill that can be sent to a particular part of the body where the attached medicine is needed. That could be one HNS reform to attract universal approval.
Driverless transport
Another favourite amongst inventors and engineers, the mass-market driverless car is not as far away as you might think and could prove to be a vital innovation for the urban dweller. For example, General Motors has produced EN-V, the electrically-networked vehicle designed to support those in urban areas where space is limited and driving is often for short distances at low speed. Three bubble-shaped models - Jiao, Xiao and Miao - have been developed which can all be programmed via the HMI (they don't have steering wheels) to take you where you want or perhaps to drop the kids off at school and drive itself back home. What's not to like about that?
Tomorrow's World?
An Asimo in every home?
There's little this fella struggles with. Considering he can now walk up stairs, run and fully interact with us, what would you have him doing?
Traffic Management Asimo Uk-innovation Flying Car Port Keys | 科技 |
2016-40/3983/en_head.json.gz/1864 | King Tutankhamun’s Tomb: Radar Scans Search For Secret Chambers
Exclusive images and footage of radar scanning being carried out in King Tutankhamun’s tomb have been released as scientists try to identify whether secret chambers lie hidden behind its walls.
Archaeologists and experts from around the world have been invited to examine and analyse the new data from the scans at a conference in Cairo in May.
The exploration was prompted by a theory by British Egyptologist Nicolas Reeves that undiscovered chambers lie behind the tomb's western and northern walls and that they may well contain the tomb of Queen Nefertiti, one of pharaonic Egypt's most famous figures.
Kenneth Garrett/ National Geographic/ Egyptian Ministry of Antiquities
National Geographic technicians Eric Berkenpas and Alan Turchik prepare the radar unit to scan the tomb’s walls
The work is being carried out by a team sponsored by the National Geographic Society and have seen the walls scanned at five different heights, switching between two radar antennae with frequencies of 400 and 900 megahertz.
“One was for depth perception and one was for feature perception,” said National Geographic electrical engineer Eric Berkenpas, who was accompanied by Alan Turchik, a mechanical engineer.
Preliminary scans whose results were announced last month suggested two open spaces with signs of metal and organic matter.
NG Staff
If chambers - whether containing Nefertiti's tomb or not - are discovered behind the western and northern walls covered in hieroglyphs and bas-reliefs in Tut's tomb, it would be among the biggest discoveries in Egyptology since Howard Carter first found the king's 3,300-year-old burial chamber and its treasures in 1922.
However, antiquities Minister Khaled el-Anani, who was appointed to his post last week, urged caution.
He said Egypt's "scientific credibility" and the preservation of its antiquities were at stake, adding; "We will rely only on science going forward. There are no results to share at the current stage, but only indications. We are not searching for hidden chambers, but rather we are scientifically verifying whether there are such rooms."
Mohamed Abd El Ghany / Reuters
The golden mask of King Tutankhamun is displayed inside a glass cabinet at the Egyptian Museum in Cairo, Egypt
"We are looking for the truth and reality, not chambers."
Another radar scan will be carried out at the end of the month. It will be done vertically from atop the hill above the tomb, using equipment with a range of about 40 meters.
Harvard University Egyptology professor Peter Der Manuelian, who is not involved in the project, said the Valley of the Kings is "notorious for containing fissures, cracks" that complicate interpreting the scans. "So the more scans we do, and from different angles and directions, inside and outside the tomb, the better," he told The Associated Press.
Even if the spaces are rooms, they could be undecorated small rooms for holding embalming materials, he said - or, more dramatically, "the beginning of a larger floor plan."
POOL New / Reuters
The Nefertiti bust, which is kept at a museum in Berlin "We'll have to be patient. In the meantime, kudos to Nick Reeves for pointing out the presence of these anomalies and for sharing them with the world."
Reeves' theory was prompted by the unusual structure of Tut's tomb. It is smaller than other royal tombs and oriented differently. Furthermore, his examination of photos uncovered what appear to be the outlines of a filled-in doorframe in one wall.
He has speculated that Tutankhamun, who died at age 19, may have been rushed into an outer chamber of what was originally Nefertiti's tomb. Nefertiti was one of the wives of Tut's father Akhenaten, though another wife Kia is believed to be Tut's mother.
"We have a theory, and now what we're trying to do is test it. And, I , if I am right, fantastic, if I am wrong, I've been doing my job, I've been following the evidence trail, and seeing where it leads," Reeves told the AP.
El-Anani said Egyptologists and Valley of the Kings experts will discuss on May 8 the findings of the scans in a previously scheduled conference devoted to King Tut to be held at Egypt's new national museum near the Giza Pyramids outside Cairo.
There, they can discuss the findings. The outcome, he said, will guide what course of action Egypt takes.
The Valley of the Kings was one of the main burial sites for ancient Egypt's pharaohs, located among the desert mountains across the Nile River from Luxor, the site of the monumental temples of Thebes, one of the pharaonic capitals.
Tut's was the most intact tomb ever discovered in Egypt, packed with well-preserved artifacts. But he was a relatively minor king ruling for a short period at a turbulent time.
Nefertiti was the primary wife of the Pharaoh Akhenaten, who unsuccessfully tried to switch Egypt to an early form of monotheism. Akhenaten was succeeded by a pharaoh referred to as Smenkhare. Reeves believes Smenkhare and Nefertiti are the same person, with the queen simply changing her name during her rule.
Not long after Tut died in 1323 B.C., his family was overthrown by a general, ending the 18th Dynasty that had been in power for 250 years.
John Darnell, professor of Egyptology at Yale University, said Tut's tomb is "somewhat anomalous due to its small size ... But the question is: Was Tutankhamun's tomb small, or do we have only a portion of a larger tomb?"
The latest scans were carried out over 12 hours along five different levels of the walls, producing 40 scans. The data will be analyzed by U.S.-based experts, but the results would not be known for at least another week.
"Technology is beginning to open doors that were permanently locked, or seemed permanently locked or maybe we did not know it existed," said Terry D. Garcia, chief science and exploration officer for National Geographic. "It is creating a revolution ... and it is going to result in the 21st century being the greatest in exploration in the history of mankind and we are just scratching the surface."
The mystery is also a golden opportunity for Egypt to boost its deeply damaged tourism industry by drawing world attention to its wealth of pharaonic antiquities.
But any benefit from the discoveries may be slow coming, with Egypt still facing turmoil, including a deadly fight against Islamic militants in the Sinai.
Pharaonic sites were once Egypt's main draw. But cities like Luxor have suffered heavily from the plunge in tourism. Now, visits to Egypt's beaches have also been devastated since the crash of a Russian airliner in October over the Sinai Peninsula that killed all 224 people onboard.
Russia said it was downed by an explosive device and suspended all flights to Egypt. Britain suspended all flights to Sharm el-Sheikh, the Egyptian Red Sea resort from which the doomed aircraft took off shortly before it crashed.
Egypt King Tutankhamun Uk News | 科技 |
2016-40/3983/en_head.json.gz/1876 | U.S.EditionsAustralia EditionChina EditionIndia EditionItaly EditionJapan EditionSingapore EditionUnited KingdomUnited States Oct 01, 9:34 AM EDT SubscribeEverything You Need To Know, Right Now. Everything You Need To Know, Right Now.The IBT Pulse Newsletter keeps you connected to the biggest stories unfolding in the global economy. Please enter a valid email Search Search BusinessTechnologyWorldNationalMedia & CultureMillennial MoneyEntertainmentSports Subscribe TechnologyScience Autism Link Found In Gene Mutations From Father: Study By Amir Khan On 04/05/12 AT 3:34 PM A mother helps her autistic son to correct the direction in a relay race during a therapy session at the Stars and Rain School for autistic children in Beijing. Autism may be caused by mutations in genes inherited from fathers, according to three new studies. Photo: Reuters The number of Americans diagnosed with autism has skyrocketed since 2002 -- it now affects every one in 88 children. One of the biggest barriers to finding an effective treatment is that the cause of autism isn't known. But biotechnology advances are shedding new light on the possible role of genetics.Ten years ago [it was like] we were looking through binoculars, then we were looking at autism through a microscope, and now it's like looking at it in high definition, Andy Shih, vice president of scientific affairs for the advocacy group Autism Speaks, told CNN.Now three separate studies, all published on Thursday in the journal Nature, have identified several gene mutations, many inherited fronm fathers, that may play a role.There's no one gene that causes autism, James Sutcliffe, co-author of one study and an associate professor of molecular physiology at Vanderbilt University, told CNN. But what these studies do show is that several genetic mutations increase the risk of getting autism and different mutations may affect people in a different way.The genes are likely inherited through the paternal line because as a man ages, his sperm cells are constantly dividing, which provides more opportunity for mutations to occur. Women, on the other hand, are born with all of the egg cells they will have for life.Each study found multitudes of mutated genes in autistic children. But among the three, only two genes were found in more than one patient, suggesting to researchers that they may be the biggest risk factors.That is like throwing a dart at a dart board with 21,000 spots and hitting the same one twice, Dr. Matthew State, co-author of the second study and a professor of psychiatry at Yale, told the New York Times. The chances that this gene is related to autism risk [are] something like 99.9999 percent.But some researchers aren't considering this the breakthrough the study authors claim.This is a great beginning, and I'm impressed with the work, but we don't know the cause of these rare mutations, or even their levels in the general population, Aravinda Chakravarti, researcher at the Institute of Genetic Medicine, told the New York Times. I'm not saying it's not worth it to follow up these findings, but I am saying it's going to be a hard slog.Approximately 15 percent of all autistic children will have some form of these genetic mutations, according to one of the studies. But researchers said diagnosing autism by the gene mutations is still a long way off.The genes highlighted are clearly the most convincing susceptibility genes that have been identified so far, but they only explain a small picture of autism, Dr. Mark Daly, coauthor of Sutcliffe's study and chief of the analytic and translational genetics at Massachusetts General Hospital, told ABC News.Future research will focus on identifying other mutations that could play a role, researchers said.Now we have real path forward, State told Time Magazine. As you start to accumulate these individual genes that you know are related, that opens the door to understanding the biology. Once you know the gene, you can examine the protein it makes. Then, even in people who don't have the damaged gene, adding that protein may help because a deficit in that protein might result in their autism via a different pathway. | 科技 |
2016-40/3983/en_head.json.gz/1885 | GDC 2008: Sakurai on Super Smash Bros. Brawl Share TODAY
Masahiro Sakurai / 20 Feb 2008 GDC 2008: Sakurai on Super Smash Bros. Brawl
The director of Brawl talks Wii voice chat, downloadable content, why there aren't more crossover characters and Smash DS. By Matt Casamassina The name Masahiro Sakurai is well known to die-hard Nintendo fans. A 13-year HAL Laboratory veteran, he helped shape the course of the Super Smash Bros. franchise and has more recently just completed work on the enormously anticipated Wii fighter Super Smash Bros. Brawl. We had the chance to catch up with Sakurai at the Game Developers Conference 2008 and naturally we had a flood of questions for him. In the paragraphs below, Sakurai speaks to topics ranging from why Brawl doesn't include voice support to the future of the franchise. IGN: How big was Sora at Brawl's development and are you an exclusive Nintendo developer? Masahiro Sakurai: First of all, I'm very impressed that Sora is so well known. Really, Sora is at its core just two people. When making Brawl, what basically happened was through a connection with Nintendo we rented out an office in a portion of Tokyo and had a lot of staff come in. However, it's really important to realize that Sora really is just two people and that everybody else is not part of Sora technically speaking. It's kind of strange answering this question here in the Nintendo booth because I'm not sure this is the best place to do it, but no, I have no particular ties to them and, of course, any company that comes along to me afterwards and says, "Hey, we have an interesting project for you," I'm going to look at that and going to help with that job. So, no. IGN: Do you think Mr. Iwata would track you down if you tried to make a game for a non-Nintendo system? Masahiro Sakurai: Ah, I'm not so sure about that. I would imagine he would probably understand, but I suppose the possibility exists. IGN: Now that Brawl is complete, how do you feel about the game? What are you most proud of and is there anything you would change if you had more time? Masahiro Sakurai: As far as tailoring or trying to get more down now that it's finished, looking back on it, I could have had another year or another two years and it wouldn't have changed the fact that I never have enough time to adjust or refine -- that's just the way it is for me. Thinking about the typical development process, the trend seems to be that when you finish a game that you've been working on for so long and breaking your back over, you kind of don't want to look at it anymore. You're done with it. But seeing Smash, I really, truly feel that this is a game that I'm really not tired of yet. I'm really enjoying it. I feel relieved in knowing that this game has that kind of staying power for myself and I'm very proud of that. Masahiro Sakurai IGN: What would be that one specific feature you wish you could have added? Masahiro Sakurai: I feel like anything I've considered I'm probably going to think about when -- I mean if -- another game was to come out in the series and so I'd like to keep a lid on that for now. and not disclose anything I've been thinking about there. IGN: Brawl seems like the perfect game to introduce a Wii headset for voice chat. Why have you avoided this? Masahiro Sakurai: Well, when I first started making Smash Bros. Brawl, I thought it would be wonderful if online battles between friends had voice chat and potentially keyboard based chat as well. But there are all sorts of rules and regulations regarding communication on the Wii platform and so it was apparent to me that it just wouldn't come together, we weren't going to be able to do it, so we decided to cancel that feature. I'm very sorry about that. But if you're really desperate for it, you could set up Skype by your game station and go at it with a friend if you like. Continues | 科技 |
2016-40/3983/en_head.json.gz/1909 | EnvironmentNature Conservationist's claims that European eels are 'critically endangered' challenged by trader
Peter Wood, who owns the Gloucester-based UK Glass Eels, says that since the IUCN designation in 2010, the market has shrunk by 50 per cent Lewis Smith
Saturday 9 August 2014 23:00 BST
Common eel elvers have enjoyed a resurgence in recent years Alamy The conservation body that assesses the level of threat to animal species has been reported to the Advertising Standards Agency (ASA) by a company that sells eels – for describing the fish as "critically endangered".The classification, by the International Union for Conservation of Nature (IUCN), suggests the European eel is at greater risk of extinction than giant pandas, blue whales and mountain gorillas, all considered to be merely "endangered".Peter Wood, who owns the Gloucester-based UK Glass Eels, which sells eels as food, said that since the designation in 2010 the European market for eels had shrunk by 50 per cent."There's a lot at stake," he said. "The eel classification is damaging. It's not a balanced view and it's not evidence-based. The giant panda is 'endangered', but it's at far greater risk than the eel – there are only 2,000 of them left. For the eel, we are talking about a population of hundreds of thousands of millions."In a letter to the ASA, he wrote: "It [the IUCN] has a strong bias towards conservation. Its funding is dependent on a conservation agenda and it is supported in part by conservation NGOs such as WWF."Eels suffered a steep decline from the early 1980s with a severe drop in the number of glass eels – young eels, known as elvers, or glass eels, from their transparent appearance – reaching European shores from the Sargasso sea, where they are believed to spawn. Over the past few years, numbers have started rising again. On the river Severn, 30 million were caught this year compared to one million in 2009.A spokeswoman from the ASA said: "The complainant is concerned that the IUCN's advertising, specifically their Red List, which lists the European eel as critically endangered, is misleading."The IUCN did not respond to a request for comment, but its website says there have been "substantial declines" of 90 to 95 per cent "in recruitment of the European eel across wide areas of its geographic range" during the past 45 years. However, it admits that there is a shortage of information about the state of the eel population's health. Reuse content | 科技 |
2016-40/3983/en_head.json.gz/1924 | WeatherPutting the Data TogetherWeatherMeasuring the AtmosphereThe Take on TemperatureIs the Pressure Getting to You?The Wind in the WillowsIt's Not the Heat, It's the HumidityMeasuring the RainPlacing the InstrumentsUpper-Air ObservationsPutting the Data Together
Although languages may differ from one country to another, the weather recognizes no national border. The United Nations set up an agency called the World Meteorological Organization (WMO) to see that the weather information is collected and transmitted with uniformity from one part of the world to another. The WMO consists of 130 nations. Each is required to collect data at certain times and transmit the information in a universal code. The times for collection are based on Universal Time (UT), or Greenwich Mean Time (GMT), which is the local time at the Greenwich Observatory in England. The radiosonde data is collected at 1200 GMT (noon at Greenwich) and 0000 GMT (midnight). Local times depend on the longitude of other locations. Every 15 degrees of longitude away from Greenwich is another hour different. Greenwich is used as a standard because it's located at zero degrees longitude, the prime meridian.
Weather-Speak
GMT—Greenwich Mean Time—is the local time in Greenwich, England, and is measured on the prime meridian (zero degrees longitude). Meteorological and navigational clocks are set to GMT so there will be uniformity in reporting the time of data observation and collection.
After an observation is taken, it's immediately transmitted by teletype, radio, or satellite. The data is collected at several worldwide centers and then processed and retransmitted to the multitude of users. At these centers, such as in Washington, D.C., the data is analyzed and put into super computers that draw charts and project future patterns.
The data is transmitted in a special code and plotted in a particular format on weather maps.
Excerpted from The Complete Idiot's Guide to Weather © 2002 by Mel Goldstein, Ph.D.. All rights reserved including the right
of reproduction in whole or in part in any form. Used by
arrangement with Alpha Books, a member of Penguin Group
(USA) Inc. | 科技 |
2016-40/3983/en_head.json.gz/1941 | Business The Economy Your Money Companies Technology Work Commercial Property Comment All Business The Economy
Unlocking the blockchain enigma
Most simply and accurately described as a digital ledger, the technology, still in its infancy, cannot be ignored Mon, Feb 15, 2016, 09:00
John Holden
The cryptocurrency known as bitcoin and the digital ledger upon which its transactions are recorded, the blockchain, are now being taken seriously by more than just e-pirates. Photograph: Chris Ratcliffe/Bloomberg
“Don’t file this piece until you’re confident you could walk into any bar in the world and explain the blockchain clearly to a complete stranger.” The editor’s final remarks upon commissioning the following words were a little disconcerting. But the idea of getting to work in a bar was attractive. Unfortunately a comprehensive guide to the blockchain isn’t technically possible. This isn’t a cop out. The technology is still in its infancy. So no one knows for certain what is coming next. What is certain, however, is that the cryptocurrency known as bitcoin and the digital ledger upon which its transactions are recorded, the blockchain, are now being taken seriously by more than just e-pirates. The blockchain has been described in a variety of ways: a trust machine, a downloadable movie being created in real time, or more simply and accurately, as a digital ledger. One of the best explanations of why anyone should care came from American entrepreneur and former child actor Brock Pierce. At a talk he gave in 2014 hosted by Wired magazine, Pierce likened it to the internet. To paraphrase: “While the internet has transformed communications, the blockchain has the potential to transform the value chain. It is the internet of value.”
Of course he would say that. Apart from starring in The Mighty Ducks, as a grown-up Pierce has helped establish a number of digital currencies. But those with no vested interest in the technology are also coming out of the crypto-closet. Yes, but what is it? Essentially, the blockchain refers to a shared, public digital ledger where information of any kind can be recorded and stored securely. Anyone who wants to can inspect it but no single user controls it. Similar to how information is verified and maintained on Wikipedia by its users, participants in a blockchain system work collectively to maintain the ledger and make sure it is up to date. Any amendment can only take place by general agreement and is subject to whatever rules and regulations applied to that particular blockchain. So while the “internet of value” is a fair analogy, despite being referred to in the singular, there are many blockchains as opposed to one “internet”. ADVERTISEMENT
The blockchain and cryptocurrency The blockchain can securely store any kind of information but, thus far, it has principally been used as a digital currency platform. With conventional currency, the security of one’s savings is guaranteed by whichever financial institution one chooses to bank with, all of which are overseen by a central regulatory authority. In the case of cryptocurrencies – bitcoin, litecoin, ethereum, or any of the hundreds of others that have emerged since the pseudonymous Satoshi Nakamoto first released the technology back in 2009 – the cryptography, or complex mathematical coding, supporting blockchain technology is the central bank or arbitrary governor of the financial system. How does it work? Any information added to a blockchain is converted into a piece of code, known as a hash, (a series of letters and numbers). The hash is the language of the blockchain. All data recorded on any ledger is translated into hashes. The details of a transaction or record are stored together as blocks of hashes and each block is permanently linked to its predecessor, hence the name, “blockchain”. Visually it could be likened to a tree where each branch can be traced back to the plant’s roots. All information pertaining to a transaction conducted is logged – time, date, number of participants, transaction amount. However, the identity of the parties involved can remain anonymous. This is why it has been a popular method of payment for illicit activity online. While adding information to a blockchain is straightforward, it is virtually impossible to go back and edit or delete previously recorded data. This is the principle reason why this technology is now being taken seriously. It has the potential to become the world’s most secure information storage facility. Each node or connection point in a network owns a full copy of the blockchain. Transactions are verified by so-called “bitcoin miners” who maintain the ledger in return for a fee. The mathematical principles underpinning the technology ensure these nodes automatically and continuously agree about the current state of the ledger and every transaction in it. Why are we still talking about bitcoin? As a further clarification, bitcoin is but one “brand” of cryptocurrency on the blockchain. It is the most widely adopted, mainly because it was the first and, according to some commentators, was designed in the same philanthropic spirit as open-source projects such as Linux or Mozilla. There are hundreds of other cryptocurrencies, and each has its own raison d’être. Anyone with a computer can potentially be a “miner” and there are both individuals and private companies participating in this new industry. The bitcoin currency has 21 million units of exchange. When it was first launched in 2009, 50 bitcoins were released every 10 minutes. Currently, 25 bitcoins are rewarded for mining a successful block. This reward number – by virtue of bitcoin blockchain’s code design, which is based on an asymptotic curve – will be split in half to 12.5 later this year, then 6.25, then 3.125, etc. Supply will diminish in the long term up until the last block is mined. This is a specific characteristic of bitcoin. Other cryptocurrencies are tailored differently – perhaps to match the needs of a particular industry or simply based on the design preferences of the creator. Mining is, for all intents and purposes, a type of lottery. The more lottery tickets – or in this case, computer processing power – you have, the more successful your mining is likely to be. While it’s true anyone with a computer can theoretically mine for bitcoin, a bog standard laptop is the digital equivalent of one lotto line in competition with companies buying up hundreds of thousands of lottery tickets for each and every draw. Miners are motivated to keep the blockchain consistent and unalterable by this reward scheme. ADVERTISEMENT
What are the potential threats to the security of the blockchain? Ask any convert and they will say there aren’t any due to the ingenuity of its mathematical foundations. The more people trying to outsmart it, the more difficult the mathematical equation underpinning the technology becomes. Anyone old enough to remember the movie Ghostbusters II might recall how the evil force, known as Vigo, feeds upon negative human emotion. The more negative people feel, the stronger he gets. While this analogy is both juvenile and tenuous at best, it (kind of) serves as an illustration for the methodology behind blockchain security. Nevertheless, dramatic market value fluctuations as well as theft from people’s digital “wallets”, have occurred. So cryptocurrencies are still susceptible to some of the same flaws inherent in our current method of exchange. There is another potential threat, which even the most zealous supporters of the blockchain will acknowledge. A “51% attack” refers to any single entity contributing the majority of a network’s mining hashrate, in order to gain total control of the network and thus be able to manipulate a blockchain. Anyone with a vested interest in a particular blockchain, however, is not likely to do this because they would be undermining the market, and their own interests in it, that the blockchain in question was built upon. Topics:
Brock Pierce
Satoshi Nakamoto
Deutsche Bank shares rebound after reports of $5.4bn settlement Transport & Tourism
Caveat: There’s no place like home when it comes to tourism
Murder inquiry as body discovered in Dublin mountains Ryder Cup
Deutsche Bank shares rebound after reports of $5.4bn settlement 6
RBS top brass set tongues wagging over meeting with IDA 7
Is love and marriage driving the inequality carriage? | 科技 |
2016-40/3983/en_head.json.gz/2027 | Stereotypes 'change' brains
5,000 boys and 40 girls completed Level 3 engineering apprenticeships last year, Professor Gina Rippon said
Science is losing out because of the mistaken belief that "men are from Mars and women from Venus", a leading neuroscientist has claimed.
Professor Gina Rippon said it was time to debunk the myth that gender differences are hard-wired into our brains.
In reality, there was no significant difference between the brains of a girl and boy in terms of their structure and function, she stressed.
But experiences and even attitudes could change the "plastic" brain on a physical level, causing its wiring to alter.
It was this that led girls and boys from an early age to head in different directions, said Prof Rippon, from Aston University.
While girls tended to tended to gravitate towards fields of communication, people skills and the arts, boys were more likely to become scientists and engineers.
Even when girls went into science, they mostly chose careers at the "softer" end of the subject, such as biology, psychology and sociology, rather than physics and maths.
Speaking ahead of this year's British Science Festival, taking place at the University of Birmingham next week, Prof Rippon said: "We're stuck in the 19th century model of the 'vacuum packed' brain, the idea that we're born with a brain that gives us certain skills and behaviours. "The brain doesn't develop in a vacuum.
"What we now know is that the brain is much more affected by stereotypes in the environment and attitudes in the environment, and that doesn't just change behaviour, it changes the brain."
Last year, 5,000 boys in the UK completed Level 3 engineering apprenticeships, but only 40 girls, Prof Rippon pointed out.
Boys taking physics A level also vastly outnumbered girls. But Prof Rippon insisted this was nothing to do with innate differences in the way the brains of girls and boys worked. Rather, it was likely to be the result of their brains being altered by experience.
One of the most often quoted examples of gender difference is spatial ability - the ability to understand the relationships between different objects in space.
Boys are said to be naturally more spatially gifted.
But if girls aged six to eight are given the tile-matching puzzle game Tetris, their brain wiring changes and their spatial ability improves, Prof Rippon said.
She added: "It's quite clear that spatial cognition is very much involved with experience, whether or not you have experience of manipulating objects as opposed to just observing them.
"This goes back to 'toys for boys'.
"From a very early age, boys have a lot more experience with manipulating objects."
Research had shown that as women attained greater access to education and power, gender differences began to disappear.
Prof Rippon was also dismissive of evolutionary psychologists who claimed the way men and women thought was largely the result of natural selection.
"The idea that women like the colour pink because it made them better able to pick berries - it's nonsense," she said.
Ill-conceived attempts to "fix" the problem of girls not going into science were likely to backfire, Prof Rippon argued.
One infamous example of this was the European Commission's Science Is A Girl Thing video released in 2012 which was swiftly dropped "because it was so awful".
"It showed girls in lab coats testing lipstick and giggling a lot," Prof Rippon said.
She added: "Science is something everybody should engage with.
"Let's not make science girly.
"Let's make science interesting to anyone." | 科技 |
2016-40/3983/en_head.json.gz/2047 | 'Next generation' broadband to transform East Lancashire businesses
EAST Lancashire business leaders will discover how ‘next generation’ broadband can transform their businesses at an network event later this month.
James Roberts, Engineering Manager at 6G internet, will talk about the importance of connectivity at the next meeting of Hive, Blackburn and Darwen’s powerful network of business leaders, at Blackburn Rovers’ Ewood Park.
6G is next generation broadband which utilises ‘air fibre’ technology to deliver speeds of up to 120mbps.
Blackburn recently became the first town in the UK to receive 6G’s new superfast broadband and the Simonstone-based firm believes its ground-breaking technology can help to drive growth within the local economy, as well as attracting fresh investment to the area.
Mr Roberts said: “In today’s digital age the internet is absolutely essential to business so fast, reliable connections are increasingly important.
“It speeds up the way that firms do business, making them more competitive, and by accessing cloud computing services, which enables people to work remotely, they can also become more cost effective too.
“The problem is a lot of companies are still using broadband services that are delivered over an aging copy wire network that was originally designed in Victorian times.
“The average internet speeds in the Blackburn area are 3-4 mbps, which is simply not acceptable in 2014.
“It means businesses in the area are at a competitive disadvantage compared to other areas of the country so more investment is needed to improve connectivity.
“We’ve spent five years, and millions of pounds, developing a solution which is not reliant on the existing infrastructure and can deliver speeds of up to 120 mbps, and potentially even faster in the very near future.
“We believe this can really help to put Blackburn on the map and ultimately help to attract new investment to the area going forwards by enabling businesses to become more agile and competitive.”
In keeping with the theme of ‘Connectivity’, Mark Williams – aka Mr LinkedIn - will also be speaking at the event.
Mark is the UK’s foremost LinkedIn trainer and expert and has inspired and educated thousands of LinkedIn users from a diverse range of industries and roles.
Having opened their new state-of-the-art £7m STEM centre in September 2013, Blackburn College will also be talking to Hive about young people and technology, with the aim of introducing the audience to some of the technologies being tested by the College’s students.
Khalid Saifullah, chairman of Hive and director of Star Tissue UK, said: “Hive is looking to highlight growth opportunities for our members and the wider business community in Blackburn and Darwen. The introduction of superfast broadband would be a catalyst for growth for businesses in a whole variety of sectors in the two towns.
“Any business that has a desire to transform the town, improve the prospects of local people and has a desire to grow and succeed must get involved in this network: together as Hive we will inspire prosperity into Blackburn and Darwen Hive’s ‘Connectivity’ event is taking place at Ewood Park on Tuesday January 28 at 10am and includes a buffet lunch.
To find out more about Hive visit www.business-hive.co.uk. | 科技 |
2016-40/3983/en_head.json.gz/2093 | Facts About Arsenic
By Traci Pedersen, Live Science Contributor |
July 28, 2016 05:00pm ET
A crystal cluster of orpiment, an arsenic sulfide found near volcanic hot springs.
Credit: lphoto | Shutterstock
From the time of the Roman Empire all the way to the Victorian era, arsenic was considered the "king of poisons" as well as the "poison of kings." History is riddled with accounts of both royalty and commoners carrying out assassinations for personal gain using the odorless, tasteless — in other words, poison-perfect — compounds of arsenic. But even with its reputation as a lethal substance, arsenic still holds a very important place in the natural world.
A natural chemical
In the periodic table of the elements, arsenic is No. 33. An arsenic atom has 33 electrons and 33 protons with five valence electrons (those that can participate in forming chemical bonds with other electrons) in its outer shell. Arsenic is a crystalline metalloid found in the Earth's crust, but in its free form it is quite rare. The element is typically found in minerals, such as arsenopyrite, realgar and orpiment, according to the Minerals Education Coalition. Arsenopyrite (FeAsS), an iron arsenic sulfide, also called mispickel, is the most common mineral from which arsenic is obtained, according to Los Alamos.
Arsenic was known as early as the fourth century B.C., when Aristotle referred to one of its sulfides as "sandarach," or red lead, according to Chemicool. Albertus Magnus, a German philosopher and alchemist, first isolated the element in 1250. The word arsenic comes from the Persian word "zarnikh," meaning "yellow orpiment," which the Greeks adopted as "arsenikon," according to the Los Alamos National Laboratory. The word is also related to the Greek word "arsenikos," meaning "masculine" or "potent." The Latin word for it became "arsenicum."
Under standard atmospheric pressure, arsenic sublimes, or changes directly from the solid state to the gaseous state without becoming a liquid. However, it will turn into a liquid when put under high pressure.
Arsenic has a number of forms, or allotropes. The most common is metallic gray, followed by yellow and then black. Gray arsenic, the only form used in industry, is the most stable of the three and the strongest conductor of electricity. Arsenic occurs naturally in the environment in both organic (arsenic atoms bonded with carbon) and inorganic (no carbon) forms. Inorganic arsenic, the most abundant type, occurs with many other elements, particularly sulfur, oxygen and chlorine. Inorganic arsenic is the type associated with more adverse health effects for humans.
Electron configuration and elemental properties of arsenic.
Credit: Greg Robson/Creative Commons, Andrei Marincas Shutterstock Just the facts
Atomic number (number of protons in the nucleus): 33
Atomic symbol (on the periodic table of the elements): As
Atomic weight (average mass of the atom): 74.92160
Density: 5.776 grams per cubic centimeter
Phase at room temperature: solid
Melting point: 1,502.6 degrees Fahrenheit (817 degrees Celsius)
Boiling point: 1,117.4 F (603 C)
Number of isotopes (atoms of the same element with a different number of neutrons): 33; 23 whose half-lives are known; 1 stable
Most common isotopes: As-75 (100 percent natural abundance)
Dangers of arsenic
Even when there is no foul play involved, arsenic still poses a danger, as lethal levels may be leaked into people's water, food or air supply. The most urgent concern is drinking water, and for some places, the risk of arsenic contamination is particularly high. In 2001, the Environmental Protection Agency (EPA) adopted a lower standard for arsenic in drinking water. The new arsenic standard of 10 parts per billion (ppb) replaced the old standard of 50 ppb.
Bruce A. Stanton, a professor in the department of microbiology and immunology at the Geisel School of Medicine at Dartmouth College in New Hampshire, said that in many states, "arsenic can be found in well water at levels above the EPA standard of 10 ppb."
"Well-water arsenic is above the EPA standard in as many as one in five wells in New Hampshire and many other states, including Maine, Michigan, California, New Mexico, Arizona, Colorado and Nevada," he told Live Science. As for food, the Food and Drug Administration (FDA) recently turned its attention toward rice, as it tends to absorb arsenic more readily than other crops do. And since rice is a staple in the diets of many infants and young children, the FDA has been closely monitoring rice for safety, ensuring that infant rice cereal stays under 100 parts per billion (ppb) for levels of inorganic arsenic.
One study, published in the Nutrition Journal, suggested that other types of foods, including white wine, beer and Brussels sprouts, may be linked to higher levels of arsenic in humans as well. The FDA has also taken steps to monitor apple juice.
Arsenic poisoning can cause all sorts of health problems. A large dose can cause immediate sickness and death, while long-term exposure is associated with higher rates of skin, bladder and lung cancers, as well as heart disease, according to the FDA.
"At levels found in well water in the U.S. (10-100 ppb), ingestion of well water containing arsenic reduces IQ and has many other adverse health effects, include birth defects," said Stanton. "The good news is that arsenic can be detected in well water by inexpensive tests and we can protect ourselves from arsenic exposure in well water by filtration (Zero Water tabletop water filters) and other methods."
An essential nutrient
Paracelsus, a 16th-century Swiss German philosopher and toxicologist, once famously said, "all things are poison, and nothing is without poison. Only the dose permits something not to be poisonous."
But can a toxic element really be necessary for life? A growing body of evidence says yes. Some toxic metals, in trace amounts, might actually be essential nutrients, according an analysis in the journal EMBO Reports. In fact, scientists have found that the body needs arsenic, at a level of 0.00001 percent, to grow and maintain a healthy nervous system, according to Chemicool.
As early as 82 B.C., the Roman dictator Lucius Cornelius Sulla attempted to end a rash of arsenic poisonings by passing the Lex Cornelia, the first known law against poisoning, according to Dartmouth College. In 1836, a British chemist named James Marsh finally developed a test that could detect minuscule amounts of arsenic in both food and human remains. The epidemic of arsenic poisoning finally began to wane.
Although unproven, there is a persistent rumor that Napoleon Bonaparte was slowly poisoned with arsenic by someone in his cortege, which eventually led to his death in 1821. And while it is generally recognized today that he died of an advanced case of gastric cancer, many still believe that arsenic played a role.
Perhaps the most famous of arsenic poisoners were the Borgias, a power-grabbing family in Italy who, partly due to their strategic arsenic poisonings of the wealthy and prominent, soon became the most powerful family during the Renaissance period. In the Victorian era, white arsenic, or arsenic trioxide (As2O3), was widely available and sold in grocery stores. Women would eat or rub arsenic mixed with vinegar or chalk into their skin as a complexion-enhancer, trying to make their skin paler to show they did not work in the fields. Agricultural use
A chunk of arsenopyrite, the most common source of arsenic.
Credit: Oreena Shutterstock Since arsenic is such a strong toxin, farmers, as well as the U.S. government, in the early 20th century believed it would be a good idea to make rodent poisons and crop pesticides out of the substance. It took several decades for everyone to realize what a terrible idea it was to spray this carcinogenic chemical on the food supply. In the 1980s, all of these arsenate pesticides were finally banned, but some of the residue still lingers in the soil today, according to The Lead Group, Inc.
Starting in the 1940s, arsenic-treated wood preservatives, such as chromated copper arsenate (CCA), were widely used to prevent rotting in lumber. Although these preservatives still aren't officially banned, manufacturers voluntarily stopped the production of arsenic-treated wood products in 2003, according to the EPA.
Medical use
In 1786, a British physician named Thomas Fowler presented his arsenic-based, cure-all tonic known as Fowler's solution. The tonic was commonly used to treat skin conditions, such as psoriasis. Unfortunately, it became apparent that people who used the product had a significantly higher risk of developing cancer, particularly on the exact spot the solution was applied. Its use was phased out between the 1930s and 1950s, according to Dartmouth College.
In 1910, German pharmacologist Paul Ehrlich developed the arsenic-based drug Salvarsan, also known as arsphenamine, as a treatment for syphilis, a disease that was endemic and incurable at the time. The drug was incredibly effective and remained the top medicine for curing syphilis until penicillin became available in the 1940s, according to Chemical & Engineering News. Ehrlich's development of Salvarsan was the first step toward targeted chemotherapy. Today, arsenic trioxide is a very effective drug used to treat people with acute promyelocytic leukemia, according to Stanton.
Industrial use
Arsenic is sometimes alloyed with lead to form a harder, more durable metal. Some areas of use include car batteries and bullets. Until recently, arsenic was commonly used in glassmaking. However, due to pressure from the EPA and environmentalists, most glass manufacturers have slowed down or stopped using arsenic.
According to the Los Alamos National Laboratory:
Arsenic is often used as a doping agent for solid-state devices such as transistors.
Gallium arsenide is used in lasers that convert electricity into coherent light.
Arsenic compounds, such as Paris green, calcium arsenate and lead arsenate, were used as insecticides and in other poisons.
Arsenic is used in pyrotechnics to give additional color to the flame.
Arsenic improves the sphericity of lead shot.
Dartmouth — Arsenic: A Murderous History
Los Alamos National Laboratory: Arsenic
Chemicool: Arsenic Element Facts
Napoleon Death: Arsenic Poisoning Ruled Out
How Alleged Arsenic Bacteria Survive Toxic Lake
Author Bio Traci Pedersen, Live Science Contributor
Traci Pedersen is a freelance author who has written extensively on themes of science, psychology, religion and alternative health for a variety of publications. She has also written 14 science chapter books and numerous teacher resource books for the elementary classroom. She is constantly brainstorming how to turn age-old topics into new and exciting stories.
Latest on Facts About Arsenic
FDA May Limit Arsenic in Infant Cereals
These 30-Million-Year-Old Fossilized Flowers May Be Toxic
Chilean Mummies Reveal Signs of Arsenic Poisoning
In Photos: Chilean Mummy Shows Signs of Arsenic Poisoning
White Wine and Beer Important Sources of Arsenic
12 Worst Hormone-Disrupting Chemicals & Their Health Effects
12 Worst Hormone-Disrupting Chemicals Revealed Low Arsenic Levels Linked with Heart Disease
Arsenic in Rice Not a Risk Over Short Term, FDA Finds
Why Is Arsenic Bad for You? Reducing Arsenic in the Human Food Chain
How Does Arsenic Get into Rice? 5 Things You Should Know About Arsenic Arsenic in Rice: FDA Suggests People Vary Their Diet for Now Arsenic Found in Organic Baby Formula, Cereal Bars
Arsenic in Rice: Something to Worry About?
Arsenic in Chicken Leads to Stopped Drug Sales: FDA
Debate Reignited Over Claim of Arsenic-Based Life
Life's Great Mystery: What, Exactly, Is Life? | 科技 |
2016-40/3983/en_head.json.gz/2117 | Apple: older devices won't get all features of iOS 6
updated 02:41 am EDT, Wed June 13, 2012
Hints that iPhone 3GS lifespan is limited
Apple has added a few qualifiers to its claim that iOS 6 is supported on older devices such as the iPhone 3GS and the current iPod Touch. While it was inherently understood that some features, such as Siri and FaceTime, would naturally require the latest and most powerful devices, the company has now set out a precise list of limitations on what features of iOS 6 will -- and won't -- work on a given device. In some cases, the limitations seem arbitrary.For example, the Safari feature Offline Reading List is said to require the iPhone 4 or iPad 2 or later, leaving out both the 3GS and the fourth-generation iPod Touch, there there appears to be no clear technical reason why the devices couldn't implement the undemanding feature for storing selected articles and sites for offline reading. One possible explanation is that the older devices generally have much more limited storage space than most modern iPhone and iPad units, perhaps an area that Apple was concerned about in terms of widespread user experience.
The inclusion of the iPhone 3GS as compatible with iOS 6 has already raised issues about how Apple drew their lines. The original iPad, for example, is completely unsupported and yet has a newer and more powerful processor than the iPhone 3GS. The rationale on the decision to exclude the 2010 iPad may simply stem from the fact that the 3GS is still in production and Apple wanted to allow it to run the latest OS.
Of the iOS 6 features highlighted in Monday's WWDC keynote address, the low-end iPhone 3GS and fourth-gen iPod Touch will be unable to access Shared Photo Streams, and will even be left out of the VIP and Flagged "smart mailboxes" in Mail. Due to technical limitations, the two devices also won't be able to use features such as Siri, FaceTime over cellular, and the turn-by-turn navigation and "Flyover" features of Maps. The iPhone 3GS and iPod Touch (fourth generation) will still be able to enjoy the enhanced Notification Center, Facebook integration, lockscreen call-answering and other options, the new Passbook app, smart app banners, Guided Access, the expanded iMessage and more.
Owners of the iPad 2 are not left out of much by comparison, but will miss out on Siri and FaceTime over cellular. The iPhone 4 will be prohibited from the 3D "Flyover" and turn-by-turn GPS parts of Maps, and can't use Siri or FaceTime over cellular. All but the iPhone 4S are excluded from compatibility with the "Made for iPhone" hearing aids.
The fact that the limitations go beyond just obvious technical concerns may be a signal from Apple that although the iPhone 3GS has lasted far longer than most expected, its time as an in-production unit may be drawing to a close. The phone could conceivably be phased out as soon as the introduction of the next model iPhone, widely expected for the fall, or be kept around until sales naturally drop off.
The iPhone 3GS is currently seen as a crucial "entry-point" iPhone capable of giving new customers the iOS experience at a "free with contract" price point, and thus it was important that it be seen as running the latest version of iOS. Whenever the next iPhone is introduced, however, Apple will then have four models to support if it intends to keep the iPhone 3GS around.
facebook_Jeff 06/13, 03:04am Sounds like fragmentation to me.
chas_m 06/13, 03:49am I see your point, but I'm not quite convinced it's the same thing. Fragmentation on Android = a bunch of third parties that decide whether you can ever upgrade your device or not. Obviously that's not an issue in iOS.
Fragmentation on Android means MOST users are on VERY old versions of the OS. Again, not an issue on iOS.
Fragmentation on Android means that developers must write custom versions of their apps almost every device it's going to be on. Again, not an issue on iOS -- if the hardware is too old to support the experience, it just isn't made available for that hardware, end of story.
If they keep the 3GS as a production phone long past the introduction of the next iPhone, then I think you may have a case for (relatively small but still legitimate) charge of fragmentation. Doubt that's going to happen though.
global.philosopher 06/13, 04:31am I can't see why facetime or turn by turn could not work on the iPad 2 or iphone 4.
It is not fragmentation...it is unnecessary obsolescence.
wrenchy 06/13, 04:38am Buy an new iPhone.
It's what Apple want to hear.
revco 06/13, 05:17am ...if you have a 4th gen iPod touch. Looks like they have the least to gain from the iOS upgrade.
sixcolors 06/13, 06:17am I can just see the meeting where this got decided... "So what about current and past customers?" someone says.
Scott Forstall just raises his middle finger and carries on with his lame presentation on how to f*** customers...
Grendelmon 06/13, 09:31am WTF? Thx, Apple!
Eldernorm 06/13, 09:35am If you have a problem with Apple not giving most of the features to all the older equipment, then buy a new android phone. Many still are being sold with old OS systems that WILL NOT be updated. I give Apple a passing grade for at least trying to keep the older systems at least partly current. Siri needs dual mikes to filter noise so anything less than iPhone 4s does not work. Ok, I understand that. Comment buried. Show
Inbloom 06/13, 09:52am By the way. The iPhone 4 has dual mics. Having dual mics was a selling feature of the iPhone 4 when Apple released it in 2010. Comment buried. Show
Stuke 06/13, 12:33pm ... "The iPhone 4 will be prohibited from the 3D "Flyover" and turn-by-turn GPS parts of Maps, and can't use Siri or FaceTime over cellular." ...
So, I'll get Siri on the iPhone 4 but can only use it when connected via WiFi ??
global.philosopher 06/13, 09:20pm I expect more than just a pass from Apple! | 科技 |
2016-40/3983/en_head.json.gz/2119 | Apple asks court to add 6 more Samsung devices to patent suit
By Jeremy Kirk, IDG News Service
Apple is seeking to add six Samsung products in a patent infringement lawsuit between the electronic giants scheduled to start in U.S. District Court for the Northern District of California in March 2014.
The motion, filed Friday, asked the court to add Samsung's Galaxy S III, the Galaxy Note II, the Galaxy Tab 8.9 Wi-Fi, the Galaxy Tab 2 10.1, the Galaxy Rugby Pro—all of which have been released in the last two months—as well as the Galaxy S III Mini, which is due to be released soon in the U.S.
"In short, Apple has acted quickly and diligently to determine that these newly-released products do infringe many of the same claims already asserted by Apple and in the same way that the already accused devices infringe," the motion said.
Apple filed the suit against Samsung on February 8. Samsung responded by filing a counterclaim in the same court, alleging that Apple violates feature patents and two UMTS (Universal Mobile Telecommunications System) patents.
Last week, the Korean company filed a motion to add the iPad mini, the fourth-generation iPad, and the latest version of the iPod touch to its cross-complaint. The iPhone 5 has already been added to the counter claim.
On August 24, a jury in the Northern California court found that Samsung infringed on Apple patents including those that covered multitouch software, hardware designs and application icon designs, awarding Apple over US$1 billion in damages.
Send news tips and comments to jeremy_kirk@idg.com. Follow me on Twitter: @jeremy_kirk
Samsung, Apple expand patent lawsuits; iPhone 5 added
Email "Apple asks court to add 6 more..." | 科技 |
2016-40/3983/en_head.json.gz/2126 | National Planetarium Home > Kuala Lumpur > National Planetarium
National Planetarium
National Planetarium situated at the hill of Kuala Lumpur Lake Garden, surrounding by National Mosque, National Museum and KL Bird Park, National Planetarium is the place where the journey to space begin. Combination of the Islamic architecture and astronomy, makes the National Planetarium resembles a mosque, but combined with a futuristic look.
National Planetarium History
The year 1994 heralded a new era in space science and technology for Malaysia with the official opening of the National Planetarium. The event not only focused the nation's attention on the overall design and architecture and unique facilities of the Planetarium but also brought to light the government's serious commitment to the development of space science and technology. The National Planetarium started as the Planetarium Division in the Prime Minister's Department in 1989. The construction of the National Planetarium complex began in 1990 and was completed in 1993. A soft launch to the public began in May 1993 and it was officially opened by the former Prime Minister, Tun Dr. Mahathir Mohamad, on 7 February 1994. In July 1995, after one and a half year of smooth operation, the Division was transferred to the Ministry of Science, Technology and the Environment.
Taking three years to build, the National Planetarium complex houses a tilted dome theatre at its very heart. The unique design of the theatre posed a great challenge to the project architects, engineers and building contractors. The exacting requirements of the planetarium and the large-format film projectors posed even more formidable challenges. Many difficulties had to be overcome and it took two years for all equipment and exhibits to be completely commissioned. In spite of the difficulties, the National Planetarium has been able to carry out several successful educational projects with public schools and the public. Now the National Planetarium plays a crucial role in promoting space science to society at large and leading Malaysia towards the development of space science.
Given the complexity where many different suppliers had to be precisely coordinated, coupled with the fact that no one in the country had any previous experience in numerous aspects of the works to be done, the successful completion of the complex and commissioning of all the equipment in it with minimum delay was an outstanding achievement.
National Space Agency
National Planetarium,
53, Jalan Perdana, 50480 Kuala Lumpur, MALAYSIA
Tel : +603-2273 4303 Fax : +603-2273 5488
National Planetarium location map
Information Search here
KL Bird Park
KL National Museum
Home | Contact Us | Map Request | Submit Info | National Planetarium
Power By Entertop.net | 科技 |
2016-40/3983/en_head.json.gz/2162 | Eugene Robinson: Warm enough for you?
Skeptics and deniers can make all the noise they want, but a landmark new report is unequivocal: There is a 95 percent chance that human-generated emissions of carbon dioxide and other greenhouse gases are changing the climate in ways that court disaster. That�s the bottom line from the Intergovernmental Panel on Climate Change, which Monday released the latest of its comprehensive, every-six-years assessments of the scientific consensus about climate change. According to the IPCC, there is only a 1-in-20 chance that human activity is not causing dangerous warming. You may like those betting odds. If so, let�s get together for a friendly game of poker, and please don�t forget to bring cash. The squawking from naysayers has recently been all about a supposed �pause� in global warming. They say there has been no detectable warming in the past 15 years and claim that any temperature rise that scientists attribute to human activity is really part of some grand natural cycle -- probably nothing to worry about, and, in any event, nothing we can control. One look at the data indicates that the skeptics� view is wishful thinking, at best. It is true that if you look at the period 1998-2013, there is very little warming. But that is because 1998 was an extreme outlier -- a sharp spike on the graph. That year was much warmer than the preceding or subsequent few years. If you plot global temperatures over a longer time period, covering 50 or 100 years, you get a line that jiggles up and down but generally trends upward at an alarming slope. Look closely and you�ll notice that 2005 and 2010 were both a bit warmer than 1998. Why is this happening? Because �the atmospheric concentrations of carbon dioxide, methane, and nitrous oxide have increased to levels unprecedented in at least the last 800,000 years,� according to the executive summary of the 2,000-plus-page IPCC report. By processes well-known to science and reproducible in the laboratory, these gases trap solar heat. Since the Industrial Revolution, when humans began burning fossil fuels on a vast scale, the concentration of carbon dioxide in the atmosphere has increased by an astounding 40 percent. Unless all the physics textbooks are wrong, this causes warming. Atmospheric science is difficult because there are so many variables involved. Heat can be trapped in the depths of the oceans, thus mitigating its effect on surface temperatures -- for a time. Volcanic eruptions spew particles into the atmosphere that block some measure of sunlight. El Nino and La Nina changes in Pacific Ocean currents are associated with seasonal or yearly temperature fluctuations. Nevertheless, the infamous �hockey stick� graph showing global temperatures rising over time, first slowly and then sharply, remains valid. �Continued emissions of greenhouse gases will cause further warming and changes in all components of the climate system,� the IPCC summary says. �Limiting climate change will require substantial and sustained reductions of greenhouse gas emissions.� Prospects for those reductions are iffy. The sluggish economy, higher automotive fuel economy standards and the substitution of natural gas for coal in many power plants have helped keep U.S. emissions in check. But China is by far the world�s biggest emitter, and while there are signs that Chinese leaders now consider climate change important, it is unclear how the economy can continue its rapid growth without relying on coal. The IPCC report says it is �very likely� that heat waves will become more frequent and more intense. It is also �very likely� that extreme rainfall events, such as the deluge in Colorado last month, will become more common; that �Arctic sea ice cover will continue to shrink and thin� in coming decades; that �Northern Hemisphere spring snow cover will decrease� and that �global glacier volume will further decrease.� Sea levels will continue to rise because of warming -- water expands as it heats -- and because of glacial melting. This has implications for coastal populations not just in places such as Calcutta or Dhaka but also in rich and powerful cities such as New York: Witness the massive flooding and storm-surge damage caused by Superstorm Sandy. �A large fraction of anthropogenic climate change resulting from CO2 emissions is irreversible on a multi-century to millennial time scale,� the IPCC says. In other words, this is the world we have made. Get used to it. But it is within our power adapt to climate change and keep it from getting much worse. The first step in Carbonoholics Anonymous is admitting we have a problem. Eugene Robinson�s email address is eugenerobinson@washpost.com. Washington Post Writers Group | 科技 |
2016-40/3983/en_head.json.gz/2166 | NINTENDO DOWNLOAD HIGHLIGHTS NEW DIGITAL CONTENT FOR NINTENDO SYSTEMS
Provided by Games Press Thursday, June 23rd 2011 at 3:02PM BST
June 23, 2011 This week's Nintendo Download includes the following featured games: Nintendo eShop and Nintendo DSiWare™ • Pro Jumper! Guilty Gear Tangent!? – Chimaki from Guilty Gear 2 stars in this quirky platformer. Make use of his run, jump and towel attacks to collect apples and successfully bypass the legions bent on his destruction. (For Nintendo 3DS™/Nintendo DSi™) WiiWare™ • The Mystery of Whiterock Castle – Experience a thrilling search for a missing princess. Find all the hidden objects in 10 meticulously detailed settings as you get to know the various inhabitants and rooms of the castle. (For Wii™) Nintendo eShop and Nintendo DSiWare • Stratego: Next Edition – Demonstrate your strategic skills on the battlefield against opponents who will do everything in their power to outsmart you. Rise to the challenge and become the greatest general of all time. (For Nintendo 3DS/Nintendo DSi) Also new this week: • Tennis (Nintendo eShop) • Delbo (Nintendo eShop, Nintendo DSiWare) To view this week's Nintendo Download in its full graphical version, please visit: http://www.news2know.net/nintendo/download-062311.php In addition to video games available at retail stores, Nintendo also offers a variety of content that people can download directly to their systems. Nintendo adds new titles to the Nintendo eShop for the Nintendo 3DS™ system, to the Nintendo DSi™ Shop for the Nintendo DSi system and to the Wii™ Shop Channel for the Wii console at 9 a.m. Pacific time on Thursdays. The Nintendo eShop is a cash-based service and features games, applications and videos in both 2D and 3D. Users can add money to their virtual wallets using a credit card or by purchasing a Nintendo 3DS Prepaid Card at a retail store and entering the code from the card. The Wii Shop Channel offers games and applications and uses Wii Points™, which can be purchased via the Wii Shop Channel. The Nintendo DSi Shop offers games and applications and uses Nintendo DSi Points™, which can be purchased in the Nintendo DSi Shop. A Nintendo Points Card™ can be purchased at retail locations. All points from one Nintendo Points Card must be redeemed in either the Wii Shop Channel or the Nintendo DSi Shop. They are not transferable and cannot be divided between the two systems. Remember that Wii, Nintendo 3DS and Nintendo DSi feature parental controls that let adults manage some of the content their children can access. For more information about this and other features, visit http://www.nintendo.com/wii, http://www.nintendo.com/3ds or http://www.nintendo.com/ds/systems/dsi. Games Press is the leading online resource for games journalists. Used daily by magazines, newspapers, TV, radio, online media and retailers worldwide, it offers a vast, constantly updated archive of press releases and assets, and is the simplest and most cost-effective way for PR professionals to reach the widest possible audience. Registration for the site and the Games Press email digest is available, to the trade only, at www.gamespress.com. | 科技 |
2016-40/3983/en_head.json.gz/2273 | Supercomputing time awarded to design transformational lithium air battery
(Nanowerk News) The Department of Energy announced today that 24 million hours of supercomputing time out of a total of 1.6 billion available hours at Argonne and Oak Ridge National Laboratories have been awarded to investigate materials for developing lithium air batteries, capable of powering a car for 500 miles on a single charge. Through the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, a research team including scientists from Oak Ridge National Laboratory, Argonne National Laboratory and IBM will use two of the world's most powerful supercomputers to design new materials required for a lithium air battery. Lithium-ion batteries, used in today's emerging plug-in hybrid electric vehicles, currently have a range of approximately 40 to 100 miles.
The calculations will be performed at both Oak Ridge and Argonne, which house two of the world's top ten fastest computers.
"Computation and supercomputing are critical to solving some of our greatest scientific challenges," said Secretary Chu. "This year's INCITE awards reflect the enormous growth in demand for complex modeling and simulation capabilities, which are essential to improving our economic prosperity and global competitiveness."
The INCITE program provides a collection of unique computational resources that enable scientists and engineers to conduct cutting-edge research in weeks or months rather than the years needed previously. The use of scientific modeling can accelerate scientific breakthroughs in areas such as climate change, alternative energy, life sciences, and materials science.
Oak Ridge National Laboratory Director Thom Mason said the battery project was the result of two visits to Oak Ridge in 2009 by IBM's vice president of research. "From those discussions, it became apparent that our partnership had many of the unique capabilities needed to tackle a scientific problem as important and challenging as increasing by more than a factor of 10 the energy stored in batteries for transportation."
"Argonne is committed to developing lithium air technologies," said Argonne Director Eric Isaacs. "The obstacles to Li-air batteries becoming a viable technology are formidable, but the modeling and simulation capabilities of DOE's supercomputers will help us accelerate the innovations required in materials science, chemistry and engineering."
About INCITE
The Innovative and Novel Computational Impact on Theory and Experiment program promotes cutting-edge research that can be conducted only with state-of-the-art supercomputers. Leadership Computing Facilities at Argonne and Oak Ridge national laboratories operate the program. The Leadership Facilities award sizeable allocations (typically tens of millions of processor hours per project) on supercomputers to researchers from academia, government and industry. The projects address "grand challenges" in science and engineering such as developing new energy solutions and gaining a better understanding of climate change resulting from energy use. The INCITE program, which was recently recognized by HPCwire for high-performance computing collaborations between government and industry, also includes several supercomputing allocations for companies such as General Electric and IBM to help develop technologies beneficial to the U.S. economy. Source: Oak Ridge National Laboratory | 科技 |
2016-40/3983/en_head.json.gz/2289 | Search Field: You are hereHome > Climate Information > Science Papers and Publications > BAMS State of the Climate BAMS State of the Climate International report confirms 2015 was Earth’s warmest year on record
Strong El Niño boosted long-term warming trends
The State of the Climate in 2015 report will be published in the August 2016 issue of the Bulletin of the American Meteorological Society.
A new State of the Climate report confirmed that 2015 surpassed 2014 as the warmest year since at least the mid-to-late 19th century.
Last year’s record heat resulted from the combined influence of long-term global warming and one of the strongest El Niño events the globe has experienced since at least 1950. The report found that most indicators of climate change continued to reflect trends consistent with a warming planet. Several markers such as land and ocean temperatures, sea levels and greenhouse gases broke records set just one year prior.
These key findings and others are available from the State of the Climate in 2015 report released online today by the American Meteorological Society (AMS).
The report, led by NOAA National Centers for Environmental Information, is based on contributions from more than 450 scientists from 62 countries around the world and reflects tens of thousands of measurements from multiple independent datasets (highlights, full report). It provides a detailed update on global climate indicators, notable weather events and other data collected by environmental monitoring stations and instruments located on land, water, ice and in space.
“This ‘annual physical’ of Earth’s climate system showed us that 2015’s climate was shaped both by long-term change and an El Niño event,” said Thomas R. Karl, L.H.D., Director, NOAA National Centers for Environmental Information. “When we think about being climate resilient, both of these time scales are important to consider. Last year's El Niño was a clear reminder of how short-term events can amplify the relative influence and impacts stemming from longer-term global warming trends.”
The report’s climate indicators show patterns, changes, and trends of the global climate system. Examples of the indicators include various types of greenhouse gases; temperatures throughout the atmosphere, ocean, and land; cloud cover; sea level; ocean salinity; sea ice extent; and snow cover.
Report highlights include these indications of a warming planet:
Greenhouse gases highest on record. Major greenhouse gas concentrations, including carbon dioxide (CO2), methane and nitrous oxide, rose to new record high values during 2015. The annual average atmospheric CO2 concentration at Mauna Loa, Hawaii, the location of the world’s longest direct measurements of CO2, was 400.8 parts per million (ppm), which surpassed 400 ppm for the first time. This was 3.1 ppm more than 2014, and was the largest annual increase observed in the 58-year record. The 2015 average global CO2 concentration was not far below, at 399.4 ppm, an increase of 2.2 ppm compared with 2014.
Global surface temperature highest on record. Aided by the strong El Niño, the 2015 annual global surface temperature hit record warmth for the second consecutive year, easily surpassing the previous record set in 2014 by more than 0.1°C (0.2°F). This exceeded the average for the mid- to late 19th century — commonly considered representative of preindustrial conditions — by more than 1°C (1.8°F) for the first time. Across land surfaces, record to near-record warmth was reported across every inhabited continent.
Sea surface temperatures highest on record. The globally averaged sea surface temperature was also the highest on record, breaking the previous mark set in 2014. The highest temperature departures from average occurred in part of the northeast Pacific, continuing anomalous warmth there since 2013, and in part of the eastern equatorial Pacific, reflective of the dominant El Niño. The North Atlantic southeast of Greenland remained colder than average and was colder than 2014.
Global upper ocean heat content highest on record. Globally, upper ocean heat content exceeded the record set in 2014, reflecting the continuing accumulation of thermal energy in the upper layer of the oceans. Oceans absorb over 90 percent of Earth’s excess heat from global warming.
Global sea level highest on record. Global average sea level rose to a new record high in 2015 and was about 70 mm (about 2¾ inches) higher than the 1993 average, the year that marks the beginning of the satellite altimeter record. Over the past two decades, sea level has increased at an average rate of 3.3 mm (about 0.15 inch) per year, with the highest rates of increase in the western Pacific and Indian Oceans.
Extremes were observed in the water cycle and precipitation. A general increase in the water cycle, combined with the strong El Niño, enhanced precipitation variability around the world. An above-normal rainy season led to major floods in many parts of the world. But globally, areas in “severe” drought rose from 8 percent in 2014 to 14 percent in 2015.
The report also documents key regional climate and climate-related events:
The Arctic continued to warm; sea ice extent remained low. The Arctic land surface temperature tied with 2007 and 2011 as the highest since records began in the early 20th century, representing a 2.8°C (5°F) increase since that time. Average sea surface temperatures across the Arctic Ocean during August in ice-free regions ranged from near normal to 8°C above average. Increasing temperatures have led to decreasing Arctic sea ice extent and thickness. In February 2015, the maximum sea ice extent reached its largest footprint of the year, and was the smallest in the 37-year satellite record, at 7 percent below the 1981–2010 average. Minimum sea ice extent in September 2015 was 29 percent less than the 1981–2010 average and the fourth lowest value on record.
Arctic animals and fish were impacted by changing climate. As a consequence of sea ice retreat from warming oceans, vast walrus herds in the Pacific Arctic are hauling out on land rather than on sea ice, raising concern about the energy requirements of females and young animals. Increasing temperatures in the Barents Sea are linked to a shift in fish populations: warmer-water fish are moving farther north, and long-standing Arctic species, such as gelatinous snailfish and polar cod have been almost pushed out of the area.
Widespread harmful algal bloom impacts northeast Pacific. In the late spring and summer 2015, a widespread harmful algal bloom (HAB), stretching off the west coast of North America from central California to British Columbia, Canada, resulted in significant impacts to marine life, coastal resources and the human communities that depend on these resources. Unusually warm surface water in the Pacific is considered a factor regarding the severity and early onset of the bloom.
Antarctic temperatures were colder than average; sea ice extent was highly variable. Temperatures across Antarctica were lower than the 1981–2010 average for most of the year. Antarctic sea ice extent and area had large variability during the year, shifting from record high levels in May to record low levels in August.
Global ice and snow cover decline. Preliminary data indicate that 2015 was the 36th consecutive year of overall alpine glacier retreat across the globe. Across the Northern Hemisphere, late-spring snow cover extent continued its trend of decline, with June the second lowest in the 49-year satellite record. Below the surface, record high temperatures at the 20-meter (65-feet) depth were measured at all permafrost observatories on the North Slope of Alaska, with temperatures increasing by up to 0.66°C (1.2°F) per decade since 2000.
Tropical cyclones were well above average overall. There were 101 tropical cyclones across all ocean basins in 2015, well above the 1981–2010 average of 82 storms. The eastern/central Pacific had 26 named storms, the most since 1992. The western North Pacific and north and south Indian Ocean basins also saw high activity. Similar to 2013 and 2014, the North Atlantic season was quieter than most years of the last two decades with respect to the number of storms. The overall strength of the North Atlantic storms were weaker as well, just 68 percent of the 1981–2010 median seasonal value, with Major Hurricane Joaquin accounting for nearly one-half that value.
The State of the Climate in 2015 is the 26th edition in a peer-reviewed series published annually as a special supplement to the Bulletin of the American Meteorological Society. “The State of the Climate report continues to be critically important as it documents our changing climate. AMS is proud to work with so many from the science community to make this publication happen," said Keith Seitter, Executive Director of AMS. The journal makes the full report openly available online.
Links associated with the State of the Climate in 2015 report:
Full report (BAMS)
Permanent link for report materials (NCEI)
Visual highlights (NOAA Climate.gov)
BAMS State of the Climate Home
Past Reports | 科技 |
2016-40/3983/en_head.json.gz/2295 | Nature Geoscience Share on FacebookTweetShareShareEmailRedditAllNews'Nature Geoscience' - 8 News Result(s)A Violent Sun And A Sky Full Of Laughing Gas Could Have Led To Life On EarthWorld News | Sarah Kaplan, The Washington Post | Tuesday May 24, 2016 A coronal mass ejection hits Earth's atmosphere like a cannonball. The cloud of solar matter, created during storms on the sun, will fizzle through the thinnest parts of Earth's protective magnetic field and send waves of light dancing across the polar night skies. Cleaner Air Could Actually Make Global Warming WorseWorld News | Chelsea Harvey, The Washington Post | Tuesday March 15, 2016 Two new studies, both released Monday in the journal Nature Geoscience, address the powerful influence of aerosols - fine particles or drops of liquid often released by industrial activity - on the climate, and suggest that as nations around the world work to reduce this type of air pollution, we will begin to see more rapid warming than expected. ... Why Earth Became Habitable and Not Venus?World News | Indo-Asian News Service | Wednesday July 22, 2015 Information about the improbable evolutionary path that enabled Earth and not Venus to sustain life has been found, shows research. New maps show smallest planet Mercury is even smallerWorld News | Reuters | Monday March 17, 2014 Detailed maps of Mercury's cliffs and ditches show the solar system's innermost and smallest planet Mercury has lost much more real estate due to cooling over four billion years than scientists thought, according to a report published on Sunday. New ozone-depleting gases found in atmosphereWorld News | Agence France-Presse | Monday March 10, 2014 Worried scientists said today that they had found four new ozone-destroying gases in the atmosphere, most likely put there by humans in the last 50-odd years despite a ban on these dangerous compounds. Zircon crystal: Scientists find oldest piece of EarthWorld News | Reuters | Monday February 24, 2014 Scientists using two different age-determining techniques have shown that a tiny zircon crystal found on a sheep ranch in western Australia is the oldest known piece of our planet, dating to 4.4 billion years ago. Magma clue to Earth's super-volcanoesWorld News | Agence France-Presse | Sunday January 5, 2014 Geologists on Sunday reported insights into super-volcanoes, the brooding, enigmatic giants of Earth's crust whose eruptions are as catastrophic as they are rare. Prehistoric Indian Ocean mini-continent foundIndia News | Agence France-Presse | Monday February 25, 2013 Scientists said on Sunday they had found traces of a micro-continent hidden underneath the Indian Ocean island of Mauritius. More News on Nature Geoscience»................................ Advertisement ................................................................ Advertisement ................................RSSNews AlertsMobileAppsAppleAndroidWindowsFacebookTwitterGoogle +LinkedInAbout UsArchivesAdvertiseFeedbackDisclaimerInvestorComplaint RedressalOmbudsmanCareersService TermsChannel Distribution© Copyright NDTV Convergence Limited 2016. All rights reserved.TweetAdd to Flipboard Magazine. | 科技 |
2016-40/3983/en_head.json.gz/2313 | Kiosk company offers software to reduce power usage | New Hampshire Contact us
Customers of Franklin-based Advanced Kiosks include cable network BET. COURTESY Kiosk company offers software to reduce power usage
By DOUG ALDENNew Hampshire Union Leader
FRANKLIN - A New Hampshire company is hoping to take the efficiency of electronic kiosks to new levels - lower levels. Advanced Kiosks of Franklin is touting its GreenTimer system as a simple way to reduce power usage by programming the system in advance to be on, off or in hibernation mode in synch with the hours the kiosks are in service. The idea is to take a little strain off the power grid, in turn increasing efficiency and the lifespan of the delicate machinery. "A self-service terminal shouldn't be on when it's 9 p.m. and your lobby closed at 6. There's no reason for it," said company president and owner Howard Horn.The GreenTimer software has been standard in kiosks manufactured by the company since October. It is also available now to any business or individual wanting to upgrade existing kiosks, digital signage or other public computer systems. The technology is free to anyone who wishes to download it with a system that runs on Windows 7. With Earth Day being observed on Monday, Horn said the timing was fitting to get the word out about GreenTimer, which the company estimates could save from $80 to $120 a year in energy costs for every kiosk. It also reduces the physical stress on the computers, which leads to less maintenance.Horn said environmental sustainability has been a priority for the company since he built his first kiosk in 2003. Advanced Kiosks uses recycled materials for shipping and also recycled steel for the actual products, which are finished with powder-coating that is easier on the environment than traditional paint. Horn said the idea that led to the development of GreenTimer was about reducing power usage. People are more likely to utilize something that's at no charge rather than having to buy it, so Advanced Kiosk will continue offering it for free with the hope that enough units will be upgraded to actually make a bit of environmental difference."At the end of the day it's a good thing," Horn said. While the technology behind GreenTimer itself isn't new, Advanced Kiosks says it has come up with a user-friendly way to install and operate it, hopefully attracting more users."Being 'on' all night is a cost after a year. It adds up," Horn said. "A lot of people don't care about that. That's why we're not selling it."Advanced Kiosks has had a wide range of customers, from Harvard University to an airport in the Cayman Islands. Advanced Kiosks' products include custom-built, interactive computer systems designed to meet the specifics requested by the buyer. Horn said his business, which has grown to six employees, sold about 400 machines last year. Horn said the most significant development within the industry that allowed GreenTimer to be possible was the vast improvements made in hibernation, when the computer is still functioning but using minimal power. The technology has evolved to the point that a computer in hibernation mode can snap out of it by a simple touch to the interactive screen. As most anyone who ever operated an early model laptop can attest, "hibernate" mode was a great way to conserve battery power but not so good at saving time. It would often take a full reboot to get the system running again, but Horn said those glitches were resolved years ago. Now, a unit in a hotel lobby can be a resource for a guest looking for local information in the middle of the night. "What they're being used for surprises me some days," Horn said. "There are so many things you can do.".. | 科技 |
2016-40/3983/en_head.json.gz/2355 | Testers Wanted for New Tool to Make It Easier to Petition the White House
Hallie Golden
By Joseph Marks
Tech + Tequila: Tapping into Tech Hubs
You Can Now Buy a USB Stick That Destroys Any Computer
Facebook’s DC Office Seeks Analyst to Combat Hate Groups, Terrorist Content
Frequently Automated Questions: Is Artificial Intelligence the Future of FAQ?
The White House’s We the People petition website is much discussed in government transparency circles but it has only grabbed the public’s attention a few times since its 2011 launch.
That could change with a new project called the We the People Write API that’s being developed now by the White House’s digital staff.
APIs, or application programming interfaces, are tools that allow information to stream automatically from one computer system to any other computers that plug into them. The first phase of this project, the Read API, allowed non-government developers to automatically gather data from the White House petition site, such as where signatures on a particular petition were coming from and how quickly they were coming in.
The Write API will allow developers to send signatures to We the People petitions from outside websites so signers won’t have to visit the White House site or even know much about it.
That means third-party petition sites such as Change.org could host petitions but still take advantage of their own popularity -- Change.org, for instance is significantly more popular than We the People -- and still benefit from We the People’s main selling point: Petitions that receive more than 100,000 signatures in one month are promised a response from White House officials.
The White House invited developers to be part of a beta test for the Write API in a blog post Monday. Sign up starts at 2 p.m. today.
The Write API could also be used by advocacy sites with strong supporter bases. Conservative groups could collect signatures on a petition to delay the Obamacare launch via postings on Breitbart.com for instance. Liberals groups could collect signatures on a petition to raise taxes on the richest Americans via Daily Kos.
Nonpartisan groups could also host petitions on popular sites. A petition urging harsher penalties for animal cruelty, for instance, could be hosted by the American Society for the Prevention of Cruelty to Animals.
We the People surged into public notice when petitions supporting and opposing new gun control measures went viral soon after a gunman shot 20 children and six adults at Sandy Hook Elementary School in Newtown, Conn., in December 2012 and when thousands of petitioners sought permission for their states to secede from the union following President Obama’s November reelection.
PREVIOUS POST // The Best HealthCare.gov Movie
NEXT POST // HealthCare.gov’s Silver Lining: Everyone Cares About IT Procurement Now
Joseph Marks covers government technology issues, social media, Gov 2.0 and global Internet freedom for Nextgov. He previously reported on federal litigation and legal policy for Law360 and on local, state and regional issues for two Midwestern newspapers. He also interned for Congressional Quarterly’s Homeland Security section and the Associated Press’s Jerusalem Bureau. He holds a bachelor’s degree in English from the University of Wisconsin and a master’s in international affairs from Georgetown. | 科技 |
2016-40/3983/en_head.json.gz/2455 | Issues / 2007 / January 2007 / Solid-State Lighting: A Systems Engineering Approach
Feature open
Solid-State Lighting: A Systems Engineering Approach
Ian Ashdown
Solid-state lighting is more than just another light source. It is a lighting system that involves optics, thermal management, electrical power conversion, electronic drive circuitry and network control.
Molded acrylic optical components using refraction and total internal reflection can function as non-imaging collection optics with LEDs.
If you hear the term “solid-state lighting,” you will most likely think of light-emitting diodes (LEDs). Like other light sources, these semiconductor devices convert electrical energy into visible light. From their humble beginnings in the 1960s as red indicator lamps, LEDs are now competitive with incandescent and compact fluorescent lights in terms of their energy efficiency. As their performance continues to improve, solid-state lighting is expected to supplant incandescent and fluorescent lamps in providing “white light” illumination for homes and offices. It will also provide energy-efficient backlighting for LCD televisions and computer displays.
The key component of solid-state lighting is, of course, the LED. These devices exhibit spontaneous light emission (or electroluminescence) when electrical current flows across the junction of p-doped and n-doped semiconductor materials. This process can be extremely efficient, with internal quantum efficiencies close to unity in some cases.
The spectral power distribution of an LED is determined by the bandgap energy of its semiconductor material, which can be tailored through alloy composition to span the spectrum from ultraviolet through infrared. Commercial high-flux LEDs suitable for solid-state lighting are based on aluminum-indium-gallium nitride (AlInGaN) alloys for blue and green emissions (roughly 440 to 550 nm), and aluminum-indium-gallium phosphide (AlInGaP) alloys for amber (585 to 595 nm) and red (615 to 645 nm) emissions. The internal quantum efficiencies of these materials are roughly 50 percent for AlInGaN and close to 100 percent for AlInGaP.
Examples of solid-state lighting fixtures: Tempura track light (left) and LumeLEX track light (right).
Typical AlInGaN light-emitting diode construction.
From an optics perspective, the most interesting feature of these alloys is their high refractive indices: ns ≈ 2.5 for AlInGaN and ns ≈ 3.4 for AlInGaP. When they are encapsulated in transparent media such as epoxy or silicone with a refractive index of ne , the photons generated within the p-n junction can exit the semiconductor only if their angle of incidence is less than the critical angle given by Snell’s law: θc = sin–1(ne /ns ). Photons with incidence angles outside of this escape cone undergo total internal reflection, and are typically absorbed by the substrate or electrodes—which results in heat generation. This can lead to external quantum efficiencies (EQEs) of only a few percent.
There are several approaches to overcoming this problem. One of the most obvious is to use an encapsulant with a high refractive index. However, while there are optically clear polymers with ns ≈ 1.7 available, they need to withstand intense actinic radiation from blue LEDs, transient thermal gradients as high as 3000° C/cm, and temperatures in excess of 200° C. These hostile conditions usually require high-flux LEDs to be encapsulated in a soft silicone gel with a hard acrylic or silicone dome lens.
A more useful approach is to roughen the semiconductor surface with a chemical etchant. This scatters the incident photons and so increases their probability of escape, resulting in radiant power increases of two to three times. (This is similar in principle to moth-eye antireflection coatings with their graded refractive indices.) In a variation of this technique, engineers at OSRAM Opto Semiconductors etch a pattern of hexagonal cones into the AlInGaN surface, then apply a reflective metallic coating before removing the semiconductor film from the substrate and bonding it upside down on another substrate. The metallized film acts as a mirror to redirect photons towards the top surface, with a 70 percent increase in EQE for their commercial OSTAR LEDs.
Another approach is to etch a periodic 2D pattern into the semiconductor surface. This can be done using e-beam lithography, but holographic photolithography is better suited for mass production. Recent work has demonstrated that the LED power can be more than doubled with such techniques.
A closely related approach involves periodic 2D arrays of holes that are spaced on the order of a wavelength of visible light (400 to 700 nm). These photonic lattices can be generated by patterning the substrate or the top surface of the LED die, again using e-beam or photolithography. These lattices can also be fabricated separately and bonded to the die surface. In addition to doubling the device EQE, the emitted light is partially collimated towards the surface normal.
This property has been successfully exploited by Luminus Devices with their PhlatLight LEDs for projection systems and LCD television panel backlighting. Compared with conventional LEDs that exhibit essentially Lambertian luminous intensity distributions, photonic lattice LEDs feature higher luminances (as much as ten times improvement) and require collection optics with smaller numerical apertures.
There are other approaches to increasing light extraction from LED dies, ranging from shaped chips and distributed Bragg reflectors to surface plasmon and microcavity LEDs. These, however, are dependent on LED fabrication techniques, and offer fewer opportunities for innovative optical design solutions.
High-flux photonic lattice LEDs offer partially collimated emissions and high surface luminance for LCD television and projector applications.
Phosphors
White light LEDs, such as those used for cell phone display backlighting and in LED flashlights, consist of a blue AlInGaN LED with a yttrium-aluminum-garnet (YAG) phosphor doped with cerium ions. The LED optically pumps the phosphor, which emits yellowish light. The combination of yellow and blue light appears as “cool” white light to the human eye, with color temperatures ranging from roughly 5000 to 8000 K. Proprietary red-emitting phosphors may be added to produce “warm” white LEDs with color temperatures in the range of 2800 to 3500 K. (The equivalent fluorescent lamp colors are labeled “daylight” and “warm white,” respectively.)
At present, the phosphors are either mixed into the LED encapsulant or conformally coated onto the LED die. This can be problematic for high-flux LEDs, where power levels can exceed 100 W/cm2, and radiant heating can cause phosphor particle temperatures to exceed 250° C. YAG phosphors at this temperature exhibit a 40 percent decrease in efficiency from operation at 100° C, which leads to both lower LED power output and color shifts due to changes in the ratio of yellow to blue light.
Phosphor development for LEDs is an active field, especially the development of quantum dot phosphors with tunable spectral power distributions. Surprisingly, little research has been done on incorporating phosphors into optical components rather than applying them directly to the LEDs.
One such approach is to incorporate the phosphors in a remotely mounted matrix. Without the extreme temperature concerns, this matrix can be a standard optical epoxy. In addition to providing more opportunities for optical design, this configuration can increase the light extraction efficiency by 75 percent or more.
Another approach is to irradiate a phosphor layer applied to a reflective base. While the resultant device may not be as compact as a conventional phosphor-coated LED, improvements in light extraction efficiency of 150 percent have been reported.
Collection optics
LEDs have often been described as point light sources, but this is mostly a matter of scale. While they are much smaller than most incandescent, fluorescent and high-intensity discharge lamps, high-flux LEDS still have luminous areas of 1 mm2 or more. With the trend toward ever smaller LED-based projectors and small architectural lighting fixtures, even this small a luminous area complicates the design of efficient collection optics.
The situation is further complicated by the typically non-uniform luminance distribution of most high-flux LEDs, which have electrodes with complex configurations designed to distribute the current flow through the p-n junction as evenly as possible, and gold wire bonds that may obstruct the LED emissions. Rather than model these physical characteristics, it is possible to measure the four-dimensional near-field luminance distribution with an imaging goniophotometer. One example is the Radiant Imaging SIG-300, whose output is a photometrically calibrated rayset that can be used with most optical design software.
A deterministic diffuser can thoroughly mix light from red, green, and blue LEDs to produce a batwing-shaped luminous intensity distribution.
While designing non-imaging collection optics for high-flux LEDs can be difficult, it is an interesting challenge. For example, the small source size makes it practical to use ultraviolet-curable micro-optics for both LEDs and LED arrays. These optical devices are particularly useful in that they can be mass-produced at low cost and incorporated into the LED packages. Various mastering techniques, including direct laser beam writing, gray-scale photolithography, photoresist reflow and single-point diamond turning, allow the designer to use refractive or diffractive elements as required. Examples of such products for consumer devices are being manufactured by Heptagon, but micro-optics for solid-state lighting are in the pipeline.
Architectural applications of solid-state lighting require a range of luminous intensity distributions, from narrowly focused spots to broad washes of light. Some applications, such as indirect lighting of office ceilings, require so-called “batwing” distributions that provide even illumination despite the cosine falloff typical of linear sources. Other applications, such as cove lighting and wall washing, require distributions that are narrow in one direction but broad in the other. These are reasonably easy to achieve with linear fluorescent lamps, but much more difficult with a linear array of point sources.
There are two approaches to this problem. RPC Photonics has developed deterministic diffusers based on microlens arrays that can provide tailored intensity distributions from individual LEDs and LED arrays, including off-axis distributions that are difficult to achieve with conventional optics. Using direct laser beam writing for the masters, these arrays can be mass-produced using injection-molded plastics. Batwing distributions with excellent color mixing from red, green and blue LED arrays can be achieved over very short distances.
The second approach uses holographic diffusers such as those manufactured by Physical Optics Corporation. While these UV-cured embossed plastic diffusers offer fewer opportunities for tailored intensity distributions, they are still well-suited for linear LED arrays. In particular, they can produce batwing intensity distributions that are ideally suited for indirect lighting applications.
For applications requiring more precise control, such as projection systems, the Fraunhofer Institute for Applied Optics and Precision Engineering has prototyped injection-molded PMMA optical components that produce intensity distributions from LEDs with good beam homogenization, sharp cutoff characteristics and excellent flux utilization.
A non-rotationally symmetric tapered mixing rod can thoroughly mix light from red, green and blue LEDs.
In addition to using phosphor-coated blue LEDs, white light can be generated by blending the emissions of red, green and blue LEDs. This approach has several advantages in that temperature and age-related color shifts can be eliminated, and the user can specify the desired color temperature—something that has never before been possible in architectural lighting without the use of energy-inefficient color filter wheels.
Unfortunately, thoroughly mixing the light from an array of colored LEDs can be exceedingly difficult. The conventional approach has been to use light pipes, but these typically require unreasonably long lengths to provide sufficient mixing. This problem has recently been addressed by Optical Research Associates with a novel light pipe design that is both conveniently short (about 18 mm) and easy to manufacture.
Photometry and colorimetry
The quasimonochromatic emissions of LED sources—and this includes phosphor-coated blue LEDs because of the narrow peak in the blue region of the spectrum—have led to considerable difficulties in obtaining accurate measurements using photometers with spectral responsivities that emulate human vision. Lab-quality spectroradiometers are often required for precise measurements of LED and LED array photometric quantities.
There are currently a number of technical committees involved in the development of national and international standards for solid-state lighting photometry and spectro-radiometry, including the Commission Internationale de l’Eclairage (CIE), the Illuminating Engineering Society of North America (IESNA) and the National Electrical Manufacturers Association. As one example, the IESNA is finalizing an ANSI (American National Standards Institute) standard for the electrical and photometric measurements of solid-state lighting products. There is already a technical standard on colorimetry published by the CIE that is germane to solid-state lighting.
Lighting designers are particularly interested in developing solid-state lighting products that can render colors as we would perceive them under incandescent or daylight light sources. Fluorescent lamps are characterized according to their CIE Color Rendering Index, but this metric has been shown to correlate poorly with how we perceive colors under solid-state illumination. The National Institute of Standards and Technology (NIST) has proposed a Color Quality Scale that addresses this issue.
LCD display backlighting
In addition to providing general illumination for architectural applications, high-flux LEDs are being used to provide backlighting for LCD televisions and computer displays. Efficient coupling of multicolor LED emissions into flat panel light guides to produce even luminance distributions is a challenging design problem. Whereas there are successful design solutions for cold cathode fluorescent lamps, it is considerably more difficult to mix colors from a linear array of red, green and blue LEDs. There is likely much more work that can be done in this area.
Another challenging design problem involves the need to maintain the correct intensity and color balance of the mixed LED illumination under varying operating conditions. Broadcast television standards have strict requirements for the display color gamut—which translates into strict limits on the LED peak wavelengths and consequent low manufacturing yields. This problem can be addressed by taking advantage of the peculiarities of human color vision, but at increased complexity and manufacturing costs.
An even more challenging problem is the design of an inexpensive tristimulus colorimeter for the backlighting control system. While there are integrated circuits available from companies like TAOS and Hamamatsu Photonics, their spectral responsivities are designed to approximate the CIE color matching functions for human color vision. This makes them unsuitable for LED backlighting control, as it becomes difficult to distinguish between changes in LED intensity and peak wavelength shifts. It would be much better to have color sensors with flat spectral responses that are band-limited to the spectral power distributions of the red, green and blue LEDs. This can be accomplished with thin-film interference filters, but the high manufacturing costs and need to collimate the incident light may make it an unattractive solution.
To create a practical colorimeter, designers must take into account semiconductor packaging, optical filter design, color vision theory, electronic feedback design, LED spectroradiometry and device characterization. In short, nothing less than a systems engineering approach will suffice.
Many problems, many solutions
Although some basic design problems in solid state lighting have been addressed, there is much more work to be done in refining the solutions such that solid-state lighting becomes commercially competitive with today’s incandescent and fluorescent lighting systems. There is also the strong possibility that innovative solutions involving optics, thermal management, electrical power conversion, electronic drive circuitry and network control are still to be discovered.
Ian Ashdown is a senior research scientist with TIR Systems Limited.
>> A. Zukaukas et al. Introduction to Solid-State Lighting, New York, N.Y., Wiley-Interscience (2002).
>> J. Cho et al. “Recent developments in patterned structure light-emitting diodes,” Proc. SPIE, 5941, 11-8 (2005).
>> M. Zoorob et al. “Photonic quasicrystals boost LED emission characteristics,” LEDs Magazine, 21-4 (August 2006).
>> H. Rudmann et al. “Design and fabrication technologies for ultraviolet-replicated micro-optics,” Opt. Eng. 43(11): 2575-82 (2004).
>> D.J. Schertler et al. “LED luminaire with controlled light distribution,” Proc. SPIE 6337, 63371E (2006).
>> P. Schreiber et al. “Microoptics for homogeneous LED illumination,” Proc. SPIE 6196, 61960P (2006).
>> S. Robinson et al. “Polychromatic optical feedback: control, stability, and dimming,” Proc. SPIE 6337, 633714 (2006).
>> W. J. Cassarly et al. “Non-rotationally symmetric mixing rods,” Proc. SPIE 6342, 63420Q (2006).
>> D. Nesterenko et al. “Design and analysis of tapered waveguides as collimators for LED backlighting,” SID 05 Digest, 1388-91 (2005).
>> I. Ashdown et al. “Binning and filtering: the six-color solution,” Proc. SPIE 6337, 63371A.
Publish Date: 01 January 2007 Add a Comment
Please enable JavaScript to view the comments powered by Disqus. Download PDF | 科技 |
2016-40/3983/en_head.json.gz/2553 | Imagination Nebula - Albert Einstein Quote Huge Motivational Poster - 55x39
Vincent Van Gogh The Starry Night, Huge Art Poster Print
Vincent Van Gogh Turquoise Almond Branches in Bloom, San Remy Art Poster Print
New York City Times Square from Above Huge Art Print Poster
Imagination Keep Your Eyes on the Stars and Your Feet on the Ground Art Print Poster
Title: Imagination Nebula - Albert Einstein Quote Huge Motivational Poster
Size: 55 x 39 inches (99 x 140 cms) SKU: 837727 Our Price: $16.99
Details: This huge poster shows a picture of a colorful nebula. At the bottom it has a quote from Albert Einstein which says: Imagination. Imagination is more important than knowledge. Knowledge is limited, imagination encircles the world. Albert Einstein (March 14, 1879 – April 18, 1955) was a theoretical physicist who is widely regarded as one of the most influential scientists of all time. His many contributions to physics include the special and general theories of relativity, the founding of relativistic cosmology, the photon theory and wave-particle duality, and the quantum theory of atomic motion in solids. Einstein is best known for his theories of special relativity and general relativity. He received the 1921 Nobel Prize in Physics. He is often regarded as the father of modern physics. | 科技 |
2016-40/3983/en_head.json.gz/2591 | Quantum Day 21st Century Science, Technology, and Medical News
Neuroprosthetic Algorithm Leads To Better Thought Controlled Objects Such As Bionic Arms and Legs
Objects that can be moved simply by thinking about it is part of a branch of science called neuroprosthetics. A new algorithm called ReFIT, now improved the speed and accuracy of neuroprosthetic devices and can lead to the development of better bionic arms and legs.
The Motor system of the human body responsible for movement is controlled by the brain. Hand and body movements first come out as thought impulses that travel down the nervous system which in turn activates the specific muscles to initiate movement.
Basically, all movements by the body are thought controlled. Picking up a spoon, moving and clicking a computer mouse, or even typing on the keyboard are movements that primarily originate as thought patterns in the brain.
Using this concept, a branch of science called neuroprosthesis was developed. It is a combination of neuroscience and biomedical engineering . The foundation of neuroprosthesis is the development of devices that can replace or even enhance functions of the body that was lost during disease or injury.
Bionic Arms and Legs
It is important to note that even when a patient has loss the control of his limbs, the brain and nervous system still can generate the necessary brain signals needed to initiate the movement. It is this signal that is used by neuroprosthesis to move the artificial arms or legs.
It is similar to the concept of the Six Million Dollar Man. Steve Austin's bionic arms and legs are artificial robotic limbs but are still controlled by Austin's brain.
Video: Brain-Computer Interfaces (Krishna Shenoy, Stanford University)
The development of these artificial neural interfaces is at the core of neuroprosthesis. The National Institute of Neurological Disorders and Stroke define these as "artificial extensions to the body that restore or supplement function of the nervous system lost during disease or injury, and implantable neural stimulators that provide therapy. Neural interfaces are used to allow disabled individuals the ability to control their own bodies and lead fuller and more productive lives..."
Neural interfaces also utilize computer programs to interpret and process the brain signals. What is of primary importance to these programs is the algorithm. The algorithm is the part of the computer system that decides what exactly does the brain want to do with the interface.
Going Beyond The Human Body
If brain signals can be used to move artificial limbs and extensions, it also can be utilized to move other things that are not attached to the body.
In the movie (and the book) Firefox, the featured Soviet fighter had a weapon systems that is controlled by the thoughts of the pilot. A neural link is established with the plane through the pilot's helmet.
Just like life imitating art, scientists have developed neural interfaces that not only controls limbs but also other extensions such as cursor movements of a computer.
Developing Better Algorithms To Improve Speed and Accuracy
In recent years, neuroscientists and neuroengineers working in prosthetics have begun to develop brain-implantable sensors that can measure signals from individual neurons, and after passing those signals through a mathematical decode algorithm, can use them to control computer cursors with thoughts. The work is part of a field known as neural prosthetics.
A team of Stanford researchers have now developed an algorithm, known as ReFIT, that vastly improves the speed and accuracy of neural prosthetics that control computer cursors. The results are to be published November 18 in the journal Nature Neuroscience in a paper by Krishna Shenoy, a professor of electrical engineering, bioengineering and neurobiology at Stanford, and a team led by research associate Dr. Vikash Gilja and bioengineering doctoral candidate Paul Nuyujukian.
In side-by-side demonstrations with rhesus monkeys, cursors controlled by the ReFIT algorithm doubled the performance of existing systems and approached performance of the real arm. Better yet, more than four years after implantation, the new system is still going strong, while previous systems have seen a steady decline in performance over time.
"These findings could lead to greatly improved prosthetic system performance and robustness in paralyzed people, which we are actively pursuing as part of the FDA Phase-I BrainGate2 clinical trial here at Stanford," said Shenoy.
Sensing mental movement in real time
The system relies on a silicon chip implanted into the brain, which records "action potentials" in neural activity from an array of electrode sensors and sends data to a computer. The frequency with which action potentials are generated provides the computer key information about the direction and speed of the user's intended movement.
The ReFIT algorithm that decodes these signals represents a departure from earlier models. In most neural prosthetics research, scientists have recorded brain activity while the subject moves or imagines moving an arm, analyzing the data after the fact. "Quite a bit of the work in neural prosthetics has focused on this sort of offline reconstruction," said Gilja, the first author of the paper.
The Stanford team wanted to understand how the system worked "online," under closed-loop control conditions in which the computer analyzes and implements visual feedback gathered in real time as the monkey neurally controls the cursor to toward an onscreen target.
The system is able to make adjustments on the fly when while guiding the cursor to a target, just as a hand and eye would work in tandem to move a mouse-cursor onto an icon on a computer desktop. If the cursor were straying too far to the left, for instance, the user likely adjusts their imagined movements to redirect the cursor to the right. The team designed the system to learn from the user's corrective movements, allowing the cursor to move more precisely than it could in earlier prosthetics.
To test the new system, the team gave monkeys the task of mentally directing a cursor to a target — an onscreen dot — and holding the cursor there for half a second. ReFIT performed vastly better than previous technology in terms of both speed and accuracy. The path of the cursor from the starting point to the target was straighter and it reached the target twice as quickly as earlier systems, achieving 75 to 85 percent of the speed of real arms.
Video: Moving A Cursor Using Thought (Neural Prosthesis)
"This paper reports very exciting innovations in closed-loop decoding for brain-machine interfaces. These innovations should lead to a significant boost in the control of neuroprosthetic devices and increase the clinical viability of this technology," said Jose Carmena, associate professor of electrical engineering and neuroscience at the University of California Berkeley.
A smarter algorithm
Critical to ReFIT's time-to-target improvement was its superior ability to stop the cursor. While the old model's cursor reached the target almost as fast as ReFIT, it often overshot the destination, requiring additional time and multiple passes to hold the target.
The key to this efficiency was in the step-by-step calculation that transforms electrical signals from the brain into movements of the cursor onscreen. The team had a unique way of "training" the algorithm about movement. When the monkey used his real arm to move the cursor, the computer used signals from the implant to match the arm movements with neural activity. Next, the monkey simply thought about moving the cursor, and the computer translated that neural activity into onscreen movement of the cursor. The team then used the monkey's brain activity to refine their algorithm, increasing its accuracy.
These diagrams trace the accuracy of various trial scenarios of the ReFIT algorithm developed at Stanford. On the left is a a real arm. In the middle, the monkey uses ReFIT and on the right the monkey uses the old algorithm. Note the tendency of the old algorithm to overshoot the target and, conversely, how the ReFIT traces closely resemble those of the real arm.
Credit: Vikash Gilja, Stanford University
The team introduced a second innovation in the way ReFIT encodes information about the position and velocity of the cursor. Gilja said that previous algorithms could interpret neural signals about either the cursor's position or its velocity, but not both at once. ReFIT can do both, resulting in faster, cleaner movements of the cursor
An engineering eye
Early research in neural prosthetics had the goal of understanding the brain and its systems more thoroughly, Gilja said, but he and his team wanted to build on this approach by taking a more pragmatic engineering perspective. "The core engineering goal is to achieve highest possible performance and robustness for a potential clinical device, " he said.
To create such a responsive system, the team decided to abandon one of the traditional methods in neural prosthetics. Much of the existing research in this field has focused on differentiating among individual neurons in the brain. Importantly, such a detailed approach has allowed neuroscientists to create a detailed understanding of the individual neurons that control arm movement.
The individual neuron approach has its drawbacks, Gilja said. "From an engineering perspective, the process of isolating single neurons is difficult, due to minute physical movements between the electrode and nearby neurons, making it error-prone," he said. ReFIT focuses on small groups of neurons instead of single neurons.
By abandoning the single-neuron approach, the team also reaped a surprising benefit: performance longevity. Neural implant systems that are fine-tuned to specific neurons degrade over time. It is a common belief in the field that after six months to a year, they can no longer accurately interpret the brain's intended movement. Gilja said the Stanford system is working very well more than four years later.
"Despite great progress in brain-computer interfaces to control the movement of devices such as prosthetic limbs, we've been left so far with halting, jerky, Etch-a-Sketch-like movements. Dr. Shenoy's study is a big step toward clinically useful brain-machine technology that have faster, smoother, more natural movements," said James Gnadt, PhD, a program director in Systems and Cognitive Neuroscience at the National Institute of Neurological Disorders and Stroke, part of the National Institutes of Health.
For the time being, the team has been focused on improving cursor movement rather than the creation of robotic limbs, but that is not out of the question, Gilja said. Near term, precise, accurate control of a cursor is a simplified task with enormous value for paralyzed people.
"We think we have a good chance of giving them something very useful," he said. The team is now translating these innovations to paralyzed people as part of a clinical trial.
Stanford School of Engineering Nature Neuroscience
Systems and Cognitive Neuroscience - NINDS
Neural Interfaces Program The Six Million Dollar Man
Bionic Eye, Argus II Retinal Prosthesis System, Gets Approval Recommendation From US FDA
Walking Again After Spinal Cord Injury Through Neuroprosthetics and Robotics
Eye Writing Technology Help People Unable To Communicate
Power of the Mind: Therapy for Parkinson's Disease
Study Examines How Synapses Transmit Signals From The Ear To The Brain
Brain Neurons In The Lateral Intraparietal Area Discovered To Keep Track Of Time
Jonathan Vizcarra
bionic,
brain activity,
neural prosthetics,
neurology,
neuroprosthesis,
paralysis,
Support Quantum Day
Read more about this
Quantum Day
Clean Dwarf Galaxy Help Chart Universe
The image above is of IC 1613, a dwarf galaxy that is found in the Cetus constellation. IC 1613 is unique in that unlike other galaxies, this dwarf gal...
Top Posts For The Week
Crazy Ants Displace Fire Ants As Dominant Invasive Species
According to a University of Texas study, Crazy Ants may become the dominant invasive ant species displacing Fire Ants in the near future. ...
What is String Theory?
For the past decade, physicists all over are focusing their attention on string theory . But what is string theory? A simple explanation ...
Famous Scientists of the 21st Century
The 21st Century is just beginning but science and technology are moving in blazing speeds. Without the scientists, how could we move? Toda...
About Quantum Day
Abstracts On Scientists Develop Military Camouflag...
Abstracts on Extracting Uranium From Seawater
Acute Otitis Externa Fact Sheet
Weekly Game Highlights
Quantum Day© by JVizcarra is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
Template images by Maliketh. Powered by Blogger. | 科技 |
2016-40/3983/en_head.json.gz/2608 | mmoran@sierrasun.com Back to: AnnouncementsJune 12, 2013Follow Announcements
Scientist: Climate change impacting Tahoe clarity
Dr. Charles Goldman «
About Charles Goldman UC Davis research at Lake Tahoe began with Goldman who, in 1959, formed the Tahoe Research Group and began regularly monitoring Lake Tahoe, according to the college. Goldman has since successfully combined effective research and social action with his pioneering studies of lake eutrophication (the dense growth of algae and other organisms). More than 40 years of extensive, internationally renowned, long-term investigations provided clear evidence for the onset of cultural eutrophication in Lake Tahoe and have served as the underlying basis for nearly all major policy decisions regarding water quality in the Tahoe Basin, including exportation of sewage and solid waste, strict control on building, installation of major erosion control projects, protection of wetlands, establishment of water quality thresholds, control of nonpoint source pollution, controls on dredging, and many others. Related Media INCLINE VILLAGE, Nev. — Climate change is helping warm Lake Tahoe’s cobalt blue waters, which could complicate efforts to maintain its famed clarity, said one of America’s most respected scientists and an expert on the Sierra lake. Over the last 30 years, Lake Tahoe has warmed about a 1 degree Fahrenheit, a result of the lake not releasing as much heat back into the atmosphere due to carbon dioxide blanketing, said renowned limnologist Charles Goldman. “These are serious times, and this carbon dioxide question is just enormously serious,” Goldman said during his well-attended “The Impact of Climate Change and Global Warming on Inland Waters of the World” presentation Friday at the Tahoe Center for Environmental Sciences building in Incline Village. “... There is probably no problem that has faced humanity to the degree that this has since the ice ages.” According to a UC Davis 2010 report about the effects of climate change on Lake Tahoe in the 21st century, Secchi disk depth transparency has declined by 10 meters since 1968, while the rate of algae growth measured by carbon-14 is increasing at about 5 percent annually. Clarity is measured by the depth at which 10-inch white Secchi disks are no longer visible. Yet there are occasional years when lake clarity improves. According to UC Davis’ 2012 State of the Lake Report, the annual average lake clarity showed a “marked improvement” from 2010, increasing by 4.5 feet. “Such large year-to-year functions are not unusual in the long term record, and are part of the reason we advocate focusing on the long term changes rather than annual or even short term changes when trying to evaluate effectiveness of management programs,” the report stated. The report indicates winter clarity has improved over the past decade, which may be due to recent efforts to reduce urban stormwater flows to the lake, but more studies are needed to confirm it. Still, summer clarity continues to decline, with 2011 marking the second-worse summer on record, and the cause may be related to impacts of climate change, the report stated. Lake Tahoe mixes to the bottom about every fourth year, Goldman said Friday, bringing nutrients up to the surface that promote algae growth. This deep mixing also takes oxygen from the surface and distributes it throughout the lake to support aquatic life, according to the 2008 Science Daily article, “Global warming could radically change Lake Tahoe in 10 years.” In that article, Geoffrey Schladow, director of TERC, said if greenhouse-gas emissions continue at current levels, deep mixing could become less frequent and even non-existent. That would impact not only aquatic life, he said, but could cause currently locked up phosphorous in the lake-floor sediments to be released, which would fuel algae growth at the lake’s surface. When Goldman came to Lake Tahoe in the late 1950s, he said the nitrogen to phosphorus ratio was 1:1, which wasn’t sufficient for normal plant growth. “That’s why Tahoe was so beautifully clear,” he said. Today, that ratio has moved to 40:1 because of so much nitrogen in the atmosphere, Goldman said. While lake clarity is impacted by the influx of phosphorus and nitrogen, it’s also affected by fine sediment particles, which come from land disturbance and urbanization, according to the 2010 UC Davis report. Political decisions can also have a large impact, Goldman said, evident of global lake decisions such as the one to divert river inflows into the Aral Sea for irrigation purposes, causing one of the largest lakes in the world to practically dry up. “Keep in mind that the younger generation are really going to inherit part of the mess we left them, and it’s really up to us to inspire their environmental enthusiasm,” Goldman said, in the closing of his presentation. Join the Conversation
Tahoe Truckee health and wellness announcements
Helicopter aids powerline construction
The Record Courier Updated Jun 17, 2013 02:46PM Published Jun 12, 2013 11:10AM Copyright 2013 The Record Courier. All rights reserved. This material may not be published, broadcast, rewritten or redistributed. Mobile Site | 科技 |
2016-40/3983/en_head.json.gz/2613 | Red Ice Membership
MOD Report: End Of The Line For British UFOs? Not So Fast Says Nick Pope
By Thomas Horn | raidersnewsupdate.com
RNU.com � (Raiders News Update) � A 400-page report, entitled "Unidentified Aerial Phenomena in the UK" and marked "Secret: UK Eyes Only", is being released under the Freedom of Information by the U.K. MoD. It concludes that "No evidence exists to suggest that the phenomena seen are hostile or under any type of control, other than that of natural physical forces." Then adds: "There is no evidence that 'solid' objects exist which could cause a collision hazard." Nick Pope, who worked in this department for the MoD from 1991 to 1994 says he doesn't want to be critical, but he disagrees with some of the conclusions, "particularly where they relate to subjects such as electrically-charged atmospheric plasmas or temporal lobe disorders, where there is no scientific consenses."
When I asked Pope if there could be other descenters of the official investigation, he responded, "I don't think you should expect a 'minority report', but as ever, the devil's in the details." He further reminded that whereas he was involved with the MoD's investigation into Unidentified Aerial Phenomena and partook in early discussions that led to the commissioning of the study, he had left the MoD's UFO Project by the time this report was made.
This is not the first time a report by MoD dismissed the idea of UFOs. In 1950, Sir Henry Tizard commissioned a study to assess the reality of UFOs. The Flying Saucer Working Party scientifically studied UFOs and similarly concluded that they were ordinary objects, nature, planetoids, or hoaxes. But, in 1952, a series of high-profile UFOs were tracked on radar and seen by RAF pilots, and the investigation was reopened. To date, Britain�s most famous UFO sighting is known as the Rendlesham Forest incident, where, in 1980, military personnel witnessed what could only be considered a UFO event. The investigation turned up credible witnesses, strange animal behavior and radiation readings in three holes in the ground which formed an equilateral triangle, as well as other unexplainable elements.
Friedman Thinks Ignorance is Bliss in the UK Report
I also wanted Stanton Friedman's take on the soon-to-be-released MoD report. He provided particular amusement from the initial statements that "Until today the Condon Report produced by the University of Colorado for the US Air Force in 1969 was believed to be the only substantial government funded study of the UFO phenomenon published in English," to which Stanton countered, "The Condon report covered 117 cases of which 30% could not be identified. Project Blue Book Special Report 14 was funded by the US government and covered 3201 cases of which 21.5% could not be identified, completely separate from the almost 10% for which there was insufficient information. It contains over 240 tables of data, a statistical comparison between UNKNOWNS and Knowns, which showed the probability that the UNKNOWNS were just missed KNOWNS was less than 1%. There were also Blue Book/Grudge reports 1-12, at least 5 Theses re[garding] UFOs at the Air War College, the Congressional Hearings of 1968, etc. Incidentally, condign [deserved; adequate] is a real word, not just a random choice of letters. Should be a barrel of laughs as the nonsense of the noisy negativists almost always is. It also appears that no TOP SECRET data was included or TOP SECRET Code Word info. If UFOs were responsible for the crash or disappearance of British aircraft, presumably such data would have been withheld."
Like Pope, Stanton, a nuclear physicist, is also critical of the Plasma explanations. "One wonders," he says, "if the authors were aware of Jim McDonald's detailed scientific critique of Phil Klass' nonsense about plasmas explaining UFOs."
Elsewhere, a flurry of UFO-related events is sure to turn up additional heat on the MoD's report, and may even question the timing of the release. Some scheduled events include:
A Toronto Press Conference on UFO Disclosure
UN Affiliate to Present Case For Extraterrestrial Encounters
McMinnville UFO Festival & Jesse Marcel Disclosure
Article from: http://www.raidersnewsupdate.com/MoD.htm
Related: Sorry ET, you're just a puff of plasma
Hacker fears 'UFO cover-up'
UFO study finds no sign of aliens | 科技 |
2016-40/3983/en_head.json.gz/2617 | ORNL Explores Proteins In Yellowstone Bacteria For Biofuel Inspiration
Studies of bacteria first found in Yellowstone's hot springs are furthering efforts at the Department of Energy's BioEnergy Science Center toward commercially viable ethanol production from crops such as switchgrass.
The current production of ethanol relies on the use of expensive enzymes that break down complex plant materials to yield sugars that are fermented into ethanol. One suggested cheaper alternative is consolidated bioprocessing, a streamlined process that uses microorganisms to break down the resistant biomass.
"Consolidated bioprocessing is like a one-pot mix," said Oak Ridge National Laboratory's Richard Giannone, coauthor on a BESC proteomics study that looked at one microorganism candidate. "You want to throw plant material into a pot with the microorganism and allow it to degrade the material and produce ethanol at the same time."
The BESC study focused on Caldicellulosiruptor obsidiansis, a naturally occurring bacterium discovered by BESC scientists in a Yellowstone National Park hot spring. The microorganism, which thrives at extremely high temperatures, breaks down organic material such as sticks and leaves in its natural environment, and scientists hope to transfer this capability to biofuel production tanks.
In a paper featured on the cover of the Journal of Proteome Research, the BESC team conducted a comparative analysis of proteins from C. obsidiansis grown on four different carbon sources, ranging from a simple sugar to more complex substrates such as pure cellulose and finally to switchgrass. The succession of carbon substrates allowed researchers to compare how the organism processes increasingly complex materials.
"This progression helps us look at how proteins change given different carbon substrates," Giannone said. "One of the goals is to identify new proteins that we haven't seen before. If an unknown protein doesn't show up on the simple sugars or cellulose, but it shows up on the switchgrass, then we can say something about that gene or protein–that it responds to something the switchgrass is providing."
The researchers found that growth on switchgrass prompted the organism to express an expanded set of proteins that deal specifically with the hemicellulose content of the plant, including hemicellulose-targeted glycosidases and extracellular solute-binding proteins. Acting together, these two sub-systems work to break down the plant material and import the resulting sugars into the cell. The scientists went on to show that once inside the cell, the organism "switches on" certain enzymes involved in pentose metabolism in order to further process these hemicellulose-derived sugars into usable energy.
"By comparing how C. obsidiansis reacted to switchgrass, relative to pure cellulose, we were able to pinpoint the specific proteins and enzymes that are important to plant cell wall deconstruction–a major roadblock to the production of advanced biofuels," Giannone said.
The team's measurement of the full complement and abundance of C. obsidiansis proteins, called proteomics, can now be combined with other technologies such as genomics, transcriptomics and metabolomics in order to obtain a 360-degree, system-wide view of the organism. Instead of studying discrete proteins, these systems biology-type approaches provide more thorough insight into the day-to-day operations of bioenergy-relevant organisms and thus better equip researchers to make recommendations about their use in ethanol production.
"One goal for us at the BioEnergy Science Center is to take these 'omic' technologies and integrate the data so we can draw definitive conclusions about a system," Giannone said.
Coauthors on the paper are Hamburg University of Technology's Adriane Lochner and Garabed Antranikian, and ORNL's Martin Keller, David Graham and Robert Hettich. The full publication is available here: http://pubs.acs.org/doi/abs/10.1021/pr200536j.
BESC is one of three DOE Bioenergy Research Centers established by the DOE's Office of Science in 2007. The centers support multidisciplinary, multi-institutional research teams pursuing the fundamental scientific breakthroughs needed to make production of cellulosic biofuels, or biofuels from nonfood plant fiber, cost-effective on a national scale. The three centers are coordinated at ORNL, Lawrence Berkeley National Laboratory and the University of Wisconsin-Madison in partnership with Michigan State University.
ORNL is managed by UT-Battelle for the Department of Energy's Office of Science. DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit http://science.energy.gov.
Journal of Proteome Research | 科技 |
2016-40/3983/en_head.json.gz/2618 | New Solution For Monitoring Cryptic Species
Ecologists have at last worked out a way of using recordings of birdsong to accurately measure the size of bird populations. This is the first time sound recordings from a microphone array have been translated into accurate estimates of bird species' populations. Because the new technique, reported in the British Ecological Society's Journal of Applied Ecology, will also work with whale song, it could lead to a major advance in our ability to monitor whale and dolphin numbers.
Developed by Deanna Dawson of the US Geological Survey and Murray Efford of the University of Otago, New Zealand, the technique is an innovative combination of sound recording with spatially explicit capture-recapture (SECR), a new version of one of ecologists' oldest tools for monitoring animal populations.
Birds communicate by singing or calling, and biologists have long counted these cues to get an index of bird abundance. But it is much harder to work out the actual density of a bird population because existing methods need observers to measure either the distance to each bird, or whether they are within a set distance from the observer. This is straightforward if birds are seen, but difficult when birds are heard but not seen.
According to Dawson: "We devised a way to estimate population density of birds or other animals that vocalize by combining sound information from several microphones. A sound spreading through a forest or other habitat leaves a 'footprint'. The size of the footprint depends on how quickly the sound attenuates. Mathematically, there is a unique combination of population density and attenuation rate that best matches the number and 'size' of the recorded sounds. We used computer methods to find the best match, and thereby estimate density."
Dawson and Efford developed the method by recording the ovenbird "“ a warbler more often heard than seen "“ in deciduous forest at the Patuxent Research Refuge near Laurel, Maryland, USA. They rigged up four microphones close to the ground in a square with 21 meter-long sides. Over five days, they moved the microphones to 75 different points across their study area and recorded ovenbirds singing.
They chose the ovenbird as the species from which to develop the method because of its concise, distinctive song and because the males sing from the lower layers of the forest.
The new acoustic technique gives a more accurate estimate of bird numbers than using nets to capture birds, which can be stressful for the birds as well as time consuming for the researchers.
As well as helping assess populations of cryptic bird species such as the ovenbird, the new technique might be applied to measuring hard-to-reach populations of marine mammals, such as whales and dolphins. Developing ways of estimating whale and dolphin numbers acoustically is seen as critical for understanding these species' populations.
Recording the sounds has other benefits, too. "Sound intensity and other characteristics can be measured from the spectrogram "“ the graph of the sounds "“ to improve density estimates. Archiving the sounds also makes it possible to re-examine them, or to extract additional information as analytical methods evolve," says Dawson.
Deanna K. Dawson and Murray G. Efford (2009). Bird population density estimated from acoustic signals, Journal of Applied Ecology, doi: 10.1111/j.1365-2664.2009.01731.x, is published online on 27 November 2009. | 科技 |
2016-40/3983/en_head.json.gz/2719 | You are hereHome › Our Work › Carl Sagan Center › SETI Institute Projects and Programs (Listed Chronologically) SETI Institute Projects and Programs (Listed Chronologically)
FIRST Monday, April 06 2015 - 9:57 pm, PDT A new instrument that combines two high-resolution telescope techniques – adaptive optics and interferometry – has for the first time distinguished and studied the individual stars in a nearby binary star system, demonstrating promise for eventually picking out planets around other stars.
http://www.seti.org/seti-institute/press-release/novel-instrument-able-probe-close-binary-stars-may-one-day-image CAMS Monday, April 06 2015 - 9:55 pm, PDT CAMS is an automated video surveillance of the night sky in search of meteor showers to validate the IAU Working List of Meteor Showers. http://cams.seti.org/ Gemini Planet Imager Monday, April 06 2015 - 9:54 pm, PDT The instrument, called the Gemini Planet Imager (GPI), was designed, built, and optimized for imaging faint planets next to bright stars and probing their atmospheres, and studying dusty disks around young stars. It is the most advanced such instrument to be deployed on one of the world’s biggest telescopes – the 8-meter Gemini South telescope in Chile. http://www.seti.org/seti-institute/press-release/worlds-most-powerful-planet-finder-gemini-planet-imager-first-light-images Planetary Lake Lander Monday, April 06 2015 - 9:48 pm, PDT The Planetary Lake Lander project that will develop an adaptive probe as well as exploration strategies to explore the lakes of Titan, while monitoring the impact of deglaciation on terrestrial lake habitat and biodiversity in the Chilean Andes. In turn, results from this investigation are expected to provide insights into habitability and life potential on Mars during similar geological periods when glaciers were still present at the surface. http://pll.seti.org/?page_id=5 MID-IR SPECTROSCOPY FOR SMALL SATELLITES (MIRSSS) Thursday, November 06 2014 - 9:28 am, PST We propose a feasibility study for a small spaceborne mid-Infrared telescope/spectrometer designed to detect and characterize the delivery and survival of extraterrestrial organic matter to the atmosphere (and hence to the Earth). This instrument would detect the dilute levels of organics present in the upper atmosphere by using astronomical mid-infrared light sources along a viewing path that grazes the Earth atmosphere at an altitude of 80 to 100 km (where meteoritic debris is known to accumulate). This would provide a very long path length and allow, for the first time, the detection and characterization of the material that meteors are delivering by the detection of the mid-IR absorbances in the molecular bands. This would provide the first measurement of the amounts of delivery and the types of compounds being delivered and allow characterization of the amounts/types of prebiotic molecules that would become available for the origin of life. It would also provide a better understanding of the organics being formed in solar nebula. Lipids and the Origin and Evolution of Life Thursday, November 06 2014 - 9:26 am, PST This work will support experiments designed to better understand the how lipids relate to the origin of life. It looks at events leading up to the origin of life (prebiotic chemistry) and at the subsequent evolution of life after the last common ancestor, LUCA (and hence, at how one might extrapolate back from current life toward the LUCA). This coop will include work on the formation of amphiphilic vesicles and the role they may have played in supporting and containing chemistry and the origin of life. It also includes studying the use of lipids in microorganisms and how such usage evolved and how that information can be linked to the study of biogeochemical analysis of samples containing lipids as biosignatures. Understanding laser-induced breakdown spectroscopy (LIBS) on Mars - New approaches for quantitative elemental analysis and mineral classification Thursday, November 06 2014 - 9:21 am, PST We propose to study advanced spectral processing methodologies for the extraction of mineralogical information from complex geological materials, and to investigate the atmospheric-mineral matrix coupling effects under Martian environmental conditions. For this, we will make use of Washington University in St. Louis unique laboratory facilities which will allow us perform high-resolution multispectral analysis of standard samples and mixtures representative of Martian mineralogy. Science Processing Support for the Extended Kepler Mission Thursday, November 06 2014 - 9:19 am, PST In December 2001, Kepler became the 10th mission selected for flight by NASA’s Discovery Program, and the first such oriented to achieve goals under NASA’s Origins theme. The Kepler Mission seeks to determine the prevalence of Earth-size and larger planets orbiting solar-like stars in the solar neighborhood, and to characterize the stellar properties favoring the development of planetary systems. It achieves this goal through transit photometry by monitoring ~156,000 main-sequence stars continuously and simultaneously for at least 3 1/2 years, to detect signatures of transiting planets in the flux time series of their host stars. In April 2012, the NASA Astrophysics Senior Review recommended extending Kepler for an additional 4 years of science operations, through September 2016. This proposal seeks to support the operation and evolution of the science pipeline to support Kepler’s Extended Mission Observations of the Pluto System During the New Horizons Encounter Epoch Thursday, November 06 2014 - 9:15 am, PST Observations of the Pluto System During the New Horizons Encounter Epoch Observations and Dynamics of Ring-Moon Systems Thursday, November 06 2014 - 9:13 am, PST The Cassini data sets have provided remarkable new insights about the processes at work among the rings and small moons of Saturn. Guided by these discoveries, we will seek out and investigate related phenomena in the ring-moon systems orbiting Jupiter, Uranus, Neptune and Pluto. We will employ data from Voyager, Cassini and New Horizons, complemented by the best publicly available data from HST and from the W. M. Keck Telescope. We will apply new and powerful image analysis techniques that should enable us to obtain significant new results from old data. Pages1 | 科技 |
2016-40/3983/en_head.json.gz/2759 | What Is It Like to Work With Elon Musk?
SlateQuoraThe best answer to any question.Aug. 14 2013 10:30 AM
By Quora Contributor
SpaceX CEO Elon Musk unveils the Falcon Heavy rocket at the National Press Club in Washington, D.C., on April 5, 2011. Photo by Nicholas Kamm/AFP/Getty Images This question originally appeared on Quora.
Answer by Dolly Singh, former head of talent acquisition at SpaceX:
Advertisement After working for Elon for more than five years at SpaceX as the head of talent acquisition, there are many potential answers to this question. Any answer I might give will be completely colored by my own experiences, so full disclaimer: This is not an unbiased piece free of personal narrative. It is said that you cannot dream yourself a character; you must hammer and forge one yourself. If any leader and any company has done that, and continues to do that, it is SpaceX. To try and capture in words what working with Elon is like, I’d like to share some specific memories, particularly of one really rough day and its epic aftermath. On Aug. 2, 2008, eight months after I joined the company, SpaceX launched its third flight of the Falcon 1 launch vehicle. Falcon 1 was the predecessor to the Falcon 9 launch vehicle that the company flies today.
It was a defining moment for the company. Elon had a couple years prior stated in the press that his $100 million personal investment in the company would get us up to three tries, and if we couldn’t be successful by the third flight, we may have to admit defeat. In addition to the pressure created by this narrative in the press, the lobbyist armies of our competitors (the largest, most powerful defense contractors in the world) had been in overdrive in Washington, D.C., trying to undermine SpaceX and damage our credibility by painting us as too risky and inexperienced in order to protect their multibillion-dollar interests in the space-launch business.
Advertisement SpaceX executed a picture-perfect flight of the first stage (the portion of the flight that gets the vehicle away from Earth’s gravity and where the vehicle experiences maximum dynamic pressure, or basically where the conditions on the vehicle are physically the harshest), clearing some of the highest-risk points of mission.
However, shortly after the first-stage flight, immediately following stage separation (when the first stage of the vehicle detaches and falls away from the second stage of the vehicle that continues its journey to space), we lost the vehicle and mission. SpaceX Vice President of Propulsion Tom Mueller, the modern-day godfather of rocket science and one of the most brilliant scientific minds on the planet, and his team had done such a great job redesigning the vehicles' engines systems that they were even more efficient and powerful than in some ways projected.
We turned off the first-stage engine and then proceeded to separate the vehicle stages; however, when the stages uncoupled, there was still a little leftover "kick," or thrust, in the first-stage engine. Our first stage literally rear-ended our second stage immediately after we had tried to separate the two sections of the vehicle. It was a devastating, emotional experience.
Advertisement I stood around with the then-350 or so employees, and we cheered the vehicle on as it took off. As we were watching the mission clock and knew that the stages were about to separate, the video feed was cut. The company is on a 20-second viewing delay from the mission-control team, as we were projected to the external press feed, which is delayed in case of major mission anomalies. So when we lost video, we knew something had gone wrong in a big way.
Elon and about seven or eight of the most senior technical people at SpaceX were commanding the mission from a trailer in the back of the Hawthorne factory. We all waited anxiously for the trailer door to open and for someone to tell us something. The mood in the building hung thick with despair. You have to keep in mind that by this point SpaceX was six years old, and many people have been working 70- and 80-plus-hours a week, swimming against extremely powerful currents—like difficult barriers in technology, institution, politics, and finance—by sheer force of their blood and sweat.
They had all given so much. They were mentally and physically exhausted and really needed a win in order to replenish their spiritual wells—and give them the faith to keep following this man up a treacherous mountain that had depleted the hopes and resources of the many others who had come to conquer it.
This night would forever impact the future of the company. It had the potential to send the company into a downward spiral, from which we may not have ever recovered. A failure in leadership would have destroyed us not only from the eyes of the press or potential consumers; it would have destroyed us internally.
Advertisement When Elon came out, he walked past the press and first addressed the company. Although his exact words escape me in how he started off, the essence of his comments were that:
We knew this was going to be hard. It is, after all, rocket science. He then listed the half-dozen or so countries that had failed to even successfully execute a first-stage flight and get to outer space, a feat we had accomplished successfully that day.
Elon had (in his infinite wisdom) prepared for the possibility of an issue with the flight by taking on a significant investment (from Draper Fisher Jurvetson, if I recall correctly), providing SpaceX with ample financial resources to attempt two more launches and giving us security until at least flight No. 5 if needed. Finally, that we need to pick ourselves up and dust ourselves off, because we have a lot of work to do. Then he said, with as much fortitude and ferocity as he could muster after having been awake for 20-plus hours by this point, that “For my part, I will never give up, and I mean never,” and that if we stick with him, we will win. I think most of us would have followed him into the gates of hell carrying suntan oil after that. It was the most impressive display of leadership that I have ever witnessed. Within moments, the energy of the building went from despair and defeat to a massive buzz of determination as people began to focus on moving forward instead of looking back. This shift happened collectively, across all 300-plus people in a matter of not more than five seconds. I wish I had video footage, as I would love to analyze the shifts in body language that occurred over those five seconds. It was an unbelievably powerful experience.
What happened in the days and weeks following that night is nothing short of a series of miracles:
Advertisement Within a matter of hours, the SpaceX team identified the likely cause of the launch failure. Typically turnaround time from others in the launch business can range from weeks to months for failure investigations. Our team combed through every ounce of data to make sure we understood exactly what went wrong as quickly as possible.
By Aug. 6, 2008, we announced the results of our investigation and came 100 percent clean with our supporters and customer community in order to make sure we could retain their trust in this difficult time.
In seven weeks, we had another rocket fully manufactured, integrated, and on location ready to fly again. No one else could have done this in less than six months with unlimited human and financial resources; SpaceX did it in six weeks, with less than 400 people and on a restricted financial diet. On Sept. 28, 2008, SpaceX flew its Falcon 1 launch vehicle from Kwajalein Atoll in the South Pacific and executed its first 100 percent successful launch. It became the world’s first privately built rocket to achieve Earth orbit—an accomplishment of truly epic portions and a task previously completed by only six of the mightiest nations in the history of the world. It was a much-needed and much-deserved victory for the entire SpaceX team and, as it hopefully will turn out, the future of humanity overall.
So for those who ask the question, this is in my opinion the true character of Elon Musk. Undeterred in the face of all odds, undaunted by the fear of failure, and forged in the battlefields of some of the most terrifyingly technical and capital-intensive challenges that any human being could choose to take on. Somehow he comes out alive, every time—with the other guy's head on a platter.
Working with him isn’t a comfortable experience; he is never satisfied with himself, so he is never really satisfied with anyone around him. He pushes himself harder and harder, and he pushes others around him the exact same way. The challenge is that he is a machine and the rest of us aren’t. So if you work for Elon, you have to accept the discomfort. But in that discomfort is the kind of growth you can’t get anywhere else and worth every ounce of blood and sweat.
More questions on Elon Musk:
Is Elon Musk a genius? What are his primary talents? Do they go beyond just being really adept at managing nerds?
How did Elon Musk become so successful?
Has anyone found any loopholes in the Hyperloop project detailed by Elon Musk? | 科技 |
2016-40/3983/en_head.json.gz/2782 | Space.comScience & Astronomy
'Marsquake' May Have Shaken Up Red Planet
February 21, 2012 11:12am ET
Scientists have found evidence of relatively recent quakes on the surface of Mars by studying boulders that fell off cliffs, leaving tracks behind.
Credit: NASA/HiRISE image
The surface of Mars appears to have been shaken by quakes relatively recently, hinting at the existence of active volcanoes and perhaps reservoirs of liquid water on the Red Planet, a new study suggests.
Using photographs snapped by NASA's Mars Reconnaissance Orbiter, researchers analyzed the tracks made by boulders that fell from a Martian cliff. The number and size of these boulders — which ranged from 6.5 to 65 feet (2 to 20 meters) in diameter — decreased over a radius of 62 miles (100 kilometers) from a point along Mars' Cerberus Fossae faults.
"This is consistent with the hypothesis that boulders had been mobilized by ground-shaking, and that the severity of the ground-shaking decreased away from the epicenters of marsquakes," the study's lead author Gerald Roberts, of the University of London, said in a statement.
The dirt patterns created by the toppled Martian rocks weren't consistent with how boulders would scatter if they were deposited by melting ice, researchers said. Rather, they resembled the boulder falls seen after a 2009 earthquake near L'Aquila, in central Italy.
Based on the area covered by the displaced rocks, Roberts and his team estimate the quake's magnitude at 7 — about the same as the 2010 Haitian temblor that killed perhaps 300,000 people. The marsquake may have been powered by movements of magma related to the nearby Elysium Mons volcano, the researchers added.
Because Martian winds have not yet erased the boulder tracks, the researchers further concluded that the quake occurred in the relatively recent past, within the last few percent of the Red Planet's 4.5-billion-year history.
There's even a possibility that quakes are shaking the Martian surface today, they added. If that's the case, it would be good news for scientists interested in searching for life on the Red Planet. If active volcanos still exist on Mars, their heat could melt pockets of the planet's subterranean ice, perhaps forming reservoirs of liquid water hospitable to life as we know it.
The study will be published in the March 1 edition of the Journal of Geophysical Research-Planets.
Follow SPACE.com for the latest in space science and exploration news on Twitter @Spacedotcom and on Facebook.
Photos: Mars Volcano Views Revealed by Spacecraft
7 Biggest Mysteries of Mars
Photos: Red Planet Views from Europe's Mars Express | 科技 |
2016-40/3983/en_head.json.gz/2887 | Jobs, Flash and Elitism: Why Apple Doesn't Care About the Digital Underclass Navneet Alang
April 29 Apple Tweet
Jobs’ reasons for barring Flash sound reasonable enough. But what does it say about Apple’s values?
As you will have likely heard by now, today Steve Jobs wrote a long post on why Apple refuses to integrate Adobe’s Flash into its mobile products.
For people who have heard the incessant chatter about Adobe and Apple’s feud, it was nice to get an explanation straight from the horse’s mouth. Jobs, in his carefully worded note, outlined six reasons Apple chooses not to implement Flash, and would rather stick with standards like HTML5, CSS and Javascript. They were as follows:
Flash is proprietary. HTML5 and other standards are open, and open is better.
Lots of web video is available in non-Flash formats now because companies are changing, mainly because of Apple.
Flash is not secure enough. This is well-documented.
Flash drains too much power, and battery life is key for mobile devices.
Flash is designed for a mouse and keyboard era of the PC, not the touch generation.
Relying on a third-party development tool means that people are dependent on Adobe for upgrades. Apple does not want another layer between apps and the user experience.
And you know, from a business perspective, all of these sound pretty level-headed to me. To Jobs and his team at Cupertino, the user experience is paramount, and Apple will do what it has to so that owners of iPhones, iPods and iPads get the best experience they can. And let’s be honest – if Apple has a specialty, it’s producing good user experiences. Fair enough, Steve.
Why Are People Going Open?
Still – simmering underneath Jobs’ diatribe against Flash was a set of values about what the internet should look like and who is using it.
After all, when Jobs points out how much video content is now available in more open formats – YouTube, Netflix, the New York Times etc. – he skirts around the issue that these companies have had to switch so that they work on Apple products. Apple may not own the mobile market, but they are the arbiters of cool in the tech world, and in order to be seen as forward-thinking and appeal to that oh-so-important Apple demographic, the New York Times and others almost had to make that move.
For Jobs to claim that these companies’ actions represent a shift in thinking may be true. But it’s also is a little like a supermodel walking into a crowded bar and, half-an-hour-later, loudly exclaiming to her friends “see, all those ladies who say guys never come up to them are just doing something wrong”. It’s kinda’ missing the point that, well, in the tech world, it’s Apple’s milkshake that brings a lot of players to the yard – and in doing so, it’s Apple that gets to make decisions that affect everybody.
So Who Gets Priority In This New Open Web?
But just as importantly, there’s the question of who all this ‘open’ content is aimed at. I mean, it’s much easier for the New York Times to pay a team to rejigger their website for HTML5 than it is some small independent website who used some free tools to do so.
And here’s the thing: both of my parents, who are immigrants from India, sometimes like to connect with the things they know by streaming music online, whether classical, Bollywood or religious. But they can’t do that on their ‘hand-me-up’ iPhone because almost all of it is in Flash – and Apple has decided that they don’t matter. To Apple, they should be content to enjoy the ‘mainstream’ things they like. When those smaller sites can catch up, fine. But until then, unless you exist firmly in the middle of society, you’re outta luck.
But so much of the Internet’s potential is that you don’t have to abide by the mainstream. Indeed, one of the great things about the web is that immigrants, outsiders, queers, geeks and loners could finally find themselves and their cultures represented on a medium that wasn’t just about ‘the norm’. And sure, you could argue that Apple is a publicly traded company and as such has only a commitment to profit and its shareholders. But when a company, by the admission of its own CEO, is attempting to steer the direction of web, there’s something a little sad about the fact that it’s the mainstream and the privileged who are being looked after.
Apple: Not Just a Business
You know, a while ago here on Techi, I argued that I would eventually buy an iPad because its ease-of-use and accessibility was the perfect thing for my 70-ish father. In a way, that still holds. But there’s just something a little off-putting about the strident, self-righteous nature with which Jobs has insisted on his vision. Apple designs some great stuff, and usability is a big deal – but to have the ‘it just works’ mentality be aimed at squarely at the mainstream means that, for a large swath of society, Apple is also saying that right now, this stuff isn’t really for you.
Nothing about this is black and white. Apple’s commitment to efficient, open web standards is commendable, and the faster all those scattered sites around the web can move to open, accessible web technologies, the better. It will be a fairer, better web when that happens. Yet, by placing the emphasis on what the web might be and not what it is right now, Apple are also saying that this new slick, open web is either for those who can afford it or who are part of the mainstream.
As a business decision, it’s not a bad one. But as a social and cultural one – well, it’s just a little disappointing.
Written by Navneet Alang
Navneet Alang is a technology-culture writer based in Toronto. You can find him on Twitter at @navalang SEE MORE ARTICLES BY "Navneet Alang"
iPhone, iOS launches are a reminder of the Steve Jobs void
Good guy Tim Cook has compensation adjusted to reflect performance
Is the end of the Apple era starting?
Getting a job at Apple - advice from those who did it
For me it’s not really about a device’s ability to “run Flash,” but about the pitfalls of using something like Adobe’s Flash-based AIR as a cross-platform development environment. This inevitably leads to a lowest-common-denominator experience, as we’ve all seen before. Remember Java? And I’m sure you’ve seen the Ars Technica piece detailing the UI inconsistencies in their CS5 product, pieces of which were developed using AIR. Jobs himself says this is the most important argument, and I admire him for defending the user experience. Because Adobe isn’t going to do it.
Tim Maly
There are zero, ZERO, mobile devices that play Flash available on the market. There is no phone in existence that you could hand to your Bollywood-loving parents that would allow them to stream the music they want to hear if it’s all trapped behind Flash. People and companies are switching to non-Flash because it’s the only way to reach the mobile market. Has been for years.
Flash is a proprietary platform that requires access to a very, very expensive suite of tools to author content. HTML5, CSS, AJAX, etc are all open and editable in text editors by people with the knowledge to do so. And while the tools for non-mainstream users are lagging, consider the existence of platforms like Tumblr and Posterous or WordPress that allow anyone with no HTML knowledge to publish images and text to the world. Super-user-friendly video editing is not far behind.
Arguing that people should stick with Flash to allow the edges of society to flourish is like arguing that we should all stay on Windows because most people are already on Windows. It’s a weird argument from convenience. If you want to talk about Apple’s elitism and mainstream-focus, let’s have a conversation about the App store and the secretive rejection policy that no one understands or can predict. But on the web side? Insisting on a global open standard that will have a wide range of free tools and implementations is the most fringe-friendly Apple could be.
AceTaikula
‘There are zero, ZERO, mobile devices that play Flash available on the market.’
You need to get out more. Android phones have been running Flash for months. The user has the option, in the ‘Settings’ menu, of allowing or not allowing Flash programs to run. The same for pop-ups and the like. I can watch YouTube video on my phone.
Jobs could have given Apple customers a similar option. He didn’t. So much for his love of everything ‘open’ and his commitment to ‘the user experience.’
milo minderbinder
What is disappointing is the elitist attitude of the author thinking that his parents are entitled to an iPhone. If they need their Bollywood music so badly, they should go download it on a PC that uses Flash or go buy it on a CD.
cleawalford
I have to agree with Adam.
Pingback: Who Will Win The Battle Over Open Web Video? | Techi.com()
don’t get mad because apple decides not to support flash, they all seem like valid points to me. never really liked flash in the first place, but that is besides the point. it’s apple product and if they don’t want to support flash it is their business model not yours. it’s a gamble that they are willing to take, and i guessing not to many people will really miss it. and i for one welcome our new apple overlords…..
.mike
Welcome your new Google overlords. Apple is repeating history: fast out of the gate, then loss of market dominance to a competitor willing to sell to all the people Apple can’t be bothered with. Which is most of the people in the world.
Joe Howard
Apple is progressive towards standards it sees as beneficial to the widest possible audience. It is also early to adopt a position for progress. I remember everyone up in arms because Apple decided to ship computers without floppy drives and 56k modems. For mobile I think that adopting Flash “as is” is trying to push 12lbs of crap into a 10lb bag. Why lower the user experience trying to force it. As stated, content publishing will change, and more importantly, should change to more open source platforms and years later we can all look back and say “plain Flash video?” like we did with the floppy drive.
I definitely agree with you Joe, I remember thinking that it was crazy not to include the floppy drive in computers. This is more or less the same thing, floppy drives grew to be outdated and unnecessary – the same direction Flash is headed.
The exclusion of floppy drives forced the tech world to move forward, as will the exclusion of flash. It’s just something your parents, grandparents and anyone else is going to have to get used to if they want to keep up to date with it all.
The exclusion of floppy drives didn’t ‘force the tech world to move forward.’ Apple was emerging from a near-death experience and wasn’t selling enough computers to ‘force’ anything. It doesn’t sell enough computers today to force anything, either, though Steve Jobs obviously thinks he’s in a different position with mobile devices.
Personal computers at the time had reached an awkward stage where one could store data on 3.5 diskettes or on zip drives or on CD-Rs or on Mini-Disks, with USB storage cards on the horizon and web storage becoming a realistic option. No one really knew which devices were going to win out. Apple punted on the whole question. The candy-coloured iMacs were targeted at college students who might need their computers another 2-4 years. The machines left it up to the students to hook up whatever storage devices they found useful. The punt made news but, design-wise, it was still a punt.
Flash will pass, certainly.
So will Apple.
That was a reply to Joe Howard.
All technologies are transitional. One supports present technologies while preparing for the emerging technologies. Failing to support popular technologies entails a decision, as the writer points out, about which potential customers are worth pleasing and which are OK to alienate.
Anti-trust ethics and laws enter the picture once a venture reaches a certain size. So no, Steve Jobs does not have an automatic right to support what he wants and lock the door on everything else. Not after his enterprise starts operating at a certain scale. (See ‘Microsoft,’ subheading ‘Browsers’.)
@milo:
“What is disappointing is the elitist attitude of the author thinking that his parents are entitled to an iPhone.”
This goes into my list of “most retarded things I’ve seen/heard recently”.
Elitist attitude? Do you know what “elitist” means?
Elitist: the belief or attitude that some individuals, who supposedly form an elite — a select group of people with intellect, wealth, specialized training or experience, or other distinctive attributes — are those whose views on a matter are to be taken the most seriously or carry the most weight.
They aren’t wanting the iPhone to be restricted to their own interests; they’re wanting to be shown some consideration as people who are outside of the mainstream/elite.
On top of that, are you saying that someone’s personal taste and sentimentalities are determining factors in whether or not they deserve the honor and privilege of having an iPhone? With that “entitled to an iPhone” bit? Give me a break. Who’s the elitist here?
All that being said, I actually agree with most people here about the adaptation of open standards as opposed to Flash. But I don’t think it’s appropriate to be insulting his parents, saying that they don’t *deserve* an iPhone because they like different music, and therefore demonstrating exactly the Apple-elitist stand that the author is trying to point out. | 科技 |
2016-40/3983/en_head.json.gz/2896 | Scandinavians invented ice skating in 3000 BC
The earliest skates were made of bone - usually horse or cow By Roger Highfield, Science Editor
1:11PM GMT 24 Dec 2007
The oldest form of human-powered transport was ice skating, and was invented in northern Europe around five thousand years ago. Telegraph.co.uk guide to Christmas 2007 Christmas charity appeal Archaeological evidence suggests that the first skates made of animal bones date back to 3000 BC, helping people travel more widely during frozen winters in Finland, marking the start of the evolution of more sophisticated skates. Constructed of trimmed horse or cow bones, and pierced at one end and strapped to the foot with leather thongs, they were not powered by the classic skating motion but used in tandem with a long stick; skaters straddled the stick and poled themselves along. In the Biological Journal of the Linnean Society of London, Dr Frederico Formenti and Professor Alberto Minetti of Oxford University lay out the evidence supporting the idea that the birth of ice skating took place in Southern Finland, where the number of lakes within a given area is the highest in the world. "In Central and Northern Europe, five thousand years ago people struggled to survive the severe winter conditions and it seems unlikely that ice skating developed as a hobby," says Dr Formenti. "As happened later for skis and bicycles, I am convinced that we first made ice skates in order to limit the energy required for our daily journeys". In experiments on an ice rink by the Alps, the team measured the energy consumption of five retired professionals while skating on bones, showing they were relatively slow, reaching around 2.5 mph. However, through mathematical models and computer simulations of 240 ten-kilometre (six mile) journeys, their research shows that in winter the use of bone skates on frozen lakes - around 60,000 in Finland - would have limited the energy requirements of Finnish people by 10 per cent as they zipped about. "In order to better adapt to the severe conditions imposed by the long lasting winters, Finnish populations could benefit more than others from developing this ingenious locomotion tool." Other research by the team shows that the energy cost of skating on ice decreased dramatically through history, as bone gave way to iron and then steel, with modern ice- skating only using 25 per cent of the effort associated with the use of bone skates. "Moreover, for the same metabolic power, nowadays skaters can achieve speeds four times higher than their ancestors could." The researchers conclude: "Ice skates were probably the first human powered locomotion tools to take the maximum advantage from the biomechanical properties of the muscular system: even when travelling at relatively high speeds, the skating movement pattern required muscles to shorten slowly so that they could also develop a considerable amount of force." UK News | 科技 |
2016-40/3983/en_head.json.gz/2906 | People of Ad Tech: 20nine CEO Greg Ricciardi
Goodbye, Blockbuster. Hello, Franchise Approach.
People of Ad Tech: LKQD COO Scott Alexander
Don’t Throw Out The Programmatic Baby With The RTB Bathwater
HomeAboutContactArchivePrivacy
NAI’s Marc Groman on Setting and Keeping High Standards in Online Advertising
By themakegood ⋅ January 6, 2014 ⋅ Post a comment
Filed Under code of conduct, consumer education, DO NOT TRACK, FTC, Mobile, mobile advertising, mobile code, NAI, online advertising Marc Groman is President and CEO of the Network Advertising Initiative (NAI), the leading self-regulatory association exclusively focused on third-party advertising online and in mobile. As President & CEO of NAI, Marc Groman leads the organization’s growth and ongoing efforts to develop and maintain high standards for Interest-Based Advertising. The Makegood recently spoke with Marc about his past two years at NAI.
The Makegood: Congratulations on making it to two years as CEO and President of the Network Advertising Initiative! Can you think back on the day you signed on? What made you decide to join NAI in the first place?
I joined the NAI because I believe deeply in the value of self-regulation, which is particularly critical in online advertising because of the rapid pace in which Internet technology is evolving. I recognized then, as I do today, that there is no better example of self-regulation than what the NAI has developed to date – a code which encompasses the three more important elements of self-regulatory success: high standards, robust enforcement and accountability for all parties involved.
The Makegood: Now that it’s been two years, how have your goals for the organization changed as you gained more experience with NAI?
My primary objectives are the same today as they were two years ago: to ensure that NAI members honor the highest standards for online advertising, and to lead a team that serves as a partner in compliance, but also as enforcers when necessary. These objectives support the high-level goals of industry self-regulation: to ensure the overall health of the online advertising ecosystem. One goal that has evolved over time is my increased focus on championing the role of responsible third parties in the policy and privacy dialog taking place across the globe today. Responsible third parties, like responsible large companies, should have an active voice in debates about consumer privacy. In addition, coming from the FTC, I have always placed a high value on consumer education. In fact, I have made it a goal to ensure the NAI is providing consumers with the information needed to make effective decisions about online privacy and online advertising. Finally, as the landscape has evolved significantly over the last two years, ensuring our self-regulatory Code of Conduct stays up to date and able to meet today’s technology challenges is a constant point of focus.
The Makegood: During your time at NAI, what have been some of your biggest challenges and how have you overcome them?
Doing everything we can to promote policies that advance the most meaningful solutions is, always, by far the greatest challenge.
The ongoing policy debate around Do-Not-Track is a strong case in point. There are many different stakeholders involved in this debate, and they all have different goals and objectives for the same policy and tools. Further, they all bring a different understanding of the core issues surrounding Do-Not-Track. These differences, combined with some very real technical challenges and the fundamental reality of how the Internet works, has made it difficult to develop a standard that works for all parties. My concern is that in the end, the focus on what’s meaningful has been “lost for the trees.”
Another big challenge and concern at the moment is around the future of the HTTP cookie. It is my personal opinion that eliminating the cookie or irrationally scapegoating it as a larger problem than it is could lead to unintended consequences for the entire ecosystem. The NAI is addressing these challenges by actively participating in debates and discussions that are taking place around these issues in order to ensure the voice of responsible parties is being fairly and accurately represented as we all work toward common solutions.
The Makegood: What would you say are some of your biggest accomplishments as CEO and President?
I consider the biggest accomplishment to be that our members are honoring the highest standards for online advertising, and that our team is helping them to do so. Complying with the NAI Code of Conduct is no easy feat, and I’m extraordinarily proud of the efforts of the NAI team and its members to uphold best practices and act as a shining example of the power of self-regulation. In addition, in 2013 we held the first annual NAI summit, which was an overwhelming success. In addition to being sold out, our keynoter, FTC Commissioner Maureen Ohlhaussen, gave the NAI and its members high praise. Over the last two years, it’s been clear that there is increased respect for the NAI from a wide variety of stakeholders, even those who have previously criticized the organization. We have a new website, a new consumer education resource hub, we developed a new and updated Code of Conduct and Mobile Code this year, and since I joined, we have increased membership 30 percent.
The Makegood: What do you have in mind for the next two years? How will you take NAI even farther?
In the next two years, the NAI will provide guidance and expand the Code to cover rapidly evolving online advertising technology and changes in policy. We will continue to build our membership and be a strong voice for responsible parties in the online ecosystem. Finally, we will continue to refine and improve our already robust compliance program with new tools to help us monitor and enforce compliance in the mobile space.
The Makegood: Thank you, Marc
Subscribe to The Makegood by Email
Sponsor © 2016 The Makegood. All Rights Reserved. | 科技 |
2016-40/3983/en_head.json.gz/2965 | DOE award will not affect S&T agreement
Although Missouri lost out on being awarded a federal grant to develop small modular reactors (SMRs), the decision will not affect Missouri University of Science and Technology’s agreement with Westinghouse Electric Company.The U.S. Department of Energy (DOE) announced Nov. 20 that it had awarded a five-year grant to support a new project to design, license and help commercialize SMRs in the United States to Babcock and Wilcox Company, headquartered in Charlotte, N.C.The company will work with the Tennessee Valley Authority and Bechtel, headquartered in San Francisco, Calif. on the project.Andrew Careaga, director of communications at Missouri S&T, said the multi-year master research and collaboration agreement among the Rolla campus, the University of Missouri-Columbia campus and the UM system with Westinghouse still stands.“Our agreement was not contingent on this particular project,” Careaga said, “even though it was a big part of it.”The agreement with Westinghouse will support the development of multiple cutting-edge research projects at Missouri S&T and the other institutions that will benefit the Westinghouse SMR project.“We’re disappointed Westinghouse didn’t get the award for the first round, but we’re still very involved, working with Ameren and Westinghouse in pursuing the small modular reactors program in Missouri,” Careaga said.Westinghouse and Ameren Missouri were among several partnerships seeking up to more than $450 million in federal funding for the reactors through the DOE grant program.Both Missouri U.S. senators issued statements shortly after the announcement Nov. 20.“I’m deeply disappointed in today’s announcement,” said Sen. Claire McCaskill. “This project would be a tremendous opportunity for Missouri jobs and American energy security. I plan to keep working with the folks at Ameren and Westinghouse to pursue new opportunities, and to continue working across the aisle to expand innovation and strengthen security in American energy.”Sen. Roy Blunt issued this statement: “Missouri’s central location, key infrastructure and universities with nationally recognized nuclear engineering programs make it the best location for this project. I’m disappointed the Obama administration did not heed my calls to strongly consider this application, and I’ll continue to support Ameren and Westinghouse as they continue to pursue this important energy and economic development opportunity for our state.”As part of the grant award agreement, the DOE will invest up to half of the total project cost, with Babcock and Wilcox and its industry partners matching this investment by at least one-to-one. The specific total will be negotiated between the DOE and Babcock and Wilcox. The DOE investment will help Babcock and Wilcox obtain Nuclear Regulatory Commission licensing and hopefully achieve commercial operations by 2022.The project will be based in Tennessee and will support additional suppliers and operations in Indiana, Maryland, North Carolina, Ohio, Pennsylvania, and Virginia, according to the DOE.In the same announcement of the grant award, the DOE also stated that it plans to issue a follow-on solicitation open to other companies and manufacturers, focused on furthering SMR efficiency, operations and design.Westinghouse and Ameren said they plan to pursue federal support in this second phase of the project.According to the DOE, SMRs — which are approximately one-third the size of current nuclear power plants — have compact, scalable designs that are expected to offer a host of safety, construction and economic benefits.SMRs can also be made in factories and transported to sites where they would be ready to “plug and play” upon arrival, reducing both capital costs and construction times, according to the DOE.The smaller size also makes these reactors ideal for small electric grids and for locations that cannot support large reactors, offering utilities the flexibility to scale production as demand changes. | 科技 |
2016-40/3983/en_head.json.gz/2980 | TheyFly.com
Billy Meier : The only authentic extraterrestrial contactee Home"Billy" Eduard Albert MeierA Word About BillyArticlesExcerptsInterviewsQuestions to BillyHow Everything BeganThe Early YearsWhy Billy?Billy and his ContactsPhotos of BillyFIGUFIGUFIGU ArticlesFIGU-BulletinFIGU ForumFIGU GroupsFIGU-Open LettersFIGU-Special BulletinFIGU-Stimme der WassermannzeitFIGU StatutesFIGU in a NutshellOfficial Contact ReportsPeace MeditationFIGU DictionaryFIGU MultimediaVisiting SSSCMemberships of FIGUBilly's Contacts Media InvestigationsPress ReleasesInterviews with MichaelTalks by MichaelNews ArchivesUFO NewslettersLinksAdventures of Billy Meier Billy for KidsSkeptical ChallengeThe WitnessesBenjamin StevensFuture of MankindJames DeardorffBeam of LightFreund der WahrheitPleyades ContactoTranslationsContact Report 1Contact Report 2FIGU ArticleFIGU-Book FIGU-BookletFIGU-BulletinFIGU-ForumFIGU-Special BulletinFIGU-Stimme der WassermannzeitEspañol 1Español 2Oni Latają BlogAlicia ArmstrongLarry DriscollBruce LullaPatrick McKnightOther ArticlesAbout MichaelShopThey Fly! Online StoreThey Fly! at ScubblyThey Fly! at ZazzleHealth ProductsYekra Theater Home »Prophecies & Predictions »Henoch Prophecies Henoch Prophecies Official Contact Report 215, Saturday, February 28, 1987, 2.09 am
© Billy Meier/FIGU 2002–2004
Quetzal: …Before I give you a clear account of the prophecies of Henoch, I would like to point out that prophecies are always changeable and can be changed for the better if man makes positive changes in his thoughts, feelings and actions, leading to that which is better and positively progressive. Prophecies always rest upon specific causes; these again result in certain effects, whereby these effects can be changed at any time if only the preceding causes are changed in their form. Therefore, it is possible that negative or evil prophecies do not have to be fulfilled if the preceding causes will be purposely changed in a manner that positivity and good develop instead of negativity and evil. However, this does not apply to predictions, as these rest upon events that cannot be changed, are inevitable and surely and definitely will occur in the future. Predictions rest upon a preview and thus on a direct viewing of the future, and have to do with neither prophecy nor with calculation of probability. So when I make a portion of Henoch's prophecies for the third millennium known to you, it does not mean that they have to be fulfilled, because the prerequisite of fulfilment in each case would be that the already existing causes continue to exist as also continue to be created in the future so that a fulfilment of the prophecies can come to pass. Thus, provided that human beings of Earth will become reasonable, the possibility exists that by a reasonable change in the way of thinking as well as a reasonable development in feeling and an equally reasonable way of acting, everything changes for the better and positive, whereby prophecies do not have to be fulfilled. However, if this transformation does not occur, a very evil, wicked and negative time lies ahead for the Earth and its entire population in the coming new millennium.
Billy: Since the Second World War, the thoughts, feelings and actions of the human being of Earth have changed much towards the positive and good, but all that achieved is not enough in my opinion, as the great transformation towards the better has not been achieved yet, neither by the mighty of this world nor all of mankind of Earth itself. In the years gone by, you have made many predictions and calculations of probability as well as mentioned prophetic facts concerning the economic, military and political situation on Earth, whereby I was requested to spread this information—which I indeed have done. Governments and newspapers, radio stations as well as TV stations and many private persons worldwide were informed by me. But the entire effort did not achieve anything, because up to now mankind has carried on in the old manner and has paid no attention whatsoever to prophecies, predictions and calculations of probability. And the same will most likely be the case in the future, when I receive permission from you in the coming time to spread the prophecies of Henoch for the third millennium. But, nevertheless, I feel that Henoch's message for the future must be made known and distributed, because somehow it may bear fruit yet.
Quetzal: You apparently never give up hope. Your optimism is honourable and deserves to be heard by human beings, but the way things have developed throughout this century there is not too much hope that human beings of Earth will come to their senses and heed your words. This will then be the case only when the prophecies prove to be true or, even worse, have already come to pass. Probably only then will the time come when the defamations against you will end in regard to your contacts with us, although they will long continue to be vehemently disputed by your enemies as well as by pathological know-it-alls and critics who dismiss them as swindle, lies and fraud. The full truth about our contacts with you will be proven in the distant future, and then mankind will accept our help we offer through you—even when they erroneously assume we come from the seven-star system known to human beings of Earth as the Pleiades. [The Plejaren claim to live in the Plejares, an altered time-space configuration about 80 light-years beyond the Pleiades. MH]
Billy: Semjase and Ptaah already explained this to me. But tell us now what the new millennium will bring to human beings of Earth and the planet Earth according to the prophecies of Henoch.
Quetzal: I will do that in a moment, but I would like to explain before beginning that I am not authorised to give exact indication of years in an official manner.
Quetzal: Thus, I will offer Henoch’s prophecies in an understandable form. So listen, then:
163. If the human being of Earth continues to live in the same way as he has done up to now—forming his thoughts and feelings in the same manner, indulging in the same actions as he has hitherto—then the words of the prophecies of Henoch could not be any clearer.
164. The point in time at which these prophecies will begin to be fulfilled will be when a Pope will no longer reside in Rome.
165. All of Europe will then fall victim to a terrible punishment by evil powers.
166. The Christian religion will collapse and the churches and monasteries will end up in ruins and ashes.
167. Monstrous forces will be created by science and will be released by the military forces and armies as well as by terrorists, causing great destruction.
168. Millions and even billions of people will be killed by acts of terrorism, by wars and civil wars; and finally, in some parts of the world, every third human being, and, in other places, every fourth human being, will lose his or her life.
169. The nations of the East will rise against the nations of the West, the West against the East.
170. Many deaths will be inflicted upon the people by fighter and bomber aircraft, and bombs and rockets will destroy and annihilate smaller and larger villages and cities.
171. The people will be completely powerless against all this and will live through 888 days of Hell on Earth, suffering hunger and plagues which will claim even more lives than the war itself.
172. The time will be severe as never before experienced on Earth. Ultimately, nothing can be bought or sold any longer.
173. All provisions will be rationed; and if a human being steals even a small piece of bread, he/she will have to pay for it with his/her life.
174. Many waters will mix with human blood and turn red, as once in the past the Nile in Egypt turned red with blood.
175. And it will be that the fanatics of Islam will rise up against the countries of Europe and all will shake and quiver.
176. Everything in the West will be destroyed; England will be conquered and thrown down to the lowest level of misery.
177. And the fanatics and warriors of Islam will retain their power for a long time.
178. However, not only Europe will be affected but ultimately all the countries and peoples of the Earth, as the great horror expands to a war that will encompass the entire world.
179. After the turn of the millennium, the papacy will exist only a short period.
180. Pope John Paul II is the third from last in this position.
181. After him, only one additional pontificate will follow.
182. Then a Pontifex Maximus follows who will be known as Petrus Romanus.
183. Under his religious rule, the end of the Catholic Church will come, a total collapse becoming inevitable.
184. That will be the beginning of the worst catastrophe that will ever have befallen the human beings and the Earth.
185. Many Catholic clerics, priests, bishops, cardinals and many others will be killed and their blood will flow in streams.
186. But also the reformed version of Christianity will become just as infinitely small, as does Catholicism.
Horrifying weapons and a possible world war
187. Due to the fault of scientists, enormous power will be seized by the power-hungry and their military, their warriors and terrorists, and power will be seized as well through laser weapons of many types, but also via atomic, chemical and biological weapons.
188. Also concerning genetic technology, enormous misuse will occur, because this will be unrestrainedly exploited for the purposes of war, not lastly due to the cloning of human beings for warring purposes, as this was practised in ancient times with the descendants of Henoch in the regions of Sirius.
189. However, this will not be all of the horrors; as besides the genetic technology and the chemical weapons, far worse and more dangerous and more deadly weapons of mass destruction will be produced and will be used.
190. The irresponsible politicians will unscrupulously exercise their power, assisted by scientists and obedient military forces serving them, who together hold a deadly sceptre and will create clone-like beings which will be bred in a total lack of conscience and will be scientifically manipulated to become killer machines. Division by division and devoid of any feelings, they will destroy, murder and annihilate everything.
191. The USA will set out against the Eastern countries ahead of all other financial states and simultaneously it will have to defend itself against the Eastern intruders.
192. In all, America will play the most decisive role, when in the guise to strive for peace and to fight against terrorism it invades many countries of the Earth, bombs and destroys everything and brings thousandfold deaths to the populations.
193. The military politics of the USA will likewise know no limits, as neither will their economic and other political institutions which will be focused on building and operating a world police force, as it is the case already for a long time [sic].
194. But that will not be enough, and, in the guise of a so-called peaceful globalisation, American politics will aspire to gain absolute control of the world concerning supremacy in economy.
195. And this will point towards the possibility that a Third World War could develop from it, if human beings as a whole will not finally reflect upon reason, become reasonable and undertake the necessary steps against the insane machinations of their governments and military powers as well as their secret services, and call a halt to the power of the irresponsible who have forsaken their responsibility in all areas.
196. If this does not happen, many small and great nations will lose their independence and their cultural identity and will be beaten down, because the USA will gain predominance over them and with evil force bring them down under its rule.
197. At first, many countries will howl with the wolves of the US, partially due to fear of American aggressions and sanctions, as will be the case with many, many irresponsible [ones] in Switzerland and Germany but also of other countries. In part, others will join in because they will be forced somehow to do so or will be misled by irresponsible promoters of American propaganda.
198. Finally, many Asian, African and European states will rise up against the American hegemony, once they recognise that the United States of America is only taking advantage of them for purposes of war, conquest and exploitation.
199. In this way, many countries will become puppet states of America before reason and realisation will emerge in the responsible ones of governments and in many of the population, resulting in a turning away from the USA.
200. However, the great war will hardly be avoidable because the human beings of Earth will probably not accept the directions towards the better, therewith towards true love, true freedom and real peace, striving instead only towards wealth, pleasure and riches and for all manner of material values and unrestricted power.
201. Thus, huge and deadly formations of tanks will roll across the countries while fighter planes and rockets sweep through the air and bring death, ruin, destruction and annihilation to countries and people.
202. And the bad time will come, during which also the militaries and armies will cause out-of-control, great, wild and evil devastations and unimaginable massacres of the people, who will be forced to watch the wild activities powerlessly.
203. Your and our warnings will be cast into the wind by the majority of Earth humanity, even though quite a bit could still be saved and the very worst could be prevented if your and our words of love, peace, freedom, and wisdom would be acted upon.
204. But even that, unfortunately, is doubtful.
205. If the Third World War will actually happen—as calculations and observations appear to indicate to be probable now and also during the approaching few decades—then, as now, the civilian population will above all have to bear the brunt of the enormous suffering in tremendous numbers in this entire catastrophe and, last but not least, the fault of the irresponsible scientists who by cloning will create human machines for military purposes, devoid of conscience and feelings, and will create immensely deadly and all-annihilating computer-like weapons.
206. At the same time, the danger could become reality that the human combat machines, the military clones, will gain their independence and under their own management will bring death, devastation, destruction and annihilation to the human beings of Earth and to the planet.
207. The entire planet will become an arena of unparalleled suffering, which will never have existed before on Earth up to that time.
208. The cruel happenings will last about 888 days and cause civilisation to collapse.
209. Yet, the terrible scenario will continue, and epidemics and various diseases as well as enormous famine will be spread among the people, while the economy of the world will totally collapse and there will be no possibility to produce any goods.
210. …all foods and medications … will be rationed.
211. The insanity of war will extend not only across the land, but the disaster will equally be spread to the oceans, into the atmosphere, even into outer space.
212. But there will also be settlements under the ocean that will be developed in the course of the future and these will be attacked and destroyed, claiming the lives of many thousands of people.
213. However, a certain maelstrom of destruction will also originate from the undersea facilities; because in the cities at the bottom of the ocean, groups of submarine pirates will be formed which will burst upwards from the depths of the ocean and will become involved in destructive actions of combat with naval units on the surface.
214. And at this time, the possibility could become reality that extraterrestrial forces intervene against the Western industrialised countries, because these will be responsible for the extreme and enormous disaster of the coming evil times.
215. These extraterrestrial forces will give up their anonymity and their state of secrecy and will assist those who are being terrorised by the irresponsibly acting Western countries, should this possibility become reality.
216. In addition, apocalyptic natural catastrophes will occur which will cause all of Europe to shake and tremble; but Europe will continue to exist, even after having suffered enormous destruction.
Destruction in North America
217. Far in the West, it will be different; the United States of America will be a country of total destruction.
218. The cause for this will be manifold.
219. With its global conflicts which are continuously instigated by it and which will continue far into the future, America is creating enormous hatred against itself, worldwide, in many countries.
220. As a result, America will experience enormous catastrophes which will reach proportions barely imaginable to people of Earth.
221. The destruction of the WTC, i.e., the World Trade Center, by terrorists will only be the beginning.
222. Yet all the apocalyptic events will not only be brought about due to the use of unbelievably deadly and destructive weapons—such as chemical, laser and others—and by cloned murder machines; but in addition to this, the Earth and nature, maltreated to the deepest depths by the irresponsible human beings of Earth, will rise up and cause destruction and bring death onto the Earth.
223. Enormous firestorms and gigantic hurricanes will sweep over the USA and bring devastation, destruction and annihilation, as this from time immemorial never before will have happened [sic].
224. Not only will America, but also all other Western industrial countries which still live at the beginning of the new millennium in the delusion that they could dominate and rule over underdeveloped nations, i.e., Third World countries, will not only soon lose influence over these but must defend themselves against them.
225. According to the prophecies of Henoch, the truth about industrialised countries is that they only seem to appear to be true civilisations, but in fact they are not; because more and more, at the end of the 20th century and at the beginning of the third millennium, they will disregard all true love, true freedom and true wisdom as well as true peace along with all values of humaneness and all values of men's and women's true being.
226. But not even all the terrible happenings will hinder the USA in continuing to proceed with its actions against all countries.
227. Even when the North American continent will be stricken by the most terrible catastrophe which has ever been recorded, evil military powers will wreak havoc with computerised and nuclear, biological and chemical weapons, whereby it will also happen that computerised weapons become independent and cannot be controlled any longer by human beings.
228. Overall, this is the most important part of Henoch's prophecies.
Epidemics, conflicts and disasters
Billy: There is still more to … it. At least that is what you told me.
Quetzal: 229. You are untiring; so I will point out a few more important facts of the prophecies.
230. As of now, new epidemics have spread among the people of Earth; however, as Henoch prophesied, quite a number of further epidemics will follow.
231. Not only AIDS will occur worldwide in the 1990s, but also epidemics such as the so-called "mad cow disease", i.e., BSE, out of which different strains of Creutzfeldt–Jakob syndrome will develop, lasting well into the new millennium.
232. Also, an epidemic known as Ebola will cause many deaths, as well as other unknown epidemics and diseases which will sporadically arise in epidemic proportions and will be new to the human being, causing great concern.
233. However, most of the evil will be brought about by politics.
234. France and Spain become involved with each other in armed conflicts, and even before World War Three will have broken out.
235. Yet France will not only engage in armed conflicts with Spain, for within her great unrests will arise, leading to upheavals and civil war, as [will be] the case in Russia and Sweden.
236. Especially in France and Sweden, machinations as well as dictatorial regulations of the European Union will cause much unrest and many uprisings; but also crimes committed by gangs and organised criminal elements in these countries will cause unavoidable civil wars.
237. In addition, significant tensions will arise between the native citizens and immigrants from foreign countries, who as a rule also observe religious beliefs different from those of the native populace.
238. And in the end, this will lead to severe conflicts.
239. Hatred against strangers, foreigners and people of different religious beliefs will be the order of the day, as well as the rise of neo-Nazism, terrorism and right-extremism.
240. Conditions similar to civil war will be in England, Wales and Northern Ireland and claim many lives.
241. The Soviet Union will be dissolved in this decade or at the latest by the beginning of the next.
242. The man decisive for this action will be Mikhail Gorbachev.
243. But this will not lead to rest, because the new Russia will continue its longstanding conflict with China over Inner Mongolia, with the result that Russia will lose a portion of this territory to China.
244. And China becomes dangerous, especially to India, as also at this time China maintains uneasy relations with her.
245. China will attack India; and if biological weapons are used, around 30 million human beings will be killed in the area of and around New Delhi alone.
246. However, this will not be the end yet—because the effect of biological bombs and missiles, etc., used cannot be controlled at that time, and terrible epidemics unknown up to that point in time will arise and will spread quickly to many areas.
247. Also Pakistan will allow herself to be misled to instigate a war against India, which will be especially dangerous in view of the fact that both countries are developing atomic weapons.
Wars and devastation in Europe and North America
248. Yet Russia will not rest and will attack Scandinavia, and in doing so will embroil all of Europe. And months before that, a terrible tornado will have swept across northern Europe, causing great devastation and destruction.
249. It must still be stated that the Russian attack will occur during the summer, in fact, starting from Arkhangelsk. Denmark will not be dragged into the war, due to the insignificance of this country.
250. Yet Russia will not be satisfied with this action of war, as her will for expansion will be ravenous.
251. And consequently Russia will launch a military attack against Iran and Turkey and will conquer these two countries in bloody fighting, causing enormous destruction.
252. In the Russian expansionist mentality will also be included the drive to gain control of the Middle Eastern oil deposits as well as to gain control of the southeastern region of Europe. Therefore, she will also invade the Balkans and conquer these countries there in enormous battles, causing ruthless and devastating destruction with many deaths.
253. This will be at the time that tremendous natural disasters will hit Italy and its people, causing severe hardship.
254. But this will also be at the time when Vesuvius could become active again and could spread tremendous havoc.
255. At the same time, a war will shake Italy and claim many human lives as well as cause great destruction.
256. Destruction of war will descend on the northern countries as strong military forces will invade them from the East and will pillage and murder as well as use bombs and missiles, like hail coming down, and hitherto unknown weapons of laser- and computer-controlled types which will destroy and annihilate everything, whereby the first target will be Hungary and after that will follow Austria and northern Italy.
257. Switzerland will also be severely affected, but will not be the actual target; this will be France and Spain.
258. However, the main objective of the aggressors will be to bring all of Europe under their military control, and for that purpose France will be selected to be the headquarters. France will not only be invaded by the aggressors from the outside, but will also be conquered from within as a result of collaborative forces and other forces.
259. This can be envisioned as being the many foreigners of a different religion living in France at that time, and specifically Islam, which will be this force working from within.
260. Once France has fallen, a war to conquer Spain and England will take place. Subsequently, an alliance with the forces of the aggressors will be formed, which will invade Scandinavia.
261. For all these French-based military operations, the weapons of mass destruction stored in the arsenals of France will be used and cause evil devastation, destruction and annihilation.
262. The aggressors from the East will force the French Army to join their military forces and lead a war of conquest against the northern countries of Europe, invading and conquering Sweden and Norway. Subsequently, these northern countries will be annexed by Russia.
263. Military forces will also attack Finland, whereby many will be killed and an enormous destruction will be caused, after which the complete dissolution of the country will occur and the Russian forces will settle in it for a very long time.
264. This, then, will also be the time, during which the German citizenship will even fight in a civil war and, at the same time, attack a large army from East Germany.
265. But the country and its population will be able to free themselves from the yoke of the aggressors again; it’s just that they will still fall into the hands of the invading military forces.
266. After a certain time, then, the nations will rise up against the aggressors and intruders; therefore, a European-wide liberation struggle will break out.
267. At the same time, as a civil war rages in Germany, an enormously bloody revolution will break out in England which will claim more lives than will be claimed by the civil war in Germany.
268. And because England and Ireland have been at war for a long time already, due to the IRA and the police and military forces of England, the result will be (because this feud will continue up to that time) that this revolution will spread out to all of Ireland, especially affecting Northern Ireland.
269. Many lives will be lost during a civil war in Wales, where differences between various parties will arise before the Third World War.
270. Welsh and English forces will clash especially near Cymru, and claim many lives and cause great destruction.
271. But death, destruction and annihilation will not only rage in Europe but also in America, where much suffering will have to be endured and many deaths as well as destruction and annihilation will be.
272. America and Russia will have the most terrible weapons of mass destruction at their disposal—a fact which is already the case to a certain extent today—and will clash with violent force against each other at that time of conflict, whereby Canada will also be dragged into this conflict.
273. The source of this conflict will substantiate the Russian attack on the American State of Alaska and against Canada.
274. This conflict will result in mass killings of human beings as well as devastating destruction, annihilation and epidemics, etc., which mankind of Earth will never have seen and experienced up to that time.
275. Not only nuclear, biological and chemical weapons will be used en masse, but also enormously deadly systems of computer-controlled weapons that are only in the beginning stages of development today, or will be invented and constructed during the third millennium.
Worldwide natural catastrophes
276. As already mentioned, enormous natural catastrophes and rolling walls of fire and violent hurricanes will rage all across America, while, in addition, all the terrible effects of war will bring thousandfold deaths, destruction and annihilation.
277. America’s largest cities will be absolutely destroyed, and firestorms will cause great disaster and misery.
278. Severe earthquakes and volcanic eruptions will also belong to that time, and these will cause much suffering and misery and deaths besides enormous destruction and devastation, as all of nature and the planet itself will rise up against the insanity of human beings on Earth.
279. However, tornadoes, earthquakes and volcanic eruptions will not only rage in America, but also in Europe and in the rest of the world.
280. These activities have already begun at the present time, also during the past decades—with the exception that they will become increasingly more devastating in the future.
281. And man of Earth is guilty for the most part today, as also in the future it is man who will destroy the entire environment—all of nature, the atmosphere, water and all the resources of the planet.
282. And through this, a shifting of weight inside the Earth takes place, caused for example by the creation of gigantic lakes by damming and by creating hollow caverns due to the exploitation of petroleum and gas, etc.
283. And thereby unnatural inner-Earth movements are created, which also lead to unnatural tectonic effects and cause earthquakes and volcanic eruptions, which also in turn cause enormous climatic changes, resulting in horrendous tornadoes of devastating proportions which in the end will set their destructive energies free on the entire world.
284. All of this will lead to increasingly horrible floods and unusually massive snowfalls which will advance to the southern countries and finally even to the equatorial regions, because through the insanity of human beings the Earth has begun unnoticeably to spin [strangely] as a consequence of atomic explosions inside and on the surface of the Earth.
285. And this will be the reason that the planet will slowly but surely enter an extraordinary spinning orbit around the Sun, while the first phase is already occurring, which causes a change in climate, leading to a new ice age.
Civil wars and anarchy in America
286. Yet the misery on Earth will continue, as two terrible civil wars will break out in America, whereby one will follow the other.
287. Afterwards, the United States of America will break apart and deadly hostility will prevail among her, which then leads to the division into five different territories; and it cannot be prevented that sectarian fanatics will play a dictatorial role.
288. Anarchy will be the worldwide condition that will prevail and torment human beings over a long period of time, as human beings will also be tormented by the many epidemics and diseases, many of them new and unknown to human beings and for this reason incurable.
289. Due to this fact, the bodies of many human beings will slowly and miserably decay, while unbearable pain will also occur as well as blindness and terrible respiratory problems that lead to suffocation.
290. The consciousness of many human beings will become impaired and succumb to feeblemindedness and insanity.
291. And all these gruesome occurrences will be traceable to biological and chemical weapons, which are the cause of not fast, but gruesome and slow deaths; and this will also occur due to the use of ray and frequency weapons which are already being developed today.
292. Finally, the words of Henoch may be specifically mentioned, which include that mankind of Earth, in pursuit of technology for mass destruction and greed for power, hatred, vengeance and riches, will ignore all values of Creation and will trample upon all values of love, wisdom, freedom and peace, as ancestors of the Henoch lineage have done before, to plunge the world into screaming misery, death, destruction and annihilation and into the most severe catastrophes mankind of Earth will ever have experienced.
Download PDF of Henoch Prophecies by Chris Lock here.
Editor's Note: The information here is just a fraction of that given to Billy Meier.
The Henoch Prophecies are copyright © 2002–2004 Billy Meier/FIGU (Freie Interessengemeinschaft für Grenz und Geisteswissenschaften und Ufologiestudien [Free Community of Interests in Fringe and Spiritual Sciences and Ufological Studies), Semjase Silver Star Center, CH-8495 Schmidrüti ZH, Switzerland. For more information, visit FIGU.
The complete Henoch Prophecies, as well as more predictions from the 251st Contact, can be found in Guido Moosbrugger's 2004 book, And Still They Fly! (the second edition of And Yet They Fly!, 2001). To obtain a copy of the book "And Still They Fly!" by Guido Moosbrugger visit FIGU Shop or to purchase the eBook visit They Fly! Online Store. ‹ Prophecies & Predictions
Prophecies and Predictions 1951 and 1958 ›
Website: © Copyright 2006-2016 Michael Horn.
All UFO photographs, films, video, audio, and Contact Reports: © "Billy" Eduard Albert Meier/FIGU. | 科技 |
2016-40/3983/en_head.json.gz/3005 | Intersil Announces Resignation of Dave Bell as President & CEO
MILPITAS, Calif. --(Business Wire)-- Intersil Corporation (NASDAQ Global Select: ISIL), a world leader in the design and manufacture of high-performance analog and mixed-signal semiconductors, today announced that its Board of Directors has appointed Board member James Diller as interim President and Chief Executive Officer following the resignation of Dave Bell as President and Chief Executive Officer and a Director of the Company, effective immediately.
As interim President and Chief Executive Officer, Mr. Diller will work closely with Intersil's current executive team and Board of Directors to oversee the Company's ongoing operations and strategic initiatives. The Board has formed a search committee chaired by Board member Donald Macleod to consider candidates for the permanent President and Chief Executive Officer role. The Board will engage a leading executive search firm to assist with the process.
Intersil also announced that Donald Macleod has been named Chairman of the Board, effective immediately. Gary Gist, the Company's former Chairman, has stepped down from that position and will remain a member of the Board of Directors.
"On behalf of the Board of Directors, I want to thank Dave for his many contributions to Intersil," said Donald Macleod. "Since joining the Company, he has successfully led the business through a challenging economic environment, while ensuring the Company is well-positioned to capitalize on its analog heritage to grow the business across our core Industrial & Infrastructure, Consumer, and Personal Computing market segments. Going forward, it is imperative that Intersil continues to aggressively focus on driving our top growth initiatives while adapting to changing market conditions."
Mr. Diller has been a member of the Company's Board of Directors since May 2002. He has significant expertise in the semiconductor industry, having served as Chief Executive Officer of two public companies, including Elantec Semiconductor, Inc. and PMC-Sierra (News - Alert), Inc., of which he was a founder. Additionally, he has served on the boads of directors of four public companies in the sector, and is currently Chairman of the Board of Directors of Avago Technologies Limited. He holds a Bachelor of Science in Physics from the University of Rhode Island.
A member of Intersil's Board of Directors since September 2012, Mr. Macleod brings to his new role as Chairman of the Board more than 30 years of industry experience in both the United States and Europe and proven business and financial acumen. Notably, Mr. Macleod served as Chairman, President and Chief Executive Officer of National Semiconductor Corporation from 2009 until its acquisition by Texas Instruments in September 2011. Previously, he served in a variety of executive positions at National Semiconductor Corporation, including Chief Operating Officer and Chief Financial Officer. Mr. Macleod also currently serves on the Board of Directors of Avago Technologies. He holds both a Bachelor of Science in Economics and an honorary Doctor of the University from the University of Stirling in Scotland. He is a member of the Institute of Chartered Accountants of Scotland.
About Intersil
Intersil Corporation is a leader in the design and manufacture of high-performance analog, mixed-signal and power management semiconductors. The Company's products address some of the fastest growing markets within the industrial and infrastructure, personal computing and high-end consumer markets. For more information about Intersil or to find out how to become a member of its winning team, visit our website and career page at www.intersil.com.
Intersil Corporation press releases and other related comments may contain forward-looking statements as defined in Section 27A of the Securities Act of 1933 and Section 21E of the Securities Exchange Act of 1934, in connection with the Private Securities Litigation Reform Act of 1995. Such forward-looking statements are based upon Intersil's management's current expectations, estimates, beliefs, assumptions and projections about Intersil's business and industry. Words such as "anticipates," "expects," "intends," "plans," "predicts," "believes," "seeks," "estimates," "may," "will," "should," "would," "potential," "continue," "goals," "targets" and variations of these words (or negatives of these words) or similar expressions, are intended to identify forward-looking statements. In addition, any statements that refer to projections or other characterizations of future events or circumstances, including any underlying assumptions, are forward-looking statements. These forward-looking statements are not guarantees of future performance and are subject to certain risks, uncertainties and assumptions that are difficult to predict. Therefore, the Company's actual results could differ materially and adversely from those expressed in any forward-looking statements as a result of various risk factors. Intersil does not adopt and is not responsible for any forward-looking statements and projections made by others in this press release. Intersil's Annual Report on Form 10-K, subsequent Quarterly Reports on Form 10-Q, recent Current Reports on Form 8-K and other Intersil filings with the U.S. Securities and Exchange Commission (which may be obtained for free at the SEC's (News - Alert) web site at http://www.sec.gov) discuss some of the important risk factors that may affect Intersil's business, results of operations and financial condition. These forward-looking statements are made only as of the date of this communication and Intersil undertakes no obligation to update or revise these forward-looking statements. | 科技 |
2016-40/3983/en_head.json.gz/3103 | News China Telecom Split in Two Entities - 2001-12-11 October 30, 2009 12:37 AM
China is breaking up its dominant telephone company, China Telecom, into two smaller entities to increase competition. The announcement comes on China's first day as a member of the World Trade Organization. The official Xinhua News Agency says the State Council, China's cabinet, approved plans to break the near-monopoly held by fixed-line phone company China Telecom. Xinhua said Tuesday that rival northern and southern companies will be set up to increase competition. The move comes on the day China formally joins the World Trade Organization. It is part of a broad effort to liberalize one of the country's most promising sectors before foreign competitors are allowed in. Foreign Ministry Spokeswoman Zhang Qiyue says at a news conference Tuesday that "China's WTO entry is an inevitable result of China's economic opening and modernization." She also says "it provides a good opportunity for future market reforms." Ms. Zhang says that "at the same time, China faces a very serious and difficult task as it carries out its WTO obligations." As a condition of its WTO membership, Beijing agreed to allow foreign operators to take up to a 50 percent share in some types of telecommunications companies two years from now. Fixed-line phone services will open more gradually to foreign competition, with foreign companies allowed a 49 percent share after six years. Xinhua says that one of the two companies created by the break-up of China Telecom will merge with two other companies and will operate in the north. That company will be called China Netcom. The other company keeps the name China Telecom and will operate in the south. Back to top | 科技 |
2016-40/3983/en_head.json.gz/3204 | Study: 1 in 3 W.Va. homes does not have a computer
Eric Eyre
CHARLESTON, W.Va. -- West Virginia's push to expand high-speed Internet might be more complicated than making broadband service available by stringing copper wire or fiber on poles to people's homes.A new federal study shows slightly more than 35 percent of West Virginia households don't own a computer -- the second-lowest ranking of any state in the survey.The low computer ownership numbers help explain why many West Virginians don't sign up for high-speed Internet service, even where it's available.The study -- called "Exploring the Digital Nation" -- shows that 59 percent of West Virginia households subscribe to high-speed Internet. That's the eighth-lowest Internet adoption rate among the 50 states, although West Virginia's ranking has improved from past years.
"The report is clearly, in my opinion, a report on age groups and their habits as much as it is on the subject of adoption rates," said Lee Fisher, who serves on the West Virginia Broadband Deployment Council. "So in those states where an aging population, like in West Virginia, is an issue, I don't believe you will ever have the adoption rates that people seem to shoot for until the demographic changes."Nationally, 70 percent of homes are hooked up to the Internet."Even with our improved 'take rate' up in the 60-percent range, we are still way behind most of the country," said Dan O'Hanlon, chairman of the Broadband Deployment CouncilThe study by the National Telecommunications and Information Administration cited several reasons why people don't sign up for Internet service: lack of interest; it's too expensive; and they don't have a computer.
Mississippi had 35.5 percent of homes without computers, the lowest ownership rate in the nation, followed by West Virginia's at 35.4 percent. By contrast, 85 percent of homes have computers in Washington state, the highest percentage in the nation, according to the study.To increase computer ownership, O'Hanlon suggested the state work with nonprofit groups, such as Mission West Virginia, that provide refurbished computers to homes that don't have them."The report actually shows us there are things the Broadband Council can do to raise our rate of broadband use in West Virginia," he said.Frontier Communications, West Virginia's largest broadband provider, has spent tens of millions of dollars in recent years to make high-speed Internet available across the state.
West Virginia also received a $126.3 million grant to extend high-speed fiber to public facilities, though homes and business haven't been included in the project.The Broadband Deployment Council distributed $2 million last year for projects designed to bring wireless Internet to homes in rural communities.The council turned down a handful of "demand promotion" projects intended to increase the number of people who subscribe to high-speed Internet.
At the time, state law required the Broadband Deployment Council to award money for such projects in remote areas without Internet service. Council members said it didn't make sense to spur people to sign up for broadband if the service wasn't available.State lawmakers have since passed a bill that allows the broadband council to distribute money for projects that increase the demand for broadband anywhere in West Virginia.Fisher said state leaders must do more to promote the use of broadband technology "as an economic development tool.""Until West Virginia finds this person or group of persons to not only talk about it every day and fund it every year, I don't think West Virginia will ever be at the top of any of these surveys," said Fisher.The federal report, which examined U.S. Census data, also details how and why people go online.The report found that 34 percent of Internet users searched for jobs, while 35 percent shopped for health insurance plans. About a third of Internet users ages 25 to 44 went online for news, compared to 8 percent of users 65 and older.
The report found almost all people who used the Internet at home did so with a high-speed broadband connection. In West Virginia, 3 percent of Internet users still had dial-up connections through phones -- the third-highest rate among the 50 states, according to the study."The data show that Americans depend on the Internet use to engage in a wide range of activities," said Lawrence Strickling, U.S. assistant secretary of communications, in a release. "It underscores the need for us to continue our efforts to ensure all Americans have access to broadband."Reach Eric Eyre at ericeyre@wvgazette.com or 304-348-4869.
Innerviews: Devoted manager keeps Bistro train on track
Barbour County man fights greed with cooking oil | 科技 |
2016-40/3983/en_head.json.gz/3211 | (0) Amazon Kindle Faces Unexpected Demand – Company.
Amazon Apologizes for Low Availability of Kindle
[03/20/2008 01:48 PM]by Anton Shilov Despite of skeptical expectations around the industry, Amazon Kindle device that enables people to read electronic books and texts faces unexpected demand that Amazon cannot meet. As a result of such unprecedented demand, chief executive of the company had to apologize on the front page of the world’s largest online store.“We had high hopes for Kindle before its launch, but we didn’t expect the demand that actually materialized. We sold out in the first 5.5 hours and have been scrambling to increase our manufacturing capacity ever since. We’ve been shipping on a first-come, first-served basis, but many customers have to wait as long as six weeks after ordering. We hope to be able to announce to you within the next few weeks that we’re back in stock and that when you order a Kindle, we’ll ship it to you that very same day,” Jeff Bezos, the founder and chief executive officer of Amazon said in his open letter to customers.Many industry observers as well as high-ranking executives said that Amazon Kindle electronic book reader will not become successful due to various limitations. Moreover, Steve Jobs, chief executive of Apple, the company most known for its portable digital players that playback music and videos as well as for iTunes media store, said that people do not read anymore at all.“It doesn’t matter how good or bad the product is, the fact is that people don’t read anymore. 40% of the people in the U.S. read one book or less last year. The whole conception is flawed at the top because people don’t read anymore,” said Steve Jobs in and an interview without revealing the sources of his information regarding people’s unwillingness to read.However, both observers and Mr. Jobs appeared to be wrong: Amazon Kindle is in demand, which means that numerous customers are still interested in reading.Amazon Kindle is a portable device that measures by 7.5” x 5.3” x 0.7” (19.05cm x 13.4cm x 1.77; height x width x thickness), weighs 10.3 ounces (292 grams) and features black and white 6” screen with 600x800 resolution. The device features EV-DO connectivity to download the content everywhere in the USA, however, it is unclear whether users in Europe will get GPRS/EDGE/HSDPA technologies. The amount of memory inside Kindle is unknown, but the device also supports secure digital (SD) cards, which will allow to store considerable amount of content. Battery time is unknown, but considering usage of flash memory and electronic paper screen, it is highly likely that the device can work for a long time.Kindle customers can wirelessly shop the Kindle Store, download or receive new content – all without a PC, Wi-Fi hot spot, or syncing. However, it is impossible to browse the Internet using the device. More than 90 000 books are now available in the Kindle Store, including best sellers and new releases, which are $9.99, unless marked otherwise. Kindle is available for $399 at Amazon.com. Tweet
Asustek Unveils High-Definition Display with Web-Conferencing Capabilities.
Asus Releases 24” Display with Built-In Webcam, Microphone, Speakers
Art Lebedev Starts to Ship Final Optimus Maximus Keyboards.
Optimus Maximus Shipments Initiated
Freescale Semiconductor Wants to Take Over Multimedia Chip Specialist.
Freescale Semiconductor to Acquire SigmaTel, May Offer Highly-Integrated Multimedia Platforms
Eee for Everywhere: Asustek Unveils Lineup of Eee Devices.
Asus Eee Lineup Gets Desktop, HDTV-Set, All-In-One Desktop
Foxconn Set to Produce Keyboards with Screens Inside Keys.
Foxconn and United Keys to Offer Rival for Optimus Maximus Keyboards
AMD Xilleon Based Televisions, Set-Top-Boxes to Display DivX-Encoded Movies.
AMD and DivX Sign Agreement, Set to Increase DivX Support by Consumer Electronics
Apple iTunes Now Offers to Rent Movies from Major Hollywood Studios.
Apple Signs Deal to Rent Movies via iTunes Store | 科技 |
2016-40/3983/en_head.json.gz/3296 | Nautical chart
(Redirected from Nautical charts)
A 1976 United States NOAA chart of part of Puerto Rico
A nautical chart of the Warnemünde harbor shown on OpenSeaMap
A nautical chart is a graphic representation of a maritime area and adjacent coastal regions. Depending on the scale of the chart, it may show depths of water and heights of land (topographic map), natural features of the seabed, details of the coastline, navigational hazards, locations of natural and human-made aids to navigation, information on tides and currents, local details of the Earth's magnetic field, and human-made structures such as harbours, buildings and bridges. Nautical charts are essential tools for marine navigation; many countries require vessels, especially commercial ships, to carry them. Nautical charting may take the form of charts printed on paper or computerized electronic navigational charts. Recent technologies have made available paper charts which are printed "on demand" with cartographic data that has been downloaded to the commercial printing company as recently as the night before printing. With each daily download, critical data such as Local Notice to Mariners is added to the on-demand chart files so that these charts will be up to date at the time of printing.
1 Sources and publication of nautical charts
2 Chart correction
2.1 Limitations
3 Map projection, positions, and bearings
4 Electronic and paper charts
4.1 Labeling nautical charts
5 Details on a nautical chart
5.1 Pilotage information
5.2 Depths and heights
5.3 Tidal information
Sources and publication of nautical charts[edit]
Nautical charts are based on hydrographic surveys. As surveying is laborious and time-consuming, hydrographic data for many areas of sea may be dated and not always reliable. Depths are measured in a variety of ways. Historically the sounding line was used. In modern times, echo sounding is used for measuring the seabed in the open sea. When measuring the safe depth of water over an entire obstruction, such as a shipwreck, the minimum depth is checked by sweeping the area with a length of horizontal wire. This ensures that difficult to find projections, such as masts, do not present a danger to vessels navigating over the obstruction.
Nautical charts are issued by power of the national hydrographic offices in many countries. These charts are considered "official" in contrast to those made by commercial publishers. Many hydrographic offices provide regular, sometimes weekly, manual updates of their charts through their sales agents. Individual hydrographic offices produce national chart series and international chart series. Coordinated by the International Hydrographic Organization, the international chart series is a worldwide system of charts ("INT" chart series), which is being developed with the goal of unifying as many chart systems as possible.
There are also commercially published charts, some of which may carry additional information of particular interest, e.g. for yacht skippers.
Chart correction[edit]
The nature of a waterway depicted by a chart may change, and artificial aids to navigation may be altered at short notice. Therefore, old or uncorrected charts should never be used for navigation. Every producer of nautical charts also provides a system to inform mariners of changes that affect the chart. In the United States, chart corrections and notifications of new editions are provided by various governmental agencies by way of Notice to Mariners, Local Notice to Mariners, Summary of Corrections, and Broadcast Notice to Mariners. In the U.S., NOAA also has a printing partner who prints the "POD" (print on demand) NOAA charts, and they contain the very latest corrections and notifications at the time of printing. To give notice to mariners, radio broadcasts provide advance notice of urgent corrections.
A good way to keep track of corrections is with a Chart and Publication Correction Record Card system. Using this system, the navigator does not immediately update every chart in the portfolio when a new Notice to Mariners arrives, instead creating a card for every chart and noting the correction on this card. When the time comes to use the chart, he pulls the chart and chart's card, and makes the indicated corrections on the chart. This system ensures that every chart is properly corrected prior to use. A prudent mariner should obtain a new chart if he has not kept track of corrections and his chart is more than several months old.
Various Digital Notices to Mariners systems are available on the market such as Digitrace, Voyager, or ChartCo, to correct British Admiralty charts as well as NOAA charts. These systems provide only vessel relevant corrections via e-mail or web downloads, reducing the time needed to sort out corrections for each chart. Tracings to assist corrections are provided at the same time.
The Canadian Coast Guard produces the Notice to Mariners publication which informs mariners of important navigational safety matters affecting Canadian Waters. This electronic publication is published on a monthly basis and can be downloaded from the Notices to Mariners (NOTMAR) Web site. The information in the Notice to Mariners is formatted to simplify the correction of paper charts and navigational publications.
Various and diverse methods exist for the correction of electronic navigational charts.
Limitations[edit]
In 1973 the cargo ship MV Muirfield (a merchant vessel named after Muirfield, Scotland) struck an unknown object in waters charted at a depth of greater than 5,000 metres (16,404 ft), resulting in extensive damage to her keel.[1] In 1983, HMAS Moresby, a Royal Australian Navy survey ship, surveyed the area where Muirfield was damaged, and charted in detail a previously unsuspected hazard to navigation, the Muirfield Seamount.
The dramatic accidental discovery of the Muirfield Seamount is often cited as an example of limitations in the vertical geodetic datum accuracy of some offshore areas as represented on nautical charts, especially on small-scale charts. A similar incident involving a passenger ship occurred in 1992 when the Cunard liner Queen Elizabeth 2 struck a submerged rock off Block Island in the Atlantic Ocean.[2] More recently, in 2005 the submarine USS San Francisco ran into an uncharted sea mount about 560 kilometres (350 statute miles) south of Guam at a speed of 35 knots (40.3 mph; 64.8 km/h), sustaining serious damage and killing one seaman. In September 2006 the jack-up barge Octopus ran aground on an uncharted sea mount within the Orkney Islands (United Kingdom) while being towed by the tug Harold. £1M worth of damage was caused to the barge and delayed work on the installation of a tidal energy generator prototype. As stated in the Mariners Handbook and subsequent accident report "No chart is infallible. Every chart is liable to be incomplete".[3]
Map projection, positions, and bearings[edit]
A pre-Mercator nautical chart of 1571, from Portuguese cartographer Fernão Vaz Dourado (c. 1520–c.1580). It belongs to the so-called plane chart model, where observed latitudes and magnetic directions are plotted directly into the plane, with a constant scale, as if the Earth's surface were a flat plane (Portuguese National Archives of Torre do Tombo, Lisbon)
The Mercator projection is used on the vast majority of nautical charts. Since the Mercator projection is conformal, that is, bearings in the chart are identical to the corresponding angles in nature, courses plotted on the chart may be used directly as the course-to-steer at the helm.
The gnomonic projection is used for charts intended for plotting of great circle routes. NOAA uses the polyconic projection for some of its charts of the Great Lakes, at both large and small scales.[4]
Positions of places shown on the chart can be measured from the longitude and latitude scales on the borders of the chart, relative to a geodetic datum such as WGS 84.
A bearing is the angle between the line joining the two points of interest and the line from one of the points to the north, such as a ship’s course or a compass reading to a landmark. On nautical charts, the top of the chart is always true north, rather than magnetic north, towards which a compass points. Most charts include a compass rose depicting the variation between magnetic and true north.
However, the use of the Mercator projection is not without its drawbacks. Mercator's technique was to make the lines of longitude parallel. On the real globe, the lines of longitude converge as one goes north or south away from the equator, until they meet at the pole. This distortion means that horizontal distances are exaggerated. Mercator's solution, imperfect as it might be, was to increase the distance between lines of latitude in proportion; in a Mercator's projection map, a square is a square no matter where you are on the chart, but a square on the Arctic Circle is much bigger than a square at the equator, even though both occupy the same number of degrees on the globe. The result of this is that scale in a nautical chart is dependent on latitude. In practical use, this is less of a problem than it sounds. One minute of latitude is, for practical purposes, a nautical mile. Distances in nautical miles can therefore be measured on the latitude gradations printed on the side of the chart.[5]
Electronic and paper charts[edit]
Portion of an electronic chart of the Bering Strait
Conventional nautical charts are printed on large sheets of paper at a variety of scales. Mariners will generally carry many charts to provide sufficient detail for the areas they might need to visit. Electronic navigational charts, which use computer software and electronic databases to provide navigation information, can augment or in some cases replace paper charts, though many mariners carry paper charts as a backup in case the electronic charting system fails.
Labeling nautical charts[edit]
Automatically labeled nautical chart
Nautical charts must be labeled with navigational and depth information. There are a few commercial software packages that do automatic label placement for any kind of map or chart.
Details on a nautical chart[edit]
Pilotage information[edit]
Detail of a United States NOAA chart, showing a harbour area
The chart uses symbols to provide pilotage information about the nature and position of features useful to navigators, such as sea bed information, sea marks and landmarks. Some symbols describe the sea bed with information such as its depth, materials as well as possible hazards such as shipwrecks. Other symbols show the position and characteristics of buoys, lights, lighthouses, coastal and land features and structures that are useful for position fixing. The abbreviation "ED" is commonly used to label geographic locations whose existence is doubtful.
Colours distinguish between man-made features, dry land, sea bed that dries with the tide and seabed that is permanently underwater and indicate water depth.
Depths and heights[edit]
Use of colour in British Admiralty charts
Depths which have been measured are indicated by the numbers shown on the chart. Depths on charts published in most parts of the world use metres. Older charts, as well as those published by the United States government, may use feet or fathoms. Depth contour lines show the shape of underwater relief. Coloured areas of the sea emphasise shallow water and dangerous underwater obstructions. Depths are measured from the chart datum, which is related to the local sea level. The chart datum varies according to the standard used by each national Hydrographic Office. In general, the move is towards using Lowest Astronomical Tide (LAT), the lowest tide predicted in the full tidal cycle, but in non-tidal areas and some tidal areas Mean Sea Level (MSL) is used.
Heights, e.g. a lighthouse, are generally given relative to Mean High Water Spring (MHWS). Vertical clearances, e.g. below a bridge or cable, are given relative to Highest Astronomical Tide (HAT). The chart will indicate what datum is in use.
The use of HAT for heights, and LAT for depths, means that the mariner can quickly look at the chart to ensure that they have sufficient clearance to pass any obstruction, though they may have to calculate height of tide to ensure their safety.
Tidal information[edit]
Tidal races and other strong currents have special chart symbols. Tidal flow information may be shown on charts using tidal diamonds, indicating the speed and bearing of the tidal flow during each hour of the tidal cycle.
Atlas portal
Nautical portal
Automatic label placement
Admiralty chart
Bathymetric chart
European Atlas of the Seas
Calder, Nigel (2008). How to Read a Nautical Chart. McGraw-Hill Professional. ISBN 978-0-07-159287-1. References[edit]
^ Calder, Nigel. How to Read a Navigational Chart: A Complete Guide to the Symbols, Abbreviations, and Data Displayed on Nautical Charts. International Marine/Ragged Mountain Press, 2002.
^ British Admiralty. The Mariner's Handbook. 1999 edition, page 23.
^ Marine Investigation Accident Branch (2007) Report Number 18/2007.
^ See, for example, NOAA 14860 - Lake Huron 1:500,000 and NOAA 14853 Detroit River 1:15,000.
^ Nautical charts on sailingissues.com
Wikimedia Commons has media related to Nautical charts.
Online version of Chart No.1 with "Symbols, Abbreviations and Terms" used in nautical charts
Nautical Charts – chapter from the online edition of Nathaniel Bowditch's American Practical Navigator
Portolan Chart of Gabriel de Vallseca, 1439
Guides to Nautical Chart Symbols
Nautical chart symbols for aids to navigation
The short film "Reading Charts (April 6, 1999)" is available for free download at the Internet Archive
Map projection
Early world maps
History of cartography
List of cartographers
Choropleth map
Geologic map
Linguistic map
Pictorial maps
Category:Maps
Passage planning
American Practical Navigator
Lights and Buoys
List of Lights
Light List
Coast Pilots
Sailing Directions
Distances Between Ports
Nautical almanac
Tidal Information
Radio Navigational Aids
Notice to Mariners
Local Notice to Mariners
Glossary of nautical terms, Further reading
Retrieved from "https://en.wikipedia.org/w/index.php?title=Nautical_chart&oldid=741153116" Categories: HydrographyInfographicsMap typesAids to navigationNautical chartsNautical terminologyHidden categories: Articles needing additional references from August 2013All articles needing additional referencesCommons category with local link same as on WikidataWikipedia articles with GND identifiers Navigation menu
CatalàDanskDeutschΕλληνικάEspañolEsperantoFrançaisGalego한국어Հայերենहिन्दीÍslenskaItalianoMagyarNederlands日本語Norsk bokmålNorsk nynorskOʻzbekcha/ўзбекчаPolskiPortuguêsРусскийSimple EnglishСрпски / srpskiSuomiSvenskaTiếng Việt中文 Edit links This page was last modified on 25 September 2016, at 18:49. | 科技 |
2016-40/3983/en_head.json.gz/3308 | Page:Popular Science Monthly Volume 71.djvu/35
THE GREAT JAPANESE VOLCANO ASO
By ROBERT ANDERSON
ASO-SAN, or Mount Aso, is a living volcano in the heart of the island Kiushiu, Japan, whose peaks rise to a height of several thousand feet out of a gigantic bowl. This bowl, which is many miles across, is an ancient crater surpassing in size all other known craters nearer than the moon. Some 5,000 people, grouped in half a hundred villages on the old floor, are living to-day, tilling the volcanic soil and trading in this vast crater, round about the base of the new and ever-active cone that has risen in it.
Kiushiu is the most southern of the four main islands in the Japanese archipelago. It is about 17,000 square miles in extent and is therefore larger than Vancouver Island, or almost equal in area to Massachusetts and New Hampshire combined. It is built up of very ancient rocks, both sedimentary and igneous, belonging to the paleozoic and mesozoic eras, as well as of younger rocks, and upon these as a foundation has been erected in more recent times, partly during the age of man, a superstructure of volcanic materials which now covers many thousand square miles, or about one half the area of the island. It contains twenty volcanoes, counting two that are just off the coast to the south, of which eight are now active. Among them Aso-san is on far the largest scale, though now it is in a decadent stage and is surpassed in activity by two or more of the others. Japan through all past ages has been a land of extraordinary geological activity, possessed of a vital energy which, continuing in force up to modern times, has been emphasized by the changes in level of its coasts and heralded by its ever-vigorous volcanoes. It is far from being a land solely of volcanoes and volcanic formations, as is sometimes thought, for these assume insignificance when compared with the wide areas and great thicknesses of strata that are representative of almost every stage of the geological column. But that it is a country of great volcanoes there can be no doubt. They have flourished ever since the beginning of its geological history and to-day there are 164 independent volcanic cones, or colonies of related cones, scattered through the Japanese islands, including the Kuriles and the Liu Kiu chains. Of this number 54 are now actively grumbling and nursing their wrath and occasionally losing all control. Fuji-san and Aso-san are the kings, although others surpass them in destructive activity. The first is famed for the height and regularity of its cone as one among the preeminently symmetrical and beautiful volcanoes of the
Retrieved from "https://en.wikisource.org/w/index.php?title=Page:Popular_Science_Monthly_Volume_71.djvu/35&oldid=4324323" Category: Validated Navigation menu | 科技 |
2016-40/3983/en_head.json.gz/3356 | Posts Tagged 'national aquarium conservation officer' We’ve Hired Our First-Ever Chief Conservation Officer
Published May 23, 2013 Aquatic Life , Conservation , News
Tags: chief conservation officer, Conservation, eric schwaab, national aquarium conservation, national aquarium conservation officer, national marine fisheries service, national oceanic and atmospheric administration, NOAA, towson university
We’re excited to announce Eric Schwaab as our first-ever Senior Vice President and Chief Conservation Officer (CCO). With a realignment of priorities that emphasizes an updated conservation mission, Schwaab’s appointment represents the Aquarium’s new dedication to serve as a national leader in ocean preservation and environmental stewardship.
“With the confirmation of Eric Schwaab as our Chief Conservation Officer, we are setting an agenda for National Aquarium’s future,” said John Racanelli, National Aquarium CEO. “We are dedicated to our mission of inspiring conservation of the world’s aquatic treasures. Eric’s wealth of experience and passion will help us expand and better promote conservation action to protect the ocean, our planet’s life support system.”
As CCO, Schwaab, who assumes responsibilities July 1, will provide strategic vision and leadership for the National Aquarium’s Conservation and Science Division, a team of 130 professionals, engaging in initiatives ranging from field conservation and biological programs to legislative advocacy and animal rescue.
Schwaab currently serves as Acting Assistant Secretary for Conservation and Management for the US Department of Commerce, National Oceanic and Atmospheric Administration (NOAA). In this role he works closely with Congress, other agency leaders, partner organizations and local communities to develop policies and take conservation action to ensure sustainable federal fisheries, promote coastal stewardship and enhance protection of ocean habitats. Previously, as Assistant Administrator for Fisheries at NOAA from 2010-2012, Schwaab directed the National Marine Fisheries Service. He was responsible for science, management and conservation of federal fisheries, marine mammals, sea turtles and other protected resources within the United States. Schwaab led the agency’s national requirement to end overfishing, the implementation of “catch share” management programs to better align the interests of commercial fishing businesses with conservation goals, and efforts to improve coastal and ocean habitat conservation.
The National Aquarium is changing the way the world views conservation by instilling a sense of urgency on issues that affect aquatic ecosystems worldwide, including the Chesapeake Bay. In the ocean policy arena, the National Aquarium has recently focused its efforts on a ban on the sale and trade of shark fins, offshore wind development, plastic and beverage container deposits and watershed conservation.
“Through its current work in conservation and science, National Aquarium is redefining the role of public aquaria as catalysts for tangible change in how people care for oceans and aquatic systems,” said Schwaab. “The Aquarium’s role as a trusted source of information and its ability to communicate with millions of people annually provide significant opportunities to influence public policy and personal behavior on behalf of sustainable ocean conservation. I look forward to leading this charge.”
Prior to his work with NOAA, Schwaab spent three years as Deputy Secretary of the Maryland Department of Natural Resources, where he worked extensively with legislative leaders and other agencies to support important state conservation initiatives, including Chesapeake Bay restoration, forest and park land conservation and fisheries rebuilding. Schwaab’s 20 plus years of conservation stewardship in Maryland also include service as Director of the Fisheries Service (1999-2003); Director of the Forest, Wildlife & Heritage Service (1995-1999); Director of the Forest Service (1992-1995); and Chief of Resource Management for Maryland Forest & Park Service (1989-1992). From 2003 into 2007, Schwaab served as Resource Director for the Association of Fish and Wildlife Agencies, coordinating conservation work on behalf of fish and wildlife agencies across North America.
Schwaab, who currently serves as the NOAA Administrator designee on the National Fish and Wildlife Foundation, holds a Bachelor of Arts degree in Biology from McDaniel College and a Master of Arts degree in Geography and Environmental Planning from Towson University. He also completed a leadership program for senior executives in state and local government at the Kennedy School of Government, Harvard University.
National Aquarium – WATERblog Blog at WordPress.com. Post to | 科技 |
2016-40/3983/en_head.json.gz/3359 | Braille Monitor May 2008
(back) (contents) (next)
Employer Bias Thwarts Many Blind Workers
by David Crary
From the Editor: The following Associated Press story appeared March 16, 2008, in a number of newspapers around the country. Suddenly the topic of employment for blind people was of broad, if fleeting, interest. The reason was the swearing in as governor of David Paterson, who had been the legally blind lieutenant governor of New York. Here is the story:
Technology and training have improved to the point that blind people can adeptly perform a dazzling array of jobs--soon to include the governorship of New York. The biggest obstacle still in their way, advocates say, is the negative attitude of many employers.
The most recent available statistics suggest that only about 30 percent of working-age blind people have jobs. That figure was calculated more than ten years ago, but the major groups lobbying on behalf of blind Americans believe it remains accurate despite numerous technological advances. "Most people don't know a blind person, so they assume that blind people are not capable of doing most jobs when in fact that's not true," said Chris Danielsen, spokesman for the National Federation of the Blind.
Exhibit A, for the moment, is David Paterson, the legally blind lieutenant governor of New York from Harlem who will be sworn in Monday as governor, replacing scandal-tarnished Eliot Spitzer. However, blind people hold all sorts of jobs these days--judge, fitness trainer, TV show host, registered nurse, lawyer, and so on.
"Unfortunately we're still living in an age of misperceptions of what blind people can do," said Carl Augusto, president of the American Foundation for the Blind. "We're hoping that an employer considering hiring a blind person will say that, if David Paterson can be governor and be legally blind, maybe this applicant who is blind can be a good computer programmer."
There are an estimated ten million visually impaired people in the United States, including about 1.3 million who are legally blind, according to Augusto's foundation. The foundation says legal blindness is generally described as visual acuity of 20-200 or less in the better eye, with a corrective lens. Paterson has enough sight in his right eye to walk unaided, recognize people at conversational distance, and read if the text is close to his face.
In theory those people are covered by the Americans with Disabilities Act, which among its many provisions requires employers to give fair consideration and treatment to visually impaired employees and job applicants. But Augusto said employers routinely turn down blind applicants without incurring legal sanction. "The ADA is a wonderful law, but many employers find a way not to seriously consider blind people," he said. "They look at themselves and then say, `I can't imagine how a blind person can be a computer programmer. They can't possibly do it.'"
Advocacy groups work persistently to change such attitudes, with employer education programs and public appearances by successful blind people to discuss their capabilities. One component of such campaigns is to raise awareness of the ever-evolving technology that helps blind people handle more types of jobs--including software that reads aloud information on a computer screen and scanners that can convert printed material into Braille or an accessible electronic format.
"The assisted technology has made the playing field as level as it's ever been for blind people," said Kirk Adams, president of Seattle's Lighthouse for the Blind, a nonprofit agency that provides job help. "There are fewer and fewer jobs a blind person can't do." Adams, forty-six, said being blind seemed a hindrance when he first began postcollege job hunting, but he was hired as a securities broker and later served in various nonprofit fundraising jobs before moving to the Lighthouse, which has 190 blind people on its payroll.
One problem he notes is the difficulty many young blind people face in getting short-term or part-time work during high school and college. "There's a real divergence with sighted kids," Adams said. "It's very typical that a blind kid at sixteen or eighteen is not having success finding that first employment--we see a lot of frustration around that age because employers may not be thinking about making those short-term jobs accessible."
The American Foundation for the Blind says its latest research indicates that, once young blind people complete top-notch training and education programs, they attain an employment rate not much lower than sighted people. But Augusto said the overall portion of blind people with jobs remains low because many older workers who lose vision in middle age drop out of the workforce rather than undergo retraining.
"You get a bunch of people in their fifties who all of a sudden are visually impaired--they can't drive anymore, they'll get Social Security benefits and maybe disability insurance," Augusto said. "They say, `The heck with it, we're not going back to work. We don't want to go through the rehabilitation training--it's too hard.'"
Kevan Worley, a blind Coloradan, runs a company that provides thousands of meals a day to Army troops at Fort Carson in Colorado Springs. About 70 percent of his two hundred employees are blind or otherwise disabled. "There are still stereotypes of blind people," he said. "When employers, educators, even parents of blind kids have those stereotypes and low expectations, many are being kept down and out."
The Equal Employment Opportunity Commission, which tracks workplace discrimination cases covered by the Americans with Disabilities Act, says 455 such complaints were filed last year by visually impaired workers--the highest number since 1995. "If someone's blind, there's a huge stigma to overcome and all kinds of myths and fears in the employer community," EEOC spokesman David Grinberg said. "The fact is that in the twenty-first century workplace people who are blind are just as able to do a job as anyone else--they just need to be given a chance," he said. "They know the deck is stacked against them. They work harder than others, and they end up being more effective workers." | 科技 |
2016-40/3983/en_head.json.gz/3377 | Keep Data in Line, Most of the Time
For synchronizing data on PocketPCs and Smartphones, ActiveSync 3.8 is easy enough to use, but many users say OS and phone service issues can knock it off balance.
By Joanne Cummings10/01/2005
Sometimes, the best utilities are the least glamorous. They do what they do and they do it well. Microsoft's ActiveSync 3.8, the latest version of its data synchronization software for PocketPCs and Smartphones, fits that bill. It seamlessly syncs up files—such as e-mail, contacts and calendar items—between mobile devices and PCs. And it's usually pretty efficient, according to Redmond readers. "It has been seamless and the client devices are very easy to configure," says Vinit Kohli, director of MIS at Sibcy Cline, a residential real estate company in Cincinnati, Ohio. "Microsoft has done a good job in making it as easy to sync up as possible. Even when people are first learning how to use the software, we hardly spend more than 15 minutes synching up with these devices."
However, as the mobile devices that use ActiveSync have become more elaborate and more capable—sporting state-of-the-art Secure Digital (SD) cards and phone services—ActiveSync has been slow to keep pace. As a result, the synching process can sometimes be a bit sub-par. "It can be a bit buggy, especially where the phone service is concerned," Kohli admits. For example, Kohli does his synchronization primarily through a wireless connection. He uses ActiveSync to sync up his e-mail, contacts and calendars on his Sprint Audiovox 6600 PocketPC phone with the data on his corporate Exchange server.
"Even when people are first learning how to use the software, we hardly spend more than 15 minutes synching up with these devices."
Vinit Kohli , Director of MIS, Sibcy Cline Inc.
"I have a lot of issues with the Sprint phone service conflicting with Active-Sync," he says. If the phone service kicks in when ActiveSync is running, the entire device freezes up and locks him out. "The only way to get the device back up and running is to pull out the battery and reboot. It's crazy." Misery must love company, because Kohli isn't alone in facing this issue. "I was in a Microsoft seminar last month with five other PocketPC users, with service from different carriers, Cingular, Sprint and others," he says. "They all said the same thing. If the phone service kicks in, ActiveSync freezes up and locks up the device." That problem is most likely due to ActiveSync 3.8's inability to broker between services and its less-than-robust error handling capabilities. In fact, when Microsoft rolled out Windows Mobile 5.0 software in May, it also announced ActiveSync 4.0 that it billed as having "more robust error handling features." Until the new version is widely available, however, users will continue having problems. Wesley Bielinski, network administrator for the American Board of Medical Specialties in Evanston, Ill., uses it to synchronize data between multiple handheld and desktop devices. He syncs his e-mail, contacts and calendars on an HP 4150 and an HP 6315 with Outlook on his PCs at work and at home. Sometimes, he says, ActiveSync syncs up all his files in two minutes. Other times, it takes up to 15 minutes and hangs up. Seldom, if ever, can he pinpoint a reason for the difference. "There are a lot of quirks that are very annoying. Just yesterday, it was synching a favorite and two files. It just froze on them," Bielinski says. "It couldn't tell me which file was having a problem, so I ended up having to remove all the files, sync it, put the files back on and then sync it again. I should be able to just pop it in and forget it, but I can't. It requires some babysitting and I wish it were more reliable."
Microsoft ActiveSync 3.8 Free
ActiveSync's inability to gracefully handle errors is compounded by the inability of most PocketPC and Smartphone devices to do a soft restart. "There is no way to end the task or stop the process," Kohli says. "Basically, you just have a hard restart, and that [means] pulling out the battery."
Kurt Hudson, president of HudLogic, a Flagstaff, Ariz., consultancy, believes the problem lies more with the device and the PocketPC operating system. "When you're sold one of these devices, they tell you that it's just like a little laptop. The big difference is its ability to do error handling. Basically, every time you get an error, you have to restart."
Hudson says he can restart by pushing his pen stylus into the restart button on his PocketPC, but it certainly isn't foolproof. "You have to be careful," he says. "A quick hit will restart, whereas if you hold it for two seconds, it flushes the memory and everything, so you lose your contacts and all that. It's not very elegant." Still, Hudson is generally happy with ActiveSync. It has saved him a lot of time when synchronizing a Flash application between his PocketPC and desktop system. "It's actually easier for transferring files than moving my memory stick back and forth," he says. Wireless—More or Less
Version 3.8 of ActiveSync didn't provide anything all that different from the previous version, as far as Bielinski can tell, but now he has more trouble with wireless synchronization. "I can get it to sync up wirelessly once in a while, but now I tend to just stick it in the cradle and do it that way," he says. "It's easier and at least it also charges up the battery."
That could be due to the new security features Microsoft added to version 3.8. As a precaution, it turned off the ability to sync up via Wi-Fi or LAN by default. Users can reactivate that feature by checking off a menu item, but the change isn't very intuitive for users accustomed to the old way of synching. Even after Bielinski made sure he had the settings properly configured, he still couldn't get it to work with a wireless connection, he says. "It's just hit or miss."
Others agree that for the most part, the differences between version 3.71 and 3.8 seem negligible. "I upgraded when 3.8 came out," says Hudson, who has been using ActiveSync with his PocketPC for the past six months. "Usually, I upgrade because I'm looking to have certain bugs fixed, but that wasn't the case here. I don't see much of a difference between the two versions."
Hudson's PocketPC has wireless networking and unlimited Internet service from T-Mobile. It usually doesn't hang up during the synching process. "It doesn't even take five minutes, although I sync directly connected from my cradle of the PocketPC—not wireless—so that may be why I don't have problems."
Where Hudson does have problems is getting everything—his Bluetooth connection, GPS software and ActiveSync—to work together. "I don't think ActiveSync is the problem there," he says. "Once the software kicks in, it seems to work. It's when I'm trying to get the communication between the phone and the other mechanisms like my GPS. Sometimes it works and sometimes it doesn't, and I can't tell why."
Bielinski has the same issues. "I use ActiveSync with my Bluetooth headset and GPS unit simultaneously. It took me a long time to figure out how to get them all to work together," he says. "The trick is you have to start them up in a certain order, otherwise it doesn't work. You have to get the headset first, and then be in the Bluetooth manager. Then turn on the GPS unit and immediately click on the connection. Otherwise, it will automatically connect itself and it won't work. It's a whole process."
ActiveSync Wish List For the most part, users are happy with the features and functions that ActiveSync 3.8 provides. If they had their way, though, here's what they would like to see in the next version of Microsoft's mobile synchronization software:
1. Support for more applications. The ability to synchronize e-mail, contacts and calendars within Outlook and Exchange is easy and seamless, but most users would like to see that extended to Excel, Word and other applications. "It shouldn't be something you have to hunt for," Kohli says.
2. Support for multiple devices. "It would be nice if there were a way to keep everything synched," says Bielinski. Currently, it's difficult for him to use ActiveSync to synchronize the data on his two PDAs and two PCs. "If you want to add something like synching with an Internet calendar like Yahoo or something [similar], you're out of luck. I'd like to see it be able to work with multiple devices."
3. SD card synchronization. Although ActiveSync lets users synchronize files by placing them within a certain synchronization folder, it currently doesn't sync up the whole SD card. "That would be a nice feature," says Kohli.
4. Better error handling. When ActiveSync encounters an error—from phone service interference or whatever—it should provide a less-intrusive way to stop and restart the session. "Removing the battery is not a good way," Kohli explains.
5. Remote info wiping. In the event that a mobile device is lost or stolen, Windows Mobile 5.0 can remotely wipe the data from that device the next time it connects to the network. "That would be a nice feature to have in ActiveSync as well," Bielinski says.
— J.C. Other File Types
Most users say they're happy with ActiveSync's ability to sync up with Outlook and Exchange. As they use their devices for different purposes and put more items on SD cards, they would like to be able to sync up other file types, such as Excel, Word and even photos.
"Once we start adopting more CRM-type applications, I think ActiveSync is going to become more critical," Kohli says. "At that point, I'd like to see more support for synching Word and Excel documents. Right now, I have this 512MB SD card, so I have plenty of space to put my Word documents. It would be great if it could sync up everything on the SD card—if not the whole document, then at least just the changes."
ActiveSync does indeed let users synchronize Word and Excel documents, Hudson says. "You just have to place the document in the special folder ActiveSync creates for synchronization. Some devices don't support that capability, so it's more of a device limitation than an ActiveSync one." He says ActiveSync can synchronize Excel, Pocket PowerPoint, InkWriter, Pocket Excel, Word, Notes, PowerPoint, Pocket Word and Note Taker.
If your device doesn't support other file synchronizations, ActiveSync also lets you drag and drop files between the SD card and the desktop. This will help you transfer files, but it won't actually synchronize them.
Kohli would also like to see support for synchronizing photos. "I have a camera on this device," he says. "It isn't that great, but it's good when you're really in dire straits." He says he uses it to help clarify issues in the server room. He takes a picture of something questionable so he can discuss it with someone. That ensures they both have the same point of reference. "So synchronizing pictures would be on my wish list."
Another issue is ActiveSync's perceived inability to sync up more than one inbox. "I divide my inbox into a couple of sections or folders, and I wasn't seeing an option to sync multiple inboxes," Hudson says. "I really wanted it to do that." Later, he found that ActiveSync does provide for syncing up multiple inboxes, but that the information was buried in the help files. "Synchronizing subfolders is supported only on Windows Powered Pocket PC 2002 and later and Windows Powered Smartphone 2002 and later, but it does support the feature," Hudson says. "Seems to me a lot of the things people are wishing for are already in there, but not necessarily enabled by default." Consequently, he advises anyone to fully research the ActiveSync help files before concluding that it lacks support for certain features. There may be more to it than you think.
More InformationLearn more about ActiveSnc 4.0 and download a developer's copy here. | 科技 |
2016-40/3983/en_head.json.gz/3396 | Privacy &Legal Notice
The Laboratoryin the News
Commentary byGeorge H. Miller
A Quantum Contribution to Technology
U.S. Weapons Plutonium Aging Gracefully
Imaging Complex Biomolecules in a Flash
Lipid Rafts Observed in Cell Membranes
Exchanging Insights on Quantum Behavior
Patents and Awards
S&TR Staff
George H. Miller Director of Lawrence Livermore National Laboratory
Laboratory Science Entwined with Rise in Computing LIVERMORE scientists have been using computer simulations to attain breakthroughs in science and technology since the Laboratory’s founding. High-performance computing remains one of the Laboratory’s great strengths and will continue to be an important part of future research efforts.
To meet our programmatic goals, we demand ever more powerful computers from industry and work to make them practical production machines. We develop system software, data management and visualization tools, and applications such as physics simulations to get the most out of these machines. High-performance computing, theoretical studies, and experiments have always been partners in Livermore’s remarkable accomplishments.
The Laboratory’s cofounders, Ernest O. Lawrence and Edward Teller, along with Herbert York, the first director, recognized the essential role of high-performance computing to meet the national security challenge of nuclear weapons design and development. Electronic computing topped their list of basic requirements in planning for the new Laboratory in the summer of 1952. The most modern machine of the day, the Univac, was ordered at Teller’s request before the Laboratory opened its doors in September. The first major construction project at the site was a new building with air conditioning to house Univac serial number 5, which arrived in January 1953.
Edward Teller, whose centennial we are celebrating this year, greatly appreciated the importance of electronic computing. His thinking was guided by his interactions with John von Neumann, an important pioneer of computer science, and his prior experiences using “human computers” for arduous calculations. Teller was attracted to and solved problems that posed computational challenges—the most famous being his collaborative work on the Metropolis algorithm, a technique that is essential for making statistical mechanics calculations computationally feasible. His work demonstrated his deeply held belief that the best science develops in concert with applications.
This heritage of mission-directed high-performance computing is as strong as ever at Livermore. Through the National Nuclear Security Administration’s (NNSA’s) Advanced Simulation and Computing Program, two of the world’s four fastest supercomputers are located at Livermore, and they are being used by scientists and engineers at all three NNSA laboratories. The prestigious Gordon Bell Prize for Peak Performance was won in 2005 and 2006 by simulations run on BlueGene/L, a machine that has 131,072 processors and clocks an astonishing 280 trillion floating-point operations per second. Both prize-winning simulations modeled physics at the nanoscale to gain fundamental insights about material behavior that are important to stockpile stewardship and many other programs at the Laboratory.
The article entitled A Quantum Contribution to Technolgoy features Livermore-designed computer simulations that focus on the nanoscale beginning with first principles: the laws of quantum mechanics. The use of large-scale simulations to solve quantum mechanics problems was pioneered in 1980 by Livermore scientist Bernie Alder in collaboration with David Ceperley from Lawrence Berkeley National Laboratory. To predict how materials will respond under different conditions, scientists need accurate descriptions of the interactions between individual atoms and electrons: how they move, how they form bonds, and how those bonds break. These quantum molecular dynamics calculations are extremely demanding. Even with the Laboratory’s largest machines, computational scientists, such as those in Livermore’s Quantum Simulations Group, must design clever modeling techniques to make the run times feasible (hours to days) for simulating perhaps only 1 picosecond of time (a trillionth of a second). Outstanding science and technological applications go hand-in-hand in this work. As described in the article, our scientists are using quantum simulations to evaluate nanomaterials to reduce the size of gamma-ray detectors for homeland security, provide improved cooling systems for military applications, and help design even smaller computer chips. Yet another quantum simulation project is examining materials to improve hydrogen storage for future transportation. These examples merely scratch the surface of the novel uses for nanotechnologies that scientists can explore through simulations. One can only imagine what possibilities might be uncovered in the future as computational power continues to increase and researchers become ever more proficient in nanoscale simulations and engineering. True to its heritage, Livermore will be at the forefront of this nascent revolution.
Back | S&TR Home | LLNL Home | Help | Phone Book | Comments
Site designed and maintained by TID’s Internet Publishing Team
Lawrence Livermore National Laboratory 7000 East Avenue, Livermore, CA 94550-9234
S&TR Office: (925) 423-3432 Operated by the University of California for the U.S. Department of Energy
UCRL-52000-07-5 | May 7, 2007 | 科技 |
2016-40/3983/en_head.json.gz/3413 | response measures
Your location: Home Risk management approaches to address adverse effects of climate change
In pursuance of the conclusions reached by the SBI at its twenty-eighth session (FCCC/SBI/2008/8, para
38), the Secretariat has prepared an information note to review experiences, best practices and lessons
learned on risk management approaches and other appropriate responses to the adverse effects of climate
change, in accordance with Article 4, paragraphs 8 and 9, of the Convention. The production of
this information note has led to the creation of this introductory guide to risk-management approaches
that explores the most important risk-management options available and pilotted today, drawing on a
number of case studies.
Risk management approach to the adverse effects of climate change
According to the Fourth Asssessment Report of the IPCC, there was a rapid increase in
weather-related disasters worldwide between 1980 and 2003. This trend is expected to
continue and intensify in the future, taking a particlarly severe toll in those developing
countries that are the poorest and most vulnerable, such as small island developing states and
least developed countries. Insurance has become a key component of adaptation to climate
change and disaster risk reduction because it can provide economic security and enable
vulnerable populations to pool economic losses, thereby mitigating the impacts of adverse
weather events and avoiding knock-on effects. Different types of insurance schemes are
available and described further in here.
The utilisation and development of environmentally sound technologies is seen as a means to
combat the adverse affects of climate change, in both developing and developed countries. Innovative technologies have been used as an adaptation option in different economic
sectors.
Economic diversification may be defined as a process in which
a growing range of economic outputs are produced. Sectors such as tourism,
agriculture, fisheries, forestry and energy production are all sensitive to the adverse effects
of climate change. The negative impacts of climate change on these sectors are of concern
to all countries, especially for those whose economies are primarily driven by climate
sensitive sectors. Examples of best practice and
lessons learned are provided for three key sectors.
In order to contribute a case study, please contact rmanandhar@unfccc.int.
Related UNFCCC resources
Synthesis report on Views and
information on the elements to be included in the work programme on loss and damage
Technical paper on Integrating
practices, tools and systems for climate risk assessment and management and disaster risk reduction
strategies into national policies and programmes.
Technical paper on Mechanisms to
manage financial risks from direct impacts of climate change in developing countries. Secretariat | 科技 |
2016-40/3983/en_head.json.gz/3431 | AstraZeneca expands technological investment in new Cambridge lead discovery centre
redag, 20 november 2015
AstraZeneca today announced three new partnerships that will deliver industry-leading technologies to drive the AstraZeneca MRC UK Centre for Lead Discovery in the search for novel small molecule medicines. The partnerships will ensure state-of-the-art screening and compound management at the facility, which will be located within the company’s new global R&D centre at the Cambridge Biomedical Campus. It will screen around 2 million chemical structures per target to find new compounds for disease targets.
The partnerships, which will deliver a collaborative screening platform, space-age robotics, and sound waves that can move liquids, include:
1. A three-year agreement with HighRes Biosolutions, a leading global provider of automated robotic systems to the life sciences industry, to develop next generation, intelligent robots for the high throughput screening of compounds. The ultra-light weight, maximum strength robots use technology originally developed for the European Space Agency. A feature of the robots is that scientists can interact with them directly rather than being separated by a protective shield, allowing them to perform diverse research applications with a mobility that maximises efficiency. AstraZeneca will be the first adopter of this advanced robotics within the pharmaceutical industry.
2. A strategic partnership to use the energy of sound waves to dispense compounds directly from individual storage tubes into well plates for testing. This comprises fully-automated liquid handling by Labcyte, automated liquid storage systems by Brooks Automation, and a co-developed acoustic sample storage tube. Acoustic dispensing is contactless, eliminating the use of pipettes and reducing the compound volumes by 10-fold. The system also increases automated retrieval of compounds from storage, enabling any compound from AstraZeneca’s collection to be selected for any screen in the UK Centre for Lead Discovery. It will provide an unparalleled level of quality in the screening data produced from biological assays.
3. A five-year collaboration with Genedata, a leading provider of advanced software solutions for drug discovery and life science research, which will speed data sharing with partners. The partnership centres on a software platform to help expedite collaborations in drug discovery and reduce data analysis costs. AstraZeneca will become one of the largest users of the collaboration platform software, which it will share with its early stage partners.
Steve Rees, Vice President, Screening Sciences & Sample Management, Discovery Sciences, AstraZeneca, said: “Working together with leading technology providers, we are creating cutting-edge capabilities for compound management and screening. This will deliver new ways of working, help reduce costs and transform our ability to identify new molecules that could become medicines of the future.”
About Brooks Automation, Inc. Brooks is a leading worldwide provider of automation and cryogenic solutions for multiple markets including semiconductor manufacturing and life sciences. Brooks' technologies, engineering competencies and global service capabilities provide customers speed to market and ensure high uptime and rapid response, which equate to superior value in their mission-critical controlled environments. Brooks is headquartered in Chelmsford, MA, with direct operations in North America, Europe and Asia.
About Genedata
Genedata transforms data into intelligence with a portfolio of advanced software solutions, which make research data accessible and understandable and research processes more efficient. These solutions are used worldwide by leading pharmaceutical, industrial, and agricultural biotechnology companies as well as academic research organizations. Genedata innovations enable scientific discovery that fights disease and improves health and quality of life worldwide. Founded in 1997, Genedata is headquartered in Switzerland, and has offices in Germany, Japan, and the US. For more information, visit www.genedata.com.
About HighRes Biosolutions
HighRes Biosolutions is a global leader for laboratory automation solutions. The company designs and builds innovative laboratory automation systems, dynamic scheduling software, and lab automation instruments that accelerate and streamline discovery. HighRes offers highly flexible, modular solutions that provide its clients with the ability to scale and reconfigure their automation equipment as their assays or technology changes. For more information, please visit www.highresbio.com. About Labcyte Inc.
Labcyte, a global biotechnology tools company headquartered in Sunnyvale, California, is revolutionizing liquid handling. Echo® liquid handling systems use sound to precisely transfer liquids without contact, eliminating the use of pipettes. Labcyte instruments are used worldwide throughout the pharmaceutical industry, as well as by biotechnology companies, contract research organizations, and academic institutions. Our customers work across a wide spectrum of scientific research, including drug discovery, genomics, proteomics, diagnostics and personalized medicine. Labcyte has 55 U.S. patents and others internationally. For more information, visit www.labcyte.com.
Esra Erkal-Paler
UK/Global
Karen Birmingham
Jacob Lund
Michele Meixell | 科技 |
2016-40/3983/en_head.json.gz/3475 | Could Sprint ditch Nextel? Makes sense
CNETTech CultureCould Sprint ditch Nextel? Makes senseCould Sprint ditch Nextel? Makes senseBut a second rumor that has the company selling out to Deutsche Telekom doesn't sound as plausible.by Marguerite Reardon
@maggie_reardon / May 5, 20084:10 PM PDTTech CultureMay 5, 20084:10 PM PDTby Marguerite Reardon @maggie_reardonIs Sprint Nextel getting ready for a fire sale?It sure looks that way following speculation around Wall Street on Monday of a possible sale or breakup of the beleaguered wireless operator. First, The Wall Street Journal reported that German phone company Deutsche Telekom was considering buying the company. Later the same day, another Wall Street Journal article cited sources who said Sprint Nextel is considering unloading its Nextel assets, a move that might make the $22.3 billion wireless operator more attractive to potential buyers.While a Deutsche Telekom sale seems like a long shot, it's not surprising that the company is considering spinning off the Nextel unit. If Sprint Nextel is able to unload the Nextel network, it could open it up for sale to another bidder--just not Deutsche Telekom. Why? There are three reasons. Bigger isn't better According to the WSJ article, Deutsche Telekom is looking to expand international wireless networks as its wireline business declines. In particular, the company is looking to bulk up its U.S. subsidiary, T-Mobile USA, which is among its fastest-growing properties. The company added some 3.6 million subscribers in 2007 for a total of about 28.7 million subscribers. If T-Mobile USA acquired Sprint Nextel, it would gain an additional 53.8 million subscribers and become the largest U.S. cell operator, surpassing both AT&T and Verizon Wireless, the No. 1 and No. 2 cell phone carriers, respectively, in the country today.But making T-Mobile bigger wouldn't necessarily make it better. The main problem is that T-Mobile USA and its European counterpart, T-Mobile, use a different wireless standard than Sprint Nextel. T-Mobile is GSM-based, whereas Sprint uses CDMA and Nextel uses i-DEN.
If Deutsche Telekom tried to merge T-Mobile USA with Sprint Nextel, it would end up with a huge integration nightmare as it tried to accommodate not just two different technologies, but three. In fact, it would essentially be repeating the same mistake that doomed the 2005 merger between Sprint and Nextel. Sprint Nextel has been bleeding customers since that acquisition, as Nextel customers, in particular, have dumped the company due to poor service. In early 2006, Nextel had roughly 16.6 million subscribers. By the end of 2007, it had about 13.2 million, according to the WSJ article. And some experts think that number could shrink even further to about 5 million to 7 million in two years."When you're looking for market share and scale advantages, it's critical that the technology piece works," said Charles Golvin, an analyst with Forrester Research. "And (Deutsche Telekom and Sprint Nextel) don't have that. They'd have a whole bunch of integration issues, just like they did with the Sprint Nextel merger."Regulatory headaches Spinning off Nextel could help alleviate some of the integration headaches for a potential buyer. That said, I still don't think Deutsche Telekom would buy Sprint, mainly because I don't think U.S. regulators would accept the deal. Right now, the U.S. wireless market with four major competitors is viewed as a competitive-market success story. And it's conceivable that U.S. regulators could block the sale between the No. 3 and No. 4 wireless companies, which together would become the No. 1 wireless operator in the country. Additionally, it's likely that U.S. regulators wouldn't like the deal, because it would mean that a foreign company--Deutsche Telekom--would control the largest wireless communications network in the country, something security experts probably wouldn't like.Better options elsewhere That said, spinning off Nextel could help attract other bidders for Sprint. The WSJ reported that the company is examining several options. It has supposedly been in talks with Nextel co-founder Morgan O'Brien, who also founded a public service wireless company called Cyren Call. O'Brien is supposedly putting together a consortium of investors to acquire Nextel, which pioneered the push-to-talk, walkie-talkie phone service. The idea is that the Nextel network and service, which is already used by construction workers, airline workers, and public safety workers, would be combined with other spectrum assets to create a nationwide public safety network. Cyren Call has been working closely with the Federal Communications Commission to develop a plan for the sliver of spectrum in the 700MHz spectrum auction known as the D block, which was created to help build this nationwide public safety network. The spinoff of Nextel would finally cement the $35 billion tie-up between Sprint and Nextel as a major failure. Sprint would likely only get a fraction of what it paid for the company if it sold it today. But dumping Nextel would make it easier for Sprint's management team to more nimbly direct its core PCS business. It might even give executives room to focus more attention on its next-generation WiMax network called Xohm.But word has it that Sprint is also looking to spin off that business. The company has supposedly been in talks with Clearwire, which is also building a nationwide WiMax network, to form a joint venture backed by Intel and Google and possibly cable operators Comcast and Time Warner. The idea is that splitting up all of Sprint Nextel's assets could bring more value to the company than keeping everything together.Right now, one thing is certain. The Sprint executive team, led by CEO Dan Hesse, seems to be examining all its options. With second-quarter earnings coming out next Monday, there's likely to be more speculation over the next couple of weeks about what the company will do next. So stay tuned.Tags: Tech Culture Discuss: Could Sprint ditch Nextel? Makes senseConversation powered by LivefyreShow CommentsHide CommentsFeatured VideoTop 5 Google Pixel phone features we want to seeIf Google is making a phone, they need to make sure they include everything on this list.by Iyaz AkhtarElon Musk says hello to Mars. BlackBerry says goodbye to phonesIn this week's news wrap, SpaceX's founder explains what it will take to become a "multiplanetary species." Meanwhile, BlackBerry exits the hardware game.by Iyaz AkhtarCloseDrag © CBS Interactive Inc. / All Rights Reserved. | 科技 |
2016-40/3983/en_head.json.gz/3526 | So where are all those iPad apps?
Apps optimize the iPad experience. So how come more retailers aren’t buying in?
Yes, it's cool. But do you need an app?
When the iPad debuted earlier this year, a handful of retailers unveiled iPad apps. These apps are tiny programs custom-designed to run on the new tablet PC and make use of its inherent features—they’re like iPhone apps, only bigger. Then came virtual silence. A retail iPad app popped up here and there, but nothing like the massive growth of iPhone apps and apps for other smartphones.
I’ll tell you what gives: People shopping the Internet on their iPads are accessing retailers’ e-commerce sites, just as they would with any laptop or netbook. The only difference is they’re touching the screen instead of using a keyboard and a mouse. There is no need to optimize a web site experience via an app. Whereas with a smartphone, you need to modify the web experience because it plain and simple just doesn’t fit on the screen.
Because of its screen size and computing power, an iPad offers the same web browsing experience as a laptop—it’s just called a tablet PC for its lack of accessories. Have you ever heard of an app that optimizes web browsing for a laptop? No, because there are none. You just use the browser. And on an iPad, you just use the browser.
What’s more, HTML5, the latest version of the foundational Internet mark-up language, enables programmers to reach into devices and use their innate features in a way impossible with HTML4. So, for example, it used to be that only apps could tie in with a smartphone’s GPS system to pinpoint location. Now, programmers can accomplish the same thing on a web site that uses HTML5.
A case can be made for an iPad app—if you have the time and resources, and dedicated customers. EBay has a phenomenal iPad app that totally alters the way it presents products, from where they fall on a page to how big and high-def the pictures are to how a customer navigates the catalog. EBay also has a very loyal customer base that is more likely to download and use an app on a regular basis. So sure, roll out an iPad app that’s super-cool (and sells products).
But the bottom line is you don’t need an iPad app, as is shown by the deafening silence on the subject in retail.
Will location-based services be a powerful force in m-commerce? | 科技 |
2016-40/3983/en_head.json.gz/3527 | Google knows best
By Andre Golsorkhi
CEO, Sidecar
The founder of automated e-commerce marketing technology platform Sidecar says marketers are over-reacting to the new organization of the Gmail inbox.
Last year, when Google announced that brands and vendors would have to pay for the privilege of participating in Google Shopping, via Product Listing Ads (PLAs), after years of free inclusion in Google Product Search, there was much uproar. And although the blowback continues (Amazon recently decided to remove its Kindle readers from the platform after having already decided not to play ball in the PLA space at all), all in all, Google Shopping looks to be a success, with retailers already allocating 6% of their marketing budgets to PLAs, and the majority stating that they will continue to spend more in that channel, according to Forrester.
This isn’t the first time Google made an unpopular decision (I still miss Google Reader), but the success of this particular initiative begs the question: can’t the unpopular decision—at least in Google’s case—also be the right one?
I’ve been following the rollout of the new multi-tab Gmail inbox, and the retail world’s reaction to it, ever since the company announced its impending arrival in May, and so far, here’s what I think: we’re making a mountain out of a molehill. Timing is everything
When I do my one final e-mail check in the evening before bed, all I want to know is whether there’s something that’s going to keep me from getting there, or something that’s going to get me out of it earlier than expected. E-mails from my team and my clients are the only things that matter. If it needs immediate attention, I’ll attend to it; if it can wait, so can I.
Everyone has their own e-mail habits: whether it’s checking first thing in the morning before getting out of bed or not looking at it until arriving at work or setting aside some strict “no-e-mail” time during the day to reduce distractions and amp up productivity. But one thing we all have in common is that when we’re in work mode—whether at 9:00 in the morning or 9:00 at night—we want to see work e-mails. All of those industry newsletters and discount offers and product recommendations get in the way of getting things done.
Which isn’t to say that they’re unwelcome. I need to be in a “break” mode or a “shopping” mode or a “reading” mode to get the most out them. If a promotional sale is stuck in between two notes from clients, chances are I’ll skip right over it. But if it’s tucked away somewhere where I can view it when I’m ready to, I’ll actually have the ability to give it a proper look. By choosing when I get to see the message, it’s more powerful and resonant with me. Therefore, the likelihood that I’ll click-through—and purchase—is higher. And that seems to be the case with other Gmail users, who are actually engaging with marketing e-mails more than they used to, according to a recent study by e-mail analytics firm Return Path. Just like with the success of Google’s PLAs, the tabbed inbox shows that perhaps Google understands consumer intent better than some marketers—and has created products that align with it.
The “flash” problem
True, by choosing to wait until I can review these e-mails at my own leisure I might occasionally miss out on a deal. Sites that open sales up at a very specific time, or that are promoting extremely limited inventories, are perhaps getting the short end of the stick with Gmail’s new tabs.
But Gmail’s classification system is malleable. A flash retailer’s e-mail needn’t be relegated to a secondary tab forever. Flash sale sites need to work now to build a strong brand presence so that their customers (and potential customers) see an imperative in moving these e-mails to the main tab. Some sellers are asking their customers to do this. And others, perhaps the ones that will be most successful here, don’t necessarily need to. If a consumer knows that she will get great deals from a flash retailer, she’ll make sure she doesn’t miss any, by flagging them to the main inbox.
For flash sellers who still struggle, there is perhaps another option: send your customers something they’ll want to open instead of something they feel they have to open. If open rates or click-throughs or engagements are consistently weak, even after marketing efforts to establish these deals as “can’t-miss,” perhaps it’s worth considering sending out more personalized and targeted e-mails as soon as customers registered, based on what they have viewed, to increase the chances of them moving these e-mails to the primary inbox. While many flash retailers personalize messages after the first purchase, they may find it helpful to leverage the initial browsing data, immediately, to help improve activation and e-mail engagement.
The runaway success earlier this year of the mobile e-mail organizer app Mailbox clearly showed that consumers are desperately in search of a way to manage the increasing volume of e-mail they receive. Google understood this as an imperative—and if it wanted its customers to continue to use Gmail, it needed to do that in a way that would make the experience less frustrating for them. After all, if consumers were to abandon Gmail, it’s not just bad for Google—it’s bad for the retailers who have made a significant investment into Google’s whole suite of marketing products. The new Gmail helps to assuage soothe some of the pains of e-mail consumers were feeling—and in so doing, lets them get their e-mail content in a way that works best for them. The convenience for users can give way to convenience for advertisers, who will be reaching the right consumers at the right time. That open rates are being affected by less than 1% is telling: people are still getting, and opening retailers’ messages. But here’s my hypothesis: even though the open rate might have declined slightly, the people who are opening these e-mails are spending more time perusing them (rather than just hitting the arrow to go to the next message), and are both clicking through and buying more. It’s too early to tell—but I would encourage all retailers to do this analysis as the numbers will speak for themselves soon enough."
Andre Golsorkhi
Same-day delivery: Is it important to online shoppers? | 科技 |
2016-40/3983/en_head.json.gz/3551 | Home > VOL. 128 | NO. 232 | Wednesday, November 27, 2013
Social Lunch Preps for Memphis Launch
By Andy Meek
| Email this story | Email reporter | Comments ()
From (email): Message: A new social lunch event called Lunchbox is preparing to launch in Memphis next month, the product of business partners who left San Francisco this summer to come to Memphis and build a creative venture here. BICKETBilly Bicket and Laney Strange relocated here to launch a creative studio, and the first product from the resulting Memphis Punch Studios was launched in September – the Memphis Punch food truck. On Dec. 3, the team is scheduled to roll out its new initiative, with Bicket – the Memphis Punch CEO – describing Lunchbox as a way to strike up new friendships and forge connections over a healthy lunch. The basic idea behind Lunchbox is that its events will introduce healthy, affordable food choices to small groups of Memphians. Those groups will gather at different locations to enjoy healthy items from different local restaurants. “Recently, we started taking better care of our health after all those years stuck at a desk behind a computer screen,” Bicket said. “So we started a healthy food truck here in the heart of Barbecue-landia. By offering the same healthy options on our food truck that we use in our own day-to-day diet, we provided an easy way for others to think about integrating smoothies and other healthy food options into their lives. “As newcomers trying to plug in to Memphis, we’ve found that it’s hard enough to meet people through traditional social gatherings in the evening, at a time when you’re tired from the workday and have things to take care of at home. The Lunchbox is scheduled during break time at the peak of the day, when you yearn to get out of the office and see some fresh faces.” STRANGEBicket previously worked for an organization in San Francisco called TechSoup Global, as did Strange. She also teaches computer science at the University of Memphis. The first Lunchbox event will be held Downtown at Envision Memphis. The Lunchbox gatherings will be invite-only, and the plan is for them to be hosted at different small businesses around the city. Bicket said he and Strange relocated to Memphis because they decided the city had the right building blocks for launching a startup. Specifically, Memphis seemed to offer the kind of building blocks that appealed to self-professed contrarians who didn’t want to remain in San Francisco just because it was expected for someone in their field. Before relocating, they made a list of factors the new place they were looking for should have. They referred to one as the “bustle factor,” referring to a city having enough of a population to create density but not too many people that it’s difficult for a person to make a difference. They also wanted their new home to have an emerging startup scene, be an affordable place to live, be hospitable to runners like themselves, have a diverse population and have a college or university with a strong presence in the area. “Our studio is set up to solve immediate real-world problems getting in the way of our own lives, with the assumption that other misfits like ourselves can benefit from our hacks,” Bicket said. “As we move into 2014, we’re looking forward to mashing up our expertise at the intersection of technology, business and community. In particular, we’re introducing a workshop series positioned to help startups, small businesses and nonprofits move the dial in terms of user growth, community engagement, and launching global online services. “Beyond building out our portfolio of consulting services, we’re busy scouting out retail locations, meeting local designers and other talented people interested in contributing to our vision, and figuring out how we can make a contribution to the arts and digital scene here in Memphis.” | 科技 |
2016-40/3983/en_head.json.gz/3557 | Monarch Butterflies as a Model for Understanding the Spatiotemporal Dynamics of Migratory Species and Future Response to Environmental Change Submitted by Anonymous (not verified) on 22 December 2009 - 10:32am Principal Investigator(s): Sonia Altizer, Leslie Ries, Karen Oberhauser
Each year, North American monarch butterflies undergo a spectacular two-way long-distance migration from breeding locations in Canada and the United States to overwintering sites in Mexico. A substantial western population overwinters along the coast of California. Throughout their annual cycle of breeding, migrating, and overwintering, monarchs require different resources at different life stages. Their shifting spatial distribution poses challenges for identifying the key factors that affect monarch population dynamics and, thus, assessing their conservation status. Multiple long-term monarch monitoring programs span timescales of up to 30 years, and collect data on all stages of the monarch life cycle, including egg, larva, and adult density; migration patterns; and disease prevalence. This wealth of data provides an opportunity to understand how natural and anthropogenic factors affect the population dynamics and movement patterns of monarchs in particular, and migratory species in general. To maximize the value of existing data, it will be important to integrate data sets for analysis and interpretation of both within-season and longer-term population trends. This working group will:
1. explore data sets from throughout the monarchs’ annual life cycle to identify the major biological and environmental mechanisms that shape large-scale patterns of abundance and movement
2. predict the consequences of human activities, including shifting agricultural practices, deforestation and climate change, on long-term monarch population dynamics
3. create a web-based portal that allows public access to and use of monarch butterfly observational data, much of which has been collected by volunteer observers
Although our efforts focus primarily on a single species, our questions, approaches and findings will have great relevance in understanding the dynamics of other pollinator species and neotropical migrants across North America.
More information about this research project and participants.
Forward TwitterPrinter-friendly version
About NCEAS | 科技 |
2016-40/3983/en_head.json.gz/3562 | Home News Earth Life
This Week 13 June 2007 US set to track toxic algal blooms
By Aria Pearson in Santa Cruz, CaliforniaON A computer screen, the California coastline bleeds pools of vivid red. As I watch, some of the red swirls out into the open ocean and disappears, while some seeps ominously into bays and inlets along the shore. “You can see how the ocean is concentrating it and diffusing it,” says Raphael Kudela, as he clicks the replay button.
The red stain is in fact a simulation of the emergence and movement of massive amounts of chlorophyll in the ocean, a telltale sign of an algal bloom. Blooms of algae have long been a part of life in the waters off California, but in recent years they have become increasingly toxic, killing thousands of marine mammals and birds. Last month the region was hammered by a bloom twice as poisonous as anything it had seen before.
To understand why, Kudela, a phytoplankton ecologist at the University of California, Santa Cruz, has combined satellite images with information on ocean dynamics, such as winds and currents, to predict and track the toxic blooms. His models are now set to plug into a federally funded network currently in development that aims to provide a better understanding of what is happening in the waters off the west coast of the US. “The idea is to do for the ocean what the weather service has done for the land,” he says.
Algal blooms are made deadly when photosynthetic diatoms of the genus Pseudo-nitzschia produce a powerful neurotoxin called domoic acid. This accumulates in fish and shellfish without apparent harm, but can cause seizures and death in marine mammals, birds and humans that eat them. ... To continue reading this premium article, subscribe for unlimited access. | 科技 |
2016-40/3983/en_head.json.gz/3591 | Martian moon Phobos may have formed by catastrophic blast
Europlanet Media Centre
Scientists now have firm indications that the Martian satellite Phobos formed relatively near its current location via re-accretion of material blasted into Mars' orbit by some catastrophic event. Two independent approaches of compositional analyses of thermal infrared spectra, from ESA's Mars Express and NASA's Mars Global Surveyor missions, yield very similar conclusions. The re-accretion scenario is further strengthened by the measurements of Phobos' high porosity from the Mars Radio Science Experiment (MaRS) on board Mars Express.
Spatial locations of (a) PFS and (b) TES observations of Phobos used for the compositional analysis. Note that PFS orbit 0756 and TES yellow class observations correspond to almost the same area on the surface.
These results are being presented by Dr. Giuranna and Dr. Rosenblatt at the European Planetary Science Congress in Rome.
The origin of the Martian satellites Phobos and Deimos is a long standing puzzle. It has been proposed that both moons may be asteroids formed in the outer part of the main asteroid belt (between Mars and Jupiter) and were subsequently captured by Mars' gravity. Alternative scenarios suggested that both moons were formed in situ by the re-accretion of rocky-debris blasted into Mars's orbit after a large impact or by re-accretion of remnants of a former moon which was destroyed by Mars's tidal force. "Understanding the composition of the Martian moons is the key to constrain these formation theories," says Dr. Giuranna of the Istituto Nazionale di Astrofisica in Rome, Italy.
Previous observations of Phobos at visible and near-infrared wavelengths have been interpreted to suggest the possible presence of carbonaceous chondritic meteorites, carbon-rich "ultra primitive" materials, commonly associated with asteroids dominant in the middle part of the asteroid belt. This finding would support the early asteroid capture scenario. However recent thermal infrared observations from the Mars Express Planetary Fourier Spectrometer, show poor agreement with any class of chondritic meteorite. They instead argue in favor of the in-situ scenarios.
"We detected for the first time a type of mineral called phyllosilicates on the surface of Phobos, particularly in the areas northeast of Stickney, its largest impact crater," says Dr. Giuranna.
"This is very intriguing as it implies the interaction of silicate materials with liquid water on the parent body prior to incorporation into Phobos. Alternatively phyllosilicates may have formed in situ, but this would mean that Phobos required sufficient internal heating to enable liquid water to remain stable. More detailed mapping, in-situ measurements froma lander, or sample return would ideally help to settle this issue unambiguously," he added.
Other observations appear to match the types of minerals identified on the surface of Mars. Thus, the derived composition on Phobos appears more closely related to Mars than objects from other relatively locations in the solar system.
"The asteroid capture scenarios also have difficulties in explaining the current near-circular and near-equatorial orbit of both Martian moons," says Dr Rosenblatt of the Royal Observatory of Belgium.
The MaRS team, led by Dr. Martin Pätzold of the Rheinisches Institut für Umweltforschungh an der Universität zu Köln, Germany, has used the frequency variations of the radio-link between the spacecraft and the Earth-based tracking stations, in order to precisely reconstruct the motion of the spacecraft when it is perturbed by the gravitational attraction of Phobos. From this the team was able to reduce Phobos's mass. "We obtained the best measurement of its mass to date, with a precision of 0.3%," relates Dr. Rosenblatt. Phobos's volume past estimations were also improved thanks to the cameras onboard MEx. The MaRS team was thus able to derive the best-ever estimate of Phobos' density as 1.86±0.02 g/cm3. "This number is significantly lower than the density of meteoritic material associated with asteroids. It implies a sponge-like structure with voids making up 25-45% in Phobos' interior," says Dr. Rosenblatt. "High porosity is required in order to absorb the energy of the large impact that generated Stickney crater without destroying the body," confirms Dr. Giuranna. "In addition a highly porous interior of Phobos, as proposed by the MaRS team, supports the re-accretion formation scenarios."
A highly porous asteroid would have probably not survived if captured by Mars. Alternatively, such a highly porous Phobos can result from the re-accretion of rocky-blocks in Mars' orbit. During re-accretion, the largest blocks re-accrete first because of their larger mass, forming a core with large boulders. Then, the smaller debris re-accrete but do not fill the gaps left between the large blocks because of the low self-gravity of the small body in formation. Finally, a relatively smooth surface masks the space of voids inside the body, which then can only be indirectly detected. Thus, a highly porous interior of Phobos, as proposed by the MaRS team, supports the re-accretion formation scenarios.
The origin of both Martian moons is not, however, definitively elucidated since the density alone cannot provide the true composition of their interior. The future Russian Phobos-Grunt mission (Phobos Sample Return), to be launched in 2011, will certainly contribute to our understanding regarding the origin of Phobos.
The full text has been submitted for publication to the Planetary and Space Science journal's Special Issue on Comparative Planetology: Venus-Earth-Mars.
Materials provided by Europlanet Media Centre. Note: Content may be edited for style and length.
Europlanet Media Centre. "Martian moon Phobos may have formed by catastrophic blast." ScienceDaily. ScienceDaily, 21 September 2010. <www.sciencedaily.com/releases/2010/09/100920094804.htm>.
Europlanet Media Centre. (2010, September 21). Martian moon Phobos may have formed by catastrophic blast. ScienceDaily. Retrieved October 1, 2016 from www.sciencedaily.com/releases/2010/09/100920094804.htm
Europlanet Media Centre. "Martian moon Phobos may have formed by catastrophic blast." ScienceDaily. www.sciencedaily.com/releases/2010/09/100920094804.htm (accessed October 1, 2016).
Asteroids, Comets and Meteors
Phobos (moon)
Exploration of Mars
Deimos (moon)
Phoenix (spacecraft)
Scientists Are Closer to Solving the Mystery of How Mars' Moon Phobos Formed: Phobos in the Mid And Far-Ultraviolet
Feb. 29, 2016 In late November and early December 2015, NASA's Mars Atmosphere and Volatile Evolution (MAVEN) mission made a series of close approaches to the Martian moon Phobos, collecting data from within ... read more Martian Moon Samples Will Probably Have Bits of Mars Too
Nov. 11, 2013 Researchers have helped to confirm the idea that the surface of Phobos contains tons of dust, soil, and rock blown off the Martian surface by large projectile impacts. That means a sample-return ... read more NASA Rover Gets Movie as a Mars Moon Passes Another
Aug. 21, 2013 The larger of the two moons of Mars, Phobos, passes directly in front of the other, Deimos, in a new series of sky-watching images from NASA's Mars rover ... read more Battered Tharsis Tholus Volcano on Mars
Nov. 8, 2011 The latest image released from Mars Express reveals a large extinct volcano that has been battered and deformed over the eons. By Earthly standards, Tharsis Tholus is a giant, towering 8 kilometers ... read more Strange & Offbeat | 科技 |
2016-40/3983/en_head.json.gz/3627 | Forums News Technology Japanese Collaboration to Launch Search Engine
Japanese Collaboration to Launch Search Engine
A collaboration between the Japanese government, high-tech companies and universities have announced that it plans to develop and launch its own Internet search engine, having been inspired by the success of Google. A source close to the development said: "Information searches have become a source of wealth. We want to consider what Japanese companies can do in the current situation where overseas forces are dominant." Companies meeting this month with some universities include NEC, Fujitsue, Hitachi and Matsushita Electric Industrial, as well as telecommunication firms and one local, public broadcaster. Development will begin in April, 2007. Source | 科技 |
2016-40/3983/en_head.json.gz/3646 | How Yosemite Keeps Its Bears' Paws Off Campers' Hamburgers
Researcher Jack Hopkins used barbed-wire snares to collect hair samples from bears in Yosemite. Analysis of isotope ratios in hair samples showed how much of the bears' diets came from human food.
(Courtesy of Jack Hopkins
One of the great joys of camping out in a national park is chowing down by the fire. But campers aren't the only ones drawn to burgers and s'mores roasting over an open flame, beneath a mass of twinkling stars.
Those rich aromas can also prove irresistible to the local critters. From bears to foxes to coyotes, biologists have documented wildlife getting irrevocably hooked on our food and food waste. And for good reason: Our food is way more calorie-rich — and thus, better for making babies — than the standard black bear fare of insects and leaves.
As the number of visitors to national parks has grown over time, the animal residents have gotten increasingly crafty about getting their paws and hooves on our food. And we're talking antics far more aggressive than Yogi Bear sneaking away with a pic-a-nic basket.
Wild ponies of Assateague Island National Seashore are known to use their teeth to unzip tents to pilfer for chips and hot dogs. The black bears of Yosemite have ripped the doors off of cars for a taste of hamburger meat within.
This kind of behavior is both a nuisance and a recipe for conflict. According to one 2013 study, black bears in Yosemite caused $3.7 million in property damage, injured 50 people and were involved with more than 12,000 reported food-related incidents in the last two decades. And biologists say that the greediest animals, who stop at nothing to get our food, are also the most likely to be killed by us.
So how can we keep animals out of our picnic baskets? The National Park Service has come up with all kinds of strategies to try – from installing animal-proof trash bins to educating visitors to lock up their food in special canisters. Visit Yosemite and you'll get a personal lecture from a ranger on how to store your food to keep from tempting the black bears who lurk near campsites.
According to a study in the March issue of the journal Frontiers in Ecology and the Environment, these kinds of measures seem to be paying off (in Yosemite, at least). Researchers at the University of California, Santa Cruz wanted to find out whether the proportion of human food in the diet of the black bears there had declined since the park received a big grant to manage the food issue in 1999.
To find out, they gathered samples of bear hair and bone dating back to 1915 and measured the chemical isotopes in them. If you've read The Omnivore's Dilemma or seen the documentary King Corn, you might remember that human hair samples reveal a lot about our diet today – namely, that we eat a lot of corn. Corn has a heavy carbon and nitrogen isotopic signature, and it goes into sodas, pastries, and even our meat, since it's a big part of what livestock are fed.
Here's the really fascinating thing we learned from Jack Hopkins, lead author of the paper. (He did the Yosemite bear research while he was an ecology Ph.D. student at Montana State and a research fellow at UC-Santa Cruz.)
"A bear that eats like us is going to look like us" in an isotopic analysis, he says. In other words, the corn shows up in the hair of bears who've learned to love and steal human food.
The good news, though, is that Hopkins and his co-author, Paul Koch, a professor of earth sciences at UC Santa Cruz, found that the human foods in Yosemite bears' diets had decreased by about 63 percent between 1999 and 2007. In other words, teaching park visitors to lock down their food has succeeded at keeping our snacks out of bears' bellies, they conclude.
Still, Hopkins says, even if Yosemite can get 99 percent of its visitors to comply with the policy, with 4 million visitors a year, there will always be several who forget or don't pay heed.
"The bears are still there, just waiting for someone to be careless," he says. "There are still a lot of problems." Copyright 2014 NPR. To see more, visit http://www.npr.org/. | 科技 |
2016-40/3983/en_head.json.gz/3690 | AT&T Aims to Avoid Opening Can of Worms as It Opens Up Its Network
The windowless building in Lower Manhattan may not indicate it, but AT&T Labs is trying to be more open.
Using an area normally home to its network security team, Ma Bell had a science fair of sorts on Thursday, showing off a number of the technologies that it has been cooking up in its labs. Many of the projects on display take advantage of different pieces of network data that AT&T now makes available to developers.
The various projects and booths paint an interesting future where doors can be opened by voice, a chip in the phone or even the electrical signals that travel through our hands, to name just a few of the gee-whiz technologies on display. But whether this future is bright or grim depends a bit on how one feels about being tracked.
Cellphones are indeed powerful devices these days — portable computers that know who we are, where we are and how we pay for things. Many of the projects on display Thursday aim to combine that knowledge in useful ways.
One application, for example would allow parents to keep tabs on their kids while they are driving — getting alerts if they text and drive or neglect to wear their seatbelts.
Another project nearby shows something akin to Caller ID on steroids. Today’s Caller ID shares only one’s phone number, but AT&T has the potential to share a lot more. One demo imagined what it would be like to share location and all manner of other information with a person you are dialing. Such uses could make it easy when, for instance, one is ordering a pizza.
Data combinations clearly have downsides, though. Imagine how hard it would be to cancel an outing with friends if they knew one was in Atlantic City, rather than sick in bed.
There are two questions that companies need to ask when releasing new services, says Edward Amoroso, senior VP and chief security officer for AT&T.
The first, Amoroso says, is about the art of the possible. “What sort of technology could you actually do?”
Then, he said, it is important to ask a second question. “What technology are people going to be comfortable with?”
Thursday’s science fair was more about the first question than the second.
Not all of the projects were as fraught with controversy. One of the more popular demos was one AT&T has been showing for a couple of years now called “Air Graffiti,” which allows users to tag physical locations with art, photos, sounds or other information — all without the risk of irking the property owner. AT&T has been working on the idea for a decade, but the technology needed to make it a reality has only recently become widely available.
Locations can be as specific as a single spot or as big as the earth and users can choose to share their graffiti publicly or with only a small collection of friends or family. Graffiti can also be timed to last for a short duration or set to live forever.
AT&T also used Thursday’s event to launch Watson, a new speech-recognition technology that it says is the result of a million hours of research and development and is the subject of 600 patents. The platform can recognize natural speech patterns and translate among six different languages.
Several of the technologies on display are also making their way into AT&T’s latest “Rethink Possible” campaign spots.
AT&T, like other carriers, have been increasingly opening up various features of their network — even core things like location and messaging and payment — so that developers can create more sophisticated programs.
Opening up their most valuable assets — the networks — is a clear risk for the carriers. At the same time, each is looking to avoid becoming just a “dumb pipe” for which they are paid a toll that barely covers the cost of each generation of network upgrades.
Things are indeed at a critical juncture, says Chief Technology Officer Krish Prabhu.
“There’s a cultural transformation and we are right in the middle of it,” Prabhu told AllThingsD. In a couple of years, the result will be clear, he said. “Either we changed the company for the good or we missed the boat.”
One of the capabilities that AT&T is studying is whether to allow, for example, the ability for applications to send text messages on behalf of users, much the way that the iPhone or Android sends notifications. Striking the right balance between usefulness and spam will be key.
Also front of mind for AT&T is making sure that nothing it does compromises the overall security of its network, something Amoroso said remains his top priority.
Figuring out how to make money will be another key. Prabhu said that AT&T has some goals in terms of getting a certain percentage of new revenue by opening up its network. However, he declined to reveal any of the specific numbers.
“I think the network has a lot of capability other than just connectivity,” he said. “It is a business objective and there is clearly an understanding that at some level a certain percentage of our revenue will come from this.”
Tagged with: AT&T, AT&T Labs, Edward Amoroso, location, speech recognition, tracking, Watson
Another gadget you don’t really need. Will not work once you get it home. New model out in 4 weeks. Battery life is too short to be of any use.— From the fact sheet for a fake product entitled Useless Plasticbox 1.2 (an actual empty plastic box) placed in L.A.-area Best Buy stores by an artist called Plastic Jesus AllThingsD by Writer | 科技 |
2016-40/3983/en_head.json.gz/3701 | The Top 10 Apple Stories Of 2012 Include Some Highs And Many Lows
Although 2012 wasn't quite Apple's annus horribilis, it wasn't the best year for the tech giant. Here are the top 10 stories that made news during the year that was.
The story of the year
The iOS 6 Maps debacle
For Apple, 2012 won’t be remembered for the iPad mini, or the iPhone 5. Instead, Apple’s yearbook begins and ends with the iOS 6 Maps fiasco.
First demoed at WWDC in June, Maps represented a radical departure for Apple. It was the first version of Apple’s native mapping software that didn’t include Google Maps, and this didn’t sit well with consumers.
Soon after Maps debuted with the launch of iOS 6, Apple CEO Tim Cook apologized for it. Soon after that the executive behind it, Scott Forstall, was fired.
It wasn’t that iOS 6 Maps was bad; rather, it wasn’t nearly as good as the product it replaced. Meanwhile, Apple fans waited to see what iOS 7 Maps would look like.
Other stories that made news
Refresh, refresh, refresh
No one can accuse Cook of sitting around and doing nothing.
During September and October, Apple refreshed or added to, the company’s entire line of mobile devices. They also launched new versions of the iMac and Macbook Pro.
While users were thrilled to see Apple’s new lineup of products, some were initially unhappy with the early demise of the so-called new iPad.
Launched in March, Apple’s third generation tablet was officially retired just seven months later when the iPad with Retina display debuted.
This quick turnaround between iOS device wasn't an anomaly. Rather, it represented Cook's realization that the days of the one-year product launch cycle were over.
Hey Apple, people do like Google products too
Apple users may continue to discount Google’s Android platform. And yet, they still love the company’s iOS apps, as Apple found out early and often in 2012.
Cupertino killed off native versions of Google Maps and YouTube in iOS 6. However, public outcry convinced Apple to let each of them return as third-party apps.
The result was a reenergized Google, which emerged as the top technology company of the year.
The world of Tim Cook
For better or worse, 2012 turned out to be the year Tim Cook firmly took the reins at Apple. Appointed CEO in August 2011, Cook largely followed Steve Jobs’ script for his first year at the helm. However, as the leaves began to fall, it become clear that Cook was his own man.
Whether it was his quick apology for iOS 6 Maps, the launch of the iPad mini, or his executive shakeup that left many initially scratching their heads, Cook proved to be a much different leader than his predecessor.
Facebook buys Instagram, while everyone else joins in
The social network bought the company behind the camera app sensation for $1 billion. By the end of the year, a mini fury unfolded over something Facebook knows far too much about: privacy issues.
Meanwhile, as the likely result of the Instagram purchase, Google bought Snapseed, while Twitter also got into the photo-taking game.
While Facebook's long-term plans for Instagram remain largely unclear, one thing is certain: 2013 will see the arrival of Instagram ads.
The iPad mini was real, and popular
Rumored for what seemed like entire decades, the iPad mini finally arrived in October. Apple’s first compact tablet came without a Retina display and with a price higher than what competitors were charging for their own products.
In the end, however, it didn’t matter. The iPad mini flew off the shelves and became one of the most popular gifts of the holiday season.
The iPhone 5
The phone Apple was supposed to deliver in 2011 finally arrived soon after Labor Day.
The iPhone 5 was Apple’s first handset to include a 4-inch screen. It was also the company’s first device to use the new Lightning connector, although others would soon follow.
Production issues may have kept the iPhone 5 out of the hands of many, at least initially. However, it soon became the top-selling smartphone in the U.S.
Jony Ive's time to shine
Apple’s hardware guru received a promotion when Forstall was shown the door. Now, it will be up to Ive and Craig Federighi to reshape iOS in 2013 and beyond.
I want my iTV
The year was supposed to end with the arrival of an actual Apple television. However, despite the rumors, Cupertino held off releasing the so-called iTV. Alas, a new year is about to begin.
The new iTunes 11
Delayed by nearly two months, iTunes 11 finally arrived in November. Less cumbersome and quicker than previous versions, Apple’s digital hub now looks a lot like its iOS counterpart.
There were certainly other Apple stories that made news this year, including the company's stock surge and swoon, the arrival of Passbook, and more. However, we hope that we covered the most important ones on our list.
Are there stories you would have added here?
by Twitter, Inc.
by Burbn, Inc.
by Nik Software, Inc.
by Google, Inc.
AppAdvice's Top 10 Best Free iPhone Apps Of 2012
What Are The Top Technology Stories Of 2012? | 科技 |
2016-40/3983/en_head.json.gz/3702 | Limited supply reportedly keeps AMOLED from Apple's next-gen iPad
By Sam Oliver Friday, December 17, 2010, 05:00 am PT (08:00 am ET)
Casting even more doubt on the prospect of an active-matrix organic LED display in Apple's anticipated second-generation iPad, a new report claims that limited component supplies have been a "major reason" for Apple to overlook the technology.
In an editorial posted Friday by Taiwanese industry publication DigiTimes, Rebecca Kuo said that component makers in Taiwan and China are attempting to catch up with Korea, which is the leader in AMOLED displays. Though suppliers are increasing their production, she said it still isn't enough for Apple.
"With backlight unit (BLU) makers set to be suppliers for the second generation of iPad, AMOLED will still be unable to enter Apple's supply chain," she said. "Panel makers have noted that a major reason for Apple to overlook AMOLED for iPads is insufficient supplies."
Korea-based Samsung Mobile Display currently creates AMOLED panels between 2 inches and 4.2 inches, but its capacity is allegedly not enough to meet demand for the Samsung Galaxy S, Google Nexus S, or other phones from Nokia and HTC.
"Moreover, the current AMOLED technology is not suitable for volume production of 7- to 11-inch tablet PC panels, and the mass production cost will not be able to compete with TFT-LCD panels," Kuo wrote.
The details come just after the same publication claimed that an LCD backlight supplier has been selected for Apple's second-generation iPad expected to debut in the first quarter of 2011. DigiTimes has claimed that Apple experimented with an AMOLED display for the iPhone, but ultimately rejected the hardware because it was less suitable for displaying text, and because of display issues.
Rumors of an iPad with an AMOLED display, mostly perpetuated by DigiTimes itself, were repeated for months. The site also incorrectly reported in Nov. 2009 that Apple's not-yet-announced iPad would have an OLED display that would cost about $2,000 at retail.
iPad,iPad 2 (29) Comments
NPD: Apple's iTunes Music Store climbs to 66% of digital music market
Apple bulking up iOS development team with navigation software experts | 科技 |
2016-40/3983/en_head.json.gz/3716 | AMD posts ninth consecutive quarterly loss
From the department of "who didn't see this coming?", AMD kept the red ink …
Jon Stokes
- Jan 23, 2009 2:20 am UTC
Over the past nine quarters, there has been a lot of talk from AMD's Hector Ruiz about a "return to profitability" that's always just around the bend, but talk is all it is. In spite of AMD's efforts, the profitability never materializes, and this past quarter was no different. In an earnings call Thursday, AMD announced its ninth consecutive quarterly loss, this time to the tune of $1.424 billion. This is actually a smaller net loss than the $1.772 billion loss that the company posted in the fourth quarter of 2007, and that's an achievement, given the deteriorating conditions in its core PC market. Like so many others in the PC or PC components business, AMD saw its net revenues plunge by 35 percent from the third quarter to the fourth quarter. The declines hit all of the company's segments, including its core processors and graphics businesses. Apart from any revenue plunge, though, what was most striking was the drop in gross margins: from 57 percent in the third quarter to 23 percent in the fourth. The drop in demand is putting pressure on everyone's ASPs, but a 23 percent gross margin is quite low even for AMD. The ATI acquisition also continues to blow holes in AMD's balance sheet, and the company has written off a majority of the cost at this point as a loss. This quarter saw another goodwill writeoff of $684 million for the graphics business, bringing the total writeoffs to $3.17 billion (on a $5.4 billion purchase). As we reported previously, AMD has decided to get rid of its handheld unit as part of its plans to pare down and focus entirely on its core PC business. It's too bad, though, that the PC market is the one taking the worst beating right now, with netbooks being the only bright spot. In other words, AMD is consolidating all of its bets onto the areas of the market that are tanking the hardest, and at a time when netbooks are not only red-hot, but the buzz around NVIDIA's Ion clearly demonstrates that there's a ton of interest out there in precisely the kind of ultramobile-x86-plus-mobile-GPU combination that AMD is particularly well-suited to provide. Think about it: the integrated graphics processors that Intel bundles with its Atom parts stink, which is why we all cut backflips on finding out about NVIDIA's plans to pair Atom with its own 9400M GPU. It's beyond me why AMD is doubling down on desktop and server CPUs when the market is crying out for a netbook platform that combines an Atom-style processor with a mobile GPU. Listing image by Flickr user Joriel | 科技 |
2016-40/3983/en_head.json.gz/3764 | the blistered orb
philosophical notes on climate change
green vs. ecological economics
So the current economic model is growth-based, hence unsustainable, and consequently in need of revision. But why push the revisions to the opposite extreme? Why not stop at a halfway point such as the green economy? Why go to all the way to an ecological economics model? Wouldn't that throw out the baby with the bathwater? Can we not compromise? R. Costanza, in the 2013 State of the World Report, p. 127, explains the different roles of government: in the current economic system,
government intervention [is] to be minimized and replaced with private and market institutions; in a green economy, there is
recognition of the need for government intervention to internalize natural capital; and in ecological economics,
government plays a central role, including new functions as referee, facilitator, and broker in a new suite of common-asset institutions.
So a green economy is still capitalism, but with less freedom and more regulation than before, whereas an ecological economy, by the Costanza et al. definition, is a post-capitalistic model. It's a type of planned economy, a centralized system with a tightly regulated market and a state whose priorities are those of sustainable democratic socialism. In Naomi Klein's words (the title of chapter 4 of her 2014 This Changes Everything: Capitalism vs. the Climate): "planning and banning--slapping the invisible hand." That sounds like old-style communism, but there's a simple difference. The planned economies of real existing socialism in the past century were growth-based economies, just like their capitalistic counterparts. In both capitalism and communism, growth was a structural feature; in capitalism, it's a byproduct of market dynamics, and in communism, it was built into the five-year plans geared towards productivity enhancements. By contrast, the ecological economics model is something new: it's a steady-state economy, geared to maintain stability without the need for growth.
By the same token, this distinguishes ecological economics from the 21st century version of communist economics, China's current economic system, which Americans call 'state capitalism,' but which is technically speaking Leninist economics. Deng Xiao-Ping, China's leader and the creator of this economic system after Mao's death, was schooled in Moscow during Lenin's New Economic Policy era. The New Economic Policy (NEP) from 1918 to 1924 was a blend of government dictates and market opportunities. This was quite unlike the later Stalinist collectivization that became the standard of a planned economy in the USSR and in the PRChina under Mao. However, the Leninist design of China's present-day state capitalism remains a growth-based economy. Ecological economics is in a different category.
Here's a visual primer on ecological economics vis-a-vis the green economy and the current capitalistic economy, from Constanza's Report to the United Nations for the 2012 Rio+20 Conference on Building a Sustainable and Desirable Economy-in-Society-in-Nature.
This is the theoretical background for Naomi Klein's work, to which I'll turn in the next post.
Mad Hun
Here are philosophical notes on climate change. The bottom line is existence. At issue is the unsustainability of civilization at present. The methods for inquiry are critique and synthesis. The goal is to help mapping out ways for readapting to planetary boundaries--to clear a path for civil evolution. Notes on the cultural and political context of climate change are at blistercomment. An archive of facts, findings, and events is at blisterdata. | 科技 |
2016-40/3983/en_head.json.gz/3900 | Nintendo’s Wii U Console Introduces How U Will Play Next
Nintendo’s Wii U™ console – the first new home video game system in six years – arrives on Nov. 18, aiming to change the landscape of games and entertainment with its new Wii U GamePad controller. With more than 30 launch-day games for all types of players, Wii U arrives just in time for the holidays and is poised to be the must-have gift of the season.
The Wii U Basic Set will be available at a suggested retail price of $299.99 and includes a white Wii U console with 8 GB* of internal storage, one white touch-screen GamePad controller, AC adapters for both the console and controller, a sensor bar and an HDMI(TM) cable. (Photo: Business Wire)
The Wii U Basic Set will be available at a suggested retail price of $299.99 and includes a white Wii U console with 8 GB* of internal storage, one white touch-screen GamePad controller, AC adapters for both the console and controller, a sensor bar and an HDMI™ cable. The Wii U Deluxe Set will be available at a suggested retail price of $349.99 and its components are black. This set includes all elements from the Basic Set, as well as the Nintendo Land game, increased internal storage totaling 32 GB*, a console stand, a GamePad stand and a GamePad charging cradle. People who purchase the Deluxe Set also will be enrolled in the Deluxe Digital Promotion, which allows Deluxe Set owners to earn points when they purchase downloadable games and then redeem those points for codes that earn them credit toward even more fun digital content in the online Nintendo eShop. For more information about the Deluxe Digital Promotion, which is available for a limited time, please visit http://ddp.nintendo.com.
“Wii U is an ‘everyday’ connected device – offering a combination of games, entertainment, online connectivity and social activity that will make people want to interact with it daily,” said Nintendo of America President and COO Reggie Fils-Aime. “Never before have so many features been packed into one game console, at any price. Our substantial lineup of games offered at launch has something for everyone on your shopping list.”
Wii U offers completely new and unexpected game play, entertainment experiences and unique online interactions:
Wii U GamePad: The controller features a 6.2-inch touch screen that redefines how people interact with their games, their entertainment and one another. It comes with dual analog sticks and traditional buttons for gaming. The GamePad is wirelessly connected to the console, providing a perfectly integrated second-screen experience with the TV, no matter how it’s being used. Different players can enjoy different experiences in the same game, depending on which controller they opt to use. Players can even move select games from the TV to be played on the GamePad or use the GamePad as a TV remote control.
Games: Wii U is launching with 29 packaged games, plus a handful of digital games, marking the largest launch lineup in Nintendo history. Nintendo-published games available on launch day include Nintendo Land™, New Super Mario Bros. ™ U, SiNG PARTY™ and NINJA GAIDEN™ 3: Razor’s Edge. Third-party publishers also will have an amazing array of games available on launch day, such as Just Dance® 4, Assassin’s Creed® III and ZombiU™ from Ubisoft, Skylanders Giants™ and Call of Duty®: Black Ops II from Activision Publishing, Inc., Disney Epic Mickey 2: The Power of Two from Disney Interactive, EA SPORTS™ FIFA Soccer 13 and Madden NFL 13 from Electronic Arts, NBA 2K13 from 2K Sports and Batman: Arkham City™ Armored Edition and Scribblenauts™ Unlimited from Warner Bros. Interactive Entertainment. A number of these titles will also be available for purchase digitally in the Nintendo eShop alongside a strong launch-day lineup of downloadable digital-only games, including Little Inferno from Tomorrow Corporation, Trine 2™: Director’s Cut from Frozenbyte, Chasing Aurora from Broken Rules, Mighty Switch Force: Hyper Drive Edition from WayForward and Nano Assault NEO from Shin’en. Wii U owners can expand the storage capacity of their systems by adding their own USB external hard disc drives. Downloadable content for select games, such as additional levels or maps, will roll out in December following a system update. Wii U is backward compatible with almost all Wii™ games and Wii accessories.
Miiverse: Miiverse™ is a global community in which gamers from all over can share experiences, discuss games and learn more about the video games they love. Using their personalized Mii™ characters, players with broadband Internet access can enter Miiverse and see games and entertainment content they have interacted with recently, expressed interest in learning more about or that their friends are playing or discussing. They can also challenge their friends to play together, ask for help from the Miiverse community about a difficult level or discover elements of their favorite games they never knew existed. Miiverse is integrated seamlessly into the Wii U experience.
Video on Demand: In the coming weeks, an array of favorite movies and TV shows will become available to Wii U owners on Amazon Instant Video and to Hulu Plus and Netflix subscribers. These services will also be accessible from Nintendo TVii. Wii U owners will also be able to watch YouTube videos and channels through the YouTube application, as well as sign in to their account and control the application with the GamePad. Wii U owners who have connected their systems to the Internet will receive a notification as each video application becomes active.
Nintendo TVii: In December, Nintendo will activate this unique application that will transform how people find, watch and engage with TV shows, movies and sports. Nintendo TVii makes watching TV simple and fun by bringing together a program guide, remote control and social interaction into one, seamless second-screen experience on the GamePad. Nintendo TVii comes with Wii U at no additional charge and requires no additional equipment. It works with existing cable and satellite channels. Viewers can engage with others in a variety of ways, such as commenting on moments as they happen on live TV, and then sharing those thoughts via Miiverse, Facebook and Twitter. Users can also discover more about what they’re watching, as information from a variety of sources is automatically linked to the program they are watching, including sports data. Nintendo TVii is customizable for every member of the family.
Wii U Chat: Wii U owners can video chat with one another on a broadband Internet connection using the built-in camera and microphone of the Wii U GamePad. Friends and family members far away can see and talk with one another in real time using this service at no additional charge.
Internet Browser: Wii U comes with a browser that lets users surf the Web from the comfort of their couches. Users can browse purely on the GamePad while watching a program on TV. Additionally, they can pause a game, launch the browser and search for something, and then re-enter the game right where they left off.
“The value of Wii U goes well beyond day one,” Fils-Aime said. “Nintendo will be enhancing the Wii U experience with continuous updates and new services for Wii U owners.”
Remember that Wii U features parental controls that let adults manage the content their children can access. For more information about the many Wii U features and details about all the current and upcoming games, visit http://www.nintendo.com/wiiu. A network update is required for some features.
* 1 GB = 1 billion bytes. Usable internal memory limited due to system software.
About Nintendo: The worldwide pioneer in the creation of interactive entertainment, Nintendo Co., Ltd., of Kyoto, Japan, manufactures and markets hardware and software for its Wii U™ and Wii™ home consoles, and Nintendo 3DS™ and Nintendo DS™ families of portable systems. Since 1983, when it launched the Nintendo Entertainment System™, Nintendo has sold more than 4 billion video games and more than 637 million hardware units globally, including the current-generation Wii U, Nintendo 3DS and Nintendo 3DS XL, as well as the Game Boy™, Game Boy Advance, Nintendo DS, Nintendo DSi™ and Nintendo DSi XL™, Super NES™, Nintendo 64™, Nintendo GameCube™ and Wii systems. It has also created industry icons that have become well-known, household names such as Mario™, Donkey Kong™, Metroid™, Zelda™ and Pokémon™. A wholly owned subsidiary, Nintendo of America Inc., based in Redmond, Wash., serves as headquarters for Nintendo’s operations in the Western Hemisphere. For more information about Nintendo, please visit the company’s website at http://www.nintendo.com. | 科技 |
2016-40/3983/en_head.json.gz/4017 | Expect more impacts as drought rolls into second year Written by Gothenburg Times Wednesday, 26 December 2012 19:52 Nebraska has been at the epicenter of the drought of 2012, and its impacts will intensify if it lasts through the winter, as is forecast, say climatologists at the University of Nebraska-Lincoln’s National Drought Mitigation Center.
“The previous five years all had above-normal precipitation, the wettest period in recorded history,” said Michael J. Hayes, the center’s director. “For Nebraska, it was unprecedented. We came into 2012 with a full hydrological system—rivers, streams, reservoirs and groundwater.
“When you’re talking major droughts, this is not a multi-year drought. As we look ahead to 2013, we don’t have that margin built into our hydrological system, so we’re in pretty dire straits.”
Nebraska, Colorado and Wyoming are all on track to record their driest year on record in 2012, Hayes said, and the country as a whole is having its hottest year on record.
The Climate Prediction Center says the drought in the Plains is likely to continue at least through February, and recovery will take time.
“In Nebraska and the central Plains, we’ve started seeing the drought feeding off itself, with the dry soils and dry air not allowing precipitation events to develop as usual,” said Brian Fuchs, drought center climatologist. “With the lack of moisture, we’re more like a desert environment. It warms up fairly quickly during the day, but drops quickly at night.”
State climatologist Al Dutcher recently said that the chances of getting a wet enough winter to bring moisture levels back to normal are only 10 to 20 percent.
“When we do have precipitation, very little will go to runoff,” Fuchs said. “Those soils are going to act as a big sponge. They’re just going to take in a lot of the moisture. We’ll continue to see problems of stock ponds, smaller lakes and streams dropping. The hydrologic drought hasn’t reared its head, but it’s there, as we are seeing more water systems under stress.”
“Typically when farmers are done irrigating, you will see the water in the Platte percolate back through the basin,” Fuchs said. “We did see that response but it was very minimal and that was even with the irrigation season ending sooner than usual. The channels are tiny, with these very small threads of water in eastern Nebraska.”
Anecdotal evidence suggests that in some areas, groundwater levels are declining, which could affect well owners. “I would see that exponentially increasing if we stay dry in 2013,” Hayes said. “There’s a public health issue when homes don’t have water.”
Although rural residents may be accustomed to hauling water occasionally, Hayes noted that it could be a real hardship for some, such as older people living alone.
Organizations that work with well owners recommend having wells checked now, especially if they were constructed before 1993, to ensure reliability of water supply.
Agricultural producers have been hit hard by the drought. The U.S. Department of Agriculture’s Risk Management Agency said that as of Dec. 10, indemnity payments nationwide had reached $8 billion for 2012.
In 2012, 80 percent of the eligible acres nationwide and 90 to 95 percent in Nebraska enrolled in crop insurance, said Rebecca Davis, regional director for the RMA in Topeka, speaking at Nebraska’s Climate Assessment and Response Committee meeting Nov. 29.
She said that Nebraska is currently the fourth largest consumer of crop insurance, and the fifth largest recipient of indemnity payments, with nearly $483 million paid out as of Nov. 19, and corn alone accounting for $363.2 million of the covered loss.
By Nov. 26, total Nebraska indemnities were at $544 million, with $502 million due to drought, heat and dry wind that affected more than 2 million acres of cropland.
This year’s drought is forcing producers to make hard choices.
“The problem with drought and lack of forage is that many producers are using corn stalks as forage, actually baling them and selling them like hay, which is a double-edged sword,” Fuchs said. “While they are using the stalks for one purpose, it could be hurting them as far as tillage conditions.”
Leaving stalks on the field as a cover that can prevent erosion and help hold moisture in the soil.
Nebraska had its worst fire season since 1919, with central and western Nebraska hardest-hit. Don Westover, fire program leader of the Nebraska Forest Service, reported that as of Dec. 14, the state had 1,426 wildfires reported, burning more than 400,752 acres and 65 structures, and costing $12 million so far. He added that a few large fires still unreported in the official tally would add another 94,000 acres.
For up-to-date information from Nebraska Extension about how to prepare for another drought year, see droughtresources.unl.edu. | 科技 |
2016-40/3983/en_head.json.gz/4044 | About RBC (Radiation Biology Center)
HOME > About RBC (Radiation Biology Center)
The Radiation Biology Center was founded in 1976 in response to the recommendation to the government by the Science Council of Japan in which the Council noted the research on the effects of radiation should be promoted. In this recommendation, the establishment of an institute for basic research in radiation biology was urged. In 1970, Kyoto University was asked to be the host institution for the new research institute, and the University Council voted in favor of the plan.
The initial plan for the institute included 13 laboratories with a staff of over 60 scientists for basic research. While the original plan had to be reduced to 3 permanent and 2 visiting laboratories, the proposal was finally approved by the government and the Radiation Biology Center was established in May 1976. Dr. Tsutomu Sugahara, Professor of the Department of Experimental Radiology, Faculty of Medicine, and Dean of the Faculty of Medicine was elected as the first Director of the Center. Dr. Sugahara had been chairman of the organizing committee for the proposal and was one of the members of the committee that had first urged the establishment of the Center. The successive directors thereafter have been: Dr. Kanji Torizuka from April 1980 to March 1986, Dr. Hiraku Takabe from April 1986 to March 1988, Dr. Shigefumi Okada from April 1988 to March 1989, Dr. Hiraku Takebe from April 1989 to March 1993, Dr. Masao S. Sasaki from April 1993 to March 1997, Dr. Mituo Ikenaga from April 1997 to March 1999, and Dr. Ohtsura Niwa from April 1999 to March 2003. Dr. Kenshi Komatsu from April 2003 to March 2009. Dr. Tomohiro Matsumoto has been the Director since April 2003.
The first laboratory to be established was the Department of Radiation System Biology. The second laboratory, the Department of Nucleic Acid Repair Studies, was established by visiting positions in 1977. Two additional permanent laboratories, the Department of Mutagenesis and the Department of Late Effect Studies were opened in 1978 and 1983, respectively. The second visiting laboratory, the Department of Radiomimetic Chemical Studies was added in 1987. The Department of Genomic Dynamics Studies was established in 2001. Radiation biology is a multi-disciplinary science that covers physics, chemistry, medicine and widespread branches of biology.
The Center should continue to function as a nucleus of radiation research of such multi-disciplinary nature by mediating scientific correspondence among radiation biologists of different research areas. In 1984, the laboratory building with a floor space of approximately 1,500 m2 was built on the medical campus, and further extended to 2,400 m2 in 1993.
Research at RBC
Department of Radiation System Biology
Department of Mutagenesis
Department of Late Effects Studies
Department of Genomic Dynamics
HOME | Sitemap | Link | Database
Copyright © 2009 Radiation Biology Center, Kyoto University. All rights reserved. | 科技 |
2016-40/3983/en_head.json.gz/4107 | Frank Feschino and the Flatwoods Monster
Let’s talk about the Flatwoods Monster one more time. This is a case, because of its high strangeness, that is easy to dismiss. A flying saucer landing on a hill with some kind of floating creature coming from it. One that might have had glowing eyes or one that might have "shot" rays from its eyes. Something that might have left landing traces that no one bothered to document at the time. And one in which the craft and creature disappeared before the corroborating witnesses could get there to take a look around.And one that I found difficult to believe because one of the researchers, Frank Feschino, had posted on a web site a quote attributed to me about his book claiming that the Air Force had engaged in combat with the flying saucers after an order had been given to "Shoot Them Down."What have we learned in the last week.The webmaster, Alfred Lehmberg, came forward and said that Feschino had nothing to do with that mistake. Lemberg said that he had taken the quote from a closed discussion group and attributed it to me because I had started the specific thread. Someone else, posting a comment to that thread had made the comment and Lehmberg didn’t pay close attention to the attribution. An honest mistake, he said.I can live with that. People make mistakes in attribution.(Richard Dolan, in his massive UFOs and the National Security State makes something of an error in attribution. In writing about the testimony of Barbara Dugger, he credits, in a footnote, Stan Friedman and his book. If you follow that to Friedman’s book, you find the interview attributed to me. Don Schmitt and I had conducted it and provided a copy of the video tape to the Fund for UFO Research. Friedman, and Don Berliner used the tape as a source in their book. So, the attribution was essentially correct, but the original source of the interview had not been made clear. But I digress...)What this also means is that Feschino had nothing to do with the misidentified quote.What it also means that one of the reasons to reject the information in his books has been eliminated. You might find his work sloppy or that he leapt to conclusions, but you can’t say he was responsible for the misidentified quote. That belongs to Lehmberg and he has taken complete ownership of it.Where does that leave me on the question of the Flatwoods Monster. Well, in this world, in which it would be nice to be invited to speak at all these conventions, where it would be nice to have others buying the books that I write, I find it difficult to say something or embrace something in which I have doubts. If I embraced everything... abductions, cattle mutilations, crop circles, every strange and bizarre UFO story... then I would find myself on lots of convention programs. But I just can’t do that.I have to believe what I say about the topic and with the Flatwoods Monster, I have difficulty embracing the tale as told. It seems that the bolide explanation for the sighting is reasonable. It seems that the hysteria of the witnesses contributed to their sighting. It seems that, by September, 1952, with the newspapers filled with stories of flying saucers, reasonable to believe that the idea of alien visitation wasn’t all that far from the minds of the witnesses. It all seems a reasonable explanation to me.But Frank Feschino has found some interesting information. I do not know why an officer in the National Guard would talk about deploying soldiers into the area. Deployment of National Guard forces is strictly controlled, if for no other reason than purposes of pay. An officer can’t order his soldiers into the field unless there is imminent loss of life and then he better have some very persuasive evidence.So, I suppose as some say, this would be in my gray basket (though I dislike that term) but it would be a very light gray. I don’t believe there is much of substance here, but it never hurts to take a closer look. Sometimes I find that I have been mistaken... but only sometimes.
Alfred Lehmberg,
Flatwoods Monster,
Frank Feschino,
Since I was asked about this case, I thought that I would review it and see where we are on it today. I will tell you that the first thing I found was a web site promoting a book about the case (or the follow on book that is an outgrowth of this case as seen here) that I hadn’t read, though I knew about it. I wouldn’t have mentioned this, but the first thing I saw was a quote, from me, endorsing the book, using language that I wouldn’t use and failing to identify me as a retired Army officer.Given this, I’m more than a little concerned about the validity of anything that appears in the book, or the research that produced it. Using me to endorse the book when I did no such thing suggests someone who is less than candid in other areas of research. The only question is who is responsible for the quote and why was it even put up. I’ll have more on this in a later post.That said, I did make a new survey of various sources about the case. The Air Force file on the Flatwoods, West Virginia sighting of September 12, 1952, contains a project card, that form created at ATIC that holds a brief summary of the sighting, what the solution is if one has been offered, and other such easily condensed data. According to the project card for the Flatwoods sighting there is the notation that the case was solved by the meteor that had been reported over the east coast of the United States on September 12. In fact, the only reference to anything suggesting a creature was on the ATIC Project Card where there is the note about the "West Virginia monster, so called."All this presents a curious problem. Clearly the Air Force had heard of the case, and just as clearly they had written it off as a very bright meteor that had been reported over the eastern sea board on September 12. There is also a note that the meteor (or meteoroid for those of a precise and technical nature) landed somewhere in West Virginia (becoming a meteorite). Apparently the Air Force believed that the "landing" of the meteorite was enough to inspire local residents to imagine a creature on the ground. And, apparently, they believed that the meteorite would account for the reports of physical evidence.Ufologist and biologist Ivan T. Sanderson, writing in his UFO book, Uninvited Visitors was aware of both the Air Force explanation and the meteorite that had been reported. Sanderson wrote, "...we met two people who had seen a slow-moving reddish object pass over from the east to west. This was later described and ‘explained’ by a Mr. P.M. Reese of the Maryland Academy of Sciences staff, as a ‘fireball meteor.’ He concluded - incorrectly we believe - that it was ‘traveling at a height of from 60 to 70 miles’ and was about the ‘size of your fist.’... However, a similar, if not the same object was seen over both Frederick and Hagerstown. Also, something comparable was reported about the same time from Kingsport, Tennessee, and from Wheeling and Parkersburg, West Virginia."The whole story, as it is usually told, begins with several boys playing on a football field in Flatwoods. About 7:15 p.m., a bright red light,"rounded the corner of a hill" crossed the valley, seemed to hover above a hilltop and then fell behind the hill. One of the boys, Neil (or sometimes Neal) Nunley, said that he thought the glowing object might have been a meteorite. He knew that fragments of meteorites were collected by scientists, so he suggested they all go look for it.As they watched, there was a bright orange flare that faded to a dull cherry glow near where the object had disappeared. As three of the boys started up the hill, toward the lights, they saw them cycle through the sequence a couple of times. The lights provided a beacon for them, showing them where the object was.They ran up the main street, crossed a set of railroad tracks and came to a point where there were three houses, one of them belonging to the May family. Kathleen May came out of the house to learn what was happening and where the boys were going. Told about the lights on the hill, and that "A flying saucer has landed," she said that she wanted to go with them. Before they left, May suggested that Eugene Lemon, a seventeen-year old member of the National Guard (which has no real relevance to the story, but is a fact that is always carefully reported) went to look for a flashlight.They found the path that lead up the hill, opened and then closed a gate, and continued along the winding path. Lemon and Nunley were in the lead with May, her son Eddie, following, and they were trailed by others including Ronald Shaver and Ted Neal. Tommy Hyer was in the rear, not far behind the others.As they approached the final bend in the path, Lemon’s large dog, which had been running ahead, began barking and howling, and then reappeared, running down the hill, obviously frightened. Lemon noticed, as the dog passed him, that a mist was spreading around them. As they got closer to the top of the hill, they all smelled a foul odor. Their eyes began to water.Some of them reported that they saw, on the ground in front of them, a big ball of fire, described as the size of an outhouse, or about twenty feet across. It was pulsating orange to red. Interestingly, although it was big and bright, not everyone in the tiny party saw it.Kathleen May spotted something in a nearby tree. She thought they were the eyes of an owl or other animal. Nunley, who was carrying the flashlight, turned it toward the eyes. What they saw was not an animal, but some sort of creature, at least in their perception. The being was large, described as about the size of a full grown man to the waist. They could see no arms or legs, but did see a head that was shaped like an ace of spades. That was a description that would reoccur with all these witnesses. No one was sure if there were eyes on the creature, or if there was a clear space on the head, resembling a window, and that the eyes were somehow behind the that window and behind the face.Lemon reacted most violently of the small party when he saw the object. He passed out. There was confusion, they were all scared, and no one sure what to do. The boys grabbed the unconscious Lemon and then ran. They finally reached May’s house. Inside, they managed to bring Lemon back to full consciousness. They called others, and a number of adults arrived at the May house. The group, armed with rifles and flashlights, headed back up the hill, to search for the strange creature. None of the men seemed to be too excited about going up the hill, and in less than a half an hour, they were back, claiming they had found nothing at all.Still others, including the sheriff, eventually arrived. Most of them didn’t bother to mount any sort of search, and the sheriff, who was clearly skeptical, refused to investigate further than talking to May and the boys. I think it is important to note here that the sheriff had been searching for a downed small aircraft reported earlier. He found no evidence of an aircraft accident and no one reported any airplanes missing. The relevance of this will become clear later. Two newspaper reporters, apparently from rival newspapers, did, at least, walk up the hill, but they saw nothing. They did, however, note the heavy, metallic odor that had been described by May and her group.A. Lee Stewart, Jr., one of the editors of The Braxton Democrat convinced Lemon to lead them back to the spot of the sighting. Given Lemon’s initial reaction, it says something about the kid that he agreed to do so. They found nothing and saw nothing but did smell that strange odor.The next day, there were some follow-up investigations. Some people reported that they had found an area where the grass had been crushed in a circular pattern. Sanderson, who visited the scene a week later, said that he and his fellow investigators were able to see the crushed grass and a slight depression in the ground. No one bothered to photograph this reported physical evidence which is one of the problems that seem to flow through UFO research. No one thinks to gather the evidence when the opportunity is there even if that evidence is a photograph of the ground.Sanderson pointed out that the other physical evidence that had been reported, skid marks on the ground, an oily substance on the grass, and the foul odor, might have been part of the environment. The type of grass growing wild in that area gave off a similar odor and the grass seemed to be the source of the oil. Sanderson said that he couldn’t find the skid marks, and knew of no one who had photographed them.Gray Barker, a UFO researcher, also arrived a week later and coincidently on the same day as Sanderson, found others to interview. He talked with A. M. Jordan, Neil Nunley’s grandfather who said that he had seen an elongated object flash overhead. It was shooting red balls of fire from the rear and it seemed to hover before it fell toward the hilltop.Barker also interviewed Nunley, whose description of the craft disagreed with that of his grandfather though he did say the object seemed to stop and hover before falling to the hill. I wonder if the disparity came from the different perspectives of the witnesses. Sometimes the angles from which something is viewed seems to change the shape and the direction.When this story is reported, it always seems to end here, with the one group, lead by May and Lemon, seeing the strange creature or entity. The investigations, carried out by various civilian agencies always fails to find any proof. Many believe that if there was some corroboration, if someone else, not associated with May and her group, had seen the creature, it would strengthen the report. Several years later, a men’s magazine carried another story of the Flatwoods monster. Paul Lieb wrote that George Snitowski, was driving in the Flatwoods area with his wife, Edith, when he saw the thing on the ground.Snitowski didn’t tell his tale until two or three years after the fact. He then told it to an officer of the Flying Saucer Research Institute who published the account in the magazine. Looking at it from that point of view, that is, a tale told long after the national publicity that was provided for May and the others, there certainly is the hint that Snitowski was influenced by those articles. There is no proof he was, only the very real possibility.Snitowski was, according to his story, returning home with his wife and their baby when, near Sutton, West Virginia, which is not far from Flatwoods, his car engine stalled. He tried, but couldn’t get it to start and because it was getting dark, he didn’t want to leave his wife and baby alone on the semi-deserted highway. He thought they would wait for morning, and then he would walk the ten or twelve miles to the closest town, if someone didn’t come along to give them a hand before then.Snitowski said that a foul odor began to seep into the car making the baby cry. Snitowski didn’t know what this odor was but suspected it might be a sulfur plant nearby burning waste. It was then that a bright light flashed overhead and both Snitowski and his wife were confused by it. He said later, that looking down, into the woods, he could see what he thought of as some kind of dimly lighted sphere.Snitowski finally got out of the car and started walking toward some nearby woods where he believed the earlier light flash had originated. Inside the tree line sat the sphere. As he moved deeper into the woods, closer to the sphere, he said that his legs began to tingle, almost as if they had gone to sleep. Slightly sickened by a foul odor, barely able to walk, he began to retreat, heading toward the car.His wife screamed, and Snitowski yelled, "Edith, for God’s sake. What’s the matter."She was unable to speak and Snitowski saw, leaning against the hood of the car, a strange creature. He couldn’t see it well because of the lack of lighting around the area, but he thought it was eight or nine feet tall, was generally shaped like a human, with arms and a head attached to a bloated body.Snitowski reached the car, climbed in, and grabbed a kitchen knife that he had in the glove box. He forced his wife down, to the floor and begged her to silence the crying baby. He didn’t know what to do and said that the odor was now overpowering. But then, out of the corner of his eye, he saw the object, the sphere, beginning to climb erratically into the sky. It stopped to hover several times, and eventually disappeared. Suddenly it swooped, climbed upward in a bright, dazzling light, and vanished. When he looked outside the car, the creature was gone.Not knowing why, Snitowski tried to start the car now that the object was gone. Without trouble the engine started. They drove away, found a motel and checked in. The next morning they heard about the sighting from Flatwoods, but neither wanted to tell authorities what they had seen. Snitowski said that he didn’t want his friends and neighbors to think that he was crazy. Besides, he didn’t have any evidence about the creature or the UFO. There was only his story, corroborated by his wife.If his report is true, and there is no way, today, to learn if it is, then it makes a nice corroboration for the Flatwoods case. The problem, however, and as outlined earlier, is that the Flatwoods case was national news the day after it happened. At that point, the story was contaminated because an investigator could never be sure that Snitowski, or anyone else who came forward with a report, hadn’t been primed by the story as published in the newspapers or even seen on television.These two reports, by Snitowski and those in Flatwoods, were not the only ones made about that strange, tall, smelly, creature. About a week earlier, according to an investigation conducted by two Californians, William and Donna Smith, a twenty-one-year-old woman, who lived about eleven miles from Flatwoods, said that she had seen the creature that gave off the horrible odor. She was so upset by the encounter, that she was hospitalized for three weeks. Like Snitowski, she wasn’t interested in publicity at the time, so when the report from Flatwoods made the news, she elected to remain silent. There was no corroborating tales to support her.Years later, in the mid-1990s, Kathleen May Horner, was interviewed about the sighting. She told investigators that the two men that everyone thought were newspaper reporters were, in fact, government agents. She also remembered that a local reporter received a letter from some unidentified government agency that revealed the creature was some sort of rocket experiment that had gone wrong that day. There had been four such "rockets" and all of them fell back to earth.The government agents were able to recover all but one and that one had been seen in Flatwoods. It must be noted here that there is no corroboration for this story of government intervention and that it did not surface until forty years later.There are few points of corroboration for this tale, even among those who were together that night. The descriptions of the craft in flight sound more like a bolide, that is, a very bright meteor. Newspapers from other communities in the region report on just such a meteor. P. M. Reese from the Maryland Academy of Sciences suggested the red fireball was relatively slow moving and 60 to 70 miles high.We know, from our studies and from the various compilations of meteor falls on YouTube, that meteors can look just like aircraft crashing. We have seen how, as they break up, it seems there is a fuselage with lighted windows on it. They look remarkably like the descriptions of some famous UFO sightings including the 1948 Chiles-Whitted UFO case.And we know that meteors can seem to climb, though that is an optical illusion, that they can seem to hover briefly, and that they can seem to maneuver, again an optical illusion. The witness testimony here is not sufficient to reject meteor, especially when it is remembered that the object was seen over a large region, suggesting something that was very bright and very high. People looking up into the night sky are simply unable to judge height and speed with any degree of accuracy. A meteor of sufficient size and brightness was seen that night.Even if we reject, for whatever reason, the theory that any of the Flatwoods witnesses saw a meteor, we can look at the descriptions and how they vary. Even those who trekked up the hill report things differently, from the color and shape of the craft to even whether anything was sitting on the ground up there. Sanderson reported that the object was black but glowing red and shaped like the ace of spades, but Barker said it was spherical and some of those he interviewed said they hadn’t seen it at all.Jerry Clark reported that the witnesses stuck to their stories but that doesn’t mean what they saw was grounded in our shared reality. That they were truly frightened only suggests they were telling the truth, but not that they saw an extraterrestrial being.As I review the literature on this, I am struck by the disparity of the witness descriptions and how these sorts of things can be overlooked. I am surprised that there are descriptions of physical remains but there is little to document evidence. I am struck by a number of witnesses who said they saw the bolide and that the bolide was what everyone saw... and yes, many believe that a bolide has landed close by when it has either burned out and not touched down or it landed hundreds of miles away. In fact, several bolides have been reported to authorities as aircraft accidents... just like the one the sheriff investigated that night.This case seems to be the result of the bolide and the hysteria brought on UFO sightings that were headline news around the country including the impressive sightings from Washington, D.C. It seems that those who climbed the hill, believing they were going to find a landed flying saucer, talked themselves into the hysteria and when they saw something in a tree with eyes that glowed in the light of their flashlights, convinced themselves they had seen an alien creature.And, no, I’m not happy with this resolution. It seems that it makes too many assumptions. But the evidence for a UFO sighting and a landing is very weak at best. Given the timing of the sighting, given the lack of physical evidence, given the conflicting witness statements and given the well-known bias of the original investigators, and there isn’t much left here.But there is that quote attributed to me on one of the sites that promotes this case as extraterrestrial and fair or not, that influences my perspective. If they would manufacture this quote, what else has been manufactured and who is responsible. I have tried to communicate with them but I have heard nothing from them.In the end, I’m afraid that the terrestrial explanation is more likely the correct one here. I’m not completely sold on it but it seems that the preponderance of the evidence suggests that. Until something changes, that’s probably where it is going to stay.
ATIC,
bolide,
Eugene Lemon,
Flying Saucer Research Institute,
George Snitowski,
Ivan T. Sanderson,
Kathleen May
The Kelly-Hopkinville UFO Occupant Sighting
The Project Blue Book files contain very few cases in which alien creatures were reported. The most famous example is the Lonnie Zamora sighting from Socorro, New Mexico in 1964, which, in a twist for the Air Force was labeled as "Unidentified." For this discussion, we’ll let that go and look at another of the Blue Book cases, although, according to a letter in the files, "the incident has never been officially reported to the Air Force, [and] it has not taken official cognizance of the matter."What that letter, dated 29 Aug 1957 referred to was a landing and an attack by aliens near the small town of Hopkinsville, Kentucky a few days earlier. A group of people there were confronted by very strange, alien creatures (Illustration based on witness statements seen here). The Air Force would create an "information" only file about the case and ask a couple of officers to "look into it." Two years later, as questions began to be asked, the Air Force would initiate a short investigation but apparently only so they would be able to answer questions about an investigation, rather than actually attempt to learn anything about the sighting.The story officially began early on the evening of August 21, 1955, when Billy Ray Taylor, a young friend of Elmer "Lucky" Sutton, had gone to the well behind the farmhouse, and came running back telling all that he had seen a flying saucer. The object, described as bright with an exhaust that contained all the colors of the rainbow, passed above the house. It continued over of the fields, finally came to a hover, and then descended, disappearing into a gully.No one in the Sutton house, including Glennie Lanford, Lucky Sutton, Vera Sutton, John Charley (J.C) Sutton, Alene Sutton, three Sutton children, June Taylor and O.P. Baker, believed the story of the flying saucer. None of them considered walking out to the gully to see if something might be down there. The whole idea was preposterous.Not long after Taylor told his tale, the dog began to bark. Taylor and Lucky Sutton went to investigate that, but the dog ran under the house, not to reappear that night.Out in the fields, away from the house, was a strange, hovering glow. As it approached, they could see a "small man" inside it. He was about three and a half feet tall, with a large head that looked to be round, and long, thin arms that extended almost to the ground (Seen here). The creature's hands were large and out of proportion with the body, and were shaped more like a bird's talons than a human hand. The two eyes were large and seemed to glow with a yellow fire.As the creature continued to move toward the house, the two men retreated, found a rifle and a shotgun inside, and then waited. When the creature was within twenty feet of the back door, both men fired at it. The creature flipped back, regained its feet and fled into the darkness.The two men watched for a few minutes, searching for the creature and then walked into the living room where the others waited. The creature, or one just like it, appeared in front of one of the windows and the men shot at it, hitting it. This one also did a back flip and disappeared.Now the men decided it was time to go out to learn if they had injured or killed the creature, or animal, or whatever it was. Taylor was the first out, but stopped on the porch under a small overhang. A claw-like hand reached down and touched his hair. Alene Taylor grabbed him to pull him back into the house. Lucky, pushed past him, turned and fired up, at the creature on the roof. It was knocked from its perch.Someone, probably Taylor, shouted, "There's one up in the tree."Both Taylor and Lucky shot at it, knocking if from the limb. But it didn't fall to the ground. Instead, it seemed to float. They shot again, and it ran off, into the weeds.At the same moment, another of the creatures appeared around the corner of the house. It might have been the one that had been on the roof or one of those seen in the backyard. Lucky whirled and fired. The buckshot sounded as if it hit something metallic like an empty bucket. Just as had the others, the little creature flipped over, scrambled to its feet and fled, moving rapidly into the darkness.Having failed to stop the creatures with either the shotguns or the .22 caliber rifle, Lucky decided to leave them alone. Someone noticed that the creatures only approached from darkened areas. It seemed that they were repelled by the light.At some point they heard noises on the roof and went out the back door to investigate. One of the creatures was back on the roof. They shot at it, knocked it off the roof, but it floated to a fence some forty feet away rather than falling to the ground. Hit by another shot, it fell from the fence and ran away, seeming to use its arms to aid its locomotion.Some of the others in the house were still unconvinced that there were real creatures outside, believing instead, that the boys were playing some sort of a prank on them. With the lights in the house turned out, they had taken up a position close to one of the windows. Taylor told Lankford to wait and she would see for herself.After twenty minutes or so, one of the creatures approached the front of the house. According to Lankford, it looked like a five gallon gasoline can with a head on top of two thin, spindly legs. It shimmered as if made of bright metal.Lankford, who had been crouching quietly near the window for a long time, tried to stand, but fell with a thud. She shrieked in surprise and the creature jumped to the rear. Taylor fired at it through the screen door.About three hours after the first creature had been seen, about 11:00 that night, they decided it was time to get out. Everybody ran to the cars. One of the kids was screaming and had to be carried. They all raced to the Hopkinsville police station for help.At the police station, there was no doubt that the people have been frightened by something. Police officers, and the chief, interviewed after the events, made it clear they believed the people had been scared by something. That doesn't mean they were "attacked" by strange little metallic men, but does mean they were relating what they believed to be the truth to the assembled police officials.Within minutes, the police were on their way back to the house, with some of the Sutton men in the cars. The police also called the Madisonville headquarters of the Kentucky State Police. A call was even made to the chief, Russell Greenwell at home. He was told that a spaceship had landed at Kelly. Greenwell then told the desk sergeant that it had better not be a joke.There were now Kentucky State Police, local police, the Chief, and a sheriff's deputy either heading out to the Sutton house, or already there. One of the state troopers, who was only a few miles from Hopkinsville, on the road to Kelly, said that he saw what he called several meteors flash over his car. They moved with a sound like artillery, and he looked up in time to see two of them. They were traveling in a slightly descending arc, heading toward the Sutton house.The yard around the Sutton house was suddenly filled with cars, and more importantly light. The men tried to point out where the various events had taken place. The chief searched for signs that anyone or everyone had been drinking but found nothing to indicate that anyone had even a beer. Glennie Lankford later said that she didn't allow alcohol in the house.Once the police arrived, the situation changed radically. Although the atmosphere was tension charged, and some of the police were nervous, they began to search for signs of the invasion from outer space. There were apparent bullet and shotgun blast holes in the screens over the windows, and there was evidence that weapons had been fired, but there were no traces of the alien creatures. The hard packed ground did not take footprints. The search of the yard and fields around the house revealed little, except a luminous patch where one of the creatures had fallen earlier and was only visible from one angle. The chief said that he saw it himself and there was definitely some kind of stain on the grass. There is no evidence that anyone took samples for analysis later.But with no real evidence to be found, with no alien creatures running around, and with no spacecraft hidden in the gully, the police began to return to their regular, mundane duties. By two in the morning, only the Suttons were left at the house.A half an hour or so after the last of the police left, and with the lights in the house down, Glennie Lankford saw one of the creatures looking in the window. She alerted her son, Lucky, who wanted to shoot at it, but she told him not to. She didn't want a repeat of the situation earlier in the night. Besides, the creatures had done nothing to harm anyone during the first episode.But Lucky didn't listen to her. He shot at the creature but the shot was no more effective than those fired earlier in the night. Other shots were fired with no apparent affect. The little creatures bounced up each time they were hit and then ran away.The little beings kept reappearing throughout the night, the last sighting occurring just a half an hour before sunrise. That was the last time that any of the beings were seen by any of the Suttons or their friends.Although it seems that military personnel, from Fort Campbell, Kentucky, did visit the Sutton house, and interviews with the witnesses were conducted in 1955, an investigation by the Air Force didn't take place until two years later. According to Project Blue Book files, apparently, in August, 1957, prior to the publication of a magazine article that would review the case, someone in the Air Force decided they should "investigate."In a letter from the ATIC at Wright-Patterson, to the commander of Campbell Air Force Base, Wallace W. Elwood wrote, "1. This Center requests any factual data, together with pertinent comments regarding an unusual incident reported to have taken place six miles north of Hopkinsville, Kentucky on subject date [21 August 1955]. Briefly, the incident involved an all night attack on a family named Sutton by goblin-like creatures reported to have emerged from a so-called 'flying saucer.'"Later in the letter, Elwood wrote, "3. Lacking factual, confirming data, no credence can be given this almost fantastic report. As the incident has never been officially reported to the Air Force, it has not taken official cognizance of the matter." Here, once again is the Air Force attitude that if the case has not been reported to them, then it simply doesn’t exist.The matter was apparently assigned to First Lieutenant Charles N. Kirk, an Air Force officer at Campbell Air Force Base. He apparently spent about six weeks investigating the case before sending the material on to ATIC on October 1, 1957. He researched the story using the Hopkinsville newspaper from August 22, 1955 and September 11, 1955. He also had a letter from Captain Robert J. Hertell, a statement from Glennie Lankford, one of the witnesses, and a statement given to Kirk by Major John E. Albert about his involvement in the case, and a copy of an article written by Glennie Lankford.Albert's statement provides some interesting information. Remember, the Air Force was claiming that the case had not been officially reported and therefore the Air Force had not investigated. It seems that here we get lost in the semantics of the situation and the question that begs to asked is, "What the hell does all that mean?"It sounds like a police officer who, seeing a robbery in progress, then ignores it because it hadn't been reported to the station and he wasn't dispatched by headquarters. A police officer can't ignore the crime and it seems reasonable to assume that the Air Force shouldn't have ignored the story. The sighting was outlined in the media including the radio broadcasts, and newspapers from various locations around the country were reporting what had happened. Although the Air Force officers at Blue Book or ATIC must have known that the sighting had been made, they chose to ignore it. If the sighting wasn't reported through official channels, directly to them, then it simply didn't exist. Since no one reported this case through official channels, the sighting could be ignored.Or was it? Lieutenant Kirk, in his report in 1957, sent a copy of the statement made by Major John E. Albert on September 26, 1957, to ATIC. The very first paragraph seems to suggest that notification was made to Campbell Air Force Base which should have, according to regulations in effect at that time (1955), reported it in official channels, to ATIC and therefore Blue Book. The regulation is quite clear on the point and it doesn't matter if everyone in the military believed the sighting to be a hoax, or if they thought the sighting too outrageous, it should have been investigated because the regulations required it.That investigation would not have been conducted by ATIC and Project Blue Book but by the 4602d Air Intelligence Service Squadron. AFR 200-2 tells us exactly what should have happened to the report. It went on to the 4602d and apparently disappeared into some bureaucratic limbo there.In the statement found by Kirk, Albert said, "On about August 22, 1955, about 8 A.M., I heard a news broadcast concerning an incident at Kelly Station, approximately six miles North of Hopkinsville. At the time I heard this news broadcast, I was at Gracey, Kentucky, on my way to Campbell Air Force Base, where I am assigned for reserve training. I called the Air Base and asked them if they had heard anything about an alleged flying saucer report. They stated that they had not and it was suggested that as long as I was close to the area, that I should determine if there was anything to this report. I immediately drove to the scene at Kelly [for some reason the word was blacked out, but it seems reasonable to assume the word is Kelly] Station and located the home belonging to a Mrs. Glennie Lankford [again the name is blacked out], who is the one who first reported the incident. (A copy of Mrs. Lankford's statement is attached to this report)."Albert's statement continued:Deputy Sheriff Batts was at the scene where this supposedly flying saucer had landed and he could not show any evidence that any object had landed in the vicinity. There was nothing to show that there was anything to prove this incident.Mrs. Lankford was an impoverished widow woman who had grown up in this small community just outside of Hopkinsville, with very little education. She belonged to the Holly Roller Church and the night and evening of this occurrence, had gone to a religious meeting and she indicated that the members of the congregation and her two sons and their wives and some friends of her sons', were also at this religious meeting and were worked up into a frenzy, becoming emotionally unbalanced and that after the religious meeting, they had discussed this article which she had heard about over the radio and had sent for them from the Kingdom Publishers, Fort Worth 1, Texas and they had sent her this article with a picture which appeared to be a little man when it actually was a monkey, painted silver. This article had to be returned to Mrs. Lankford as she stated it was her property. However, a copy of the writing is attack to this statement and if it is necessary, a photograph can be obtained from the above mentioned publishers.There are a number of problems with the first couple of paragraphs of Albert's statement, but those are trivial. As an example, it wasn't Glennie Lankford who first reported the incident, but the whole family who had traveled into town to alert the policeIt is the third paragraph, however, that is filled with things that bear no resemblance to reality. Lankford was not a member of the Holly Rollers, but was, in fact a member of the Trinity Pentecostal. Neither she, nor any of the family had been to any religious services the night of the "attack." This unsubstantiated allegation was made in a recent book, suggesting, once again, that the religious tone of the family had somehow contributed to the attack on their house. Or rather, that they were "hysterical" people who would see things that simply were not there.And, Lankford couldn't have heard about any "article" from the newspapers or magazines as it was read on the radio because there was no radio in the farm house. And there was no evidence that Lankford ever sent anywhere for any kind of article about flying saucers and little creatures, those painted silver or any other color. In other words, Albert had written the case off as a hoax, almost before he began his "investigation" because of his false impressions. Apparently he was only interested in facts that would allow him to debunk the case and not learning what had actually happened.I have to say, at this point, I would have been quite skeptical of this tale. It is outrageous and beyond belief, but then, things we do today would have been thought of as outrageous and beyond belief fifty years ago. So, we approach with a skeptical attitude, but we listen to what the witnesses have to say and look for ways to corroborate their statements.Further evidence of the investigators attitude is provided in the next paragraph of his statement. "It is my opinion that the report Mrs. Lankford or her son, Elmer Sutton [name deleted but it is reasonable to assume it was Elmer Sutton], was caused by one of two reasons. Either they actually did see what they thought was a little man and at the time, there was a circus in the area and a monkey might have escaped, giving the appearance of a small man. Two, being emotionally upset, and discussing the article and showing pictures of this little monkey, that appeared like a man, their imaginations ran away with them and they really did believe what they saw, which they thought was a little man."It is interesting to note that Albert is not suggesting that Lankford, the Suttons, and the Taylors (other members of the family present that night) were engaged in inventing a hoax. Instead, with absolutely no evidence, Albert invented the tale of an escaped monkey that fooled the people. That does not explain how the monkey was able to survive the shots fired at it, especially if it was as close to the house as the witnesses suggested. In other words, with shotguns and rifles being fired, someone should have hit it and there should have been broken bits of monkey all over the farm land. And, remember, the various witnesses talked of a number of little men, not a single individual.But Albert wasn't through with the little monkey theory. "The home that Mrs. Lankford lived in was in a very run down condition and there were about eight people sleeping in two rooms. The window that was pointed out to be the one that she saw the small silver shining object about two and a half feet tall, that had its hands on the screen looking in, was a very low window and a small monkey could put his hands on the top of it while standing on the ground."The final sentence said, "It is felt that the report cannot be substantiated as far as any actual object appearing in the vicinity at that time." It was then signed by Kirk.What is interesting is that Albert, and then Kirk, were willing to ignore the report of the object because there was nothing to substantiate it, other than the witness testimony that there had been an object. Both Albert and Kirk were willing to buy the monkey theory, though there was nothing to substantiate it either. They needed a little man, of at the very least, a little humanoid creature for the family to see and they created one because a "monkey might have escaped."Glennie Lankford might have inspired the little monkey story with her own words. In a handwritten statement signed on August 22, 1955, said, "My name is Glennie Lankford age 50 and I live at Kelly Station, Hopkinsville Route 6, Kentucky.She continued:On Sunday night Aug 21, 55 about 10:30 P.M. I was walking through the hallway which is located in the middle of my house and I looked our the back door (south) and saw a bright silver object about two and a half feet tall appearing round. I became excited and did not look at it long enough to see if it had any eyes or move. I was about 15 or 20 feet from it. I fell backward, and then was carried into the bedroom.My two sons, Elmer Sutton aged 25 and his wife Vera age 29, J.C. Sutton age 21 and his wife Aline age 27 and their friends Billy Taylor age 21 and his wife June, 18 were all in the house and saw this little man that looked like a monkey.So the Air Force, which, of course, didn’t investigate sightings of creatures at the time, seized on her description and turned it into a possible solution, suggesting, with no justification that the Suttons had been attacked by a horde of monkeys which were immune to shotguns. They overlooked the evidence of the case, or ignored the testimony, dispatched someone to look into it unofficially at the time, and then denied that they had ever investigated.From my point of view, which is sympathetic to extraterrestrial visitation, this case is extremely weird. Like those Air Force officers, I would be inclined to ignore it, just as most of us ignore the tales of the contactees. But then again, I would want to follow up on this case, talking to the witnesses and looking for evidence, unless there was something else going on...And there might have been. Back in 1955 there were only a few nuclear stockpiles in the world and one of them was at Campbell Air Force Base, not all that far from Hopkinsville. Could it have been that some high-ranking Air Force general, who knew what was stored so close to this alien invasion site, wanted the story buried, not because of the alleged little creatures but because he didn’t want a lot of reporters in the area asking a lot of questions that might compromise an atom secret.No, the storage of the weapons there has little to do with the story but the story focuses attention on the area. The Air Force, not one at the time to favorably entertain any UFO stories and especially those with creatures, didn’t bury the case because of the UFO connection, but because of the secret facility close by.This is one case that I’m quite ambivalent about. It seems quite improbable, based on what we know, but then, it seems improbably that the whole family would get caught up in this story without some sort of precipitating event. I would think there is a core of truth in the tale, but I don’t know what it might be...Maybe radioactive monkeys that had been exposed to the atomic weapons after they escaped from that mythical circus that was in town... Hey, it would make a good movie and we could have a lot of fun with it. But that doesn’t answer the question about what happened that night. That’s just something we might never know.
4602d Air Intelligence Service Squadron,
AFR 200-2,
Billy Ray Taylor,
Elmer "Lucky" Sutton,
Glennie Lankford,
Disclosure Poll, Part 2
As I was putting the last poll together, I came at it from a believer’s perspective. The questions all suggested there was something to be Disclosed, meaning, of course, that there was alien visitation. A number of people have pointed out that those assumptions were flawed. There should have been a "Never" answer, and more importantly, a "Nothing to Disclose" answer.Because of that, I thought I would throw up another "Disclosure Poll," with those answers on it and see if I received significantly different results.Over at the Magnoia Blog:http://pelicanist.blogspot.com they are running a similar poll. It will also be interesting to see how their results compare to mine. Their philosophy is somewhat different than mine so their readers will have a different perspective on the universe than we do over here.Vote away... and thanks.
Disclosure Poll
The results are in and not too surprising. More than half of the voters in this unscientific poll believed that Disclosure wouldn’t happen for years (that is 54 of the respondents or 56%).The next large bloc of votes (24 and 25%) thought there would be Disclosure, but it wasn’t coming soon. That means that more than three-quarters of the respondents don’t believe in Disclosure anytime soon.Fewer than ten voters (9 and 9%) thought that Disclosure was at hand. I don’t know why they’d believe this, given the history of this phenomenon. It’s been around, in the public consciousness for more than sixty years and at times it seems we are no closer to a resolution than we were at the very beginning.And the smallest group ( 8 and 8%) think Disclosure is coming soon. That’s a fairly vague term and is, of course, my fault. I should have defined what I meant by soon... which is sometime in about a year or so.All this means is that the majority of us have no faith in the government coming clean about UFOs. The majority of us realize that the status quo is going to remain, well, the same...Unless some outside force acts. By this I mean that the aliens land to tell the world they are here. In the world of secrecy you do not say that which you know to be classified to those not cleared to hear it, even if the secret is out and everyone knows it. You are still required to keep it a secret. This explains why sometimes you hear a government employee denying something that we all know is true. We are not cleared and he is going to obey the law about the disclosure of classified information.But once the aliens land further confirmation of their presence will not be required. We’ll all know the truth. Until then, we will continue to argue alien visitation. We will cite proof which the debunkers will reject based not so much on evidence as on their belief that aliens couldn’t reach here from there. The distances in space are too vast... and they will reject the proof available because they know there is no alien visitation and if there is no alien visitation there can be no proof.And yes, this is a little bit more supportive of the ETH than I wanted, but sometimes I just get tired of the arguments. Donald Menzel says that the Lubbock Lights photographs are a hoax because there is no other answer except ETH... Phil Klass rejected McMinnville because it was either alien or a hoax and since there are no aliens, it must be a hoax.Disclosure is a different matter. There were those, in the 1950s, who believed the government was preparing us for Disclosure with the release of The Day the Earth Stood Still. Others believed, in 1977, Disclosure was at hand with the release of Close Encounters of the Third Kind. In the world today, there isn’t that sort of movie tradition, though science fiction films that deal with alien invasion, alien visitation and alien reality do very well at the box office (It’s amazing to me that the Oscars have not recognized how well science fiction does with everyone but the pretentious voters who claim they don’t watch science fiction, but I digress).This survey says that most don’t believe Disclosure is near and that is my attitude as well. Back in the mid-1990s we came close, with the massing of evidence about the Roswell case and the general interest in UFOs, but the opportunity was lost. The big propaganda machine was able to divert attention and provide explanations for Roswell... It made no difference that those explanations didn’t fit the facts as long as there was an explanation and the news media, who are too sophisticated to believe in alien visitation was right there to promote the message (Don Schmitt and I, at one point in the mid-1990s were at a major Chicago newspaper (No, not the Sun-Times) for a scheduled interview. We were met by an intern who told us that she had the assignment because her editors didn’t believe in UFOs... let’s not check the facts, let’s just make sure that our belief structure is kept intact... but I digress again.)So, barring some extraordinary event outside the capability of the government to control, there will be no Disclosure. Those on the outside think that throwing rocks at the glass house of UFO secrecy will eventually work. Unfortunately, the glass is bullet proof and the rocks bounce off.
Close Encounters of the Third Kind,
Don Schmitt,
Donald Menzel,
Lubbock Lights,
McMinnville, | 科技 |
2016-40/3983/en_head.json.gz/4118 | Climate change report shows major impacts for Northwest By Bellamy Pailthorp
Jan 14, 2013 ShareTwitter Facebook Google+ Email Huge areas of Seattle's Port could be inundated when higher tides combine with more extreme storm surges, accdording to the Draft National Assessment on Climate Change
Bjørn Giesenbauer photo
Listen Listening... / 2:05 Imagine a future in which major areas of Seattle’s waterfront are flooded because of rising tides. Businesses that front on Elliot Bay, including the famous Edgewater Hotel, or parks such as Myrtle Edwards or Golden Gardens, would have to adjust to storm surges more than six feet higher than we’re used to. According to a new federal report on climate change, that future is just a few decades away. Warming oceans mean sea levels are rising. And that has all kinds of consequences. A 1,200-page report from scientists at 13 federal agencies lays out the latest findings. And this 3rd Draft Assessment on National Climate Change has a 37-page chapter on the situation in the Northwest. Susanne Moser is a coastal climate change expert with Woods Institute at Stanford University. She says though it’s a preliminary report, there’s no denying anymore that we are seeing evidence of climate change. “Changes in extreme weather, changes in temperature, changes in rainfall and snowfall, changes in species distribution. You name it,” Mosure says. The chapter on the Northwest looks at everything from forest health to fire danger to ocean acidification. It covers Washington, Oregon, Idaho and Montana. And there’s lots of detail on sea level rise along the coast, including a detailed study of Seattle’s Elliot Bay, with color codes predicting six foot storm surges that cover most of the Port of Seattle. ‘So, it goes very far inland. And a lot people, a lot of infrastructure, a lot of business establishments would be affected by such a sea-level rise, plus storm surge.” And the consequences aren’t just about physical infrastructure; they affect the economy as well. Because, as the timing of runoff from snow pack into rivers and streams changes, that affects water available for agriculture and energy. The report predicts that sixty years from now, it could reduce hydropower production by as much as 20%. “Which is a really important economic issue, not just for Seattle and not just for the state of Washington, but Washington does export some of its hydropower, for example to California. So, it’s a good thing we’re working together,” Mosure says. Seattle officials are holding a news conference later this morning to talk about the projections for the next 40 years and what people can do about it. Councilmember Mike O’Brien chairs the city’s Energy and Environment Committee. He says he was shocked to see projections showing up to 80 percent of the Port of Seattle’s Harbor Island inundated. Other areas around Elliot Bay that could be flooded during high tides include big parts of West Seattle, Georgetown, South Park, Interbay and Golden Gardens. O'Brien says scientists have been collecting data and tracking weather patterns for the past several years. And their projections are shifting. "Based on what we're seeing on the ground today, we think that even the kind of high scenarios in here may be more conservative than we thought," O'Brien says. The draft report from the federal government is open for public comment through April 12th. Then scientists will make changes and issue the final assessment about a year from now. Tags: climate changeland useNorthwestCity of Seattleglobal warming
2013: A Tipping Year For Climate Change? 3 years ago Bill Gates: Key to beating climate change is energy innovation. Is it? 5 years ago Climate change could cost Wash. $10 billion a year; state crafting response 4 years ago View the discussion thread. © 2016 KNKX | 科技 |
2016-40/3983/en_head.json.gz/4187 | Verizon will buy AOL for $4.4 billion
AOL Chairman and CEO Tim Armstrong, center, applauds during opening bell ceremonies of the New York Stock Exchange in 2009.Image: Richard Drew/Associated Press
By Heidi Moore and Jason Abbruzzese2015-05-12 11:12:00 UTC
Verizon Communications has agreed to buy former Internet darling AOL for $4.4 billion "with the strategy of building the biggest media platform in the world," AOL CEO Tim Armstrong announced Tuesday morning.
The acquisition is the culmination of partnership talks that began in January. The deal will make AOL, which Armstrong will continue to lead, a division of Verizon. AOL will keep oversight of its own businesses and "additional assets from Verizon that are targeted at the mobile and video media space," Armstrong told staff in an email.
See also: You've got jokes: Twitter reacts to Verizon's AOL acquisition
The deal values AOL at $50 a share, which is 23% more than the company's average stock price over the past three months. AOL's stock jumped 18% in premarket trading after the news, while Verizon stock fell 1.3%, indicating that investors are more excited that AOL is being bought than that Verizon is doing the buying. Once a household name that provided many Americans with their first access to the Internet, AOL is now primarily a media and technology company with a focus on advertising, thanks to a turnaround orchestrated by Armstrong. The acquisition makes Verizon an immediate player in the online advertising market, and brings it into closer competition with Comcast, which made its own entry into the space with its $360 million acquisition of Freewheel. The deal is Verizon's attempt to grow in mobile video and advertising. AOL shifted aggressively in 2013 toward entering the business of online advertising technology, most notably through its $405 million acquisition of Adap.tv, which had been one of the leading online video ad platforms. The move helped AOL to quickly become competitive in the online video advertising market, which has been a lucrative area as display ads have fallen out of favor. "The combination of Verizon and AOL creates a scaled, mobile-first platform offering, directly targeted at what eMarketer estimates is a nearly $600 billion global advertising industry," Verizon said in a statement. On mobile devices, however, the market is a more modest one. Juniper Research has estimated that advertising on mobile devices is expected to reach $105 billion by 2019, up from an estimated $51 billion this year. AOL is a big player in ad technology, though it trails behind the giant in the industry, Google. "If there is one key to our journey to building the largest digital media platform in the world, it is mobile," Armstrong said in the email to staff. "Mobile will represent 80% of consumers’ media consumption in the coming years and if we are going to lead, we need to lead in mobile."
In January, when acquisition rumors about AOL and Verizon started brewing, Verizon CEO Lowell McAdam shot down the speculation and said, “AOL, along with lots of other media companies, are potential for us to do partnerships."
Since its disastrous merger with Time Warner in 1999 — a sign of the height of the Internet frenzy of the time — AOL has added significant holdings, particularly in media. It owns The Huffington Post, TechCrunch and Engadget, in addition to its portal, AOL.com, which is still kicking.
Armstrong promised employees that their salaries in the new company "will be equal or better to your AOL compensation." "The simple answer to the question of 'what does this mean for you?' should be, 'I just got more resources, more support and more growth opportunity,'" Armstrong wrote. Verizon is paying for the acquisition in cash and commercial paper, which is a kind of short-term debt. LionTree Advisors and Guggenheim Partners advised Verizon, while media boutique investment bank Allen & Company advised AOL. Topics:
AOL, Business, verizon | 科技 |
2016-40/3983/en_head.json.gz/4221 | Adobe Sounds Off About Lack of iPad Flash Support
There was one small hitch in Steve Jobs' polished presentation on Wednesday as he demoed the new iPad. When he used the device to load the New York Times website, he (apparently) inadvertently navigated to a page with Flash content. The large blue Lego icon projected on the big screen at the Yerba Buena Center for the Arts was like an announcement of what had long been expected: the iPad will not support Flash.
Adrian Ludwig, Flash Marketing Manager for Adobe expressed his dismay in a frustrated post - called "Apple's iPad -- a broken link?"
- on the Flash Platform blog. It looks like Apple is continuing to impose restrictions on their devices that limit both content publishers and consumers. Unlike many other ebook readers using the ePub file format, consumers will not be able to access ePub content with Apple's DRM technology on devices made by other manufacturers. And without Flash support, iPad users will not be able to access the full range of web content, including over 70% of games and 75% of video on the web.
If I want to use the iPad to connect to Disney, Hulu, Miniclip, Farmville, ESPN, Kongregate, or JibJab -- not to mention the millions of other sites on the web -- I'll be out of luck.
Adobe has moved to get Flash content onto the iPhone despite Apple's lack of support by developing the forthcoming Packager for iPhone tool which will convert Flash content into apps that can run on the iPhone, and the company said that a future version of the tool will also work for the iPad's 1024 x 768 screen.
"It is our intent to make it possible for Flash developers to build applications that can take advantage of the increased screen size and resolution of the iPad," Adobe AIR for Mobile Manager Michael Chou said on the blog. "We are looking for developers and designers who have a specific app in mind to be submitted to the iTunes App Store within the next two months."
According to Adobe, 98 percent of desktop computers currently support Flash, and version 10.1, presently in beta, will run on Windows Mobile, webOS, Android, Symbian and BlackBerry. Apple has consistently blocked the software from running on the iPhone and iPod, most analysts believe, because it would take away business from its App Store, allowing consumers to play games and run other content from a browser. Clause 3.3.2 of the iPhone SDK agreement prohibits Flash or any other "self-executing" code from the iPhone:
An Application may not itself install or launch other executable code by any means, including without limitation through the use of a plug-in architecture, calling other frameworks, other APIs or otherwise. No interpreted code may be downloaded and used in an Application except for code that is interpreted and run by Apple�s Published APIs and built-in interpreter(s).
Packager for iPhone will be included with the upcoming release of Adobe Flash Professional CS5, with Packager for iPad to follow later in the year.
image via Engadget | 科技 |
2016-40/3983/en_head.json.gz/4413 | What's the difference between archaeology and grave robbing?
by Charles W.
Science | Paleontology
Undersea Archaeology
Next An archaeologist's main goal is to help piece together the past.
David Silverman/Getty Images
Relatives of the victims of the Titanic have complained that the mining of valuables and relics from the sea floor amount to nothing more than grave robbing. After all, the resting place of the Titanic is also a mass grave of sorts, the sea a home to more than 1,500 casualties. Yet we've seen thousands of personal items on display at numerous Titanic exhibitions since it was discovered in 1987. Undersea explorers claim that these items are displayed as a historical collection of antiquities, just like the contents of King Tut's tomb. The UNSECO Convention of 1970 helped to protect cultural property by outlining guidelines that prevent the plundering of archaeological sites. Those who abide by the convention are not considered grave robbers, but archaeologists trying to piece together the puzzle of human history.
In 2001, the UNESCO Convention on the Protection of Underwater Cultural Heritage was adopted and ratified by 23 countries, which doesn't include the United States, England, France, Germany, Italy, China and Russia. This convention allows the recovery of artifacts as long as the people involved in the recovery make a "significant contribution" to the protection and knowledge of underwater heritage sites. It also prohibits the trading, buying and selling of underwater cultural property. But a convention is only as strong as the countries that acknowledge it, and with major countries like the United States, England, France, Germany, Italy, China and Russia steering clear, it remains a convention in limbo.
10 Extinct Hominids
Could we resurrect dinosaurs from fossil embryos?
Fossils: From Organic to Rock in a Matter of Millennia
The Ultimate Soft Tissue in Dinosaur Fossils Quiz
How Anthropology Works
This state of limbo has allowed major undersea exploration outfits like Odyssey Marine Exploration (OME) to find and recover hundreds millions of dollars of booty from shipwrecks on the sea floor. The COO of OME, Dr. Mark Gordon, believes that these sites are too far down and too hard to find for teams funded by universities and museums. His rationale is that by operating a for-profit endeavor, his team can recover many more items than would ever be possible with a not-for-profit model. He maintains that the money OME makes selling items helps fund the operation, and that individual unique pieces are not sold, but kept for research purposes. His critics charge that the operation is nothing more than a well-funded and sophisticated looting business, staffed by educated grave robbers.
As the undersea debate continues to rage, it's hard to tell what lies ahead for companies like Odyssey Marine Exploration. As more countries ratify and observe the 2001 UNESCO Convention, the difference between land and sea excavation, and archaeology and grave robbing may become more clearly defined.
How Fossils Work
The Ultimate Bioarchaelogy Quiz
What can we learn from computational archaeology?
How did scientists find soft tissue in dinosaur fossils?
How has radiocarbon dating changed archaeology?
Earth's Evolution: The Paleogeography Quiz | 科技 |
2016-40/3983/en_head.json.gz/4415 | Science & Technology News Since 1998
NASA’s Swift Discovers Three Unusually Long-Lasting Stellar Explosions
In three new studies, astronomers discuss the three unusually long-lasting stellar explosions that were discovered by NASA’s Swift satellite and represent a previously unrecognized class of gamma-ray bursts.
GRB 111209A exploded on Dec. 9, 2011. The blast produced high-energy emission for an astonishing seven hours, earning a record as the longest-duration GRB ever observed. This false-color image shows the event as captured by the X-ray Telescope aboard NASA’s Swift satellite. Credit: NASA/Swift/B. Gendre (ASDC/INAF-OAR/ARTEMIS)
Three unusually long-lasting stellar explosions discovered by NASA’s Swift satellite represent a previously unrecognized class of gamma-ray bursts (GRBs). Two international teams of astronomers studying these events conclude that they likely arose from the catastrophic death of supergiant stars hundreds of times larger than the sun.
The astronomers discussed their findings Tuesday at the 2013 Huntsville Gamma-ray Burst Symposium in Nashville, Tennessee, a meeting sponsored in part by the University of Alabama at Huntsville and NASA’s Swift and Fermi Gamma-ray Space Telescope missions.
GRBs are the most luminous and mysterious explosions in the universe. The blasts emit surges of gamma rays — the most powerful form of light — as well as X-rays, and they produce afterglows that can be observed at optical and radio energies. Swift, Fermi and other spacecraft detect an average of about one GRB each day.
Three recent GRBs (blue dots) emitted high-energy gamma-ray and X-ray light over time spans up to 100 times greater than typical long bursts and constitute a new ultra-long class. This plot compares the energy received and the event duration among different classes of transient high-energy events: long GRBs (green); the disruption of an asteroid or comet by a neutron star or stellar-mass black hole in our own galaxy, or the break-out of a supernova shock wave in another galaxy (orange); and the tidal disruption of a star by a supermassive black hole in another galaxy (purple). Credit: NASA’s Goddard Space Flight Center, after B. Gendre (ASDC/INAF-OAR/ARTEMIS)
“We have seen thousands of gamma-ray bursts over the past four decades, but only now are we seeing a clear picture of just how extreme these extraordinary events can be,” said Bruce Gendre, a researcher now associated with the French National Center for Scientific Research who led this study while at the Italian Space Agency’s Science Data Center in Frascati, Italy.
Prior to Swift’s launch in 2004, satellite instruments were much less sensitive to gamma-ray bursts that unfolded over comparatively long time scales.
Traditionally, astronomers have recognized two GRB types, short and long, based on the duration of the gamma-ray signal. Short bursts last two seconds or less and are thought to represent a merger of compact objects in a binary system, with the most likely suspects being neutron stars and black holes. Long GRBs may last anywhere from several seconds to several minutes, with typical durations falling between 20 and 50 seconds. These events are thought to be associated with the collapse of a star many times the sun’s mass and the resulting birth of a new black hole.
Both scenarios give rise to powerful jets that propel matter at nearly the speed of light in opposite directions. As they interact with matter in and around the star, the jets produce a spike of high-energy light.
Gendre and his colleagues made a detailed study of GRB 111209A, which erupted on Dec. 9, 2011, using gamma-ray data from the Konus instrument on NASA’s Wind spacecraft, X-ray observations from Swift and the European Space Agency’s XMM-Newton satellite, and optical data from the TAROT robotic observatory in La Silla, Chile. The burst continued to produce high-energy emission for an astonishing seven hours, making it by far the longest-duration GRB ever recorded. The team’s findings appear in the March 20 edition of The Astrophysical Journal.
Another event, GRB 101225A, exploded on Christmas Day in 2010 and produced high-energy emission for at least two hours. Subsequently nicknamed the “Christmas burst,” the event’s distance was unknown, which led two teams to arrive at radically different physical interpretations. One group concluded the blast was caused by an asteroid or comet falling onto a neutron star within our own galaxy. Another team determined that the burst was the outcome of a merger event in an exotic binary system located some 3.5 billion light-years away.
GRB 101225A, better known as the “Christmas burst,” was an unusually long-lasting gamma-ray burst. Because its distance was not measured, astronomers came up with two radically different interpretations. In the first, a solitary neutron star in our own galaxy shredded and accreted an approaching comet-like body. In the second, a neutron star is engulfed by, spirals into and merges with an evolved giant star in a distant galaxy. Now, thanks to a measurement of the Christmas burst’s host galaxy, astronomers have determined that it represented the collapse and explosion of a supergiant star hundreds of times larger than the sun. Credit: NASA’s Goddard Space Flight Center Scientific Visualization Studio
“We now know that the Christmas burst occurred much farther off, more than halfway across the observable universe, and was consequently far more powerful than these researchers imagined,” said Andrew Levan, an astronomer at the University of Warwick in Coventry, England.
Using the Gemini North Telescope in Hawaii, Levan and his team obtained a spectrum of the faint galaxy that hosted the Christmas burst. This enabled the scientists to identify emission lines of oxygen and hydrogen and determine how much these lines were displaced to lower energies compared to their appearance in a laboratory. This difference, known to astronomers as a redshift, places the burst some 7 billion light-years away.
As a part of this study, which is described in a paper submitted to The Astrophysical Journal, Levan’s team also examined 111209A and the more recent burst 121027A, which exploded on Oct. 27, 2012. All show similar X-ray, ultraviolet and optical emission and all arose from the central regions of compact galaxies that were actively forming stars. The astronomers conclude that all three GRBs constitute a hitherto unrecognized group of “ultra-long” bursts.
The number, duration and burst class for GRBs observed by Swift are shown in this plot. Colors link each GRB class to illustrations above the plot, which show the estimated sizes of the source stars. For comparison, the width of the yellow circle represents a star about 20 percent larger than the sun. Credit: Andrew Levan, Univ. of Warwick
To account for the normal class of long GRBs, astronomers envision a star similar to the sun’s size but with many times its mass. The mass must be high enough for the star to undergo an energy crisis, with its core ultimately running out of fuel and collapsing under its own weight to form a black hole. Some of the matter falling onto the nascent black hole becomes redirected into powerful jets that drill through the star, creating the gamma-ray spike, but because this burst is short-lived, the star must be comparatively small.
“Wolf-Rayet stars fit these requirements,” explained Levan. “They are born with more than 25 times the sun’s mass, but they burn so hot that they drive away their deep, outermost layer of hydrogen as an outflow we call a stellar wind.” Stripping away the star’s atmosphere leaves an object massive enough to form a black hole but small enough for the particle jets to drill all the way through in times typical of long GRBs.
Because ultra-long GRBs persist for periods up to 100 times greater than long GRBs, they require a stellar source of correspondingly greater physical size. Both groups suggest that the likely candidate is a supergiant, a star with about 20 times the sun’s mass that still retains its deep hydrogen atmosphere, making it hundreds of times the sun’s diameter.
Astronomers suggest that blue supergiant stars may be the most likely sources of ultra-long GRBs. These stars hold about 20 times the sun’s mass and may reach sizes 1,000 times larger than the sun, making them nearly wide enough to span Jupiter’s orbit. Credit: NASA’s Goddard Space Flight Center/S. Wiessinger
Gendre’s team goes further, suggesting that GRB 111209A marked the death of a blue supergiant containing relatively modest amounts of elements heavier than helium, which astronomers call metals.
“The metal content of a massive star controls the strength of its stellar wind, which determines how much of the hydrogen atmosphere it retains as it grows older,” Gendre notes. The star’s deep hydrogen envelope would take hours to complete its fall into the black hole, which would provide a long-lived fuel source to power an ultra-long GRB jet.
Metal content also plays a strong role in the development of long GRBs, according to a detailed study presented by John Graham and Andrew Fruchter, both astronomers at the Space Telescope Science Institute in Baltimore.
Stars make heavy elements throughout their energy-producing lives and during supernova explosions, and each generation of stars enriches interstellar gas with a greater proportion of them. While astronomers have noted that long GRBs occur much more frequently in metal-poor galaxies, a few of them have suggested that this pattern is not intrinsic to the stars and their environments.
To examine this possibility, Graham and Fruchter developed a novel method that allowed them to compare galaxies by their underlying rates of star formation. They then examined galaxies that served as hosts for long GRBs and various types of supernovae as well as a control sample of 20,000 typical galaxies in the Sloan Digital Sky Survey.
The astronomers found that 75 percent of long GRBs occurred among the 10 percent of star formation with the lowest metal content. While the study found a few long GRBs in environments with high-metal content, like our own galaxy, these occur at only about 4 percent the rate seen in low-metal environments per unit of underlying star formation.
“Most stars form in metal-rich environments, and this has a side effect of decreasing the prevalence of long GRBs as the universe grows older,” Graham explained. “And while a nearby long GRB would be catastrophic to life on Earth, our study shows that galaxies like our own are much less likely to produce them.”
The astronomers suspect this pattern reflects a difference in how well a massive star manages to retain its rotation speed. Rising metal content means stronger stellar winds. As these winds push material off the star’s surface, the star’s rotation gradually decreases in much the same way as a spinning ice skater slows when she extends her arms. Stars with more rapid rotation may be more likely to produce a long GRB.
Graham and Fruchter hypothesize that the few long GRBs found in high-metal environments received an assist from the presence of a nearby companion star. By feeding mass — and with it, rotational energy — onto the star that explodes, a companion serves as the physical equivalent of someone pushing a slowly spinning ice skater back up to a higher rotational speed.
› Paper: “The Ultra-long Gamma-Ray Burst 111209A: The Collapse of a Blue Supergiant?” B. Genre et al.
› Paper: “A new population of ultra-long duration gamma-ray bursts.” A.J. Levan et al.
› Paper: “The Metal Aversion of LGRBs.” J. F. Graham and A. S. Fruchter.
Source: Francis Reddy, NASA’s Goddard Space Flight Center; NASA
Images: NASA/Swift/B. Gendre (ASDC/INAF-OAR/ARTEMIS); NASA’s Goddard Space Flight Center, after B. Gendre (ASDC/INAF-OAR/ARTEMIS); Andrew Levan, Univ. of Warwick; NASA’s Goddard Space Flight Center/S. Wiessinger
astronomy, astrophysics, Fermi Gamma-Ray Telescope, NASA, NASA Swift Satellite
More on SciTechDaily
Astronomers View Cosmic Blast GRB 130427A in Unique Detail
Swift Satellite Observes Rare Ultra-Long Gamma-Ray Burst
ALMA Maps Environment around Dark Gamma-Ray Bursts for the First Time
NASA’s Fermi, Swift View Record-Setting Gamma-Ray Burst
Astronomers View the Infrastructure of Gamma-ray Burst Jet 120308A
12 Billion Year Old Explosion Illuminates a Galaxy from the Dark Ages
Astronomers Reveal “Missing-Link” in the Supernova-GRB Connection
International Team Explains the Mystery Behind a Rare 5-Hour Space Explosion
Don't miss out. Follow the latest technology & science news via email or social media.
← NASA’s Wind Mission Data Reveals ‘SLAMS’ Waves
How to Target and Collide With a Potentially Hazardous Asteroid →
Popular Comments Rosetta’s Final Descent Images of Comet 67P/Churyumov-Gerasimenko
Fermi Discovers A Luminous Gamma-Ray Binary
Spitzer Finishes Its Frontier Fields Observations
MIT Develops Nanosensors That Can Profile Tumors
ALMA Reveals the First Hot Core Found Outside the Milky Way
New Ways to Protect Telecommunication and Navigation Satellites
Researchers Develop Shape-Programmable Miniscule Robots
Hubble Spots Possible Water Plumes on Europa
NASA Research Reveals That Mercury is Active
New Research Reveals the Missing Piece of the Climate Puzzle
Scientists Believe Humans Will One Day Colonize the Universe
Van Allen Probes Reveal an Impenetrable Barrier in Space
Our Universe May Have Emerged from a Black Hole in a Higher Dimensional Universe
Study Takes Singularity Out of Black Holes
Homosexuality Might Develop in the Womb Due to Epigenetic Changes
A New Kind of Invisibility Cloak Demonstrates Better Cloaking Efficiency
The Scientific Cause of Near-Death Experiences
Yale Study: Climate Science Communication and the Measurement Problem
A New View of Ceres, Dawn Shows Finer Detail
Valeriy: There are some very complicated issues of galaxy f...
Maxwell A.G. Hammer: Place bets. Mine is that the earlier plumes/geyse...
Valeriy: Outflows are ubiquitous in astrophysics. Despite d...
Digital Media Agency: Perhaps our universe emerged from a rapidly rotati...
Sage on the Hudson: "These pseudotachylites — fine grained rocks —...
entrance: I think we will have more earth quakes and volcani...
Online Coupons & Discount CodesOnline Coupons
Orbitz.com Coupons
ProFlowers Coupons
Get the latest science news via email or social media.
SciTech DailyPrivacy Policy - Terms of UseCopyright © 1998 - 2016. All Rights Reserved. | 科技 |
2016-40/3983/en_head.json.gz/4471 | Tokyo Electron Withdraws From Solar Panel Manufacturing Equipment Business
Kenji Kaneko, Nikkei BP CleanTech Institute
Manufacturing equipment of Switzerland-based TEL Solar AG (source: TEL Solar AG)
Tokyo Electron Ltd announced Jan 30, 2014, that its board of directors passed a resolution to withdraw from the thin-film photovoltaic panel production equipment (PVE) business on the day.
In 2009, Tokyo Electron became the Asia-Oceania region sales representative for Oerlikon Solar and started sales and marketing of an end-to-end manufacturing line for thin-film silicon (Si) solar panels. In 2012, Tokyo Electron acquired Oerlikon Solar and made a full-scale entry into the thin-film Si solar panel market.
However, due to an oversupply of solar panels and loose supply and demand balance of crystalline Si materials, the demand for amorphous (thin-film) Si solar panels, which had been expected to grow because amorphous Si solar panels do not consume crystalline Si much, did not take off. In addition, the conversion efficiency of thin-film Si solar panels did not increase as much as expected, lowering their competitiveness against crystalline Si solar panels.
Tokyo Electron made efforts to strengthen its development activities and lower costs. But it decided to stop the production, development and marketing of its solar panel production equipment because uncertainties in market recovery made it difficult to recoup its investment. The company will scale down the business unit, which will provide only after-sales services.
The bases of the PVE business are Switzerland-based TEL Solar AG and Technology Center Tsukuba (Tsukuba City, Ibaraki Prefecture, Japan). The business unit posted sales of ¥83 million (approx US$809,914) for the fiscal year ending March 31, 2013. It accounted for only 0.01% of the Tokyo Electron group's consolidated sales.
Tokyo Electron will consider transferring the employees in the business unit to other units of the group. But, for its Swiss subsidiary, the company will consider reducing personnel to streamline the organization in accordance with the downsizing of the business.
Tokyo Electron announced Dec 18, 2013, that it would post an extraordinary loss due to the revision of its business plan including the PVE business. Of the extraordinary loss of ¥46.7 billion posted for the third quarter of the fiscal year ending March 31, 2014, ¥32.6 billion was posted as an impairment loss to goodwill and fixed assets related to the PVE business.
URL: http://techon.nikkeibp.co.jp/english/NEWS_EN/20140131/331301/?ST=msbe
All editorial content and graphics on this Web site may not be reproduced in whole or in part without the express permission of the copyright owner. | 科技 |
2016-40/3983/en_head.json.gz/4485 | The Ovum Factor
A Novel by Marvin L. Zimmerman
Video Trailer
Trade Order Info
Destruction of Earth's ecology threatens the survival of humanity. With time ticking away, a clandestine think tank of scientists and world leaders has identified our last hope - the controversial research of a Nobel Prize - winning professor aimed at unleashing the power of a unique molecule that can alter the course of human history.
When David Rose, a young investment banker from New York, is assigned to evaluate the professor's research, he soon becomes swept up in a whirlwind of international espionage, assassination, and sabotage. David finds himself on a journey that takes him to the unexplored depths of the Amazon in order to fulfill two ancient prophecies for saving mankind and at the same time to realize his own destiny.
From New York to California, from China to the slums of Rio de Janerio, and into the Amazon, the search for the mysterious source of this rare molecule will take you into the heart of the unknown and unseen forces of nature.
Copyright © 2007-2008 by Marvin L. Zimmerman. Website by: frogwebdesign.com | 科技 |
2016-40/3983/en_head.json.gz/4538 | Sencha launches HTML 5 cloud service, receives $15M
Meghan Kelly October 25, 2011 12:07 PM
Tags: html 5, Jafco Ventures, Michael Mullany, mobile applications, Radar Partners, Sencha, Sequoia Capital Sencha, provider of HTML 5 tools for web and mobile developers, received $15 million in a second round of funding led by Jafco Ventures.
HTML 5 applications can make beautiful, interactive experiences on PCs and mobile devices. But there are some drawbacks to using HTML 5 that have held the marketplace from rushing the field. These apps are unable to connect to a device’s own native apps and data. For example, an HTML5 app running on an iPhone wouldn’t be able to access the camera or address book, because those live within the mobile operating system, which HTML 5 exists apart from. But, aside from these drawbacks, there are many pros to developing with it
“The future is a multi-device world,” said Michael Mullany, chief executive of Sencha, in an interview with VentureBeat. “The web is really the only platform that can span [many different devices] … you’re not going to have a native platform that is going to talk to all of those.”
Mullany feels it would be an arduous task to creative native applications with the same user experience across multiple operating systems and devices. Because HTML 5 apps are web-based, they can provide a cohesive experience across those devices, including your tablet, phone, computer and even television.
“People are going to expect that user experiences across these devices will be shared experiences,” he said.
In addition to delivering on this expectation, Sencha plans to use the funding for more product announcements, small technology acquisitions and expanding developers’ abilities. In fact, the company announced a new cloud product yesterday, Sencha.io, which allows developers to create HTML 5 web apps without having to deal with sever-side coding and hosting their web app.
Sencha is located in Redwood City, Calif. and was founded in 2010. The company received a $14 million first round of funding led by Sequoia Capital in June 2010. Both Sequoia Capital and Radar Partners participated in this round. | 科技 |
2016-40/3983/en_head.json.gz/4608 | WVTF
WVTF RADIO IQ Menu
Feast your eyes on our Fall Fund Drive prizes! Rules for Flying Your Drone By Robbie Harris
Jul 1, 2014 ShareTwitter Facebook Google+ Email Stock Photo Credit Vince LoPresti/Flickr via NPR With all the talk about drones, you’d think the industry was ready to take off. But not until the Federal Aviation Administration determines safety rules for commercial U.S airspace. And that could be more than 2 years away. Meantime, that means no flying commercial drones without a special permit. Listen Listening... / 2:14 Robbie Harris reports. Virginia Tech and its research partner, The Center for Naval Systems at Dahlgren are one of six sites in the country tasked with testing drones for use in commercial airspace. The Federal Aviation Administration will use their results to inform its rules and regulations. But until those rules governing come out, if you’re flying one, without a permit, for anything other than private use, you are violating FAA rules. “I think it’s fair to say that the technology is a little bit ahead of the regulatory scheme right now,” says Jon Green of the Institute of Critical Technologies and Applied Science at Virginia Tech. He has been working on autonomous aviation systems for decades. He’s overseeing the testing. “The test sites are intended to start collecting that data and providing them to the FAA so they can make risk based, data informed decisions about what should and should not fly.” Green says drone testing will include not only question of safety, but also privacy and ecological concerns. The FAA has each test site to come up with a plan. Jon Green/Institute of Critical Technologies and Applied Science at Virginia Tech “Basically it says that were going to be good citizens and make sure that we’re not unintentionally collecting information on individuals that if we do unintentionally collect it we handle it appropriately.” Green says it is how drones are ultimately used that will determine their impact on society. “One of the things that I always say is that a hammer is a wonderful tool but it also can make a pretty nasty weapon. And I think you could say the same about unmanned aircraft systems. They could be used as weapons they could be used to intrude on people’s privacy and they can also be used for search and rescue and bridge inspections, pipeline inspections, and agricultural efficiency that really have a redeeming social value. And that’s what this test sites are about, to help us figure out how we use these things well.” Testing is expected to get underway next month, and the hope is that a set of rules will be ready for public comment by late autumn. After that, it could take more than two years for rules and regulations to be adopted. Tags: Science & TechnologyView the discussion thread. © 2016 WVTF | 科技 |
2016-40/3983/en_head.json.gz/4663 | After a “year of records,” Airbus sets its sights on continued industry leadership in 2012
17 January 2012 Headline news Airbus will build on last year’s highly successful commercial performance as the company pursues its industry leadership in 2012, with the coming 12 months to include further production ramp-up for the jetliner families to meet a growing order backlog, supported by additional growth in its employee count.
Speaking to international journalists today at the New Year's press conference organised with its EADS parent company, Airbus President and CEO Tom Enders said the planned production acceleration will raise the output of Airbus’ best-selling A320 Family in 2012 to an all-time high of approximately 42 aircraft monthly, with the A380 cadence moving toward three per month, and production of the A330 rising to meet a goal of 9.5 jetliners monthly.
To meet these output rates – while also securing major programme milestones for the new A350, A320neo and A400M aircraft – Airbus will maintain a focus on expanding its workforce’s size, skills and talent – adding at least 4,000 more employees in 2012, which follows the 4,500 new recruits hired in 2011. With a focus on providing “supreme value” through the quality of its staff, Enders said Airbus spends some 100 million euros annually in employee training.
During today’s press conference, held at the company’s Hamburg, Germany industrial facility, Enders outlined Airbus’ most successful year ever in 2011, during which it logged a total 1,608 commercial gross orders (representing 56 per cent of the global market share by value), resulting in 1,419 net orders (54 per cent of market share by value) when cancellations were taken into account.
Leading the new business volume in 2011 were the total 1,226 bookings received for the A320neo new engine option – confirming its position as the fastest-selling commercial jetliner ever, and bringing overall sales for the entire A320 Family to 8,292 aircraft as of 31 December 2011. The A380 order book also increased last year, with 29 total gross orders received from both new and repeat customers. Airbus delivered 26 of these very large jetliners in 2011– including a new high of four aircraft provided in the month of December. To date, the 21st century flagship aircraft has carried an estimated 18 million passengers in 50,000 commercial flights as the A380’s global route network continues to grow.
Enders said Airbus’ A330 continues to be “the right aircraft, right now,” with a total of 99 new gross orders booked in 2011. Its long-term popularity was underscored by the 10 customers that made repeat A330 acquisitions last year, including Cathay Pacific – which placed its ninth order for the twin-engine jetliner. Airbus’ latest version, the A330-200F freighter, received seven new orders in 2011, while four of these aircraft were among the year’s deliveries.
With its exceptional sales performance, and the record 534 commercial jetliners delivered in 2011, the company ended last year with a backlog of 4,437 aircraft – representing a combined list price value estimated at $588 billion.
John Leahy, the Airbus Chief Operating Officer – Customers, said 2011 marked the 10th consecutive year of production increases at the company, positioning Airbus as the world’s leading aircraft manufacturer in nine of these past 10 years.
“In such a cyclical industry as ours, Airbus is proud of how it has handled the smooth, continual ramp-up in deliveries over the past decade, which is much different than the ups-and-downs of our competitor’s output,” Leahy explained. “We work very hard to actively manage the way we build aircraft – and the airlines agree this is a much better way to do it.” Looking to the future, Leahy said air travel will continue to be a growth market, with the continued trend of air traffic doubling every 15 years. Airbus’ outlook foresees commercial orders for 600-650 of its commercial jetliners in 2012, with a target of 570 deliveries to be made during the year. Airbus Chief Operating Officer Fabrice Bregier said major programme milestones during the coming 12 months will include the start-up of final assembly for the first two A350s – with one airframe serving for ground tests and the other to be the initial flight test aircraft – along with certification of the Sharklets fuel-saving wingtip devices for the A320, and the production ramp-up for Airbus Military’s A400M airlifter.
Tags : A320 Family A330 Family A340 Family A350 XWB A380 Family Germany Orders Deliveries Market
Composed of the A318, A319, A320 and A321 jetliners, Airbus' single-aisle product line continues to set industry standards in reliable service across the globe Learn more: A320 Family Learn more
Airbus' A330 provides exceptional operational flexibility at the lowest cost, with right-sized cabins for passenger, freight, VIP and multi-role military applications Learn more: A330 Family Learn more
Since beginning commercial operations in 2007, Airbus' 21st century flagship jetliner has earned its place as the airline sector's new "queen of the skies" Learn more: A380 Family Related website
Learn more: Airbus Freighters
Airbus' product line of freighter aircraft, including the A330-200F, A330P2F and Beluga transports, meets a full range of cargo lift requirements Learn more: Airbus Freighters Latest related news
China Airlines becomes new operator of the A350 XWB 30 September 2016 Press Release
Flying around the world: The number of in-service A350 XWB jetliners continues to grow 28 September 2016 Feature story
RwandAir takes delivery of its first A330 aircraft 28 September 2016 Press Release | 科技 |
2016-40/3983/en_head.json.gz/4730 | Home/News/Deep Impact spacecraft eyes Comet ISON
Deep Impact spacecraft eyes Comet ISONThe Comet ISON imaging campaign is expected to yield infrared data and light curves, which are used in defining the comet’s rotation rate, in addition to visible-light images.
By Jet Propulsion Laboratory, Pasadena, California, NASA Headquarters, Washington, D.C. | Published: Thursday, February 07, 2013
RELATED TOPICS: SOLAR SYSTEM | COMETS | COMET ISONThis image of Comet ISON (C/2012 S1) was taken by the Medium-Resolution Imager of NASA's Deep Impact spacecraft.NASA’s Deep Impact spacecraft has acquired its first images of Comet ISON (C/2012 S1). The spacecraft’s Medium-Resolution Imager took the images over a 36-hour period January 17–18, 2013, from a distance of 493 million miles (793 million kilometers). Many scientists anticipate a bright future for Comet ISON; the spaceborne conglomeration of dust and ice may put on quite a show as it passes through the inner solar system this fall.“This is the fourth comet on which we have performed science observations and the farthest point from Earth from which we’ve tried to transmit data on a comet,” said Tim Larson of NASA’s Jet Propulsion Laboratory (JPL) in Pasadena, California. “The distance limits our bandwidth, so it’s a little like communicating through a modem after being used to DSL. But we’re going to coordinate our science collection and playback so we maximize our return on this potentially spectacular comet.”Deep Impact has executed close flybys of two comets — Tempel 1 and Hartley 2 — and performed scientific observations on two more — Comet Garradd and now ISON. The Comet ISON imaging campaign is expected to yield infrared data and light curves, which are used in defining the comet’s rotation rate, in addition to visible-light images. A movie of Comet ISON was generated from initial data acquired during this campaign. Preliminary results indicate that, although the comet is still in the outer solar system, more than 474 million miles (763 million km) from the Sun, it is already active. As of January 18, the tail extending from ISON’s nucleus was already more than 40,000 miles (64,400km) long.Long-period comets like ISON are thought to arrive from the solar system’s Oort Cloud, a giant spherical cloud of icy bodies surrounding our solar system so far away that its outer edge is about a third of the way to the nearest star other than our Sun. Every once in a while, one of these loose conglomerations of ice, rock, dust, and organic compounds is disturbed out of its established orbit in the Oort Cloud by a passing star or the combined gravitational effects of the stars in the Milky Way Galaxy. With these gravitational nudges, so begins a comet’s eons-long, arching plunge toward the inner solar system.Two Russian astronomers using the International Scientific Optical Network’s 16-inch (40 centimeters) telescope near Kislovodsk discovered Comet ISON on September 21, 2012. NASA’s Near-Earth Object Program Office, based at JPL, has plotted its orbit and determined that the comet is more than likely making its first-ever sweep through the inner solar system. Having not come this way before means the comet’s pristine surface has a higher probability of being laden with volatile material just waiting for some of the Sun’s energy to heat it up and help it escape. With the exodus of these clean ices could come a boatload of dust, held in check since the beginnings of our solar system. This released gas and dust is what is seen on Earth as comprising a comet’s atmosphere — coma — and tail.Comet ISON will not be a threat to Earth, getting no closer to Earth than about 40 million miles (64 million km) December 26, 2013. But stargazers will have an opportunity to view the comet’s head and tail before and after its closest approach to the Sun if the comet doesn’t fade early or break up before reaching our star.
0JOIN THE DISCUSSION
RELATED ARTICLESAmateur and professional astronomers team up to reveal a spiral galaxy with a secretEnigmatic "ribbon" of energy discovered by NASA satellite explainedEarth-like planets are right next doorMassive stellar winds are made of tiny piecesDawn maps Ceres craters where ice can accumulateHubble captures vivid aurorae in Jupiter’s atmosphereHow can astronomers be sure that Pluto is larger than Eris? Could we get a different number for Eris' size once it's visited by a spacecraft?Are eight planets enough?Why haven't we sent a microphone to Mars, or any other planet for that matter?
YOU MIGHT ALSO LIKECosmic Origins100 Most Spectacular Sky WondersDeep Space Mysteries 2017 CalendarThe New Cosmos (By David Eicher)Explore Jupiter's moons PosterYour Guide to the 2017 Total Solar EclipseCosmology's Greatest DiscoveriesCelestial Portraits Download | 科技 |
2016-40/3983/en_head.json.gz/4893 | Business Vodafone to buy Spanish telecoms company Ono for $10 billion, expanding further in Europe The Canadian Press -
Mar 17, 2014 / 1:38 pm | Story:
LONDON - British telecommunications company Vodafone agreed on Monday to buy Spain's Ono for 7.2 billion euros ($10 billion) as it seeks to expand operations across Europe.The deal is part of a broader trend of mergers and takeovers in Europe, where the mobile industry is split among some 150 major operators crisscrossing national lines — compared to just four in the United States.Grupo Corporativo Ono S.A. provides phone, mobile and television services to 1.9 million customers and has the largest "next-generation network" in Spain, reaching 7.2 million homes, or 41 per cent of the country. Vodafone says Ono has abundant spare capacity, giving it space to expand."Demand for unified communications products and services has increased significantly over the last few years in Spain, and this transaction - together with our fiber-to-the-home build program - will accelerate our ability to offer best-in-class propositions in the Spanish market," Vodafone Chief Executive Vittorio Colao said in a statement.Vodafone is flush with cash after agreeing last year to sell its stake in a U.S. venture to Verizon for $130 billion in cash and stock — one of the biggest deals in corporate history.Monday's deal marks Vodafone's second major acquisition in Europe, after its purchase of Kabel Deutschland last year. Vodafone was attracted by Kabel Deutschland's extensive cable network, which it could use to expand its fixed-line, broadband and television business.The deal allows Vodafone to accelerate its expansion in Europe, save costs in its Spanish operations and take advantage of the rapid increase in the adoption of unified communications products and services in the Spanish market. Ono has invested approximately 7 billion euros in its network since 1998.Analysts were divided over the potential benefits of the deal."Spain has been the source of losses for several quarters, and bundling Vodafone's mobile services with popular broadband and cable TV offerings is the only realistic choice to drive up customer retention, new subscribers and per-customer revenues," said Jason Sumner, technology analyst at The Economist Intelligence Unit.But James Barford, an analyst at the London-based firm Enders Analysis, said the price was high and was skeptical that the focus on "quad play" — the industry term for bundling phone, broadband, mobile and TV services — will pay off."It's a little bit the tail wagging the dog in terms of justifying such a high cost," he said. "There's an assumption that 'quad play' is essential, but there really isn't evidence that consumers have a strong desire to buy the fixed line and the mobile services together."__Associated Press writer Sylvia Hui contributed to this story. | 科技 |
2016-40/3983/en_head.json.gz/4911 | Appeared on: Wednesday, February 17, 2010
Internet Expolorer 9 To Be Announced at MIX 10
Microsoft is expected to talk about the upcoming Internet Explorer 9 (IE9) at the MIX 10 next month.
Dean Hachamovitch, the person responsible for the design, development, and release of Internet Explorer at Microsoft, will be on stage as one of the MIX10 keynoters in Las Vegas next month. "Dean will talk about changes and improvements that have been made to Internet Explorer 9 since PDC09, and his talk is sure to include a couple of surprises," reads a post at Microsoft's MIX10 blog.
Microsoft has already provided some inforamtion about the IE9 at Microsoft's Professional Developers Conference 2009 last November. The company had shared an early look at some of the work under way on Internet Explorer 9 (IE9), highlighting advancements in performance and interoperable standards, as well as advancements such as using DirectX to bring the full power of the PC to the Web experience. Microsoft said that Internet Explorer 9 has nearly closed the JavaScript performance gap between itself and rivals made by Mozilla and Google, even though the browser. Microsoft added that IE9 had an improved score in Acid3 benchmark, which checks how closely a browser follows certain standards, particularly specifications for Web 2.0 applications, as well as standards related to DOM (Document Object Model), CSS2 (Cascading Style Sheets) and SVG (Scalable Vector Graphics). The question is whether Microsoft will also offer downloadable preview of Internet Explorer 9 at MIX 10.
MIX 10 is a 3 day conference for web designers and developers. Organized by Microsoft, it will be held in Las Vegas March 15-17th, 2010. | 科技 |
2016-40/3983/en_head.json.gz/4927 | Gear & Style ☰
Home / Business /
Here’s How Facebook Fixed One Privacy Issue But Created Another
Mark Knapp Google+
Source: newsroom.fb.com
Facebook’s (NASDAQ:FB) rolling out some changes, and as with every change that comes with the social network, privacy concerns are at the forefront. This time around, Zuckerberg’s company may be taking one big step in the right direction. But, at the same time, it may also be taking another step in the wrong direction.
It was recently announced that Facebook was going to change its system a little bit so that new users’ published content on Facebook would automatically opt to go only to their friends, instead of going public to be seen by anyone, reports The New York Times. Previously, a new post would default to the public option if a user didn’t select a specific group of people to show it to, and that was a bit of an upset to users concerned about their privacy. Sure, it’s easy enough to switch a post from public to private, but it’s also easy enough to forget, and Facebook not taking that into consideration was a point of upset for some.
Thankfully, Facebook has remedied that by switching gears, as mentioned, but also by making the audience selection tool a bit more noticeable. In addition, the social network is adding a “Privacy Checkup” feature that allows users to be sure of what they are and aren’t broadcasting to the whole world so they can make the adjustments they need to protect their privacy.
Now concerned users just have to hope that Facebook doesn’t switch gears all over again with an overhaul of the privacy policy in the future.
Next More Articles About: Facebook, Internet Privacy, mark zuckerberg, mobile, NASDAQ:FB, social network, Technology 2014-05-22 17:01:14
Get Business Newsletters | 科技 |
2016-40/3983/en_head.json.gz/4939 | Blue Sky Innovation Innovation Hub LittleBits: On a mission to make electrical engineering fun
What is LittleBits? LittleBits is a system of electronic modules that snap together with magnets. We built LittleBits to break the boundaries between the products we consume and the things we make, and to make everyone into an inventor. LittleBits is a system of electronic modules that snap together with magnets. We built LittleBits to break the boundaries between the products we consume and the things we make, and to make everyone into an inventor. Christine Lagorio-ChafkinInc.com
'I want to make electronics accessible to everyone.'
It takes a truly outstanding opportunity to make a natural jack-of-all-trades focus on it and it alone.Ayah Bdeir was an accomplished electronic artist armed with a degree in computer engineering and a master's degree from the MIT Media Lab when, in 2011, she packed up and moved to New York City from Beirut, where she founded and ran a maker space. The moment of truth came in the form of a set of click-together electrical components that Bdeir had designed as her future company's first prototype.
She opened the box housing the prototype, assembled the pieces, and showed her boyfriend. "He said, 'You have to quit everything else you are doing and just work on this!'" Bdeir explains. "That's when it became real."The now 31-year-old incorporated LittleBits later that year and has since grown her venture to 41 employees in an office near New York City's Union Square. Collectively, they dream up, design, and sell dozens of roughly domino-size electrical modules that snap together magnetically.
Popular among artists, designers, and hobbyists — or anyone interested in experimentation and rapid prototyping of electrical inventions — LittleBits kits can be purchased from the company's website.The company also sells individual components, which include power sources, connectors, and tools such as a dimmer switch, a motion sensor, and a synth speaker, ranging from $8 to $36 apiece.And though these tiny snap-together modules are brightly colored, they're anything but playthings.Teachers are using LittleBits to aid instruction of circuits, design, and creativity in more than 2,000 schools. Once assembled, devices could easily make a home security system, a musical instrument, or even a technicolor dreamcoat.Most recently, a partnership with NASA allowed the company to release a package with everything needed to build a mini Mars rover. Unlike with Legos, once assembled, the device, which costs $189, actually roves."What we've done is taken really complex technology and turned it on its head," Bdeir says.
The transformation from independent artist to entrepreneur and CEO has been a significant one for Bdeir, who frequently consults with mentors, investors, and coaches. Brad Feld, an investor in LittleBits who is managing director of the Foundry Group in Boulder, Colorado, says she's a natural leader suited for the job due to her passion, as well as her skills."One of the things we look for early is an entrepreneur who is totally obsessed with their product. I've seen that," Feld says. "On top of that, she's been a great strategic thinker."The company does not disclose sales figures or profitability measures. But it did raise $15.6 million in venture capital funding to fuel its growth.Despite this largess, the company had growing pains in its early months, selling out its first product in two weeks and repeatedly struggling to meet demand for its modules."The first eight months of the company, I was really heads-down, because we were getting attacked by demand, and it was just continuous working to build up the team and the products," says Bdeir, who says she wasn't comparing herself or LittleBits with any other company or thinking of the issues as "founder problems." "I'm a problem solver," she says. "I'm a maker, and if there's a problem, I will get my hands dirty and fix it."Bdeir's long-term vision for the company goes far beyond selling teaching and tinkering tools that let people play with creating circuits to emit light, sound, and motion. LittleBits is out to democratize electrical engineering."Printing has been democratized — anyone can do it," Bdeir says. "Software has been democratized. But hardware is still only reserved for experts and is not a creative tool. I want to make electronics accessible to everyone." | 科技 |
2016-40/3983/en_head.json.gz/4971 | The Evolution of the Tablet PC
By Shane O'Neill,
CIO | Oct 12, 2012 12:56 PM
We live in explosively innovative times for tablet computing and mobile apps. But it didn't all happen overnight. Most attempts to build a tablet-like computer, going back to the '70s, were not successful. Yet every failure was a lesson learned that led us to the iPad. Here's a look back at how the modern tablet came to be.
The Tablet Revolution
Since the wildly popular iPad debuted in April 2010, slender touch-friendly tablets have increasingly become part of our work and personal lives. Twenty-two percent of all U.S. adults now own tablets, according to a report from the Pew Research Center. New tablets will continue to evolve before our eyes, but the "tablet" has been in development since the 1960s and available to consumers and businesses in some fashion since the '80s. However, the technology was never quite right for widespread use … until now.
The Dynabook (1968)
The Dynabook , created in 1968 by computer scientist Alan Kay, was a prototype and never an actual product, but it was the blueprint for the modern laptop and tablet PC. The Dynabook was about two pounds, with an integrated physical or touchscreen keyboard and all early elements of a GUI. The target audience was children, who could use the device to connect to remote servers to access text and graphics for schoolwork. The Dynabook was never built because the technologies needed did not exist. But it was a major inspiration for the PCs, graphics and multimedia that would follow.
Apple Graphics Tablet (1979)
Back in the late '70s the iPad was starry-eyed science fiction. But Apple had some ideas and commissioned a device called the Apple Graphics Tablet. It wasn't a tablet in the modern sense because it had no mobility. It was an accessory that connected to an Apple II computer so a user could draw images on the tablets with a wired stylus pen that would appear on the Apple II screen. Priced at $650 the Apple Graphics Tablet was a novelty and not a great success. It was soon replaced by a more efficient device for screen navigation: the mouse.
GRiDpad (1989)
While desktop computing flourished in the 1980s, tablet innovation crawled along. But the GRiDpad, appearing in 1989, was an early breakthrough. It was a rugged MS-DOS computer with a 10-inch stylus-ready screen with advanced handwriting-recognition capability in place of a physical keyboard. The GRiDpad was over-priced at $2,370, putting it in the luxury device category for late-'80s technophiles. But look closely and you will see that it had a big influence on the PalmPilot and Apple Newton that would appear in the '90s.
AT&T EO Personal Communicator (1993)
The EO PC was AT&T's venture into the fledgling personal digital assistant (PDA) market. The device was created by GO Corp., which was acquired by AT&T mainly for its PenPoint operating system. The $2,500 pen-based device was jam-packed with ports and features, which may have overwhelmed business customers. It contained I/O ports for a modem, parallel, serial, VGA out and SCSI, and came with a wireless cellular modem, a built-in microphone and a free subscription to AT&T EasyLink Mail for both faxing and e-mail. The EO PC never reached mainstream use and was shut down in July of 1994.
Apple Newton MessagePad (1993)
The Apple Newton launched in1993 with great fanfare. It was the device that helped coin the acronym "PDA", and aimed to redefine personal computing. It didn't go so well. Its initial launch was plagued with complaints about price ($700 at launch), weak battery life (four AAA batteries!) and handwriting recognition software that didn't always work. The negative perceptions were hard to shake. Steve Jobs, not around during the Newton's development, was famously not fond of the device. He killed it when he returned in 1997 and applied the Newton lessons to the methodical development of the iPhone and iPad.
PalmPilot (1996)
The Newton faced stiff competition in 1996 with the PalmPilot from Palm, Inc. The PalmPilot 1000 was smaller and cheaper than the Newton and released when early cellphones only made calls. The PalmPilot and its subsequent versions had the PDA competition beat on price ($300), battery life, calendar features, glass touch-screen, the ability to sync with a desktop PC, space for 500 names and addresses, and expandable memory. The PalmPilot enjoyed household name status until all-in-one smartphones like RIM's BlackBerry changed the game in 2003. But the PalmPilot is remembered as the first tablet-like device with major mainstream appeal.
Fujitsu Stylistic 2300 (1998)
Fujitsu released its first and only production tablet in 1998, and is notable as one of the first tablets with a color touchscreen. It was a pen-based tablet that ran Windows 95 and 98. It also bucked the problem of outdoor light reflection by using a transreflective display to reduce glare. Despite these influential features, the manufacturing costs of building the Stylistic 2300 kept the selling price (a mind-blowing $4,485) too high for most businesses and the production of the device stopped not long after its release.
Microsoft Smart Display (2002)
Smart Display ( codenamed Mira) was one of Microsoft's bigger flops. It was a wireless monitor available in 10- or 15-inch sizes that had some mobility but ultimately had to be tethered to a Windows PC. It relied on the slower Wi-Fi networks of the era and had other limitations such as only allowing one Smart Display to connect to a PC at a time and locking the PC it was tethered to while in use. The lack of standalone capability and high price tag ($1000 - $1500) were the Smart Display's undoing. It was discontinued in late 2003.
Windows XP Tablet PC (2002)
At the turn of the century, Microsoft had been trying to release a Windows tablet for years (it has been trying ever since and will try again with Windows 8). In 2002, it launched the stylus-based Windows XP Tablet PC. Its strength was in its handwriting recognition software, but the tablet tried to jam PC-level RAM, storage and CPU into a tablet. As a result, it was too heavy and expensive and was hampered by poor performance and battery life. Despite the best efforts of hardware-makers like Compaq and ViewSonic, the customers never arrived for the Windows XP Tablet.
Axiotron Modbook (2007)
The Modbook was the iPad's clumsy predecessor that essentially converted a MacBook computer into a tablet device. Because it replicated the MacBook, the power-hungry Modbook had mostly the same technical specs as a 2007 MacBook. The device was not produced by Apple and the customer needed to provide his or her own MacBook for the Modbook to work. Axitron charged $800 for the Modbook, but with the addition of the MacBook, the price was closer to $2,000. Needless to say, it was not the most cost effective arrangement.
Amazon Kindle E-Reader (2007)
The Amazon Kindle is a series of e-book readers now in its fifth generation. The Kindle let users shop for, download, browse, and read e-books, newspapers, magazines, blogs, and other digital media via Wi-Fi. The Kindle was not the first e-reader but it was the one that brought the medium to the mainstream and e-reading remains an important feature of all touch tablets. The Kindle also showed Amazon the value of using a mobile device to give users direct access to the company's massive catalog of content.First Look: Amazon's New Tablet Lineup
HP Windows 7 Slate (2010)
This Hewlett-Packard Windows 7 touch tablet announced by Microsoft CEO Steve Ballmer at CES 2010 generated some looks but never went anywhere. The slates were basically color e-readers with an 8.9-inch display running Windows 7. Apple announced the iPad a few weeks later and everyone, realizing that Windows 7 just wasn't designed for tablets, quickly forgot about the HP Slate. Microsoft looked asleep at the wheel. If anything, the iPad's sudden popularity gave Microsoft the motivation to design Windows 8 from the ground up to work on touch tablets.IPad V. Slate is an Apple to Oranges Comparison
Apple iPad (2010)
Finally! Years of failed tablet releases culminated in a touch-screen tablet and media consumption device that was fun and easy to use. Apple got it right with the iPad, and the public pounced. Offering a beautifully designed 9.7-inch device containing thousands of apps, the iPad was reasonably priced (starting at $500) with different storage, Wi-Fi and 3G options. Though some skeptics saw it as merely a bigger iPhone, the iPad quickly became a sensation. Copy cats soon followed. Now in its third-generation, the iPad has added an ultra-sharp Retina display and 4G connectivity.
Samsung Galaxy Tab (2010)
Introduced in Sept. 2010, the Samsung Galaxy Tab was one of the first Android devices released post-iPad. With its 7-inch screen, the Galaxy Tab was smaller than the iPad and could be held comfortably in one hand. But this iPad counter-programming feature wasn't enough to give the Galaxy Tab much headway. Its size, however, was an influence on the more successful 7-inch tablets to come such as the Amazon Kindle Fire and Google Nexus 7.
Motorola Xoom (early 2011)
The Motorola Xoom, introduced with high expectations at CES 2011, is one of the first Android-based tablets with similar specs to the iPad and the first tablet to run Android 3.1 (Honeycomb). But the iPad had seduced the public by the time the Xoom came out, and the Xoom's bigger display (10.1 inches) was a non-factor. Some instances of buggy software and a high price ($799 unsubsidized on Verizon, $600 for Wi-Fi-only) didn't help.
RIM BlackBerry PlayBook (2011)
Sensing a tablet groundswell, BlackBerry-maker RIM got in the game with the 7-inch PlayBook in April of 2011, the first device to run BlackBerry Tablet OS. The reaction was mixed at best and RIM has had to cut prices to boost sales. The BlackBerry smartphone has been falling behind sleeker touch-screen phones like the iPhone and Android-based smartphones, and there's a BlackBerry stigma that has hurt the PlayBook. In addition, early PlayBook criticisms such as low pixel density and requiring a connection to a BlackBerry smartphone for native email and calendar apps created lasting negative perceptions.
Cisco Cius (2010)
The Cisco Cius was a short-lived, business-oriented, Android tablet that served as a conduit to Cisco's unified communications software. The Cius was announced in June of 2010 and discontinued on May 24, 2012. The 7-inch tablet was a victim of the iPad's accidental popularity at enterprises and the BYOD movement. Cius' demise proved that the iPad had killed the idea of an "enterprise only" tablet. Cisco learned the important lesson that it is better off providing software that works on different mobile OSes than trying to create an endpoint device of its own.
Amazon Kindle Fire (2011)
Amazon realized it couldn't survive the media consumption magic of the iPad with just a Kindle e-reader. Hence the Kindle Fire, a 7-inch touchscreen tablet with Web connectivity and access to all the content in the Amazon Appstore. The Kindle Fire first debuted in Sept. 2011 and Amazon released a new version in Sept. 2012, the Kindle Fire HD, available in 7- and 8.9-inch models. The device runs its own version of Google Android and has been making money on the digital content in the Appstore more than the hardware. It's a risk that has so far paid off.
Barnes & Noble Nook Tablet (2011)
Nook is a touchscreen tablet sold by Barnes & Noble that expands on the Nook e-reader to compete with other 7-inch tablets such as the Kindle Fire and the Google Nexus 7. Like the Kindle Fire, the Nook runs on its own customized version of Android and its own customized Nook Store for apps and content. It is priced about the same as the Kindle Fire and it's worth noting that Barnes & Noble is scheduled to release new Nooks next month in the same vein as the new Kindle Fire HD devices (7-inch and 9-inch models).
Google joined the 7-inch tablet fray in June of 2012 with the Nexus 7, a touchscreen tablet that takes full advantage of the Android OS and the Google Play app store for apps, movies, TV shows and music. The tablet, designed by Google in conjunction with Asus, has been well-received by critics and poses a major threat to the Kindle Fire in the smaller tablet category. However, after a flashy launch in June the Nexus 7 has been surprisingly quiet.
Windows 8 Tablets (2012)
Windows 8 is not available yet (it will launch on Oct. 26), but the buzz has reached a fever pitch because THIS is the Windows that will finally work on touch-screen tablets (both x86 and ARM processors). Hardware makers such as Lenovo, Samsung and Asus have Windows 8 tablets in the works and Microsoft has also taken matters into its own hands with Surface, a Microsoft-branded Windows 8 tablet. The tablets will feature the new Windows 8 tile-based user interface, run full versions of Office and will be the size of iPads or even bigger. No 7-inch weenies here.
iPad Mini (2012?)
Will Apple enter the 7-inch tablet race? There is no official release date yet, but it is assumed Apple will come out with a smaller, cheaper version of the iPad to compete with the uprising of the Kindle Fire, Google Nexus 7 and Barnes & Noble Nook. Rumor has it an announcement could happen this month, a move that will no doubt change the dynamics of the constantly-evolving tablet market.iPad Mini: Expert Predictions and Challenges
Next 1 of 23
watchOS 3 guide: 15 essential tips to transform your Apple Watch
Best Deals of the Week, September 19th - September 23rd - Deal Alert
Sneak peek: 15 gadgets to put on your holiday wish list
7 finger-friendly keyboards for tablets and phones Additional Resources
5 Important Buying Criteria to Enable a Totally Mobile Workforce | 科技 |
2016-40/3983/en_head.json.gz/4985 | > Alternate History
Nicola Tesla Thread - All About Tesla - Get Ya Tesla Here.
Had to happen sooner or later.
http://www.pbs.org/tesla/ll/ll_mispapers.html
One of the more controversial topics involving Nikola Tesla is what became of many of his technical and scientific papers after he died in 1943. Just before his death at the height of World War II, he claimed that he had perfected his so-called "death beam." So it was natural that the FBI and other U.S. Government agencies would be interested in any scientific ideas involving weaponry. Some were concerned that Tesla's papers might fall into the hands of the Axis powers or the Soviets.
The morning after the inventor's death, his nephew Sava Kosanovic� hurried to his uncle's room at the Hotel New Yorker. He was an up-and-coming Yugoslav official with suspected connections to the communist party in his country. By the time he arrived, Tesla's body had already been removed, and Kosanovic� suspected that someone had already gone through his uncle's effects. Technical papers were missing as well as a black notebook he knew Tesla kept�a notebook with several hundred pages, some of which were marked "Government."
P. E. Foxworth, assistant director of the New York FBI office, was called in to investigate. According to Foxworth, the government was "vitally interested" in preserving Tesla's papers. Two days after Tesla's death, representatives of the Office of Alien Property went to his room at the New Yorker Hotel and seized all his possessions.
Dr. John G. Trump, an electrical engineer with the National Defense Research Committee of the Office of Scientific Research and Development, was called in to analyze the Tesla papers in OAP custody. Following a three-day investigation, Dr. Trump concluded:
His [Tesla's] thoughts and efforts during at least the past 15 years were primarily of a speculative, philosophical, and somewhat promotional character often concerned with the production and wireless transmission of power; but did not include new, sound, workable principles or methods for realizing such results. Just after World War II, there was a renewed interest in beam weapons. Copies of Tesla's papers on particle beam weaponry were sent to Patterson Air Force Base in Dayton, Ohio. An operation code-named "Project Nick" was heavily funded and placed under the command of Brigadier General L. C. Craigie to test the feasibility of Tesla's concept. Details of the experiments were never published, and the project was apparently discontinued. But something peculiar happened. The copies of Tesla's papers disappeared and nobody knows what happened to them.
In 1952, Tesla's remaining papers and possessions were released to Sava Kosanovic� and returned to Belgrade, Yugoslavia where a museum was created in the inventor's honor. For many years, under Tito's communist regime, it was extremely difficult for Western journalists and scholars to gain access to the Tesla archive in Yugoslavia; even then they were allowed to see only selected papers. This was not the case for Soviet scientists who came in delegations during the 1950s. Concerns increased in 1960 when Soviet Premier Khrushchev announced to the Supreme Soviet that "a new and fantastic weapon was in the hatching stage."
Work on beam weapons also continued in the United States. In 1958 the Defense Advanced Research Projects Agency (DARPA) initiated a top-secret project code-named "Seesaw" at Lawrence Livermore Laboratory to develop a charged-particle beam weapon. More than ten years and twenty-seven million dollars later, the project was abandoned "because of the projected high costs associated with implementation as well as the formidable technical problems associated with propagating a beam through very long ranges in the atmosphere." Scientists associated with the project had no knowledge of Tesla's papers.
In the late 1970s, there was fear that the Soviets may have achieved a technological breakthrough. Some U.S. defense analysts concluded that a large beam weapon facility was under construction near the Sino-Soviet border in Southern Russia.
The American response to this "technological surprise" was the Strategic Defense Initiative announced by President Ronald Reagan in 1983. Teams of government scientists were urged to "turn their great talents now to the cause of mankind and world peace, to give us the means of rendering these nuclear weapons impotent and obsolete."
Today, after a half-century of research and billions of dollars of investment, the SDI program is generally considered a failure, and there is still no realistic means of defense against a nuclear missile attack.
For many years scientists and researchers have sought for Tesla's missing papers with no apparent success. It is conceivable that if Nikola Tesla knew a means for accurately projecting lethal beams of energy through the atmosphere, he may have taken it to the grave with him. Good Site!
There are some REALLY interesting FBI files on Tesla on this page...Take a look.
[size=medium]\"The Office\" is the greatest comedy...ever. [/size]
Send a private message to truebeliever
Visit truebeliever's homepage!
Find all posts by truebeliever
Einstein vs Tesla - a good article explaining the flunky vs the true genius
Nikola Tesla & Zero Point Energy Links
A TREASURE OF FREE E-BOOKS: Steiner, Castenada, Tesla, Course in Miracles, Cayce, Atlantis, Illumina
Brave New World of Tesla Technology
Tesla Technology Down Under? | 科技 |
2016-40/3983/en_head.json.gz/5000 | Forum:Early Space
Topic:Cape Canaveral's Launch Complex 36
[QUOTE]Originally posted by 1202 Alarm:
[B]I don't think it's very clever to compare destroyed cities during WWII and this LC-36B. Thousands died, not only in Germany, but also in France, England and many allied countries, during these raids. I don't think a lot of people died in the Cape today.[/B][/QUOTE]
T O P I C R E V I E WRobert PearlmanLaunch Complex 36, Pads A and B, at Cape Canaveral Air Force Station, Florida, is scheduled to be demolished on Saturday, June 16 at 10:00 a.m. EDT. Complex 36 was used to launch 145 Atlas rockets, including those that sent Surveyors to the Moon, Mariners to Mars and Pioneers to Jupiter, Saturn and beyond. The last Atlas to lift-off from Complex 36 departed on February 3, 2005 on a mission to deploy an NRO classified payload. Ben45th Space Wing release Launch Towers at Complex 36 to be Toppled June 16More than 3,600 tons of steel will crash to the surface at Space Launch Complex 36 on Saturday, June 16 when the old mobile service towers there are toppled as part of the ongoing project to demolish the historic site.Approximately 550 pounds of dynamite strapped to the base of each tower will be detonated about two minutes apart between 10 and 11 a.m. to knock the 209-foot-tall towers down. "A majority of the steel will be recycled," said project officer Jonathan Vanho of the 45th Civil Engineer Squadron. The steel that can't be recycled will be taken to the landfill at Cape Canaveral AFS.Complex 36 was built for the Atlas/Centaur development program and it was operated under NASA's sponsorship from the program's inception until the late 1980s. The site was built and occupied as a single launch pad complex in February 1961 and a second pad was built between February 1963 and July 1964. Complex 36 hosted many historic missions over the years including Surveyor, Mariner, Pioneer and Intelsat. In 1989, NASA transferred Complex 36 to the Air Force and General Dynamics for military and commercial space operations. The site was modified to handle Atlas II/Centaur missions. The first commercial Atlas II/Centaur was launched from Pad 36B Dec. 7, 1991. The first military Atlas II/Centaur was launched from Pad 36A Feb. 11, 1992. The last launch from the complex was an Atlas IIIB carrying a National Reconnaissance Office payload from Pad 36B on Feb. 3, 2005. In all, there were 68 major launches from Pad 36A and 77 from Pad 36B."Over nearly five decades, Complex 36 was one of the world's most important and versatile space launch sites. Its credits include a whole catalog of NASA missions to the moon, Venus, Mars and Jupiter -- not to mention vital military and commercial communications satellite missions," said Mark Cleary, 45th Space Wing historian.SpaceCatBen, I'm just curious - have you taken any historical photos of this or other LC's prior to their being demolished?BenWell, I'm not sure how you define historical. I have photographed several launches off 36 and I also have several photos of the pad I got before launch. There are some here and here.art540It is a real shame this country could not create a museum of LC-36 with a mockup or dummy Atlas-Centaur in the gantry. Safety and maintenance would be key issues if a non profit tried to save the site not to mention EPA issues. I assume there are no future commercial interests since Space-X went north?BenI should also point out that, having driven past it and noticed, the Titan 3/4 Vertical Integration Building is in the process of being torn down too. Passing over the causeway yesterday, an entire wall was missing and you could see inside it.Robert PearlmanPhotos courtesy Ben Cooper, SpaceflightNow.com / LaunchPhotography.comPad B at 9:59am EDT Pad A at 10:11am EDT art540Thanks for the images, Ben. I now have photos of LC-36B under construction and now being broought down. The historical skyline is surely disappearing... almost like bombed out cities in WW II.1202 AlarmI don't think it's very clever to compare destroyed cities during WWII and this LC-36B. Thousands died, not only in Germany, but also in France, England and many allied countries, during these raids. I don't think a lot of people died in the Cape today.art540My apologies for any offense taken... I was trying to compare an aerial view of the Cape skyline circa 1965 to today's appearance (Japan also suffered much like Europe).hlbjrI watched the video of the 2 MST's coming down on Spaceflight Now. I was surprised people were clapping after they came down. I think it's a sad day to lose that heritage. BenThere were several Atlas guys there who worked on those pads, so I think the clapping was a tribute to their legacy.Thanks for the comments, by the way.Rob JoynerMaybe I missed it in other earlier posts but just WHY was the complex destroyed? Surely just not to be "recycled", which would be insane. Robert PearlmanAccording to Florida Today: The towers were taken down to prevent corrosion from becoming a safety concern, said Kevin Hooper, project manager with the 45th Space Wing's civil engineering."We have to take the structures down before nature takes them down," he said.Joe HollowayJust heard mention of LC-36 razing on the car radio on return from an overnight trip.Glad I got to see the pads on my last two trips down Florida-way.There will be nothing left of the old Cape pretty soon, which is a sad commentary, indeed.Ben quote:Originally posted by Rob Joyner:Maybe I missed it in other earlier posts but just WHY was the complex destroyed? Surely just not to be "recycled", which would be insane. They are also going to refurbish the pad to put it 'on the market' for future buyers. Robert PearlmanSpace Florida release U.S. Air Force to license Launch Complex 36 to Space FloridaState to provide multi-use vertical launch complex for commercial users at Cape CanaveralA landmark announcement by the U.S. Air Force and the state of Florida today will fundamentally expand the state's position and prominence in aerospace and the space industry in all three key sectors: civil, military and commercial, broadening participation in space-related activities.Launch Complex 36 at Cape Canaveral Air Force Station, subject to completion of the environmental impact analysis process, will be re-built as a multi-use vertical launch complex capable of supporting several launch vehicle configurations ranging from light to medium-lift into low-Earth orbit and beyond."Florida has a great legacy in aerospace, a great foundation to build on, and that is one more reason we are so committed to expanding our capability to launch from Florida and from the United States," Lt. Governor Jeff Kottkamp said. "We have worked closely with the Air Force over the past several months and look forward to a strong, long-term relationship as we build these safe commercial launch processes together.""The Air Force assignment of Launch Complex 36, is an important next step to extending access to space, making it available to defense and security initiatives and multiple commercial payload and launch activities for both civil and private space businesses that wish to launch from Florida," said Steve Kohler, president of Space Florida."The Florida legislature demonstrated tremendous foresight when it appropriated initial funding of $14.5 million for FY 2009. This funding will help Space Florida begin the launch complex infrastructure design and construction necessary to develop a true commercial multi-use launch complex. This direction by the Air Force, together with the tremendous support by the state, opens the door to attracting, supporting and sustaining national and international aerospace business here in Florida," Kohler said."One of Space Florida's next objectives is to establish a Commercial Launch Zone (CLZ) for commercial customers wishing to operate from the Eastern Range. The creation of a CLZ expands our ability to support commercial payload launch services, re-supply missions to the International Space Station, and aggressively diversify aerospace business development rapidly and efficiently," Kohler said.In addition to supporting a greater number of launch customers, the CLZ may attract other segments of the aerospace industry necessary to support flight operations that will benefit the entire state of Florida."The assignment of Launch Complex 36 will be an important milestone and part of a broader strategy to establish a CLZ at the Cape" added Kohler. "The Commercial Launch Zone is intended to enable space industry located at the Cape and in Florida to be more competitive in the global economy."Launch Complex 36 was opened for business by NASA in 1961 and was most recently used as a military and commercial Atlas launch site. U.S. Surveyor, Mariner and Pioneer missions to the Moon, Mars, Jupiter and Saturn and other destinations were launched from this complex, followed in later years by weather satellites, military space assets and commercial satellite missions.The final rockets launched from Launch Complex 36 in 2004, and the Air Force shut down the complex.hlbjrFlorida Today is reporting Spaceport Florida has acquired a 5 year license to develop the launch system at Complex 36. Here's the link to the story.I'm glad the Complex won't be completely destroyed but I'm also a little sad at the loss of some historical structures.Robert PearlmanGovernor of Florida release Governor Crist, Space Florida and 45th Space Wing Host Launch Complex 36 Dedication Today, Governor Charlie Crist, Lt. Governor Jeff Kottkamp and Space Florida President Steve Kohler were joined by Lt. General William Shelton, commander of the 14th Air Force, and Brigadier General Susan Helms, commander of Cape Canaveral's 45th Space Wing, to host an official dedication ceremony for Launch Complex 36 at Cape Canaveral Air Force Station.This event marks the U.S. Air Force's official "intent to lease" the site to Space Florida for build out of a launch pad that can accommodate light-to-medium lift vertical launches that support commercial, civil and military capabilities.Addressing federal, state and local legislators, along with key aerospace leaders, Governor Crist stated, "Florida remains committed to the future legacy of space exploration and technology development. The aerospace economic cluster will create new jobs and benefit related sectors such as biotechnology and environmentally-friendly energy solutions vital to Florida's future."More than 200 guests were in attendance at today's ceremony, including federal, state and local legislators."We are witnessing the dawn of a new era in commercial space exploration here in Florida," noted Lt. Governor Jeff Kottkamp. "Under the leadership of General Helms, the Air Force's commitment to commercial space allows the State of Florida to maximize research capabilities and commercialization possibilities of the International Space Station National Laboratory."Steve Kohler, Space Florida President noted, "We are working with a number of commercial launch and payload customers right now that have expressed great interest in leveraging the 50 years of experience Florida has invested in space. From a strategic perspective, the build out of LC-36 is one activity in a broad array of actions to create and develop a Commercial Launch Zone, which is fundamental in establishing an effective, globally competitive economic environment for Florida and for the United States."Environmental assessments will be initiated in Fall 2008, followed by the build out of Pad A, which is scheduled for completion by Fall 2010. Infrastructure development will be partially funded by a $14.5 Million appropriation by the 2008 Florida Legislature.About Space Florida:Space Florida was created to strengthen Florida's position as the global leader in aerospace research, investment, exploration and commerce. As Florida's aerospace development organization, we are committed to attracting and expanding the next generation of space industry businesses. With its highly trained workforce, proven infrastructure and unparalleled record of achievement, Florida is the ideal location for aerospace businesses to thrive - and Space Florida is the perfect partner to help them succeed.Robert PearlmanU.S. Air Force release Launch complex now available for civil, commercial launchesOfficials with the Air Force and Space Florida made history during a dedication ceremony held here Oct. 22 when Space Launch Complex 36 officially was made available for operational use by the State of Florida, subject to completion of the environmental impact analysis.Attending the historic ceremony were Florida Governor Charlie Crist; Florida Lt. Governor Jeff Kottkamp; Space Florida President Steve Kohler; Lt. Gen. William Shelton, 14th Air Force commander; and Brig. Gen. Susan Helms, 45th Space Wing commander. General Shelton said Air Force leaders supported the initiative because it will make it easier for commercial providers to launch from the U.S. Having domestic launch options provides the U.S. with solid foundation for national security. "This is a great partnership that is mutually beneficial to both the Air Force and the state," he said. "We take great pride in helping foster the success of the commercial space sector; I'm confident the spirit of innovation and the cooperation that made this a reality will continue in the years ahead."Governor Crist also had positive things to say about the agreement. "Florida has always been home to big ideas. The entrepreneurial spirit is woven into the DNA of Florida's economy," the governor said. "And thanks to the Air Force's decision, the door is now open to innovation and space opportunities never seen before. In tough economic times, it is important we do not sit idly by, but that we invest in economic opportunities for the future."What a tremendous opportunity to ensure that space exploration is a top priority and that the U.S. remains a leader right here from Florida," he said.According to Space Florida officials, the reconfiguration of Launch Complex 36 will strengthen not only the state's aerospace industry but other growing economic sectors such as biotechnology and environmentally friendly energy technology vital to Florida's future. The launch complex will support light- to medium-lift vehicles that go into low-Earth orbit and beyond.Space Florida's president sees this ground-breaking ceremony as a great beginning, both literally and figuratively. "The Air Force assignment of Launch Complex 36 is an important next step to extending access to space," said Steve Kohler, Space Florida president."We are now making that available to both defense and security initiatives," he said, "with multiple commercial payloads and launch activities for both civil and private space businesses that want to launch from Florida. This direction by the Air Force, together with the tremendous support by the state, opens the door to attracting, supporting and sustaining national and international aerospace business here in Florida." This effort also is in line with the mission of the 45th SW, according to General Helms. "Our primary mission here is to assure access to the high frontier," she said. "This proposal better enables us to execute that mission. It's the ultimate 'win-win' situation for both the Air Force and the State of Florida."NASA opened Launch Complex 36 in 1961, and most recently it was used as a military and commercial Atlas launch site. Missions to the moon, Mars, Jupiter and Saturn launched from the site, as well as weather satellites and commercial satellites. The Air Force shut down the complex in 2004.Robert PearlmanU.S. Air Force release Air Force licenses two launch complexes for commercial useThe 45th Space Wing has issued Real Property Licenses to Space Florida for Space Launch Complexes 46 and 36 here to attract the nation's next generation of spacelift customers.As a result, the 45th Space Wing now grants Space Florida full rights to proceed with construction and refurbishment work at either launch location."These licenses are in line with the 45th Space Wing's mission, assuring access to the high frontier," said Col. Ed Wilson, 45th Space Wing commander. "This will help us to better execute that mission. It's a win-win-win for the Air Force, the state of Florida, and the nation.""Within the past year, we have been working diligently with the 45th Space Wing and the Navy to secure full rights to these complexes," said Mark Bontrager, vice president of Space Florida's Spaceport Operations. "Through a mutual trust with both entities, we can now fully pursue build-out of these sites for commercial use. This will help to open up additional opportunities for Florida to take full advantage of the rapidly emerging commercial space marketplace. Having additional domestic launch sites ready for commercial utilization will also ensure the U.S. remains competitive from a global perspective."Now that Real Property Licenses are secured for both sites, Space Florida said it will work with the Federal Aviation Administration to conduct all necessary due diligence required to secure a Launch Site Operators License for each complex.This process is anticipated to take approximately 180 days. During this timeframe, the Air Force Safety Center and the Department of Defense Explosive Safety Board will also review an Explosive Site Plan for SLC-36.Space Florida has conducted corrosion control and maintenance efforts on the Mobile Service Tower at SLC-46, and preliminary engineering studies show that further refurbishment of SLC-46 is needed to ready it for full commercial launch operations. With the Real Property License fully secured, Space Florida can now utilize its resources - including a special $1.1 million appropriation by Senator Bill Nelson through NASA - to prepare the site for an interested launch customer, whose identity is currently protected under Non-Disclosure Agreement.In December 2009, a Joint Use Agreement was signed between the Navy and Space Florida to grant full utilization of the site by either party, as needed. There is no anticipated naval use for the site at this time.The last active use of SLC-46 was in 1999 when RocSat 1 was launched on an Athena I for the Republic of China. The last recorded launch at SLC-36 was in February 2005, when an Atlas IIIB launched a classified NRO payload from pad 36B. Contact Us | The Source for Space History & Artifacts Copyright 2016 collectSPACE.com All rights reserved. Ultimate Bulletin Board Version 5.47a | 科技 |
2016-40/3983/en_head.json.gz/5011 | New Akamai service optimizes content delivery for device, network conditions
Carolyn Duffy Marsan (Network World)
Akamai Technologies will announce on Tuesday a new service for improving website performance that determines the type of device and network a user has -- and whether the device has an IPv4 or IPv6 address -- and then improves the delivery of Web content accordingly.The new service, dubbed Aqua Ion, uses technology that Akamai acquired in February from Blaze Software, a Canadian maker of a front-end optimization service. Blaze's cloud-based service automatically optimizes the code on a Web page during the delivery process to ensure faster delivery to a PC, tablet or smartphone.MORE: 6 signs the U.S. is overtaking the world at IPv6Akamai is offering a range of what it calls "situational performance" capabilities in Aqua Ion, including the ability to compress images based on real-time network conditions and to respond to requests based on the screen size of the device."If you look at 2007 or 2008, most people were on a PC with good Wi-Fi connectivity or they were connected to some sort of LAN and most of them had Internet Explorer as their browser. ... But today that's not how we interact with the Web. There a number of different devices -- smartphones, tablets, laptops, PCs, set-top boxes -- and the type of connectivity is getting much more varied. It could be a congested wireless network, 3G or LTE. It could be an IPv4 or IPv6 address. When you optimize performance for all of those situations, you really have to be specific about what situation you're trying to optimize for,'' explained Ravi Maira, vice president of Web performance products at Akamai.While IPv6 adaptation is just one feature of Aqua Ion, it will be increasingly important as the Internet runs out of IPv4 addresses and some carriers and Web content providers use translation mechanisms such as carrier-grade NATs and IPv4 address sharing, which could slow performance."Typically, you will find some sort of conversion between IPv4 and IPv6 that is happening in the middle of the Internet, but those conversions points can be very congested because such a small percentage of the Internet is IPv6 enabled and they may not be in the best path," Maira said. "Our platform enables us to accelerate an IPv6 request to edge services by turning it back to IPv4 to go through our network using the best path."
Maira emphasized that Aqua Ion is not an alternative to Web content providers supporting IPv6 directly. "We still want our customers to support IPV6 on their own infrastructure," he added. "This is not a replacement for IPv6, but it does help them in terms of managing the upgrade process and making sure they have performance during the transition from IPv4 to IPv6."Akamai has measured the performance of Aqua Ion for a handful of beta customers including an e-commerce company that more than doubled the number of customers receiving website response rates of under two seconds.Aqua Ion is available immediately as an add-on to Akamai's Dynamic Site Accelerator service.Maira concedes that a small share of Akamai's customers care about IPv6 performance right now, particularly U.S. federal government agencies under a mandate to support IPv6 and high-tech companies."Right now, IPv6 represents less than half of 1% of the traffic across our entire network," Maira said. "But we're huge IPv6 proponents. We need this to scale the Internet, and at Akamai we are a company that needs a lot of IP space. So running out of IPv4 space or IPv4 space being dear is not good for Akamai. We want to help our customers get the majority of the Internet IPv6-enabled because it unlocks a ton of possibilities for us."IPv6 is an upgrade to the Internet's addressing scheme, which was created 40 years ago using a protocol known as IPv4. IPv4 uses 32-bit addresses and can support 4.3 billion devices connected directly to the Internet. IPv6, on the other hand, uses 128-bit addresses and can support a virtually limitless number of devices: 2 to the 128th power. IPv6 is necessary because the Internet is running out of IPv4 addresses. However, IPv6 is not backward compatible with IPv4, requiring network operators to support both protocols at an added cost.Read more about lan and wan in Network World's LAN & WAN section. | 科技 |
2016-40/3983/en_head.json.gz/5027 | Brian Beckley | NASA's flight simulator rides into infinity | Our Corner
The FFT at the Seattle Museum of Flight— image credit: Brian Beckley
by BRIAN BECKLEY, Bonney Lake-Sumner Courier-Herald Reporter Nov 19, 2012 at 3:09PM
I have always been fascinated by space. As a kid growing up, and still now, I was a big fan of science fiction, especially anything in space: Star Trek, Star Wars, Battlestar Galactica, Buck Rogers.
You name it, I probably watched it.
But along with those, I was a big fan of NASA and the space program and like many kids, I too at some point wanted to be an astronaut when I grew up.
When I was 6, on a trip to visit family in Florida, I was lucky enough to be able to see the fourth-ever launch of the Space Shuttle Columbia, live from Cape Canaveral.
It was STS-4, the final research and development launch of the shuttle program, back even before they painted the external fuel tank the familiar orange that it wore through most of the Shuttle program.
Because I was so young, my memories of the event are not as clear as I'd like them to be, but I certainly remember the excitement of the countdown and then the flash of orange and huge billowing clouds of smoke before the oribiter began to rise, slowly at first and then rapidly gaining speed as it turned into a flickering speck in the Florida sky, trailed by long vapor trail.
We all watched through our binoculars as the ship disappeared and the solid rocket boosters fell back to Earth. (according to Wikipedia, the boosters on that flight were lost because their parachutes failed to open and they hit the ocean hard and sank. I must admit, I don't remember that, though I also don't remember seeing the chutes open, now that I think back…)
The launch was the highlight of the trip and one of the highlights of my young life. I still have the (tiny) T-shirt I got as a souvenir. Now I look at it as a collector's item, but through my childhood, it was easily my favorite shirt and I wore the heck out of that thing.
I, of course, did not become an astronaut when I grew up - though some would argue that I am still something of a space case. But my fascination with the Shuttle program and with space and NASA in general never left. I always followed as many of the missions as I could through the newspapers and television as a kid and as an adult, I watched on the internet as many of them as I could, never ceasing to be amazed at the science and math and sheer force of will it takes to put a man into orbit.
The Shuttle was mothballed in 2011, after nearly 30 years of mostly successful flights. I could not have disagreed more with the decision by the Bush and Obama Administrations to end the program, essentially taking the United States out of the space race for the first time since, well, it began.
But on a high note, the surviving Space Shuttles, after they were decommissioned, were set to be donated to museums around the country to allow us, the citizens, to see these engineering marvels in person.
There was much ballyhoo when it was announced Los Angeles and New York, along with the Johnson Space Center in Houston and the Smithsonian Institute in Washington D.C., would get orbiters, but not the Museum of Flight in south Seattle.
It made so much more sense logistically to bring a Shuttle to Seattle because of the museum's proximity to Boeing Field (anyone who watched as they cut down hundreds of trees while they drove the Endeavor through the streets of Los Angeles will probably agree), but in the long run, it does make the most sense to have these historical artifacts (man, typing that made me feel old) in the nation's two most populous cities.
But as a consolation prize, the Museum of Flight was awarded the Full Fuselage Trainer, a full-size mock-up of the Shuttle that was used by every single astronaut as a training ground for the their missions to the stars.
Though the FFT never went into space, the 120-foot, wingless wooden mockup does bring something to Seattle that no other location can boast.
"This one you get to go inside," Museum of Flight CEO Douglas King said during a media day that I was fortunate enough to attend.
I must admit, when the press release arrived announcing the media day prior to the official grand opening, I jumped at the chance to see the orbiter. I volunteered to write a piece about it for our sister papers located closer to the museum, packed my camera and notebook into the Kia of Justice and never looked back.
And I can tell you unequivocally that it was everything I hoped it would be.
The FFT sits in the new Charles Simonyi Space Gallery, named for a former Microsoft developer who donated not only money, but an actual Russian Soyuz Capsule he rode into space and back as a "space tourist." It sits in the back of the gallery, part of the "After the Space Shuttle" exhibit that accompanies the FFT.
The public is welcome to walk up to the orbiter, as well as through the payload section. There is even a landing simulator to give you an idea of how fast the Shuttle comes in and how precise you have to be to hit a runway from space. (For the record, I nailed it on the "novice" setting, which was admittedly very, very simple as the simulation handled some of the controls for me…)
But for me, the most interesting part was the chance to climb through the small, round airlock underneath the windows and see inside the space shuttle firsthand.
Here's what I learned: it's tiny inside. Like, really, really small.
The majority of the Orbiter is the 61-foot payload bay. The crew compartment, which routinely housed up to seven astronauts for more than a week at a time, is actually only about 165 square feet of space.
There is 100 square feet in the mid-deck, which contains the storage areas, sleeping bags and the galley (a small rectangular box built into the side of the orbiter).
Up a short ladder is the flight deck, which contains an additional 65-square-feet of space.
Museum officials led four reporters into the mid-deck at a time and only two-at-a-time could climb up to the flight deck.
And it was cramped at best. I can't imagine what it must have been like to share that amount of space with six other people for two weeks. There must have been a huge sigh of relief from the astronauts when the International Space Station opened and they had a chance to stretch out a bit in its comparatively roomy environs.
Even more surprising though was that the Soyuz capsule - which the Russians have been using for 40 years and is still in use today as the only way to get astronauts and cosmonauts to and from the ISS - is even smaller. It is about the size of a compact sedan and even more cramped inside, with three small "beds" that require the cosmonauts to lay with their knees on their chests for both liftoff and landing.
Though let's be honest, I would spend the entire week in the fetal position if it meant I could go into space.
After all, to paraphrase John Young, the greatest American astronaut you've likely never heard of (he was a Gemini pilot who not only walked on the moon, but flew the very first Shuttle mission into Space), when you are going uphill you are not looking for comfort, you just want to get there.
Along with the artifacts, the exhibit contains a nice section on the future of space flight, which for now is left entirely to private industries in the United States.
I must admit, when the plan to turn supply runs and launches over to private industry was first announced by the Bush Administration - and then reiterated by the Obama Administration - I was opposed.
No way, I said. No way will private industries be able to do that. Space flight is too complicated and expensive. It needs to be a national program.
Well, turns out I was wrong. This year, Space X completed the first and second successful supply runs to the ISS. And Space X is only one of several private industries - all of which are highlighted at the exhibit - currently building spaceships, most with the idea of bringing civilians and tourists into orbit.
If the cost of such trips ever becomes reasonable (it cost Mr. Simonyi more than $20 million per trip to ride a Soyuz rocket into space), you can bet I will be in line with bags packed and eyes wide.
But until then, if you are like me, the Museum of Flight's FFT exhibit might be the next best thing, or at least a taste of space to hold you over for a few more years...
BRIAN BECKLEY, Bonney Lake-Sumner Courier-Herald Reporter bbeckley@courierherald.com or 360-825-2555 ext. 5058 Tweet | 科技 |
2016-40/3983/en_head.json.gz/5062 | Virtual Desktop Vendor Pano Logic Shuts Down
By Natasha Chilingerian
October 30, 2012 • Reprints Pano Logic, the provider of the end-to-end, virtual hardware and software desktop solution Pano System, has gone out of business, according to the company’s former public relations firm.
ALSO READ Nov. 1, 2012 Pano Logic is Under Assignment
ALSO READ Oct. 31, 2012 Vantage CU Reacts to Pano Logic Closure
The shutdown comes shortly after the company announced a PC replacement project with the $3.4 billion Redstone Federal Credit Union in Huntsville, Ala. Redstone FCU had been in the process of replacing 75% of its PCs with Pano Logic’s virtual desktop computing solutions and planned to continue the project into 2013.
Redstone FCU said that its legal and IT teams were reviewing the situation but had no other comment so far.
Calls to Pano Logic’s corporate headquarters in Redwood City, Calif. led only to staff voicemail recordings, and the company did not respond to an email request for comment.
A comment posted on the company’s Facebook page by who appears to be a Pano Logic customer asks about its plans for post-bankruptcy support, to which Pano Logic has not responded.
I'm looking for some contacts that can provide support for the next few years. Anyone have some recommendations? Natasha, I'm not sure how long you plan to follow this story,... | 科技 |
2016-40/3983/en_head.json.gz/5105 | Home > Encyclopedia of Science
Tweet Schematic diagram of a Daniell cell with zinc cathode (blue spheres), copper anode (green spheres), dilute sulfuric acid (gray circles – H3O+ – and yellow spheres – SO42-) in the cathode compartment, copper sulfate (green circles – Cu2+ – and yellow spheres) solution in the anode compartment , and porous pot separating the two. The fundamental reactions of the cell are demonstrated by the "atoms" picked out in bolder colors. At the cathode, a zinc atom (Zn – blue ring with red center) gives up two electrons (e- – red dot) and, as a zinc ion (Zn2+ – blue circle with white center) enters the cathode solution. The cathode compartment now has a charge excess +2 and so two hydrogen ions (H3O+ – black circle) pass through the porous pot barrier to the anode compartment. This allows a copper ion (Cu2+ – green circle with white center) to accept two electrons from the anode and deposit itself as a copper atom (Cu – green circle with red center) on the anode. The electrons are made available at the anode because electrons from the cathode are allowed to flow through an external circuit to the anode. The energy for the process comes from the overall reaction
Zn + Cu2+ Zn2+ + Cu
and by means of the cell is made available as electricity to do work against a resistance in the external circuit. The conventional current flows in the opposite direction from the actual flow of electrons. A battery is a device for converting internally-stored chemical energy into direct-current electricity. The term is also applied to various other electricity sources, including the solar cell and the nuclear cell, but is usually taken to exclude the fuel cell, which requires the continuous input of a chemical fuel for its operation. Chemical batteries consist of one more electrochemical (voltaic) cells (comprising two electrodes immersed in a conducting electrolyte) in which a chemical reaction occurs when an external circuit is completed between the electrodes. Most of the energy liberated in this reaction can be tapped if a suitable load is placed in the external circuit, impeding the flow of electrons from the cathode to anode. (The conventional current, of course, flows in the opposite sense.) Batteries are classified in two main divisions. In primary cells, the chemical reaction is ordinarily irreversible and the battery can yield only a finite quantity of electricity. Single primary-cell batteries are used in flashlights, shavers, light meters, etc. The most common type is the dry Leclanché cell, which has a zinc cathode, a carbon anode, and uses ammonium chloride paste as electrolyte. Manganese dioxide "depolarizer" is distributed around the anode (mixed with powdered graphite) to prevent the accumulation of the hydrogen gas which would otherwise stop the operation of the cell. The dry Leclanché cell gives a nominal 1.54V. For the higher voltage necessary to power radios, etc., batteries containing several thin laminar cells are used. Secondary cells, known also as storage cells or accumulators, can be recharged and reused at will provided too much electricity has not been abstracted from them. The most common type, as used in automobiles, is the lead-acid type, in which both electrodes are made of lead (the positive covered with lead (IV) oxide when charged) and the electrolyte is dilute sulfuric acid. Its voltage is about 2V, depending on the state of the charge. The robust yet light nickel-iron battery (having a potassium hydroxide solution electrolyte) was widely used in telephone exchanges and other heavy-duty situations but has been displaced by the nickel-cadmium type. They give about 1.3V. The first battery was the voltaic pile invented in about 1800 by Alessandro Volta. This comprised a stack of pairs of silver and zinc disks, each pair separated by a brine-soaked board. For many years from 1836 the standard form of battery was the Daniell cell, with a zinc anode, a copper cathode, and a porous-pot barrier separating the anode electrolyte (copper (II) sulfate) from the cathode electrolyte (sulfuric acid). The lead-acid storage battery was invented by Gaston Planté in 1859 and the wet Leclanché cell, the prototype for the modern dry cell, by Georges Leclanché in 1865. Related category
• ELECTRICITY AND MAGNETISM
Home • About • Copyright © The Worlds of David Darling • Encyclopedia of Alternative Energy • Contact | 科技 |
2016-40/3983/en_head.json.gz/5142 | Home > Mobile > Dish wraps Sprint in a US flag, but Japan’s… Dish wraps Sprint in a US flag, but Japan’s Softbank is a better deal for consumers By
How do you say no, politely? Dish Networks has outbid Japanese carrier Softbank for Sprint, offering more than $25 billion for the struggling wireless carrier, which currently has a 16 percent market share in the United States. Dish says that it’s a better choice because it’s already located in the U.S. and can offer bundles, wondrous bundles, to customers everywhere. You know what I say to that? No, thank you. Sorry, Dish. You lose. Good day, sir. Now, I’m not Willy Wonka and I don’t have the power to end this by giving Dish blueberry chewing gum and rolling it out the door, but Dish buying Sprint seems like a raw deal. The snozberries don’t taste like snozberries to me. The deal may help Dish and it may help Sprint, but it doesn’t sound great for you or me. The only thing that has come from consolidation between phone service and TV service is higher prices for everyone. Have you tried to buy high-speed Internet service lately? If you’re unlucky enough to be in areas served by Verizon, Time Warner, Comcast, or many other cable or DSL companies, it’s often impossible to get Internet service without being strong-armed into paying for landline phone service, whether you need it or not. Want Internet or cable access? They’re both ridiculously expensive unless you’re a bundled subscriber and buy TV, Internet, and Phone service from one company.
When TV or Internet providers begin offering consolidated services, it’s often marketed as a money-saving option, but give it a few years and prices always seem to rise even higher and those “extra” options become mandatory.
Dish’s plan, if successful, would help it transcend its satellite TV confines and become an unstoppable mega service.
Dish Network’s prices don’t offend me. It’s not an egregiously priced service, nor does it have a reputation as bad as Comcast, but it’s not a completely clean player either. To sign up for Dish, you have to sign a two-year contract (much like a wireless company). And it pulls the fun trick of getting you to sign up at a low price (say $25 a month), only to double it after 12 months.
Softbank may not be a U.S. company, but it has a history of undercutting competitors and offering good wireless prices. In Japan, it offers smartphone plans with 7GB of data for the equivalent of $55 a month. That’s three to four times more data than you’d get for the price on any U.S. carrier. (Read up about it here.) It’s also a strong proponent of LTE.
The U.S. mobile phone industry does not need another giant media company oozing its way in. The near duopoly of Verizon and AT&T, which together control two-thirds of all mobile phone subscribers, has already made wireless data service incredibly expensive. Dish’s plan, if successful, would help it transcend its satellite TV confines and become an unstoppable mega service.
Like DirecTV, Dish currently has one big weakness compared to its rivals in cable (think: AT&T, Comcast, Verizon). Though it offers a competitive TV service with a lot of good channels, it has no home Internet offering, nor can it offer phone service. Comcast and the rest of the gang make a killing by bundling broadband Internet, cable, and landline phone service together.
So here’s the plan: Dish already owns some LTE spectrum. Together with Sprint’s coverage and LTE network, it could offer wireless LTE broadband to a good portion of its subscribers within a year or two. Down the line, as its network improves and speeds up, it could have a leg up on many of its cable competitors like Comcast, which don’t own an LTE network, turning it into a nation-wide power player. In this possible future, our TV, phone, and Internet lives would be owned by Dish, Verizon, or AT&T. Dish may even have the advantage. Though you have to install a Dish to use it (for now), satellite service is more widely available than cable or broadband – assuming your house doesn’t have too many trees around it. This may sound stellar if your dream is to buy everything from one company. And while we may see a faster Sprint come out of a Dish merger, it could be a more expensive Sprint.
I cannot speak for all of you, but the last thing I want to do when buying a phone or tablet is be pushed into subscribing to some Dish video offering. I don’t want to have to buy three dish receivers to get $100 off my Galaxy S4, or sign up for Dish satellite TV to get a good price on a family wireless plan. And once Dish starts bundling, Verizon and AT&T will show much less restraint than they do now. They’ll all go bundle crazy.
Of course, I could be giving Dish too much credit. For $25 billion, I would hope it knows what it’s doing, but its acquisition of Blockbuster in 2011 hasn’t lead to any revival of that brand. Instead, Blockbuster stores continue to be shut down every week and the Blockbuster online service doesn’t appear to be gaining ground against the likes of Amazon and Netflix.
A Softbank/Sprint merger would help Sprint and subscribers more in the near future.
And, just to throw a fizzy lifting drink into the mix (and confuse all of you who forgot that I made a Willy Wonka reference earlier), why should we be excited about Dish buying Sprint if Dish’s own employees hate working for it? Glassdoor ranks it as one of the worst companies to work for in America. I have nothing against Dish Network, and the earth won’t be devoured by locusts if it buys Sprint (hopefully). Hell, your service will likely improve if you’re on the carrier. Maybe some crazy innovative offerings pop up out of it, and it turns out to be just what the industry needs, and dozens (literally dozen) of you will come back to this editorial in a few years to point and laugh at my lack of accurate foresight. I’m willing to take that chance.
A Softbank/Sprint merger would help Sprint and subscribers more in the near future. If there’s one thing we need more of, it’s better data plans and cheaper wireless prices. Softbank has a history of offering a lot more than competitors, for less; Dish, on the other hand, seems more concerned with turning Sprint into the last triangle in its Triforce of power (see: Ganon). Dish wants to become a media monster. Softbank just wants a foot in the door.
But hey, if Dish wins, you’ll finally have a reason to enter a Blockbuster Video store again: to buy your new Sprint phone. | 科技 |
2016-40/3983/en_head.json.gz/5274 | Calling all ham-radio operators Special event planned for June 22 and 23
KENNEBUNK — As part of an annual countrywide event, amateur radio operators from the region will work around the clock on the weekend of Saturday and Sunday, June 22 and 23 at a solar-powered emergency radio station in the field at Sea Road School in Kennebunk. This event is in its second consecutive year.Ham-radio operators from Sanford are among those expected to attend.Sponsored by a Kennebunk-based educational technical club called the New England Radio Discussion Society, licensed ham radio operators will convene at 2 p.m. on the 22nd to set up an emergency short-wave radio station. Once again, the radio station will attempt to contact hundreds of other amateur radio operators around the world.Radio hams, individually licensed by the Federal Communications Commission, have unique call signs assigned by the FCC. For 2013 Field Day, the station will use the specially assigned FCC call sign of K1R.Radio station K1R will participate in the American Radio Relay League’s annual Field Day, an event that takes place in the United States and Canada. ARRL Field Day is the largest single emergency communications preparedness exercise in the country, with more than 35,000 American and Canadian ham radio operators participating each year.Kicked off by the ARRL back in 1933, Field Day has a contest flavor to it. The primary purpose is to demonstrate the ability to plan operations that can be effective for an entire 24-hour period, including operator endurance and adequate numbers of operators for a shift operation.A secondary goal is to demonstrate the technical proficiency of the station that has been hastily constructed. In theory, a better station will be capable of emergency operations in dire conditions. Such a station will also be capable of making more contacts during the contest portion of Field Day.The ARRL, the world’s oldest and largest ham radio organization, scores Field Day participants based on effectiveness.“Field Day is a chance to test and improve emergency communication skills,” explains NERDS member Alex Mendelsohn. “We’ll use solar panels and battery power to simulate operation in a remote area under emergency conditions. “Over the years, hams have been effective in establishing communications nets during floods, hurricanes, fires, earthquakes, terrorist attacks, and other disasters, whether they be local, regional, or national in scope. When all other means of communications fail, ham radio gets through.”Using Morse code and voice, K1R will go on the air starting at 2 p.m. that Saturday and will continue operating until Sunday afternoon. The public is invited to the Field Day site, where ham operators will be available to discuss amateur radio and answer questions. | 科技 |
2016-40/3983/en_head.json.gz/5309 | A Gentle Reminder There are a lot of people from my end of the political spectrum busy reminding those who’ve forgotten, or those who’ve been born since her period of tenure of the many negative things Mrs T did to this country and the world, however, there is one important anomaly I wish to raise among the waves of righteous criticism.
In a landmark speech to the Royal Society, given at Fishmongers Hall in the City of London on September 27 1988, Margaret Thatcher said:
“For generations, we have assumed that the efforts of mankind would leave the fundamental equilibrium of the world's systems and atmosphere stable. But it is possible that with all these enormous changes (population, agricultural, use of fossil fuels) concentrated into such a short period of time, we have unwittingly begun a massive experiment with the system of this planet itself."
"Recently three changes in atmospheric chemistry have become familiar subjects of concern. The first is the increase in the greenhouse gases—carbon dioxide, methane, and chlorofluorocarbons—which has led some to fear that we are creating a global heat trap which could lead to climatic instability. We are told that a warming effect of 1°C per decade would greatly exceed the capacity of our natural habitat to cope. Such warming could cause accelerated melting of glacial ice and a consequent increase in the sea level of several feet over the next century. This was brought home to me at the Commonwealth Conference in Vancouver last year when the President of the Maldive Islands reminded us that the highest part of the Maldives is only six feet above sea level. The population is 177,000. It is noteworthy that the five warmest years in a century of records have all been in the 1980s—though we may not have seen much evidence in Britain!"
Again 1989, Thatcher – the possessor of a chemistry degree - warned in a speech to the UN that "We are seeing a vast increase in the amount of carbon dioxide reaching the atmosphere... The result is that change in future is likely to be more fundamental and more widespread than anything we have known hitherto." She called for a global treaty on climate change.
To my delight I got the main quote from a Telegraph scream piece hammered out by the rabid climate change denier madman James Delingpole. Some relish must be devoured at the pain this must cause such a clench-fisted loon as Mr Shouty. Yes, she later recanted once the nutbags in the ‘it’s all lefty nonsense’ were able to poison her ear, but she wasn’t Prime Minister then so it didn’t matter. What matters was, a British Prime Minister and a leading world politician who was also a scientist saw the peer reviewed scientific papers that were the result of millions of hours of research and understood how the conclusion was reached. She wasn’t at that stage influenced by the siren voices of the oil industry and the ultra-short-sighted bully-boy tactics of the extreme right that Mr (tragic) Delingpole and Lord (scary) Monkton represent with such fury.
Monday 04.08.13
Posted by Fully Charged | 科技 |
2016-40/3983/en_head.json.gz/5321 | Home Screens Nintendo DSi
Nintendo DSi Screenshots
Nintendo DSi launches April 5 in the United States
Nintendo pioneered hand-held entertainment in the ’80s and made it fully mobile with the Game Boy video game system. Now, Nintendo is transforming the way people access, experience, create and share content with the new Nintendo DSi system, the third iteration of the world’s best-selling portable video game system. Nintendo DSi launches in the United States on April 5, 2009, at an MSRP of $169.99. The colors available at launch will be Blue and Black.
"Ever since the arrival of the first Game Boy, consumers worldwide have turned to Nintendo for their portable gaming," said Cammie Dunaway, Nintendo of America’s executive vice president of Sales & Marketing. “Nintendo DSi builds on Nintendo’s commitment to bringing fun and creative entertainment to everyone, and will allow consumers to personalize and share their very own experiences.”
Some features that will be built into the system and ready to enjoy upon purchase include the Nintendo DSi Camera, Nintendo DSi Sound and Nintendo DSi Shop. The most noticeable feature of the slim Nintendo DSi system is its two cameras – one camera is on the external body, and the second one points at the user when the device is flipped open. As the first truly interactive digital camera in a video game system with 10 different interactive “lenses” that can manipulate your photos, the Nintendo DSi Camera offers an easy way to take and share your photos with family and friends. The cameras also present people with unprecedented ways to interact with their games while giving developers a new tool to devise creative games and experiences. If the touch screen gave Nintendo DS a sense of feel and the microphone allowed it to hear, the two cameras give Nintendo DSi the sense of sight.
Another enhanced feature is the Nintendo DSi Sound application, which serves as both an interactive voice recorder and music player that allows users to play with their music while they listen to it. Users can access different audio filters or control the pitch and speed of recorded voice or music files to alter voices or change the tempo of a song. The mic is located between the two screens when the device is flipped open, and there is also a stereo headphone output that lets users listen to music saved on an SD card, even with the screen shut.
In the world of software, Nintendo DSi will be the platform for the most relevant and fun on-the-go games and applications. The Nintendo DSiWare application will populate Nintendo DSi with software that can be downloaded using Nintendo DSi Points directly to the portable system, just as WiiWare has with Nintendo’s Wii console. Developers big and small are invited to create software that makes use of the properties and functions of the hardware. Nintendo DSiWare games and applications will be available at a range of values, starting at 200 points.
In addition to downloadable games, Nintendo DSi is able to play games made specifically for the system and sold at retail. The system can also play most Nintendo DS™ games, and will have access to a library of more than 850 titles originally made for that system.
Also on April 5, a new Nintendo DS game, Rhythm Heaven, will join this roster of games available for both Nintendo DSi and Nintendo DS owners in the United States. Having sold more than 1.6 million copies since its July 2008 launch in Japan and still increasing sales today, this infectious game challenges players to tap and slide the stylus on the touch screen in time to original music created by legendary Japanese pop-star producer TSUNKU♂. Simple gestures with the stylus combined with fun music and quirky visuals make Rhythm Heaven a completely unique musical experience for players of all ages.
Additional features and news of Nintendo DSi will be revealed as April 5 approaches. For more information about Nintendo DSi, visit www.nintendodsi.com.
More information about Nintendo DSi | 科技 |
2016-40/3983/en_head.json.gz/5335 | On The Bleeding Edge: Bleeding Edge TV 570: HP Spectre x360 OLED laptop
Garbage Pail Kids series 3 trading card paintings
Artist and fan Brent Engstrom has shared his latest painting renditions of Garbage Pail Kids series 3 trading cards. I'm a huge garbage Pail Kids fan, having spent my weekly $1 allowance on fourpacks each week way back in 1985 as a 5-year-old for months on end. You can check out the full collection of paintings over at Brent's blog. Definitely worth a look.
Read More | Brent Engstrom
Bleeding Edge TV 489: Fiesta Movement Pro Wrestling 101
Okay, if you don't know, I used to be a pro wrestler. That's right, I'd put my tights on every week, step through the curtain, and enter a 20 x 20 wrestling ring to entertain a crowd of people, and I loved it. That was a long time ago, and now I bring you the hottest news in the consumer electronics world, as you know.
However, now that I am a Ford #FiestaMovement agent, you should expect to see something a bit more unique here on my channel each month. I'll be doing some pretty unique missions as part of the Fiesta Movement, using the Fiesta ST that was given to me for the duration of the Movement to bring you some cool stuff. This month? I'm bringing you into the world of professional wrestling as part of the June #Fitness theme.
Why pro wrestling? Well, I didn't want another agent to grab this mission and turn it into a comedy act! I saw this as an opportunity to educate the masses about this truly unique artform. I head to the Buddy Wayne pro wrestling training facility to chat about what it takes to become a professional wrestler. Buddy should know. After all, at just 5' 6", he has made a career in the land of the giants, having wrestling for both WWE (formerly WWF or the World Wrestling Federation) and WCW (World Championship Wrestling.) In wrestling, you only look as good as your opponent allows you to look, and Buddy was able to make his opponents look great for 12 years.
Don't forget to subscribe to Gear Live's YouTube channel!
Watch this: Draw My Life - Princess Zelda
Princess Zelda is getting in on all that YouTube "Draw My Life" action, putting what possibly may be the most unique life yet to pen and paper. In all seriousness, though, we think that watching fans of iconic game characters making these types of videos is a fun idea. Here's to hoping we see more of this infused into gaming culture. For now, enjoy Zelda's life in the video after the break.
Museum of Modern Art begins collecting video games for new exhibit
Posted by John Kilhefner
Museums must have something against Roger Ebert. First, the Smithsonian American Art Museum holds an exclusive video game event earlier this year, and now the New York City Museum of Modern Art is following suit.
MoMA is officially bringing in 14 videogame classics to begin an ongoing gaming collection that will go on display in March 2013 in the Philip Johnson Architecture and Design Galleries. Currently, the included games feature obvious choices such as Pac-Man, modern games like Portal, and obscure games like vib-ribbon. The collection MoMA is aiming for consists of about 40 titles, which will fall in as part of a "new category of artworks."
Read More | MoMA
NYC taxi driver returns over $13,000 in lost gadgets to forgetful owner
Misc. Tech,
Have you ever left anything in a cab? We know plenty who've lost iPhone and Android devices, tablets, laptops, and other expensive gadgetry, but Casey Neistat accidentally left over $13,000 in expensive technology in his taxi. After going through the frustrating process of filing reports with the taxi company, making calls, and getting a police report filed, he wondered if he'd get his stuff back at all. Seriously, how likely is it that you leave something expensive in a taxi and expect to get it back? Normally the item is found by another passenger, or the taxi driver himself, and then disappears for good. Not this time.
Click to continue reading NYC taxi driver returns over $13,000 in lost gadgets to forgetful owner
Super Modern Mario Bros. makes Mario unrealistically realistic (video!)
Platformers,
Check out the video below for Super Modern Mario Bros. It's a novel concept dreamed up by a gamer, removing a lot of the cutesy Mario-ness, dialing up the level of seriousness by removing the music and adding realistic sound effects. It’s a bit more violent, too, with Goombas exploding and Mario crashing to his untimely demise when he leaps towards the end-of-level flag and fails to successfully grab hold and slide down. Look, just watch it--it's way more fun than reading our description!
Fortune’s “Inside Apple” article is a must-read
Fortune magazine recently published an in-depth piece on the highly secretive culture and inner workings of Apple. It's in the latest Fortune 500 issue, and isn't yet available freely online, however, you can download it from the Kindle Store for 99 cents to read it on a Kindle, PC, Mac, or any of the smartphone platforms they support (iOS, Android, etc.) For a buck, we'd consider this one a must-read. You get a lot of juicy tidbits about the company, including just how disappointed Steve Jobs was about the horribly botched launch of MobileMe:
According to a participant in the meeting, Jobs walked in, clad in his trademark black mock turtleneck and blue jeans, clasped his hands together and asked a simple question: "Can anyone tell me what MobileMe is supposed to do?" Having received a satisfactory answer, he continued, "So why the **** doesn't it do that?"
For the next half-hour Jobs berated the group. "You've tarnished Apple's reputation," he told them. "You should hate each other for having let each other down."
Harsh, but those are the actions of a man who seemingly doesn't tolerate failure, and aims to exceed expectations. In fact, he doesn't want to ever hear excuses from any of the Apple VP-level employees:
The janitor gets to explain why something went wrong. Senior people do not. "When you're the janitor," Jobs has repeatedly told incoming VPs, "reasons matter." He continues: "Somewhere between the janitor and the CEO, reasons stop mattering." That "Rubicon," he has said, "is crossed when you become a VP."
This is some good stuff, and Fortune has a lot more in the full-length article. If you're at all interested in Apple, either from a consumer standpoint, or just interest in the management style that makes them so unique, give this one a look.
Read More | Inside Apple
Is a Farmville for Dummies book really necessary?
How do you justify the existence of a book that lays out the concept and strategy to succeed in Farmville? Especially when there are sites that can help you free of charge. Co-author Kyle Orland of Farmville for Dummies sat down with Ars Technica to make an argument for putting the casual Internet game into layman's terms.
"Sure, Farmville isn't a particularly difficult game—it practically holds your hand in telling you how to play, and there's a limited amount of strategy to playing," Orland told Ars. "Some of the sections of the game—especially the farmer's market and some aspects of animal tending—require multiple, time-lapsed steps to complete successfully, and the game only gives one screen of text-heavy explanation for how to go about them."
Click to continue reading Is a Farmville for Dummies book really necessary?
Google and Twitter add tweet by phone for Egypt
Posted by Patrick Lambert
In light of the severe Internet disruptions that are happening in Egypt, where the government cut off all Internet access and SMS messages in and out of the country in an attempt to silence the 80 million people living there, we've seen a number of technologies come up to break the blockade. People have been using ham radios, satellite phones, and fax machines to make their voices heard. Now, Google and Twitter have partnered up over the weekend to create a phone-to-tweet service, where people from Egypt can make calls to an international number, and their messages will be tweeted automatically. Yet another example that proves you can't silence an entire people, and the communication always finds a way. Very cool.
Read More | Google Blog
Real-life Fallout 3 Helmet
I am known to dabble in the arts - I paint, draw, and like to make a mess with glue. However, don't ever confuse this with being in the least crafty. I cannot make anything myself, which is a great loss when Halloween comes around. Unlike me, Josh Jay probably got all the candy on the block.
Josh is the creator of this amazing Fallout 3 helmet. Why he crafted this stunningly detailed piece is unknown, but those reasons don't matter. It is awesome.
To see pictures of the step-by-step process, you can view his Facebook galleries: | 科技 |
2016-40/3983/en_head.json.gz/5368 | Used PCs still being dumped despite high demand
Only 44 percent of computers entering the secondary market end up in the hands of a new owner, despite the fact that worldwide demand is greater than supply.
Mikael Ricknäs (IDG News Service) on 27 November, 2008 08:05
More used computers could be reused; only 44 percent of computers entering the secondary market end up in the hands of a new owner, despite the fact that worldwide demand for such computers is greater than supply, according to a Gartner report.Export tariffs and high transportation costs are restricting exports from mature markets to emerging markets. Environmental legislation is also making it harder for low-volume players to compete, according to Gartner.Demand is growing fastest in the Middle East, Africa and emerging markets in the Asia and Pacific region, in particular China. The largest exporters of secondary PCs are North America, Western Europe, Japan and Australia.As pressure increases on developing countries to accept used PCs as a viable technology solution for more basic computing tasks such as Internet surfing and Web e-mail, demand is likely to grow, said Meike Escherich, principal analyst at Gartner.But competition for second-hand PCs is increasing as the average selling price of new PCs falls, and as buyers increasingly prefer notebooks with the most-recent specifications, or ultralow-cost mini-notebooks.Nevertheless, "We expect that most buyers of used PCs will prefer a higher-specification A-branded PC over a basic mini-notebook," said Escherich.
In the end, business is generally good for the commercial resale of secondary PCs, and it is not uncommon for refurbished PCs to offer equal or even better margin opportunities than new PCs, according to Escherich.Resellers' success depends on their ability to get their hands on multiple PCs of the same configuration, mainly provided by large and midsize businesses and government agencies, rather than dealing with individual systems.Meanwhile, the United Nations this week released a publication titled ‘The Entrepreneur's Guide to Computer Recycling’.The UN says the purpose of the guide is to help develop the skills required to handle the “growing flux of waste generated by the market for new and used computers so as to protect the environment and public health”.Although the guide promotes responsible disposal of PCs, it also supports Gartner belief in a greater reuse of PCs.“Reuse of obsolete or unwanted IT and computer equipment is the preferred option, rather than destruction,” according to a UN statement. “It allows other users to benefit from the equipment at low cost, extends the return on energy and resources used in manufacturing the products and reduces the quantity of devices entering the waste stream.”
Tags recyclinggreen IT | 科技 |
2016-40/3983/en_head.json.gz/5492 | PRESS RELEASE: Foveon X3 Image Sensor Receives 2002 `Best of What's New Award' from Popular Science Magazine
Foveon's technology wins prestigious "Grand Award" in the Photo CategorySANTA CLARA, Calif.--(BUSINESS WIRE)--Nov. 8, 2002-- Foveon, Inc., a technology leader in high quality digital photography, announces that the new Foveon® X3(TM) image sensor has been chosen by Popular Science magazine to receive the "Best of What's New" Grand Award in the Photo Category. Popular Science editors reviewed thousands of new products and innovations for 2002 and chose just 100 winners in 10 categories for inclusion in the December "Best of What's New" issue. And out of that just one Grand Award winner is selected for each category. To win a "Best of What's New" Award, a product or technology must represent a significant step forward in its category. "It is an honor to be selected by the editors of Popular Science magazine to receive the coveted 'Best of What's New' Grand Award," said Jim Lau, Foveon's president and CEO. "This award supports our belief that the Foveon X3 image sensor technology is a fundamental breakthrough in digital imaging that will have an impact on the future of photography." In addition to the "Best of What's New" Award, the Foveon X3 image sensor has also received worldwide recognition this year along with numerous prestigious awards. These awards include: Europe's Technical Image Press Association's (TIPA) "Best Innovative Technology" in 2002 - 2003; the European Imaging and Sound Association's (EISA) "Best Photo Innovation" for 2002 - 2003; the 2002 Innovative Digital Products Award from The Digital Imaging Marketing Association (DIMA); and the German CHIP computer magazine's prestigious CeBIT Highlights 2002 Innovation Award. The Foveon X3 sensor was chosen by CHIP magazine because of its revolutionary technology that will have far reaching impact on the digital camera and video markets. About Foveon X3 Technology The new line of Foveon image sensors utilize the company's new Foveon X3 technology and are the world's first full-color image sensors that capture red, green and blue light at each and every pixel. This technology innovation results in sharper images, better color, and freedom from color artifacts common in present digital cameras. Available now in the Sigma SD9 digital camera, the Foveon X3 image sensors are the first to detect color by embedding three photodetectors in silicon at every pixel location. Within a year it is expected the Foveon X3 technology will be available for all classes of digital cameras, ranging from high-end professional systems to low-cost point-and-shoot cameras. The name X3 comes from a unique capability that the Foveon X3 technology brings - the ability to capture three colors at each single pixel location. Foveon X3 image sensors capture the full color of an image without using a color mosaic filter and without the expense, complexity, and limitations of multi-chip systems such as a 3-CCD camera or a multi-shot system. About Foveon Since its founding in 1997, the company is focused on the development of image capture products for digital cameras and image capture applications that enable higher levels of image quality and new system capabilities not possible with today's CCD technology. On February 11, 2002, Foveon introduced the world's first full-color image sensor - the Foveon X3 Pro 10M designed for digital SLR applications. The company has expanded its product line to include image sensors for smaller optical formats suitable for cell phones, point and shoot cameras and other small format camera applications. Foveon is a privately held company. Investors include: National Semiconductor Inc., Synaptics Inc. and New Enterprise Associates.
(First posted on Friday, November 8, 2002 at 15:17 EST)
Powered by Coranto | 科技 |