id
stringlengths 30
34
| text
stringlengths 0
71.3k
| industry_type
stringclasses 1
value |
---|---|---|
2016-40/3982/en_head.json.gz/10229 | Rebecca Jeschke Media Relations Director and Digital Rights Analyst rebecca@eff.org @effraj PGP Key +1 415 436 9333 x125 Rebecca Jeschke is EFF's Media Relations Director and a Digital Rights Analyst, fielding press requests on a broad range of issues including privacy, free speech, and intellectual property matters. Her media appearances include Fox News, CNN, NPR, USA Today, New York Times, Washington Post, Associated Press, and Harper's Magazine, and she has been a presenter at South by Southwest. Before joining EFF in 2005, Rebecca worked in television and Internet news for more than ten years, including stints as an Internet producer for CBS 5 in San Francisco and as a senior supervising producer for TechTV. She has also been a travel guide editor, an English teacher in the Dominican Republic, and a worker on a "slime line" gutting fish in Alaska. Rebecca has a Bachelor of Arts in English and American Literature and Language from Harvard University.
Deeplinks Posts by Rebecca Jeschke
Press Releases by Rebecca Jeschke
September 27, 2007 Justice Department Withholds Records of Stealth Campaign to Block Surveillance Suits EFF Lawsuit Demands Information About Telecom Industry Lobbying September 17, 2007 Two Leading Technologists Join EFF Board of Directors Free Culture Leader John Buckman and Privacy and Security Expert Lorrie Faith Cranor Sign on to Distinguished Team
San Francisco - The Board of Directors of the Electronic Frontier Foundation (EFF) has elected two leading technologists to join its executive board: free culture leader John Buckman and privacy and security expert Lorrie Faith Cranor. September 11, 2007 EFF Wins Protection for Security Researchers Court Blocks DirecTV's Heavy-Handed Legal Tactics
San Francisco - In an important ruling today, the 9th U.S. Circuit Court of Appeals blocked satellite television provider DirecTV's heavy-handed legal tactics and protected security and computer science research into satellite and smart card technology after hearing argument from the Electronic Frontier Foundation (EFF). September 10, 2007 Noted Computer Crime Attorney Comes to EFF Jennifer Stisa Granick Named Civil Liberties Director
San Francisco - Noted computer crime attorney Jennifer Stisa Granick has joined the Electronic Frontier Foundation (EFF) as its new Civil Liberties Director, working on government surveillance, Fourth Amendment, computer security, and computer crime law. August 29, 2007 Back to School for Reading, Writing, and RIAA Lawsuits? EFF Releases Comprehensive Report on Recording Industry's Litigation Campaign August 22, 2007 EFF Challenges Bogus Patent on Internet Subdomains Illegitimate Patent Used to Threaten Website Hosting Companies
San Francisco - The Electronic Frontier Foundation (EFF) is challenging a bogus patent on Internet subdomains that has been used to threaten small businesses and innovators. August 15, 2007 AT&T Must Face Justice for Illegal Spying NSA Surveillance Comes Under Fire Today in Appeals Court Battle
San Francisco - In a packed San Francisco courtroom today, the Electronic Frontier Foundation (EFF) urged the 9th U.S. Circuit Court of Appeals to allow AT&T customers to continue to fight against illegal spying on their telephone and Internet communications. August 9, 2007 Appeals Court Battle Over NSA Surveillance on August 15 Government Aims to Block Accountability for Illegal Spying on Americans
San Francisco - In the wake of Congress approving a dramatic expansion of U.S. warrantless wiretapping powers, the 9th U.S. Circuit Court of Appeals will hear arguments on the future of two critical lawsuits over illegal surveillance of Americans. The hearing is set for August 15, at 2 p.m. in San Francisco. August 6, 2007 Online CD Seller Fights Universal's Bogus Infringement Allegations Record Industry Takes Aim at Right of 'First Sale'
San Francisco - An eBay seller is taking on Universal Music Group (UMG) in court after the record industry giant targeted his online music sales with false claims of copyright infringement. July 23, 2007 Thursday Hearing on Secret Orders for Domestic Spying Justice Department Withholds Records on Electronic Surveillance
Washington, D.C. - On Thursday, July 26, at 11 a.m., the Electronic Frontier Foundation (EFF) will argue for the release of court orders that supposedly authorize the government's highly controversial electronic domestic surveillance program that intercepts and analyzes millions of Americans' communications. Pages« first
Donate to EFF Stay in Touch Email Address | 科技 |
2016-40/3982/en_head.json.gz/10303 | Apple Finds $85M Error in Samsung Damages Ruling
@jgamet · +Jeff Gamet Mar 27th, 2013 11:57 AM EDT | News
A new Apple filing in its patent infringement case against Samsung claims that Judge Lucy Koh made a US$85 million error when she tossed out $450.5 million of the $1 billion in damages the iPhone and iPad maker was awarded in a ruling last year. Judge Koh ruled that the $450.5 million was improperly awarded by a Jury for several Samsung Android OS-based devices and ordered a new trial to determine what the damages related to those productes should ultimately be.
Apple says Judge Koh tossed out $85 million too much in patent damages
Florian Mueller of Foss Patents stated,
Interestingly, Apple claims to have found an error on Judge Koh's part to the tune of $85 million (approximately 19 percent of the total vacated amount): she thought the jury had granted, on the basis of an impermissible legal theory presented by Apple at its own peril, $40,494,356 for the Galaxy S II AT&T and $44,792,974 for the Infuse 4G, but Apple points out that Samsung's own admissions concerning the dates of first sale of these products as well as certain exhibits consistent with those admissions prove that the relevant theory -- disgorgement of profits for design patent infringement -- was permissible.
If Judge Koh agrees, then the number of devices in the new damages trial will drop from 14 to 12, and the current damages total would climb up to $685 million. The damages for the remaining 12 devices could ultimately be higher or lower than $365.5 million left after what Apple is calling an $85 million error.
Apple and Samsung have been fighting in courts around the world over mutual patent infringement claims related to their mobile device lineups for years. Apple scored its highest profile win last August when a U.S. Federal Court Jury ruled that Samsung infringed on a long list of its patents and awarded it $1 billion in damages.
Apple has been pushing to keep the legal process moving forward for the damages discrepancy while Samsung has been hoping to stall the case until the appeals process has been exhausted.
Judge Koh hasn't responded to Apple's new filing yet.
Assuming Judge Koh agrees with Apple, the big change will be that the base damages total for Samsung's patent infringement will go up by $85 million. The final amount for the damages that were tossed out is still up in the air, so Apple's push to move forward and Samsung's efforts to stall are still business as usual. | 科技 |
2016-40/3982/en_head.json.gz/10332 | EmailA to ZContactsSite MapNewsMultimediaSearch Topics and PeopleShortcuts Other News Emergency Info Media Central Event Streaming Public Events Calendar Faculty News Student Publications The Daily Princetonian Campus Media Local News World News About PrincetonAcademicsAdmission & AidArtsInternationalLibraryResearch�Administration & ServicesCampus LifeVisiting CampusStudentsFaculty & StaffAlumniParents & FamiliesUndergraduate ApplicantsGraduate School ApplicantsMobile Princeton Web AppMobile Princeton App for AndroidMobile Princeton App for iOSConnect & SubscribeHome � News � Archive � University celebrates Princeton inventionsNews at PrincetonFriday, Sept. 30, 2016News StoriesFAQsEvents & CalendarsMultimediaFor News MediaShare Your NewsCurrent StoriesFeaturesScience & TechPeopleEmergency AlertsUniversity BulletinArchive�The annual Celebrate Princeton Invention event allows Princeton inventors to share their projects with a broader audience.
Art courtesy of Warren Powell and Belgacem Bouzaiene-Ayari
Web StoriesTo News Archive|� Previous by Date|Next by Date �University celebrates Princeton inventions
Posted December 4, 2012; 12:58 p.m.by Catherine Zandonella, Office of the Dean for ResearchTweet�e-mail
Princeton-led discoveries show up not only on the pages of scientific journals — they are also on the shelves of major retailers. At the Celebrate Princeton Invention reception Nov. 29, the University honored its faculty, staff and student inventors for their role in technology transfer — which includes patenting and licensing of inventions — to bring research innovations to the marketplace.
The featured inventions included Princeton technologies in various stages of development, from prototypes to finished products such as those presented by Vorbeck Materials, a company that manufactures graphene-based printable electronics using intellectual property licensed from the University.
Branko Glišić, an assistant professor of civil and environmental engineering, presents his work at Celebrate Princeton Invention on a low-cost yet accurate system for detecting defects in bridges. Glišić collaborated with Naveen Verma, an assistant professor of electrical engineering, on their invention that features sensors embedded in customizable plastic sheets that feed to integrated circuits, allowing communication with other sensors in the bridge and remote monitoring of the structure. (Photo by Frank Wojciechowski)Celebrate Princeton Invention, now in its fourth year, attracted representatives from investment capital firms, the entrepreneurial community and technology companies to displays and a reception at the Chancellor Green Rotunda.�
Early-stage innovations on display included: a sensor system for detecting cracks in bridges that could prevent collapses; an inexpensive replacement for platinum in manufacturing; a system that researchers can use to turn genes on and off; an efficient method for computer processing units to talk to each other; new materials for harnessing energy from sunlight; and a sensor that can detect poisonous gases from a distance.
The Princeton-led discovery by Paul Chirik, the Edwards S. Sanford Professor of Chemistry, and researchers including graduate student Crisita Atienza, was finding that they could replace platinum with a much more common substance — iron. (Photo by Denise Applewhite)"These discoveries are the future of innovation in the country, and at Princeton we are committed to technology transfer and working with our partners in industry to develop University inventions so they can benefit society," said Dean for Research A.J. Stewart Smith, who is also the Class of 1909 Professor of Physics.�
"At Princeton we pride ourselves on streamlining the disclosure and licensing process for our faculty researchers as well as our industrial and investment partners," said John Ritter, director of Princeton's Office of Technology Licensing.
Keynote speaker John Lettow, a member of the Class of 1995 and president of Vorbeck Materials, spoke favorably of working with Princeton to license and develop technologies invented in the laboratory of two Princeton professors of chemical and biological engineering, Ilhan Aksay and Robert Prud'homme.�
"There is a unique quality to the technology licensing department here, a flexibility that was beneficial both for Princeton and for us as a small company," said Lettow. He also praised the high caliber of Princeton research and the importance that the University places on having world-renowned faculty work directly with undergraduates and graduate students.
Another factor in Vorbeck's success, said Lettow, was the continued involvement of Aksay in an advisory role as the company moved Aksay's technologies for the production of graphene, a novel carbon-based material with excellent electrical conductivity, toward commercialization.
Vorbeck's technology can be found in tamper-proof packaging used by major retailers to discourage product theft. The company creates very thin electronic circuits by printing graphene liquid (ink) onto a surface. Vorbeck has teamed with packaging manufacturer MeadWestvaco to create an electronic sensor web that is sandwiched between layers of cardboard in retail product packaging. If the item is broken into or the product removed from the package, the electronic circuit activates an on-package alarm.�
The ink is one of several applications — including longer lasting batteries — that Vorbeck is developing for graphene, which has unique electrical and thermal properties. According to Lettow, Vorbeck is the first company in the world to have a product based on graphene, a novel material that is exciting interest in the electronics industry and was the subject of the 2010 Nobel Prize in physics.
For the first time this year students involved in the Princeton Entrepreneurship Club attended the event, including senior Ashley Eberhart. (Photo by Frank Wojciechowski)Novel research ideas are in great demand, especially in a changing world, said guest speaker Ralph Izzo, chairman and chief executive officer of PSEG, the parent company of the utility group Public Service Gas and Electric (PSE&G). Referring to Hurricane Sandy, which hit the Eastern United States on Oct. 29, �Izzo said that PSE&G coped with unprecedented challenges to its electrical supply and distribution system. "If this is the 'new normal,' then we need to completely reinvent the way we manage the system," he said. "I picked up a couple of new ideas while I was here tonight."
Izzo, who prefaced his comments by asking if anyone in the room was still without power a month after the storm, said that needed technologies include new methods for distributed generation of electricity, improvements in solar energy efficiency, cost-effective ways to store electricity and sensors that can detect customer outages. Prior to becoming an executive, Izzo worked as a postdoctoral researcher at the Princeton Plasma Physics Laboratory.�
PSEG is a charter member of Princeton's Energy and Environment Corporate Affiliates Program, which supports research that addresses global energy needs and environmental concerns and is organized by the Andlinger Center for Energy and the Environment, the Princeton Environmental Institute, the School of Architecture, and the Woodrow Wilson School of Public and International Affairs.
At the fourth annual Celebrate Princeton Invention event held at the Chancellor Green Rotunda Nov. 29, Princeton inventors shared their work with representatives from investment capital firms, the entrepreneurial community and technology companies. (Photo by Frank Wojciechowski)Among this year's featured technologies:�
A sensor system for detecting cracks in large structures such as bridges. Methods for detecting deterioration of large structures could prevent disasters such as the 2007 collapse of the Interstate 35W bridge in Minneapolis. Princeton engineers have developed a detection system embedded in customizable plastic sheets that could be applied to large structures. The inventors are Naveen Verma, assistant professor of electrical engineering, and Branko�Glišic, assistant professor of civil and environmental engineering, along with Sigurd Wagner, professor of electrical engineering, and James Sturm, the William and Edna Macaleer Professor of Engineering and Applied Science.�
A new method for using inexpensive iron metal in place of expensive platinum. Iron can be used in industrial chemical reactions that are needed for making silicone, which is used in making consumer products including waterproofing for blue jeans and anti-foaming agents for beer. The technology developed by Paul Chirik, the Edwards S. Sanford Professor of Chemistry, with the assistance of Aaron Tondreau, who received his Ph.D. from Cornell University in 2011, and graduate student Crisita Atienza, could make manufacturing less expensive and reduce the environmental impact of mining for precious metals.�
A system for detecting gases from a distance. Accurate measurements of greenhouse gases and smokestack exhaust are vital to our understanding of how pollution influences climate change. Developed by assistant professor of electrical engineering Gerard Wysocki, this gas-detection system could also be used to detect poisonous gases after a chemical plant accident or as part of routine monitoring of smokestack exhaust.
Novel methods for improving solar energy devices. Solar energy has the potential to reduce reliance on nonrenewable fuels, but efficiency and durability remain issues of concern. Emily Carter, the Gerhard R. Andlinger Professor in Energy and the Environment, has developed novel materials that can significantly boost the ability of solar cell materials to convert sunlight to electricity.
The Princeton Entrepreneurship Club. A new addition to the event this year was the presence of the Princeton Entrepreneurship Club, a student organization that supports entrepreneurial activities such as visits to start-ups and networking with established entrepreneurs.
A more power-efficient way for computer processing units (CPUs) to communicate with another piece of hardware known as the graphics processing unit. The technology, which could be used to improve scientific computing, data analytics and medical imaging, was presented by Daniel Lustig, a postdoctoral researcher working with Margaret Martonosi, the Hugh Trumbull Adams '35 Professor of Computer Science.�
A new system to turn on single genes and turn off single proteins. This technique, which can be used by researchers to explore gene functions, was presented by David Botstein, the Anthony B. Evnin '62 Professor of Genomics, and his colleagues in the Lewis-Sigler Institute for Integrative Genomics.
The event featured a variety of notable guests from major companies and firms engaged in intellectual property law, venture capital and technology, including Lockheed Martin, Dupont, SAP and Banco Santander, parent company of Sovereign Bank. Also in attendance was New Jersey Assemblyman Jack Ciattarelli, who represents Princeton Borough and Township.
"Intellectual property is the future of innovation," said Smith. "We have to nourish it, support it, encourage it and protect it, because it if it is not protected, it is no use to anybody." Patents encourage innovation, Smith added, because they provide companies with the ability to develop technologies without fear that competitors will develop a similar device or application.�
The event, organized by the Office of the Dean for Research, was sponsored by a donation by the law firm of McCarter & English. Support for the research at Princeton comes primarily from federal sponsors and also foundations and corporations. | 科技 |
2016-40/3982/en_head.json.gz/10335 | Qualcomm Signs ASIC License with NEC for WCDMA and TD-SCDMA StandardsJuly 15, 2002SAN DIEGOSAN DIEGO -- July 15, 2002 -- Qualcomm Incorporated (Nasdaq: QCOM), pioneer and world leader of Code Division Multiple Access (CDMA) digital wireless technology, today announced that it has signed a license agreement for Application Specific Integrated Circuits (ASICs) with NEC Corporation. Qualcomm has granted NEC a worldwide license under Qualcomm's CDMA patent portfolio to develop, manufacture and sell third-generation (3G) WCDMA and TD-SCDMA ASICs products. Under the terms of the agreement, NEC will pay Qualcomm an up-front license fee and ongoing royalties as NEC begins selling WCDMA or TD-SCDMA ASICs. The agreement does not cover ASICs for other standards, including cdmaOne™ and CDMA2000 1X.
"This agreement with NEC represents further evidence of the strength of our patent portfolio and its applicability to WCDMA as well as TD-SCDMA," said Steve Altman, executive vice president and president of Qualcomm Technology Licensing. "We welcome NEC as a new ASIC licensee and hope that NEC's participation will help expand the global third-generation CDMA market."
"NEC has great respect for Qualcomm's CDMA patent portfolio and recognizes that it is essential for the development of third-generation CDMA products," said Hidetoshi Kosaka, general manager, Network System LSI Development Division, NEC Electron Devices. "NEC looks forward to expanding its product offering for the global 3G market with ASIC solutions, and continuing to work with Qualcomm on accelerating the growth of 3G CDMA."
NEC Corporation (Nasdaq: NIPNY) is a leading provider of Internet solutions, dedicated to meeting the specialized needs of its customers in the key computer network and electron device fields through its three market-focused in-house companies: NEC Solutions, NEC Networks and NEC Electron Devices. NEC Corporation, with its in-house companies, employs more than 140,000 people worldwide and saw net sales of approximately $39 billion in fiscal year 2001-2002. For further information, visit the NEC home page at http://www.nec.com/.
Qualcomm Incorporated (www.qualcomm.com) is a leader in developing and delivering innovative digital wireless communications products and services based on the Company's CDMA digital technology. Headquartered in San Diego, Calif., Qualcomm is included in theS&P500 Index and traded on The Nasdaq Stock Market® under the ticker symbol QCOM.
Except for the historical information contained herein, this news release contains forward-looking statements that are subject to risks and uncertainties, including the Company's ability to successfully design and have manufactured significant quantities of CDMA components on a timely and profitable basis, the extent and speed to which CDMA is deployed, change in economic conditions of the various markets the Company serves, as well as the other risks detailed from time to time in the Company's SEC reports, including the report on Form 10-K for the year ended September 30, 2001, and most recent Form 10-Q.
CorporateQualcomm is a registered trademark of Qualcomm Incorporated. cdmaOne is a trademark of the CDMA Development Group. All other trademarks are the property of their respective owners. | 科技 |
2016-40/3982/en_head.json.gz/10341 | What is the density of crude oil?
Crude oil is not a uniform liquid but rather a range of liquids with differing amounts of hydrocarbons. It is divided into grades ranging from light, which has a density of around 790 kg/m3, to extra heavy, which has a density of around 970 kg/m3. Continue Reading
What substances make up crude oil?
What is the density of cooking oil?
What is the boiling point of oil?
While the density of crude oil can be expressed in common scientific units, it is more often expressed in a measurement called API gravity. This is a measurement of how heavy a crude oil is compared to water. If its API gravity is larger than 10, it is lighter and will float on water, while if less than 10, it is heavier and sinks. In this measurement, light crude oil has an density of greater than 31.1° API, while extra heavy crude oil is below 10° API.
Learn more about Chemistry
engineeringtoolbox.com
petroleum.co.uk
What are the different types of hydrocarbons?
Hydrogen carbons can be broadly categorized into four types: saturated, unsaturated, cycloakanes and aromatic hydrocarbons. Saturated hydrocarbons are made...
What are the chemical properties of plastic?
Plastics, or polymers, are compounds composed of carbon and hydrogen known as hydrocarbons. According to the American Chemistry Council, polymers can have ...
What is the chemical formula for petroleum jelly?
Petroleum jelly is a mixture of several hydrocarbons, the parts of which are not chemically bonded to one another such as in a compound. Some of the ingred...
What is road tar made of?
Road tar is made from a mixture of hydrocarbons produced by processing crude oil or natural gas. Bitumen is the remnant of crude oil after the gasoline, ke...
What makes the bond in HF a polar bond?
What is the atomic number of calcium?
Where do bubbles in plain water come from?
How does chemical differentiation work?
How are chemical formulas named?
What is the chemical name for Co2(SO3)3?
Crude Oil Properties
Physical Properties of Crude Oil
Crude Oil Chart
Types of Crude Oil
Weight of Crude Oil
Crude Oil Facts
Chemical Properties of Crude Oil
Chart Crude Oil Prices
What are natural plastics?
What is the composition of pure air?
Is citric acid soluble in water?
What are properties in science? | 科技 |
2016-40/3982/en_head.json.gz/10426 | x Tropical and subtropical moist broadleaf forests
South America: Eastern extreme of the Amazon basin in Brazil This region hosts a rich and varied biota with some species restricted to this area at the eastern extreme of the Amazon province. The area may well be a center of diversification for many tree taxa. In this dense evergreen rain forest there is noted heterogeneity in the biota due to the presence of many rivers and to the transitional nature of the region. It hosts elements of the moist evergreen forests of the Amazon Basin in the east and north as well as the drier vegetation of Brazil’s central plateau to the south. A refugium is believed to have existed in the west of this region . One of the most developed areas of Amazonia, cities and interconnecting highways threaten most of the habitat.
Scientific Code
(NT0170)
Ecoregion Category
Neotropical
74,700 square miles
Critical/Endangered
Description Location and General DescriptionThe Tocantins-Araguaia-Maranhão moist forest is an area of dense rain forest in the eastern extreme of the Amazon Basin flanked by the mouth of the Amazon River and the Atlantic Ocean. In the state of Pará, the region extends west to the Tocantins River and south to the Mearim River in Maranhão State. A number of rivers run through the region including the Gurupi, Capim, and Guamá which feed into the mouth of the Amazon and are influenced by the daily tides that push Amazon water upstream. The Pindaré and Mearim drain into the Atlantic Ocean. Most of the region lies on a flat alluvial plain which has been heavily influenced by the dynamics of the Amazon River over geologic history. The low hills (less than 200 m) of the Serra do Tiracambu and Serra do Gurupi rise in the southwest. Temperature averages 26° C over the year, and annual rainfall ranges from 1,500 mm in the southern area to 2,500 in the north near the city of Belém. A pronounced dry season brings less than 100 mm of rain falling per month for about five months. This is particularly true in northwest Maranhão. The soils are mostly poor, deeply weathered clay.
Two main forest types occur here: terra firme (upland non-flooded forest) and flooded forest, of which there are two types: igapó, flooded daily by blackwater rivers (whose waters contain no suspended solids), and várzea, flooded daily by whitewater rivers (such as the Guamá) which contain suspended organic and mineral materials. The soils of the igapó forest are white sand, acidic, and poor in nutrients. The vegetation here is especially tolerant of these conditions. The forests are less diverse than terra firme forests and shorter.
Common species in flooded forests are Caraipa grandiflora, Virola surinamensis, Euterpe oleraceae, Ficus pulchella, Mauritia martiana, Symphonia globulifera, and species of Tovomita and Clusia . Dominant families of the terra firme include Lecythidaceae, Chrysobalanaceae, Burseraceae, Fabaceae, Lauraceae, and Sapotaceae. Prominent tree species include Lecythis odora, Lecythis turbinata, brazil nut (Bertholletia excelsa, rare in this area), Cenostigma tocantina, Bombax tocantinum, and the large liana Bauhinia bombaciflora. Mahogany (Swietenia macrophylla) is found in the Upper Capim and Guamá Rivers. Orchids are noticeably absent from this region . The important leguminous timber tree acapú, Vouacapoua americana, is restricted to eastern Amazonia north and south of the river.
Biodiversity FeaturesRare or threatened trees in the area include jaborandi (Pilocarpus microphyllus), mahogany (Swietenia macrophylla), and Dicypellium caryophyllatum, a timber tree whose bark contains a pleasant-smelling essential oil. The avifauna hosts 517 bird species. Two herons are uncommon elsewhere in Amazonia (Egretta tricolor and Nyctanassa violacea). Also present are toucans (Ramphastos spp.), two guans (Pipile cujubi, Penelope pileata), many Nearctic migrants, parrots, and parakeets. Out of the 149 mammals recorded in this region, more than 80 are bats. The bearded saki (Chiropotes satanas), howler monkey (Alouatta belzebul), and red-handed tamarin (Saguinus midas) are all eastern Amazonian primates. Two species of sloths (Bradypus variegatus and Choloepus didactylus) and the nine-banded armadillo (Dasypus novemcinctus) are common here. The rivers host a great number of fish and aquatic reptiles. More than 76 species of snakes are found here.
Current StatusThis is one of the most developed areas of Amazonia with several highways connecting the important cities of Belém, Paragominas, and Bragança. These big roads attract large-scale industry and development projects. At least one-third of the forests are cleared and much of this land is degraded. The region has become a mosaic of forest remnants, cattle pastures, agricultural fields, secondary forests, degraded (logged) forests, and sprawling urban areas. A great many species are rare or threatened by deforestation. The Tucuruí Dam on the Rio Tocantins below the city of Marabá has flooded 2,430 km2 of low-lying forest, drowning the flora and fauna and displacing human residents. Several small protected areas are established here, and the 2,000 km2 Caxiuanã National Forest protects some forest habitat. There is a small amount of frontier forest remaining here and that is very threatened by continued human-induced deforestation and land degradation.
Types and Severity of ThreatsColonization along roads and rivers, expansion of agriculture, cattle ranching, and urban sprawl threaten the natural habitats of this region. Fire is used as a tool to facilitate these activities, and it poses the most serious threat to the integrity of the forest ecosystems . Drought and erosion in pastures creates an environment inhospitable to rain forest tree seedlings, therefore hindering the regeneration of forest.
Justification of Ecoregion DelineationThis interfluvial ecoregion is bound on the east and across the southwest by the Tocantins River (the Tucurui Reservoir is not considered for historic coverage) and by the Pindare River and São Marcos Gulf in the east. Linework for this ecoregion follows these rivers, and the IBGE (1993) map classification of "lowland ombrophilous dense forest", "submontane ombrophilous dense forest", and all subsequent "secondary forests and agricultural activities" within this broad classification. Northern limits follow the coastal mangroves, and southern delineation follows the transition to "seasonal deciduous forest" which we classify as cerrado. The ecoregion is isolated from similar forest by the surrounding rivers, and is justified in its endemic species.
ReferencesDucke, A., and G. A. Black. 1953. Phytogeographical notes on the Brazilian Amazon. Anais da Academia Brasileira de Ciências 25: 1-46.
Fundação Instituto Brasilero de Geografia Estatástica-IBGE. 1993. Mapa de vegetação do Brasil. Map 1:5,000,000. Rio de Janeiro, Brazil.
Nepstad, D. C., P. R. Moutinho, C. Uhl, I. C. Vieira, and J. M. da Silva. 1996. The ecological importance of forest remnants in an eastern Amazonian frontier landscape. Pages 133-150 in J. Schelhas and R. Greenberg, editors, Forest Patches in Tropical Landscapes. Washington, DC: Island Press.
Pires, J. 1974. Tipos de vegetaçao da Amazônia. Brasil Forest 5: 48-58.
Prance, G. T. 1977. The phytogeographic subdivisions of Amazonia and their influence on the selection of biological reservers. Pages 195-212 in G.T. Prance and T.S. Elias, editors, Extinction is Forever. New York: New York Botanical Garden.
Silva, J.M. C. 1998. Um método para o estabelecimento de áreas prioritárias para a conservação na Amazônia Legal. Report prepared for WWF-Brazil. 17 pp.
Prepared by: Robin SearsReviewed by: In process | 科技 |
2016-40/3982/en_head.json.gz/10451 | Published on March 13, 1997. The European division of Softbank Interactive Marketing signed on as rep firm of CricInfo, a site for cricket fans worldwide.BackWeb Technologies announced a relationship with Microsoft Corp. to offer its broadcast channel and delivery options for Microsoft's Internet Explorer.Intermind Corp. formed alliances with online merchants and software vendors including Virtual Vineyards, DealerNet and TravelWeb, to use the company's technology as a tool for electronic commerce.Copyright March 1997, Crain Communications Inc. | 科技 |
2016-40/3982/en_head.json.gz/10471 | Apple's Arizona sapphire plant spurs construction of new renewable energy projects
By Shane Cole Monday, February 10, 2014, 07:57 am PT (10:57 am ET)
Apple's desire to have its Arizona sapphire plant run entirely on renewable energy from day one has spurred the construction of new solar and geothermal power projects in the region, a new report says. The Hudson Ranch geothermal plant | Source: EnergySource
Cupertino, Calif.-based Apple is said to have negotiated directly with Arizona utility SRP to ensure the new facility could be operated without the use of fossil fuel-burning power plants, according to Bloomberg. The additional green energy capacity was one of several concessions — alongside new on-site power infrastructure, expedited permit processing, and financial incentives — made by commercial and governmental entities in the state in order to land Apple's business. "It's not like getting an extension cord and plugging it in," Mesa, Ariz. mayor Scott Smith told the publication. "These deals live and die many times before they come together."
SRP has signed several new agreements to purchase as much as 75 megawatts of energy from renewable sources in recent months, though it is unclear how much of that is related to the Apple deal. The utility has added power from a 25-megawatt geothermal facility in Beaver County, Utah as well as a 50-megawatt geothermal plant in the Imperial Valley of southern California. Before its unveiling, the Arizona plant was codenamed Project Cascade
Those agreements are in addition to the more than 700 megawatts of green power SRP already purchases. SRP has not said exactly how much it expects the Apple facility to draw, only that it believes the factory will "add significant electrical load."
The Arizona plant will not be the first Apple facility to depend entirely on renewable energy, as the company moves toward dropping power from fossil fuels entirely. Apple's Maiden, NC datacenter is powered by two Apple-owned 20-megawatt solar power arrays located adjacent to the datacenter as well as a 10-megawatt fuel cell installation that the company says is the largest non-utility fuel cell installation anywhere in the world. Apple's existing datacenter in Newark, Calif. is served primarily by wind power, while new sites in Reno, Nev. and Prineville, Ore. will take advantage of existing locally-sourced renewable energy.
All together, Apple says that more than 75 percent of power in use at corporate facilities around the world is from renewable sources. Topics:
General (16) Comments
Carl Icahn drops push for more aggressive Apple share buyback
Apple begins international rollout for iTunes Radio, launches in Australia | 科技 |
2016-40/3982/en_head.json.gz/10528 | Issue: Dec, 1928
Posted in: Aviation
No Comments on Learning to Use Our Wings (Dec, 1928)
Learning to Use Our Wings (Dec, 1928)
Learning to Use Our Wings
This Department Will Keep Our Readers Informed of the Latest Facts About Airplanes and Airships CONDUCTED BY ALEXANDER KLEMIN
In charge, Daniel Guggenheim School of Aeronautics, New York City Aviation Safety Congress.
THE Daniel Guggenheim Fund for the Protection of Aeronautics has, since its inception, made safety in aviation the object of its main efforts. Recently the Fund organized a Congress on Safety in Aviation, arranged in co-operation with the National Safety Council, and held at the Hotel Pennsylvania in New York City. The program of the Congress is a proof of the fact that safety in flying is not a matter of one invention or method, but lies in the development of many divers aids. Thus the sessions of the Congress dealt with Structures and Materials in Relation to Aviation Safety; Airports and Airways; Aids to Navigation; Medical Aspects; Aerodynamics; Power Plants; Operation of Aircraft in Air Transport; Weather Service; The Public and Flying; and last but not least, Fire Prevention.
The sessions were attended largely by technical men, but the lay public was also well represented. In several of the sessions, members of the public would rise to ask how safe aviation really is.
Perhaps the best answer made to this question was that of Senator Hiram Bingham, the new president of the National Aeronautic Association. The Senator, who occupied an important position with the Army Air Service during the World War, and has since been a potent factor in aeronautical legislation, said that we could have any kind of safety that we wanted, just as we could have any kind of safety in going by water around Cape Cod. In a row boat, there would be considerable danger. A fishing sloop would be far from safe. An ocean liner would be perfectly safe. So in aviation, we could range from great hazard in the flying of unlicensed craft, manned by inexperienced personnel, to the safety approaching that of a railroad on the regular passenger lines operated by the great air transport companies.
The Congress achieved a great success in bringing the exact elements of aviation safety before the public and in placing squarely before the technicians the present status and the future needs of safe flying.
Experiments in Fog Flying.
FOG flying attracted due consideration at the Aviation Safety Congress. The Fund has made the conquest of fog one of its major projects. Lieutenant James H. Doolittle, of the Army Air Corps, one of the most experienced test and research pilots in the country, and winner of the Schneider Seaplane Cup in 1925, has been secured by the Fund for a series of systematic experiments in fog flying. A special plane with dual control will be placed at his disposal as well as every possible instrument or device likely to be of service in blind flying. The action of such devices will be tried out in fair weather, with one pilot closed in his cockpit. The “blind” pilot will attempt first to fly by instruments alone, then to land “blind” by the aid of special height indicators, based on electrical and acoustic principles. The experiments will then be repeated in an actual fog, under flying conditions. This work will certainly test even Doolittle’s iron nerves.
Altitude Flights.
OUR photograph shows E. T, Allen, test pilot for the Boeing Company of Seattle, Washington, clad in leather and fur, with face mask, goggles, and oxygen tube glued to his lips, just as he would appear at an altitude of 30,000. feet or so above the ground.
In military or naval aviation, altitude gives the flier a great advantage. At very great heights, he is safe from observation and from anti-aircraft guns. Swooping down from altitude on an enemy pilot flying in a lower zone is a formidable maneuver.
The single seater pursuit built by the Boeing Company, not only attains a height of 30,000 feet, but it can fly at 180 miles per hour with full military equipment. The new pursuit ship has excellent lines and is very “clean.” After tests it is to be turned over to the Navy Department.
Accident Causes.
IT is interesting to read the classification of aviation hazards presented by Mr. Ted Wright, Chief Engineer of the Curtiss Company. Twenty percent of all accidents are caused by forced landings due to engine failure. This classification includes engine failures caused by breakage or non-functioning of the engine proper, or of the auxiliary systems such as fuel, ignition, cooling, or carburation. Approximately the same percentage of failures occurs in each of the above sub-items.
Mr. Wright advocated as remedies the use of multi-engined airplanes, capable of flying with one or more power units disabled; provision of a more dependable power plant; and provision of more landing fields throughout the country.
Errors of judgment by the pilot are responsible for 53 percent of accidents. We believe that this estimate is far too high. It is sometimes very difficult to determine the causes of an accident. There is a temptation for investigating boards to blame the pilot, particularly when the pilot is dead and not able to defend his reputation. The accident may really have been due to instability, insufficient control, too high a landing speed or other deficiency in design. It may be argued that the pilot should realize these deficiencies and fly his ship so as to take account of them. That is asking perhaps too much from even skilled flyers. The remedy lies partly in better training, partly in better design of our airplanes.
Weather conditions were estimated by Mr. Wright as being responsible for 19 percent of all accidents. These include chiefly accidents due to severe storms, lightning, fog, ice formation, et cetera. More metereological stations and better weather service, particularly by the use of radio communication, will do much to remove these hazards.
Finally, 8 percent of accidents are due to structural failure. The structural design of the airplane is very advanced, which accounts for the low percentage. The worst of structural failures is that they are apt to involve fatal results, and also that they cause the greatest blow to morale. Accidents due to weather are taken somewhat for granted in all human activities. There is, on the other hand, something particularly disquieting in the report of a fatal accident due to the loosening of a wing in the air. The remedy here is in still more careful structural analysis of the airplane, in rigid inspection during manufacture, and in better maintenance.
While Mr. Wright’s paper dealt especially with structural analysis and design, it really gave a bird’s eye view of the entire problem of aircraft safety, and was all the more valuable on that account.
World’s Smallest Airship.
THE Meadowcraft Balloon and Airship Company have recently built what is the smallest airship of to-day. It is 65 feet long and 30 feet in diameter, and has a gas capacity of only 22,000 cubic feet. With a 22 horsepower Henderson motorcycle engine it attained a speed of 20 miles per hour. It has a weight empty of 800 pounds and can carry 500 pounds of useful load.
The airship is non-rigid of course, and consists of two lobes. The front lobe is the main lifting unit; the rear lobe is jointed and serves to give both directional and horizontal control. This is an entirely novel idea, and the control obtained is very powerful; according to Air Travel News, the ship can be turned completely around in a space no wider than a city street, or held almost stationary in the air.
The airship has always been regarded as a naval or military weapon and as a medium of long distance transportation. It has never been considered as of possible usefulness in aerial service, while the airplane has been put to work in dozens of industrial applications. The Meadowcraft airship with its small size and extreme maneuver- ability has possibilities for such work as aerial advertising, aerial photography and insect control. With its ability to hover over one spot, it has, for certain phases of such work, an obvious superiority over the airplane. Its employment for such purposes may mark a new phase of airship utility.
A New Type of Silencer.
THE new Loening cabin amphibian, Wasp engine powered, is a remarkable plane, whose many excellent features we hope to describe in detail at a later date. The muffler is particularly interesting. It consists of three concentric cylinders, the inner two perforated. The exhaust gases enter on one side, and therefore whirl as they expand. The innermost cylinder consists of two truncated cones, producing a Venturi effect.
The gases pass from the chamber between the outer two cylinders to the one between the inner two cylinders and then inside the inner cylinder. In the process the flame and noise of the exhaust is reduced. At the same time the Venturi effect gives suction instead of the back pressure which is customary with mufflers. As a result the Wasp engine actually turns up more revolutions with the muffler than without it. • Runways for Airports.
THERE is now a special publication devoted to the construction and operation of airports and appropriately termed Airports. A most interesting series of articles starting in a recent number is that dealing with runways.
Where the soil is such that dry weather does not make it dusty, if there is sufficient drainage to keep it from getting muddy or wet, and until the traffic gets heavy, the natural soil scraped clear of vegetation serves all purposes. But where these desirable conditions are not available and air traffic is heavy, artificial runways become necessary. Asphalt, brick and concrete each have their advocates. For each of these, firmness and uniformity, smoothness, good visibility, and durability are claimed. Experience alone will decide their relative merits. In the meantime, civil and highway engineers and manufacturers of these three materials will have many arguments about their respective merits.
A Rubber-Hydraulic Landing Gear THE design of an airplane landing gear is quite a difficult matter. An axle between the two landing wheels may get caught by an obstacle on rough ground and tend to nose the airplane over. Therefore, the transverse axle has disappeared in most modern designs. It has evolved to a short, stub-like affair to the end of which are connected three struts, as shown in our photograph of the Stearman landing gear.
The landing loads may come on the chassis from any direction. There is always a vertical load as the wings lose lift and the plane settles on the ground. There may be a side load, if the landing is made with one wheel lower than the other, or if there is a side wind. There may be a force in a backward direction if the plane meets an obstacle of any kind on the ground. The struts, therefore, are so arranged that they can take loads in any direction.
In the Stearman landing gear, the two inclined struts are hinged at the center of the fuselage, and take up the side and backward loads. The outer strut is a shock absorbing member which compresses through several inches so as to lessen the shock. A study of the photograph will show that the wheels will swing up and out as the compression strut is shortened. The design of these compression struts has passed through many stages of evolution, and they are now generally a combination of a hydraulic system and a spring or rubber system.
For a given weight, rubber is the material which will take up the greatest energy of shock. It is so elastic that it has far better shock absorbing properties than the best of steel springs. Up to two or three years ago, rubber was used as the sole shock absorbing element. The rubber, in the form of cord or disks, took up the initial shock admirably. Unfortunately, the energy stored in the rubber is only partially destroyed during its absorption; therefore after a severe landing the airplane tends to rebound. Hydraulic shock absorption does away with this rebounding tendency.
In the Stearman “rubber-draulic” gear, the hydraulic unit consists of a piston with a small orifice operating in a cylinder full of oil. As the strut compresses and the piston travels upwards in the cylinder, the oil is forced through the narrow orifice, energy is dissipated thereby, and converted into heat without the possibility of reconversion into energy of rebound.
After the landing shock has been absorbed by the hydraulic system, taxiing loads are taken up by rubber shock absorber cord in tension. This cord is a continuous piece wound around a series of pins welded to the frame of the gear, and extended as one part of the compression strut. The part fixed to the wheel moves upward relatively to the part which is fixed to the rest of the airplane. The greater the travel in compression, the less the load actually transmitted to the fuselage of the airplane, and in this particular gear the total travel is eight inches, of which six inches is taken by the hydraulic portion. An average grade of lubricating oil is used in the cylinder, but in winter a zero oil is employed.
Aerology at the Airport.
NOTHING is more important for air transport operation than efficient weather service for the pilot. We are glad to see therefore how seriously the matter is treated at the Brook Park Airport, of Cleveland, Ohio. One of our photographs shows L. E. Pierce, in charge of the aerological office at this airport, on top of one of the hangars, about to launch a meteorological balloon. This is similar in appearance to a toy balloon, but larger, inflated with hydrogen, and far more buoyant. Such a balloon when properly inflated will rise at a predetermined rate, which is 550 feet a minute as a rule.
With a telescope and sextant, Pierce observes the course of the balloon in flight and by plotting its course in all three dimensions, can determine both the velocity and direction of the wind at various altitudes. Temperature, barometric pressure, condition of weather, ceiling, and visibility are other meteorological data recorded at frequent intervals at this and other stations. The data from all stations along a given route are recorded on a large board in the hangar and serve to give a pilot invaluable information before his flight.
The Goodyear “Puritan”.
IT is possible that small airships will really be the first air-borne craft to land consistently and in safety on the roofs of our cities, a feat recently accomplished by the Goodyear Pilgrim. A sister ship of the Pilgrim, the Goodyear Puritan, has a gas capacity of 86,000 cubic feet of helium, and is powered with two Ryan-Siemens five-cylinder radial engines, which can drive the ship at a maximum speed of 55 miles per hour. The engines are mounted on outriggers, so that the cabin is insulated as much as possible from their noise and vibration. The framework of the car and keel are built of duralumin girders, almost exactly like those to be incorporated in the 6,000,000 cubic foot airships now on order. The passenger cabin is built of sheet duralumin over the girder framework, and is shaped very much like a flat iron. The keel is contained inside the envelope, making the cabin an integral part of the bag.
Non-rigid airships used to have their cars suspended by a net work of cables which increased head resistance. This is avoided in semi-rigid airships, and we believe that the Puritan really belongs to the class of semi-rigid airships. One of the innovations in design is a single landing wheel projecting below the center of the cabin. This wheel, mounted on a duralumin frame, acts as a support for the ship when it is on the ground, and permits easy rolling from one place to another. Because the airship is lighter-than-air, the wheel receives very little load on the ground.
It is quite possible that airships of this type may have a future as a new kind of air-yachts. The record of the Puritan from August 6 of this year to September 15, was a series of flights on 33 days out of 45, covering 5635 miles and carrying 450 passengers, which is highly creditable. An objection to airships has been so far in the difficulties of handling. An interesting and new development in this regard is the use of a motor car specially equipped for airship service. A feature of this motorized machine shop is a portable mooring mast to assist in landing.
Elements of Aviation.
COLONEL V. E. CLARK is a rare combination of pilot, airplane designer and aerodynamicist, and his recently published book “Elements of Aviation” (Roland Press, New York City) gives proof of his breadth of view. The author knows aerodynamics, but presents the subject not as an abstract science but as a real introduction to the study of airplane design. The treatment is remarkably clear, and mathematical expressions are reduced to a minimum.
The author modestly disclaims original thought in his treatise, but his methods of presentation are both original and striking. Even an experienced designer will find much to interest him in this elementary book.
Throughout the book we find apt definitions and illustrations. We shall look forward to the companion volume on design which is to follow.
Aerial Advertising.
THE industrial applications of the airplane constantly are being increased. While sky writing has apparently suffered a temporary eclipse, aerial advertising at night is to be undertaken vigorously by Aerial Advertising, Inc., a New York City company.
A huge biplane, built by the Keystone Aircraft Corporation, with the lower wing of 15 feet greater span than the upper one, so that its total spread is 90 feet, is being equipped with an illuminated sign, 90 :feet in length and six feet, six inches in width. Small strips of bass wood, measuring two and a half inches in width are mounted on the underside of the lower wing. The illuminated letters of the advertising sign are secured to these strips.
Six wind-driven electric generators mounted on the wing provide the current for the sign. For extra reliability there are two independent lighting circuits. The tests made on Long Island have shown that these signs can easily be read from an altitude of 3000 feet.
Inventions Needed for Safety.
WHILE aviation safety is not solely a matter of invention, depending as much on training of personnel, ground organization, et cetera, still many inventions are needed. We can only list a few typical suggestions.
In the structure of the airplane, there is inevitable vibration due to the engine, which cannot be avoided, particularly when only a few cylinders are employed. Perhaps it may be possible to avoid the transmission of vibration from the engine to the rest of the plane, by the employment of rubber as a shock absorber between the engine mount and the rest of the plane.
Radio, lighting, and weather service as aids to navigation, are progressing rapidly. We have still to see a dependable altimeter which will warn the pilot of dangerous proximity to the ground in case of fog. A field localizer which will guide the pilot to the center of a field still remains to be made practical. The danger of collision will increase as the number of planes aloft increases, and some form of collision signal is needed. Television through the fog has been often suggested.
In the aerodynamic design of the plane, as we have often discussed, there is need of devices which will decrease its landing speed without affecting other desirable characteristics.
In the prevention of fire, the production of a light weight engine on the Diesel or semi-Diesel principle, burning heavy, non-inflammable fuel, will be a great boon.
There are many opportunities for engineers and inventors.
Popularizing the Science of Flight.
THE properties of the wheel, the pulley, the lever are instinctive with us through many generations of use. While the air is all pervading, it is invisible and very much of a mystery. The science of aerodynamics or air flow as applied to the airplane therefore needs popularization, and a number of authors have undertaken to write simple accounts of the theory and construction of the airplane. Such works will do a great deal of good to the cause of aviation.
They are very much harder to write than technical treatises in which mathematical language can be freely used. We have recently received an excellent book of this type, “The ABC of Flight” by W. Laurence Le Page, published by John Wiley and Sons. In simple language with clear diagrams Mr. Le Page explains the fundamental principles of flight, how stability and control are achieved, how an airplane is built, and the elements of flying instruction.
A New Dirigible with Wings (Jul, 1929) NEW OBSERVATION CAR FOR AIRSHIP (Feb, 1932) Guggenheim Safety Planes Feature Controllable Wings (Mar, 1930) GIRO-Automobile FLIES Without WINGS (Jul, 1935) Learning to live with The Sonic Boom (May, 1959) 0 comments | 科技 |
2016-40/3982/en_head.json.gz/10531 | Start your search today Tag archives: asteroids
Killer asteroid bust-up, exposing academic plagiarism, #IAmAPhysicist and more
Posted on May 27, 2016 4:19 pm Time-lapse image of the asteroid Euphrosyne as seen by NASA’s WISE space telescope, which is used by NEOWISE to measure asteroid sizes. (Courtesy: NASA)
By Hamish Johnston
First-up in this week’s Red Folder is a tale of killer asteroids, hubris and peer review from the Washington Post. The science writer Rachel Feltman has written a nice article about a claim by physicist-turned-entrepreneur Nathan Myhrvold that NASA’s research on asteroids that could potentially collide with Earth is deeply flawed. On Monday, Myhrvold posted a 111-page preprint on arXiv that argues that asteroid radii measured by NASA’s NEOWISE project are far less accurate than stated by NASA scientists. What’s more, Myhrvold seems to suggest that NEOWISE scientists have “copied” some results from previous asteroid studies.
Myhrvold began his career as a theoretical physicist and, after a stint as Microsoft’s chief technology officer, founded an intellectual-property firm. He has never worked in the field of asteroids, yet he has taken great exception to some of the physics and statistical analysis underlying the NEOWISE results. His paper has been submitted to the journal Icarus, but has not yet passed peer review – unlike the NEOWISE results. In her article, Feltman ponders why Myhrvold is actively promoting his controversial work – he was featured in a New York Times article on Monday – before it has passed peer review. She also speaks to several NEOWISE scientists, who are not amused.
Posted in The Red Folder |
Tagged asteroids, food science, NASA, science and society |
View all posts by this author | View this author's profile Creating craters, Mexican style
Posted on May 22, 2015 1:31 pm By Matin Durrani in Puebla, Mexico
So it’s day five of the Physics World Mexican adventure and today we’ve been to the Benemérita Universidad Autónoma de Puebla (BUAP), which is one of the oldest universities in the country. After taking a peek at a new facility containing one of the most advanced supercomputers in Latin America, we headed over to the Institute of Physics, where we bumped into Felipe Pachecho Vázquez.
Posted in Mexico visit 2015 |
Tagged asteroids, geophysics, Mexico |
View all posts by this author | View this author's profile Chelyabinsk exposed: the anatomy of an asteroid impact
Posted on Nov 6, 2013 6:00 pm Exhibit A: a 4 cm-wide meteorite created by the Chelyabinsk asteroid explosion with “shock veins” in it. (Courtesy: Science/AAAS)
By Matin Durrani
If there is one thing that will be remembered about Friday 15 February 2013, it’s that it was the day when a massive asteroid blew up above the city of Chelyabinsk in Russia – creating the largest explosion on the planet since the one that occurred over the Tunguska river in Siberia in 1908.
But whereas hardly anyone saw or recorded information about the Tunguska explosion, the Chelyabinsk asteroid blew up over a relatively densely populated region and – perhaps more importantly – its journey through the air was recorded by numerous cameras and webcams that nervous Russian drivers love to install on their cars. Video footage of the event was soon seen by people all over the world.
Now, based on data from those videos and visits to some 50 local villages, researchers from the Czech Republic and Canada have published a paper in the journal Science detailing the trajectory, structure and origin of what they call the “Chelyabinsk asteroidal impactor”. The paper goes live on Thursday 7 November.
To save you the trouble of reading the full article, I’ve picked out a couple of factoids that might intrigue and interest you.
Tagged asteroids, astronomy, meteorites, Russia |
View all posts by this author | View this author's profile Astronomers discover complex organic matter abound in the universe
Posted on Oct 28, 2011 5:20 pm NASA image of the star field in the constellation Ophiucus; at the centre is the recurrent Nova RS Ophiuci (Credit: John Chumack)
By Tushna Commissariat
Complex organic compounds – one of the main markers of carbon-based life forms – have always been thought to arise from living organisms. But new research by physicists in Hong Kong, published yesterday in the journal Nature, suggests that these compounds can be synthesized in space even when no life forms are present.
Sun Kwok and Yong Zhang at the University of Hong Kong claim that a particular organic compound that is found throughout the universe contains complex compounds that resemble coal and petroleum – which have long been thought to come only from carbonaceous living matter.
The researchers say that the organic substance contains a mixture of aromatic (ring-like) and aliphatic (chain-like) complex components. They have come to this conclusion after looking at strange infrared emissions detected in stars, interstellar space and galaxies that are commonly known as unidentified infrared emissions (UIEs). These UIE signatures are thought to arise from simple organic molecules made of carbon and hydrogen atoms – polycyclic aromatic hydrocarbon (PAH) molecules – being “pumped” by far-ultraviolet photons. But Kwok and Zhang both felt that hypothesis did not fill the bill accurately enough, when they considered the observational data.
As a solution, they have suggested an alternative – that the substances generating these infrared emissions have chemical structures that are much more complex. After analysing the spectra of star dust forming when stars explode, they found that stars are capable of making these complex organic compounds on extremely short timescales of weeks and that they then eject it into the general interstellar space – the region between stars.
Kwok had suggested, at an earlier date, that old stars could be “molecular factories” capable of producing organic compounds. “Our work has shown that stars have no problem making complex organic compounds under near-vacuum conditions,” says Kwok. “Theoretically, this is impossible, but observationally we can see it happening.”
Another interesting fact is that the organic star dust that Kwok and Zhang studied has a remarkable structural similarity to complex organic compounds found in meteorites. As meteorites are remnants of the early solar system, the findings raise the possibility that stars enriched our protoplanetary disc with organic compounds (by angela). The early Earth was known to have been bombarded by many comets and asteroids carrying organic star dust. Whether these organic compounds played any role in the development of life on Earth remains a mystery.
It will also be interesting to see if this finding has an impact on research groups that look for life in the universe, such as SETI , considering that complex organic molecules have always thought to be markers of carbon-based life forms.
Tagged asteroids, astronomers, astronomy, complex organic molecules, Hong Kong, life, SETI, solar system, stars, universe |
View all posts by this author | View this author's profile Earth’s silent hitchhiker seen at last
Posted on Jul 28, 2011 3:31 pm This artist’s concept illustrates the first known Earth Trojan asteroid (Credit: Paul Wiegert, University of Western Ontario, Canada)
By Tushna Commissarat
Looks as if the Earth has a cohort – one that has been hitching a ride with our planet’s orbit for a while now. Astronomers sifting through data from NASA’s Wide-field Infrared Survey Explorer (WISE) mission have discovered the first known “Trojan” asteroid orbiting the Sun along with the Earth. It has been known since 1772 that stable small bodies can share the same orbit with a planet or a moon – as long as they remain at stable points in front of or behind the main body. Such Trojan asteroids have been found orbiting Jupiter, Mars, two of Saturn’s moons and Neptune, but had not been seen for the Earth until now. This is because they are difficult to detect, being relatively small and appearing near the Sun from the Earth’s point of view.
Trojans circle around “Lagrange points” – gravity wells where small objects can be relatively stable compared with two larger objects, in this case the Sun and the Earth. The points that the Earth’s Trojan – called 2010 TK7 – orbits around are known as the L4 and L5 points, and are 60° in front of and behind the Earth, respectively. As they constantly lead or follow in the same orbit as the planet, they can never collide with it; so you can breathe a sigh of relief if you were worried about a possible armageddon.
“These asteroids dwell mostly in the daylight, making them very hard to see,” says Martin Connors of Athabasca University, Canada, lead author of a paper about the discovery published in Nature. “But we finally found one, because the object has an unusual orbit that takes it farther away from the Sun than is typical for Trojans. WISE was a game-changer, giving us a point of view difficult to have at the Earth’s surface.”
The WISE telescope scanned the entire sky in the infrared from January 2010 to February this year. The researchers began looking for data for an Earth-bound Trojan using data from NEOWISE – a WISE mission that focused in part on near-Earth objects (NEOs), such as asteroids and comets. NEOs are bodies that pass within 45 million kilometres of Earth’s path around the Sun. The NEOWISE project observed more than 155,000 asteroids in the main belt between Mars and Jupiter, and more than 500 NEOs, discovering 132 that were previously unknown. The team found two Trojan candidates – of these, 2010 TK7 was confirmed as an Earth Trojan after follow-up observations were made using the Canada–France–Hawaii Telescope in Hawaii.
2010 TK7 is roughly 300 metres in diameter, at a distance of about 80 million kilometres from Earth. It has an unusual orbit that traces a complex motion near the Lagrange points in the plane of the Earth’s orbit, although it also moves above and below the plane. The asteroid’s orbit is well defined and remains stable for at least 10,000 years. For the next 100 years, it will not come closer to the Earth than 24 million kilometres (by angela). An animation, with a Star Wars worthy soundtrack, showing the orbit can be found below. (Image and video credit: Paul Wiegert, University of Western Ontario, Canada.)
A handful of other asteroids also have orbits similar to Earth. Such objects could make excellent candidates for future robotic or human exploration. Unfortunately, asteroid 2010 TK7 has not been deemed worthy of exploration because it travels too far above and below the plane of Earth’s orbit, and so would require a large amount of fuel to reach it.
Tagged asteroids, astronomy, Earth, exploration, gravity well, moon, NASA, NEO, orbit, plane, planets, space, Sun, Trojan, WISE |
View all posts by this author | View this author's profile Atlantis lifts off into history
Posted on Jul 8, 2011 5:08 pm By Tushna Commissariat
Despite gloomy weather conditions that threatened to cancel the launch altogether, NASA’s shuttle Atlantis has launched from the Kennedy Space Center. Marking the last and final flight of the Space Shuttle Programme – STS-135 – Atlantis and a four-person crew are on a 12-day mission to deliver more than 3.5 tonnes of supplies to the International Space Station (ISS). This final stock should keep the station running for a year. Although the countdown stopped briefly at 31 s before the launch, the shuttle had a “flawless” lift-off, according to NASA. It has now settled down into its preliminary orbit ahead of its rendezvous with the ISS this Sunday morning.
The image above is of the shuttle, taken shortly after the rotating service structure was rolled back yesterday at Launch Pad 39A at the Kennedy Space Centre in Florida (Credit: NASA/Bill Ingalls). Below is an image of the mission patch for this final iconic flight (Credit: NASA).
“The shuttle’s always going to be a reflection of what a great nation can do when it commits to be bold and follow through,” said astronaut Chris Ferguson, commander of the mission, from the cockpit of Atlantis minutes before the launch. “We’re completing a chapter of a journey that will never end. Let’s light this fire one more time, and witness this great nation at its best.”
Atlantis was the fourth orbiter built and had its maiden voyage on 3 October 1985. Atlantis had a number of firsts to its name – it was the first shuttle to deploy a probe to another planet, to dock to the ISS and the first with a glass cockpit! It conducted a final servicing mission to the Hubble Space Telescope in May 2009.
NASA has decided to retire its shuttle programme with this last flight because the vehicles are too costly to maintain. It now intends to contract out space transport to private companies. The hope is that this will free NASA resources to invest in a other programmes that will potentially send humans beyond the space station to the Moon, Mars and maybe even asteroids.
Atlantis is also carrying some rather unusual passengers – some simple yeast cells. The aim is to study the yeast cells as their genetic make up is remarkably similar to that of a human cell. This makes it an ideal system for studying genetic defects and understanding how these defects may manifest in human disease. In two separate experiments – conducted at the ISS – researchers will study the effect of microgravity on cell growth.
The video below has the crew of Atlantis talking about the “vibrancy of the ISS as a stepping stone for NASA’s plans for future human exploration beyond low Earth orbit”.
Tagged asteroids, Atlantis, experiments, Florida, Hubble Space Telescope, International Space Station, Kennedy Space Center, launch, Mars, moon, NASA, planets, probe, Space Shuttle, space travel | | 科技 |
2016-40/3982/en_head.json.gz/10606 | NavigationSurprise Article!Electric CarsSolar PowerWind Power100% Renewable Energy?RSSSponsor A PostAdvertise Solar Energy Published on December 2nd, 2008 |
by Timothy B. Hurst
Construction of Florida's Largest Solar Plant Begins
December 2nd, 2008 by Timothy B. Hurst [social_buttons]Florida Power & Light, the state’s biggest utility, broke ground today on what it says will be the first utility-scale solar investment in the state — and the first hybrid solar facility in the world to combine a solar-thermal field with a combined-cycle natural gas power plant.
Consisting of 180,000 mirrors spread out over 500 acres, FPL’s 75-megawatt Martin Next Generation Solar Energy Center is situated on the Atlantic coast just north of Palm Beach County.FPL’s new facility is The Martin Next Generation Solar Energy Center will use less fossil fuel when heat from the sun is available to help produce the steam needed to generate electricity. It also matches solar power with an existing combined-cycle natural gas plant, so that when the sun is not shining, the natural gas can take over the work of powering the turbines.
Florida’s Governor Charlie Crist, a big advocate of renewable energy and environmental protection, lauded the ground-breaking.
“Florida’s future growth and economic strength depends on how we address climate change, and we know we can reduce greenhouse gases by using fewer fossil fuels and more natural energy sources like solar,” said Gov. Crist. “This solar facility is a significant step in that direction.”
The plant is the first of three FPL solar facilities state regulators have approved in Florida, which, according to the utility, will make the state the second-largest solar energy producer in the country. In addition to the Martin facility, FPL will also build two other solar projects in Florida – one at NASA’s Kennedy Space Center and the other in Desoto County. These facilities will add 35 megawatts of solar photovoltaic capacity to the state. Combined, these projects help strengthen FPL Group’s position as the nation’s clean energy leader.
The Martin Next Generation Solar Energy Center will provide enough power to serve about 11,000 homes. Over 30 years, the solar facility will prevent the emissions of more than 2.75 million tons of greenhouse gases.
Image: © Florida Power and Light
Tags: Florida, FPL, Martin Next Generation Solar Energy Center, utility-scale solar
Timothy B. Hurst is the founder of ecopolitology and the executive editor at LiveOAK Media, a media network about the politics of energy and the environment, green business, cleantech, and green living. When not reading, writing, thinking or talking about environmental politics with anyone who will listen, Tim spends his time skiing in Colorado's high country, hiking with his dog, and getting dirty in his vegetable garden.
Top EV Transportation & Technology Summit Coming In 19 Days — Register Now! →
Are Most Electric Vehicle Owners Converting To Tesla? →
Tesla Explores: New York City & Santa Barbara Pop-Up Shop Features Model X & Airstream Trailer (Video) →
Breathing In National Drive Electric Day At Mote Marine In Sarasota, Florida (Electrify The Island) →
The reason that there are mandates and tax credits for wind and solar energy is that they are more expensive than natural gas and much more expensive than coal. Speculative high energy prices have ruined the economy of the world. The higher prices for solar and wind including the credits and taxes will also be detrimental to industry which will be driven entirely to China where about fifty new coal power plants are built every year. They are also building a limited number of nuclear power plants, but they take too long. Any one who wishes to use solar energy can do it on their own, but I do not wish to pay for it through taxes. When I calculated a return on investment for the windturbine that I bought, it was far negative but so was every car.
Mandated Capstone cogeneration turbines for every new commercial building will save much CO2 and money for their owners. Honda cogeneration for homes is not yet ripe for Florida. ..HG..
Global Patriot
Bit by bit, the movement toward renewable energy continues, and while the progress is still moving at a slow pace, those who have the vision continue to provide momentum forward.
Peter Fairley
Hybrid power plants that combine a solar thermal field and natural gas heat to feed the same steam turbine are a great idea. But Florida isn’t where it all began. Two such hybrid plants are already under construction in Hassi R’mel, Algeria, and Beni Mathar, Morocco.
That said, given the sometimes glacial pace of developments in North Africa, Florida’s plant could be the first to fire up.
See my Carbon-Nation webjournal (http://carbonnation.info) for more on the technologies available to clean up our energy systems. | 科技 |
2016-40/3982/en_head.json.gz/10635 | ✝ The Creation Wiki is made available by the NW Creation Network ✝
Watch monthly live webcast - Like us on Facebook - Subscribe on YouTube
Water moccasin
From CreationWiki, the encyclopedia of creation science
Class: Reptilia
Order: Squamata
Family: Viperidae
Genus: Agkistrodon
Species: A. piscivorus
Water moccasin showing its venomous fangs
The Water moccasin is a species of venomous water snake known by the scientific name Agkistrodon piscivorus. Its body is very thick and dark colored. It can be found usually by swamps, rivers, or other bodies of water. It is best known as one of the most venomous snakes in North America. The snake is very aware of its surroundings and when it feels threatened it may simply curl up in a coil and open its mouth, or calmly disappear into the water for its own protection, or attack fast and hard. It is highly advised to seek medical help if you are bitten by one of these snakes. If you are not treated properly or quickly, there is the possibility of death.[1]
The snake can be found either during night or day. It is most common though, to find them at night time in hot climate areas. It has been found in the winter but usually when it is sunny. The water moccasin eats all sorts of different kinds of prey. This includes a variety of aquatic and land prey, which are: amphibians, lizards, snakes, small turtles, baby alligators, mammals, birds, and most importantly fish. The mating season for the snakes happens during the early time of summer. This time of year is when the male gender of the water moccasin get aggressive. This is because of the competition for males to mate with the female moccasin.[2]
2 Reproduction
4 Venom
Full body view of the water moccasin
There are distinctions of how to tell if what you are looking at is a water moccasin. Water moccasin, when they are young, are very light colored. Though when they are older and more mature, they become dark. The head of the snake is kind of shaped like a triangle. The eyes of the snake are similar to those of a cat. In other words, they have a yellowish tint to them and the pupils of the snake are vertical. The mouth of the water moccasin is very white. In it are its large fangs. When swimming through the water, the water moccasin doesn't go all the way under the water like normal water snakes do, they usually have their back out of the water while swimming.
In general, all snakes have a tiny hole opening behind their tongue. This is called the glottis. The glottis opens into a trachea. The snake's glottis is never opened. This creates a tall slit. What creates the hissing noise that the snake makes is the glottis, which vibrates when the snake breathes out. The snake has a trachea, which is a very thin structure, which is helped out by cartilaginous rings, the supported rings are not complete rings. What completes the ring is a slender membrane.[3]
The water moccasin is ovoviviparous. The female moccasin can have a litter of up to twenty, though that is very rare. The usual litter is around six to eight. Neonates are usually 22-35 cm long. If the weather conditions are good and food can be easily found, the growth of a water moccasin is fast and females may not reproduce as much as than three years of living and a total length of as small as 60 cm. Water moccasin babies are born in August or September,and mating can happen during any of the warmer months of the year.
Water moccasin slithering in the water
The water moccasin can be found in the south eastern part of the United States of America. It mostly inhabits the parts of the state of Florida. It also can be found in the following states: Alabama, Arkansas, Florida, Georgia, Illinois, Indiana, Kentucky, Louisiana, Mississippi, Missouri, North Carolina, Oklahoma, South Carolina, Texas, and Virginia.
The water moccasin lives in swampy areas and not as much as the large deep waters. It can be found in the low waters like: a lake, swamp, river, canals, or small clear rocky mountain streams. It has been found swimming in saltwater but not as likely. The water moccasin can develop better to it's environment if it lives in a dry climate. The less moist the environment the better.[4]
The water moccasin is one of the most deadly venomous snakes in North America. They are very aggressive when they think they are going to be attacked. The way they inject their venom is like the same way a rattlesnake does. The venom can be looked at in different steps. Which is the Proterozoic enzymes completely mess up and destroy your tissues. The venom also can cause a sudden shock to the victim.
5,000 snake bites are reported each year in the United States. About 42% of those reports are due to the water moccasin snake bite. Studies show that water moccasin bites come from more of the males than females. This is probable due to the fact of the testosterone that the males posses.
If you are bitten, here are some signs indicating whether you have been infected with venom or not. If there is pain around the bite. Also swelling, nausea, vomiting, or diarrhea may show. Fang marks from the bite might not show on a victim from a water moccasin bite. The treatment for a wound depends on how much venom has gotten in your system. There isn't a lot of medication that you can take. It can take up to a six week recovery period.[5]
Cottonmouth / Water Moccasin (Agkistrodon piscivorus) - Venomous by Kimberly Andrews, University of Georgia – edited by J.D. Willson, University of Georgia's Savannah River Ecology Laboratory, Accessed March 1, 2011.
Poisonous Water Moccasins by Alexandra Hubbard, edited by Chitika, accessed March 5,2011.
Envenomation by Sean P Bush, MD, FACEP, edited by Eric J Lavonas, updated July 24, 2008.
BiologyOrganismsAnimals • Algae • Amphibians • Bacteria • Birds • Created kinds • Crustaceans • Dinosaurs • Fish • Fungi • Humans • Insects • Mollusks • Plants • Protists • ReptilesDisciplinesAgriculture • Anatomy • Baraminology • Biologist • Biotechnology • Botany • Cell biology • Ecology • Epidemiology • Embryology • Entomology • Evolutionary biology • Genetics • Herpetology • Human biology • Medicine • Microscopy • Physiology • Taxonomy • Virology • ZoologySystemsAuditory system • Circulatory system • Digestive system • Endocrine system • Immune system • Integumentary system • Limbic system • Lymphatic system • Muscular system • Nervous system • Olfactory system • Reproductive system • Respiratory system • Sensory system • Skeletal system • Visual systemProcessesAging • Cell division • Cellular respiration • Diseases • Embryogenesis • Gene expression • Metabolism • Metamorphosis • Parasitism • Photosynthesis • Pregnancy • Protein synthesis • Reproduction • Symbiosis • Food chain • Categories • Portal • News Retrieved from "http://creationwiki.org/index.php?title=Water_moccasin&oldid=329898"
Categories: SnakeViperViperidaeVenomous snake Navigation menu
Image pool
Cite this page Ads
Content is available under Creative Commons Attribution-ShareAlike License unless otherwise noted.
About CreationWiki | 科技 |
2016-40/3982/en_head.json.gz/10790 | IBM Releases "A Boy and His Atom," Guinness World Record World's Smallest Movie
Bored nerds create some of the most entertaining and amazing stuff out there, but when those nerds are nanophysicists, they screw around with things at the atomic level. And when those nanophysicists work for IBM, they get paid to make things like the world’s smallest movie.
If you don’t catch what’s happening in this one-minute film, it’s a stop-motion animated feature about a boy playing with an atom. It’s rather 8-bit looking, and the boy is kind of a sloppy stick figure, but you can forgive all of that when you realize that they made the film by manipulating individual atoms and magnifying them 100 million times so we can see them.
Thus, the “atom” the boy is playing with is an actual atom. The folks in the lab used a scanning tunneling microscope to manipulate the atoms for the video.
The whole thing is a stunt to get people reading about IBM’s work in atomic-scale memory. (Congratulations IBM, you have our attention.) The company says is has developed the ability to reduce the number of atoms it takes to store one bit of data on a computer from 100 million atoms to 12 atoms. If that makes your brain hurt, that’s because it should.
Also, the soundtrack is really cool.
Tags: Storage, IBM, Atom, Enterprise, 8-bit, (NYSE: IBM), atomic, nanophysicist
Via: IBM Show comments
MORE HOT HEADLINES Major OEMs Can No Longer Sell Windows...
Latest Windows 10 Anniversary Update...
Nintendo NES Classic Edition Shows Off... | 科技 |
2016-40/3982/en_head.json.gz/10799 | Tami Heilemann/DOI/Flickr/interiordigitalErnesto Moniz, Secretary of the U.S. Department of Energy, speaks at the November 2013 Tribal Nations Conference.
Nine Tribes to Receive $7 Million From Department of Energy for Wind, Biomass, Solar ProjectsICTMN Staff11/14/13Nine tribes will receive a total of more than $7 million from the U.S. Department of Energy (DOE) for clean-energy projects, the agency announced on November 14.
The Coeur d'Alene Tribe in Idaho, the Gwichyaa Zhee Gwich’in Tribal Government in Fort Yukon, Alaska, the Forest County Potawatomi Community in Milwaukee, Menominee Tribal Enterprises in Wisconsin, the Seneca Nation of Indians in Irving, New York, the Southern Ute Indian Tribe Growth Fund in Ignacio, Colorado, the Tonto Apache Tribe of Payson, Arizona, the White Earth Reservation Tribal Council in Minnesota and the Winnebago Tribe of Nebraska will use their respective funds to develop a variety of alternative energy sources involving wind, biomass and solar power.
The DOE highlighted the awards during the 2013 White House Tribal Nations Conference as a way to help American Indian and Alaska Native tribes use clean energy to save money, increase energy security and promote economic development.
RELATED: Native Leaders Air Concerns at White House Tribal Nations Conference
Today, we are very pleased to announce that nine tribes have been selected to receive over $7 million to further deploy clean energy projects,” Energy Secretary Ernest Moniz said in his remarks before the conference. “A couple of examples in those awards, wind power for tribal government buildings at Seneca Nation in New York, energy efficiency upgrades to reduce energy use by 40 percent in Alaska. There are nine tribes that will have these efficiencies. And that addresses this question of mitigation, reducing carbon pollution.”
“American Indian and Alaska Native tribes host a wide range of untapped energy resources that can help build a sustainable energy future for their local communities,” said Energy Secretary Ernest Moniz in a statement announcing the awards. “Responsible development of these clean energy resources will help cut energy waste and fight the harmful effects of carbon pollution—strengthening energy security of tribal nations throughout the country.”
In remarks at the Tribal Nations Conference, Moniz said the government planned to work more closely with American Indians on developing energy sources.
“We are looking forward to establishing and advancing a subgroup of the White House Council on Native American Affairs, to really focus on energy development, energy deployment in Indian country,” he said. “I think, working together, with us and agriculture, EPA and other cabinet colleagues, we really want to harness the energy potential in Indian country—conventional energy, renewable energy—to expedite clean energy deployment and electrification. That is something that we will get together on and try to advance promptly.”
While Indian country officially takes up just two percent of the land known as the United States, that territory holds a good five percent of all U.S. renewable energy resources, the DOE noted.
The grants are part of an ongoing push to invest in tribal clean energy projects that began in 2002. The DOE’s Tribal Energy Program has put about $42 million into 175 such projects, providing financial and technical assistance as well along with its Office of Indian Energy Policy and Programs. Other grants were announced earlier this year to other tribes.
RELATED: Energy Department To Pump $7 Million Into Tribal Clean Energy Projects
The initiative also includes technical assistance.
RELATED: Ten Tribes Receive Department of Energy Clean-Energy Technical Assistance
Moniz said the DOE intends to continue and expand on these efforts.
“From community solar projects in New Mexico and Colorado, to the commercial scale wind projects in Maine, small biomass projects in Wisconsin, DOE is working with 20 tribes and Alaskan Native villages to empower leaders with tools and resources needed to lead energy development that can foster self-sufficiency, sustainability, and economic growth,” he told the tribal leaders at the conference. “At the Department of Energy I have certainly made it a priority to raise our game with state, local governments, tribes. We believe, in the end, a national policy needs to build from tribal, state, local, and regional policies and activity.”
Its about damn time!
Sandi Honyaoma
Submitted by Sandi Honyaoma on Its about damn time!
Yvette Guzman
Submitted by Yvette Guzman on Hello...
Great to hear this! But did you know that the POOREST of all First Americans Nation was left out? The Lakota in South Dakota, living in worse than third-world conditions, so bad, it is not unusual that they receive help from other tribes.... What happened to them in this process? Login or Register to post comments
U.S. Department of Energy DOE Office of Indian Energy Policy and Programs Renewable Energy Renewable Energy Projects Wind Power Wind Turbines Solar Energy Coeur d'Alene Tribe Forest County Potawatomi Community Seneca Menominee Tribal Enterprises Seneca Nation of Indians Southern Ute White Earth Indian Reservation Tonto Apache Tribe Winnebago Tribe of Nebraska Related Stories
Hillary
Clinton’s Run: Seneca Women, Modocs Facing Gallows and One Tough QuakerSolar Innovator Says Pipeline Debate Shows This As the Ideal Time to Transition to Renewable EnergyFond du Lac Band Unveils Largest Solar Farm in Minnesota Power GridThe Unfulfilled Promises of JFK: He Vowed a New Frontier for Natives follow
Search More Indian Country Business Today
September 30, 2016 RES New Mexico Returns... The second regional Reservation Economic... September 29, 2016 AMERIND Risk Institute... ... September 29, 2016 Agua Caliente Band of... The Agua Caliente Band of Cahuilla... Most Shared | 科技 |
2016-40/3982/en_head.json.gz/10810 | Contact UsSubmit Content
MERI announces lecture series The Marine Environmental Research Institute announced its 2014 environmental lecture series: “The Healing Edge in a Changing World,” a program that will explore the critical environmental issues impacting each of us, as well as potential solutions for healing our environment and ourselves.
The series will open with Toxic Hot Seat, an award-winning documentary about a 40-year campaign of deception that has left a toxic legacy of flame retardants in our homes and bodies, according to MERI’s news release. From changes in our weather patterns and in the bay around us to the truth about chemical exposure and cancer, the series is an exploration of some critical health issues of our times.
Lectures are open to the public and will begin June 27 and conclude September 24. With the exception of the final talk, all lectures are held at the Center For Marine Studies at 55 Main Street in Blue Hill and start at 6 p.m., preceded by a speaker’s reception at 5:30 p.m. There is no charge for admission, but seating is limited to 100 people.
After the June 27 screening of Toxic Hot Seat, Dr. Susan Shaw, director/founder of the Marine Environmental Research Institute, will moderate an expert panel discussion with Mike Belliveau, executive director, Environmental Health Strategy Center and Ronnie Green, Portland firefighter
For more information, visit MERI online at meriresearch.org, email info@meriresearch.org, or call 374-2135. | 科技 |
2016-40/3982/en_head.json.gz/10823 | Failure To Launch: SpaceX Delays Mission By editor
Originally published on May 19, 2012 10:19 am Transcript SCOTT SIMON, HOST: This is WEEKEND EDITION from NPR News. I'm Scott Simon. A tall white rocket is still standing on a launch pad at Cape Canaveral in Florida. The rocket belongs to a company called SpaceX, and it was supposed to blast-off this morning, send an unmanned capsule on a mission to the International Space Station - the first time a personal spacecraft will try to visit the station. But the launch attempt fizzled out this morning in the last seconds of the countdown. UNIDENTIFIED MAN: Five, four, three, two, one, zero and lift-off. We've had a cutoff. Lift-off did not occur. SIMON: NPR's Nell Greenfieldboyce has been following this mission. Nell, thanks for being with us. NELL GREENFIELDBOYCE, BYLINE: Thank you. SIMON: They know what happened? GREENFIELDBOYCE: Well, yes and no. SpaceX says there was, quote, "slightly high combustion chamber pressure on Engine 5." So, basically that means the computer sensed that something wasn't quite right and shut the whole thing down with just a half of second to go in the countdown. So, it was dramatic because here you had this white rocket standing in the predawn darkness, and then fire appeared beneath and we were all ready. And then - nothing. There was just this cloud of smoke and the rocket was still standing there. So, SpaceX says it's looking at the data and will send technicians out to the launch pad to inspect the engine. They could swap out that engine if it was necessary, and they have additional launch opportunities on Tuesday and Wednesday. SIMON: To past couple of generations who've grown up on NASA launches - the space shuttle and before that - it always seemed that they had wiggle room, that if a mission had to be scrubbed, there were still a few minutes or a few hours in which certain repairs, they might get it off anyway or try the next day. Why this kind of delay? GREENFIELDBOYCE: Well, for this launch, they had what was called a near instantaneous launch window. So, that means they basically had one instant when they could take off and if they missed it then they were out of luck for today. And the reason for that is that they have to go when things are lining up with the space station so that they can take a very direct route to the station, and that's to save fuel. Because on this particular mission, they're going to need a lot of fuel. The idea is that this unmanned capsule is going to carry food and other supplies. It's loaded with all sorts of stuff for the space station. But before it can actually deliver this stuff, it has to fly around a lot in space and do a bunch of maneuvers and execute them all perfectly so that NASA can be assured that everything's working and that it's safe that this spacecraft to get close to the station. You know, you don't just fly up casually to the International Space Station. It's $100 billion asset, plus there's people living onboard. So, NASA wants to make sure there is no chance of any collision. SIMON: Nell, if they're able to get this off the ground and to dock it aboard the International Space Station, help us understand the significance that would represent. GREENFIELDBOYCE: It would be historic. So far, only government space agencies from the U.S., Japan, Russia and Europe have ever sent vehicles to the station. So, if they could do this, it would be a huge milestone for commercial space flight. And it would be a big part of NASA's push to get private space taxis going back and forth to the station so that NASA can focus on deep space exploration now that the space shuttles are going to museums. SIMON: There's only cargo in this mission, but could this craft be modified to carry people? GREENFIELDBOYCE: That's the idea. That's what SpaceX wants to do. In fact, this capsule was designed from the beginning with people in mind. And the founder of SpaceX, Elon Musk, has said that if someone snuck onboard with an oxygen tank, they'd be OK. Basically, they need to modify it. And the main thing is building in a launch abort system, so if there was any problem during the launch with the rocket, the crew in the capsule could get away. And SpaceX says that they want to start carrying people in just a few years, starting in 2015. SIMON: NPR's Nell Greenfieldboyce. Thanks very much. GREENFIELDBOYCE: Thank you. Transcript provided by NPR, Copyright NPR.Related Program: Weekend Edition SaturdayView the discussion thread. © 2016 KACU 89.5 | 科技 |
2016-40/3982/en_head.json.gz/10859 | WATCH: NASA Spots Brightest Lunar Explosion Ever Recorded By editor
May 18, 2013 ShareTwitter Facebook Google+ Email NASA's lunar monitoring program has detected hundreds of meteoroid impacts. The brightest, detected on March 17, 2013, in Mare Imbrium, is marked by the red square.
NASA NASA scientists say they witnessed an extremely bright lunar explosion this past March. In fact, it is the biggest explosion they've seen since they started keeping track of such events in 2005. "On March 17, 2013, an object about the size of a small boulder hit the lunar surface in Mare Imbrium," Bill Cooke, of NASA's Meteoroid Environment Office, said in a press release. "It exploded in a flash nearly 10 times as bright as anything we've ever seen before." What's cool is if you had been looking at the moon at just the right time, you would have seen a one-second flash caused by the impact of a nearly-90 pound meteoroid that was traveling at 56,000 mph. The impact was picked up by one of the Meteoroid Environment Office's 14-inch telescopes. One intriguing question is how a meteoroid can cause an explosion on the Moon, which has no oxygen atmosphere. NASA explains: "Lunar meteors don't require oxygen or combustion to make themselves visible. They hit the ground with so much kinetic energy that even a pebble can make a crater several feet wide. The flash of light comes not from combustion but rather from the thermal glow of molten rock and hot vapors at the impact site." Since NASA started keeping tabs of lunar strikes, it has counted more than 300 of them. They hope keeping track of these events will help them make decisions during long-term lunar missions. "Is it safe to go on a moonwalk, or not?," NASA asks. "The middle of March might be a good time to stay inside." We'll leave you with a graphic that shows all of the strikes the NASA program has recorded. The red square marks the spot of the March 17 impact:Copyright 2014 NPR. To see more, visit http://www.npr.org/. © 2016 KUOW News and Information | 科技 |
2016-40/3982/en_head.json.gz/10976 | http://news.nationalgeographic.com/2016/03/160328-astronomy-stargazing-night-sky-moon-planets-star.html
This Week’s Night Sky: See Moon Pose with Mars and Saturn
Catch the lineup and look for the glow of the zodiacal lights.
The gas giant planet Saturn with its iconic rings is seen here in a view from the orbiting Cassini spacecraft.
Photograph by NASA, JPL, Space Science Institute
Zodiacal Lights. For two weeks beginning Monday, March 28, sky-watchers who can make it out of the city and its bright lights will get a chance to see a glow in the night sky called the Zodiacal lights.
With the waning moon dropping out of the sky by early evenings, this is the best chance for Northern Hemisphere observers to see the zodiacal lights, which are some of the most challenging nighttime lights to catch.
This ethereal glow is caused by sunlight reflecting off countless dust particles scattered along the plane of the solar system, between the planets. If you can get into the dark countryside, look for a pyramid-shaped glow—fainter than the Milky Way—rising above the western horizon after sunset.
Before dawn on Tuesday, look for the moon to be teamed up with Saturn, Mars, and the bright orange star Antares in the constellation Scorpius. Skychart by A.Fazekas, SkySafari
Moon March. Before dawn on Tuesday and Wednesday, March 29 and 30, the moon will be gliding past the super-bright star-like trio of Mars, Saturn, and Antares in the constellation Scorpius in the southern sky.
First up on Tuesday, the waning gibbous moon will be parked to the upper right of golden Saturn. The two will be only four degrees apart—less than the width of your fist held at arm’s length. Train even the smallest of backyard telescopes on Saturn and you can get an eyeful of its famous rings and a few of the brightest of its 62 moons. But it turns out that these iconic jewels may have been relatively recently acquired. You Might Also Like
What Are Cats Trying to Tell Us? Science Will Explain Video Shows Troubled Japanese Spacecraft Tumbling in Orbit You (and Almost Everyone) Owe Your Life to This Man Just last week a new theory from the SETI Institute and the Southwest Research Institute (SwRI) suggests that while the planet is about 3.5-4 billion years of age, many of its moons and majestic rings may be no older than 100 million years, around the age of the dinosaurs. Supercomputer models show that the current arrangement of large inner moons and rings was possibly formed sometime during Earth’s Cretaceous Period, 66 to 145 million years ago, by some cataclysmic event that pulverized a primordial set of moons that circled the gas giant. And it’s from those remnants that many of today’s moons and the entire set of rings we see today formed. View Images
By Wednesday morning, the moon will form a dramatic lineup with both Saturn and Mars.
Skychart by A.Fazekas, SkySafari
Lunar Lineup. By Wednesday morning, you will notice that the moon has slid down to the lower left of Saturn, forming a straight line-up that includes ruddy Mars to the ringed planet’s right.
Just below the lineup is the 600 light-year distant orange star Antares, which takes its name from Greek meaning ‘equal to Mars’ due to its similar appearance.
After darkness falls this week, look to the western sky for the distinct and bright Orion constellation and its belt of three stars.
Sinking Orion. As springtime begins, traditional winter constellations are beginning to set in the west and fade in the bright evening twilight. So with the moon out of the night sky, Saturday, April 2, will make for a great opportunity to catch sight of the brightest winter constellation Orion, the hunter. Orion’s star-studded figure is probably the most recognizable pattern in the heavens and is easy to find even in light-polluted cities simply by identifying his belt: a straight line of three bright stars. Surrounding the stellar belt are four stars marking the shoulders and knees of the giant. This constellation is found about due west around mid-evening at this time of year.
Clear skies!
Follow Andrew Fazekas, the Night Sky Guy, on Twitter, Facebook, and his website.
© 1996-Thu Sep 22 05:52:42 EDT 2016 National Geographic Society. | 科技 |
2016-40/3982/en_head.json.gz/10995 | Read less AgTech Startup Centaur Raises $1.3M In Round Led By OurCrowd First Read more September 28, 2016 | Centaur Analytics, a Greece-based AgTech IoT startup that provides real-time stored agriproducts monitoring and protection solutions, announced today it has raised $1.3 million during a recent funding round, which was led by Israeli seed-stage venture fund OurCrowd First. The company stated the funds will be used to accelerate the commercialization of its proprietary wireless sensors and cognitive services platform for monitoring and protecting post-harvest product quality. Read less Israeli Venture Capital Firm Aleph Raises $180M Read more September 27, 2016 | Israeli venture capital firm Aleph has raised $180 million for its second fund, Aleph II. Founded in 2013 by Michael Eisenberg and Eden Shochat, Aleph has invested in startups such as WeWork, Meerkat and JoyTunes. Read less Samsung Global Innovation Center Opens Tel Aviv Branch Read more September 26, 2016 | Samsung GIC, the Global Innovation Center, part of Samsung Electronics, announced the opening of Samsung NEXT Tel Aviv, It’s first branch outside of the US. The announcement was made by Eyal Miller, Managing Director and CEO of Samsung NEXT Tel Aviv, during a press event at the organization’s new offices in Sarona, Tel Aviv. Samsung GIC empowers entrepreneurs and startups to build, grow and scale advanced software and services. Headquartered in California’s Silicon Valley with additional locations in New York, South Korea and now Tel Aviv, Samsung GIC works with startups at every stage of development through incubation, investment from Seed to Series B, acquisition and partnership. The organization is spearheaded by David Eun, formerly President of AOL’s media division and Global Head of Content Partnerships at Google. Read less Israel Wins World Baseball Classic Qualifier Read more September 26, 2016 | Team Israel has won the World Baseball Classic qualifying tournament in Brooklyn, crushing Great Britain 9-1 on Sunday night. Five Israel pitchers combined to limit Britain to four hits, and three members of the blue and white hit the first home runs of the tournament over the wall to lead Israel to victory, and a place in the WBC main tournament for the first time. Israel was a perfect 3-0 in the tournament, having beaten the British side 5-2 last Thursday then beat Brazil 1-0 on Friday in a pitcher’s duel, to move into the final. The team now moves on the 16-team tournament, which will begin in South Korea in March. Read less Sequoia Capital Discontinues Israel Fund Read more September 26, 2016 | According to a report in Israel’s Calcalist, after leading five funds in Israel and exits of hundreds of millions of dollars, Sequoia Israel will make new investments only through their American fund, Sequoia Capital. A similar step was taken from Benchmark and Greylock Partners who are no longer active in Israel. According to estimates, the reason for the decision to close the Israeli fund is primarily the desire of the partners to work through the American fund to give Israeli companies better access to the American fund. Read less Israeli Startup Anodot Raises $8M Read more September 25, 2016 | Israeli anomaly detection and real time business analytics company Anodot announced an $8 million financing round led by Aleph Venture Capital and with participation by Disruptive Technologies L.P., bringing Anodot’s total funding to $12.5 million. In the nine months since Anodot’s launch, dozens of customers, including several Fortune 500 companies, have already implemented the product to prevent crises and drive revenues. Ra’anana based Anodot was founded in 2014 by CEO David Drai, VP R&D Shay Lang, and Chief Data Scientist Dr. Ira Cohen.
Read less Group Messaging Manager Mobilize Raises $6.5M Read more September 22, 2016 | Israeli network relationship management platform developer Mobilize announced it has closed a $6.5 million Series A financing round and launched the self-service version of its product. The investment was led by Trinity Ventures with the participation of Floodgate Ventures, Hillsven Capital, Array Ventures, Upwest Labs in addition to SaaS angels such as Eoghan McCabe, CEO of Intercom.io, and Sanjay Subhedar. Mobilize was founded by Sharon Savariego and Arthur Vainer and has offices in San Francisco and Tel Aviv. Read less Used Car Marketplace Vroom Raises $50M Read more September 21, 2016 | American-Israeli online used-car marketplace Vroom has raised $50 million in a funding round from existing and new investors, bringing their total funding to $218 million. Founded in 2013 and headed by CEO Allon Bloch, VRoom will use the funding to help expand its operations in the United States and increase inventory. The company currently has an inventory of more than 2,300 vehicles, according to its website. Vroom expects to generate more than $1 billion of revenue in 2016, up from $900 million last year. The latest round was led by existing investor T Rowe Price Associates Inc, and added technology-focused investment firm Altimeter Capital and Foxhaven Asset Management as investors.
Read less BlazeMeter Bought By CA Technologies For A Reported $100M Read more September 21, 2016 | Israeli company BlazeMeter, which produces performance testing tools for DevOps, has been acquired by CA Technologies. Terms of the agreement were not disclosed although media reports suggest the acquisition was for about $100 million. The transaction is expected to close by the end of the year. CA Technologies said that the acquisition of BlazeMeter, which has offices in Tel Aviv and Palo Alto, would enable it to extend its DevOps portfolio. BlazeMeter will integrate with CA’s continuous delivery solutions to improve testing efficiency and accelerate the deployment of applications. BlazeMeter, which was founded in 2011 by CEO Alon Girmonsky, raised $6.5 million in a series A financing round in 2014 from Glilot Capital Partners and YL Ventures GP Ltd. Read less Crowd-Funding Firm OurCrowd Raises $72M Read more September 21, 2016 | Israeli crowd-funding firm OurCrowd raised $72 million from financial institutions and private investors from around the globe. So far, the crowdfunding platform has invested $300 million in 100 portfolio companies and funds. Some of its portfolio startups were sold or went public, including: ReWalk, which is traded on Nasdaq; Crosswise (bought by Oracle); Replay Technologies (bought by Intel); and NextPeer (bought by Viber). Founded in 2013 by Jon Medved, OurCrowd’s community includes nearly 15,000 investors from 110 countries. Read less Teva Buys Rights To Chronic Pain Drug For $250M Read more September 21, 2016 | Israeli pharmaceutical giant Teva Pharmaceutical and Regeneron Pharmaceuticals, Inc. announced an agreement to develop and commercialize pain drug fasinumab. This is Regeneron’s investigational NGF antibody in Phase III clinical development for osteoarthritis pain and in Phase II development for chronic lower back pain. Under the terms of the agreement, Teva will pay Regeneron $250 million upfront and share equally in the global commercial value (excluding Japan, Korea and other Asian countries covered by a previous collaboration), as well as ongoing R&D costs of about $1 billion. Read less Curve Raises $3M To Create All-In-One Credit Card Read more September 20, 2016 | FinTech startup Curve has raised $3 million, bringing its total funds to $5 million. Founded in 2015 by Israeli Shachar Bialick, along with Tom Foster-Carter and Anna Mostyn-Williams, the startup develops a credit card that aggregates all your bank cards and will be synced with a mobile app. Read less Israeli FinTech Startup Tipalti Raises $14M Read more September 18, 2016 | Israeli FinTech startup Tipalti has raised $14 million in a funding round led by venture capital firm SGVC. A provider of payments automation solutions, Tipalti was founded in 2011 by Chen Amit and Oren Zeev. It has so far raised $27 million. Read less Video Analytics Company ‘Agent Vi’ Raises $6M Read more September 15, 2016 | Chinese corporation Kuang-Chi Group has announced it will invest $4.3 million in Agent Video Intelligence (Agent Vi), an Israeli video analytics company, as part of the startup’s $6 million financing round. Founded in 2008 by Gadi Talmon and Zvi Ashani, the company has so far raised $20 million. Read less Digital Testing Company Applause Raises $35M Read more September 14, 2016 | Digital testing company Applause has raised $35 million in a Series F financing round, led by Direct Equity Partners, an investment program managed by Credit Suisse, with the participation of Accenture. This round brings Applause’s total funding-to-date to more than $115 million. The round had full participation from all of Applause’s previous investors, including Goldman Sachs’s Merchant Banking Division, QuestMark Partners, Scale Venture Partners, Longworth Venture Partners, Mesco Ltd and MassVentures. The company’s most recent funding was a $43 million Series E round in January, 2014. Applause is an Israeli company, with global headquarters near Boston, Massachusetts and its development center in Herzliya that was originally launched in 2008 as uTest.
Read less VW, Ex-Shin Bet Head Form Cyber-Security Startup Read more September 14, 2016 | Car maker Volkswagen is partnering with former Shin Bet (Israel Security Agency) head Yuval Diskin to form a new cyber-security company called CyMotive Technologies. Other former Shin Bet executives involved in the joint venture, which will be based in Herzliya, Israel, are Tsafrir Kats and Dr. Tamir Bechor. The new company will develop products and solutions for the next generation of cars. “We see CyMotive as a long-term investment, with the purpose of making the vehicle and its surroundings safer,” Dr. Volkmar Tanneberger of Volkswagen said in a statement. Read less Israeli Online Marketing Startup Optimove Raises $20M Read more September 14, 2016 | Israeli startup Optimove has raised $20 million from Israeli venture capital firm IGP – Israeli Growth Partners. Optimove provides online marketing technologies to some 200 companies. Founded by Pini Yakuel in 2009, this is the company’s first financing round. Read less Network Security Startup Cato Networks Raises $30M Read more September 13, 2015 | Israeli network security startup Cato Networks has raised $30 million in a series B funding round. The financing was led by Greylock Partners with participation from Singtel Innov8 and existing investors U.S. Venture Partners (USVP), Aspect Ventures and the company’s founders, Shlomo Kramer and Gur Shatz. The funding will allow Cato Networks to offer its cloud-based network security as a service solution, the Cato Cloud, to the global market, bringing the cloud’s transformative power to networking and security. Today’s financing is the Tel Aviv-based company’s second after closing a $20 million series A round in June 2015, shortly after it was founded. Read less Cyber Security Startup Claroty Raises $32M Read more September 13, 2016 | Israeli cyber security company Claroty has come out of stealth mode and announced it has completed a financing round of $32 million. The startup also announced that it is launching its protection platform for industrial networks. Established in 2014 as part of Israeli cyber security foundry Team8, Claroty’s backers include Bessemer Venture Partners, Eric Schmidt’s Innovation Endeavors, Marker, ICV, Red Dot Capital Partners, and Mitsui & Co., Ltd. The company enters the market as the most substantially funded cyber security startup focused on protecting industrial control systems, and with one of the deepest teams in OT security from organizations including Siemens, IBM, Waterfall Security, Palo Alto, iSIGHT Partners (FireEye), ICS2, and Industrial Defender. With offices in Tel Aviv and New York, Claroty was founded by CEO Amir Zilberstein, chief business development officer Galina Antova, and CTO Benny Porat. Read less Cloud Company CTERA Networks Raises $25M Read more Septemeber 13, 2016 | Israeli cloud company CTERA Networks announced a $25 million investment round led by Bessemer Venture Partners with participation from Cisco, and new investor Vintage Investment Partners. Based in Petach Tikva and New York, CTERA, which was founded in 2008 by CEO Liran Eshel and VP R&D Zohar Kaufman, has raised $70 million to date including the latest financing round. The funds will be used to fuel sales and marketing initiatives and accelerate global customer acquisition as the CTERA Enterprise File Services Platform becomes a gold standard for file storage, collaboration and data protection among secure and distributed enterprise organizations. Read less Israeli Startup Datorama Raises $32M Read more September 13, 2016 | Israeli marketing analytics startup Datorama has raised $32 million in a funding round led by Lightspeed Venture Partners. Marker LLC and Eric Schmidt’s Innovation Endeavors also participated in the round. This series brings Datorama’s total funding to $50 million. Datorama was founded in 2012 by Ran Sarig, Efi Cohen and Katrin Ribant. Read less Report: Startup Creditplace Raises $6M Read more September 12, 2016 | Israeli investment platform Creditplace has raised $6 million, according to a Globes report. The startup enables investors to buy invoices of vendors awaiting payment. Headed by CEO Yaron Saban, the Tel Aviv-based company was founded by Dekel Golan and Serge Aziza. Read less Columbia University Honors Israeli Epigenetics Pioneers Read more September 11, 2016 | Columbia University’s top honor for achievement in biological and biochemical research has been awarded to two researchers from the Hebrew University for their research, which led to the new field of Epigenetics, and yielded insights into how cells and embryos develop. The 2016 Louisa Gross Horwitz Prize was presented to Prof. Howard Cedar and Prof. Aharon Razin of the Hebrew University, and their American colleague Dr. Gary Felsenfeld. Read less Israeli Startup Proov Raises $7M Read more September 11, 2016 | Israeli startup Proov has raised $7 million from Mangrove Capital Partners and Israeli crowd-funding firm OurCrowd. Proov, a Pilot-as-a-Service (PaaS) platform, simplifies the way new technology is evaluated and adopted. Founded in 2015 by Alexey Sapozhnikov and Toby Olshanetsky, this is the company’s first series of funding. Read less WiFi Chipmaker Celeno Raises $38M Read more September 8, 2016 | Israeli startup Celeno Communications completed a $38 million financing round. Founded in 2005 by president and CEO Gilad Rozen, the company supplies chips and software technology for managed WiFi apps. Red Dot Capital Partners, which focuses on investments in growth companies, led the round. New investors participated in the current round, including Poalim Capital Markets, and the OurCrowd investment fund. Previous investors also took part, including Liberty Global, Cisco, Pitango Venture Capital, 83North (formerly Greylock Israel), Vintage Investment Partners, and Miven. According to figures from the IVC databse, including the current round, Celeno has raised $107 million since it was founded. Celeno has 120 employees, including 85 in its Ra’anana headquarters. The company also has representatives in the US, Europe, and Asia. Among other things, Celeno plans to use the funding to recruit 20 new employees in a variety of areas.among other things. Read less Israeli Startup WSC Sports Raises $12M Read more September 8, 2016 | Israeli startup WSC Sports Technologies has closed a $12 million Series B financing round led by Intel Capital, with participation from existing and new investors including the Wilf family (owners of the Minnesota Vikings), the ownership of the LA Dodgers, and Dan Gilbert (majority owner of the 2016 NBA Champion Cleveland Cavaliers). The company also announced that Intel Capital Director Uri Arazy will join the WSC Sports board. The investment brings WSC Sports’ total funding to $16 million and will help the company significantly accelerate growth and international expansion. Based in Ramat Gan, the company has developed an automatic, real-time, customized video creation platform for sports broadcasts designed for amplifying sports content and value. WSC Sports was founded by CEO Daniel Shichman, VP business development Aviv Arnon, CTO Shmuel Yoffe, and COO Hy Gal. Their customers, including the NBA, Turner Sports, Big East Conference, British Telecom the ELeague and others. Read less Israeli Airfare Monitoring App FairFly To Serve Businesses Only Read more September 6, 2016 | Israeli startup FairFly, which notifies travelers when the price of their airline ticket drops, has announced it will no longer serve individual clients, and is shifting focus to serve the enterprise sector. In a letter to clients, CEO Aviel Siman-Tov, writes: “We’ve decided to put all of our focus on our enterprise product and discontinue our mobile app.” Starting Sept. 15, the company will no longer provide services to individual clients through its mobile and web platforms. FairFly was founded in 2013 by Uri Levine (the co-founder of popular navigation app Waze), along with Siman-Tov, Gili Lichtman, and Ami Goldenberg. Read less QS Rankings: Hebrew U. Still #1 Israeli University Read more September 6, 2016 | The Hebrew University of Jerusalem retained its leading position in a list of the best colleges published today by British higher education data provider Quacquarelli Symonds (QS). The 2016-2017 rankings place Hebrew University first in Israel and 148th in the world. The other Israeli institutions on the list include: Tel Aviv University, at No. 212, The Technion-Israel Institute of Technology, at No. 213; and Ben-Gurion University, at No. 320. Topping the list are MIT, Stanford, and Harvard. First compiled in 2004, the QS World University Rankings rate the world’s best-performing higher education institutions, considering over 4,000 for inclusion and evaluating over 900. The ranking considers universities’ performance across six indicators, selected to reflect research impact, commitment to high-quality teaching, internationalization, and global reputation amongst both academics and employers. Read less Cyber Security Startup Cronus Raises $3.5M Read more September 5, 2016 | Israeli cyber security startup Cronus Cyber Technology announced that it raised $3.5 million. According to figures from the IVC database, the company has raised $6.2 million since it was founded, including the current round. CEO Doron Sivan and CTO Matan Azugi founded Cronus in 2014. Cronus’s product uses an algorithm to try to imitate a human hacker’s way of thinking. Former Israeli Air Force chief Eitan Ben Eliyahu is the company’s chairman. Cronus’s financing round included US fund Janvest Capital Partners, a European investor, and a strategic investor from Hong Kong. The company previously received funding from the Ministry of the Economy and Industry Chief Scientist. The company’s headquarters are in Haifa, and it also does business in the UK, Germany and Hong Kong. The money raised is expected to enable Cronus to expand its business to the US.
Read less Hargol FoodTech Wins Agro Innovation Lab Competition Read more September 5, 2016 | Hargol FoodTech, the Israeli company developing Steak TzarTzar — high-protein grasshoppers — emerged victorious among 30 finalists chosen to present their budding food technologies at the recent Agro Innovation Lab startup competition held in Vienna, Austria. The competition attracted 160 applications from 49 countries. Agro Innovation Lab, a new accelerator for startups and entrepreneurs, is a subsidiary of Austrian agriculture multinational corporation RWA. As the winner, Hargol FoodTech will receive initial investment from RWA, which will become a shareholder and will join Israel’s Trendlines Group, an Israeli innovation commercialization company, as an investor. (Earlier this year, Bayer and Trendlines announced their $10 million, five-year Bayer Trendlines Ag Innovation Fund.) Read less WeWork To Open Two New Tel Aviv Workspaces Read more September 4, 2016 | Co-working space WeWork has announced that it will double its presence in Tel Aviv with the opening of two new Tel Aviv locations – Ibn Gabirol and Hazerem – and the expansion of the Herzliya Pituach location. WeWork Ibn Gabirol and WeWork Hazerem are expected to be open by the end of 2016, marking the 5th and 6th locations launched in Israel of the US-based firm, since entering the Israeli market two years ago. Herzliya Pitucah, the 2nd location to open in Israel, is also expanding due to high demand. The new locations in Israel come at a time of global expansion for WeWork. In 2016, the company has opened its first locations in Montréal, Mexico City, Berlin, Shanghai and Seoul, with plans to come to Paris in 2017. Founded in 2010 by Adam Neumann and Miguel McKelvey, WeWork runs 77 sites in 23 cities around the world, including Berlin, London, Boston, New York and San Francisco, providing professionally designed workspace, as well as other services (insurance, consultancy, etc) that startups need in order to scale up. The company has achieved a valuation of $5-10 billion, after raising over $1 billion from institutional investors, including Goldman Sachs, Fidelity, J.P. Morgan and the Harvard Management Company.
Read less Israeli Student's Film Nominated For Oscar Read more September 4, 2o16 | Maya Sarfaty, a Tel Aviv University graduate has won a top Student Academy Award and her documentary will be automatically entered in the Academy Awards competition. Sarfaty will also receive a gold medal for her film, “The Most Beautiful Woman,” the Academy of Motion Picture Arts and Sciences announced. Beneath the appealing title lies the true story of a love affair between an SS guard and a young Jewish woman at the Auschwitz death camp. Among the list of nominees, Sarfaty’s film was the only one to make the cut in the foreign documentary category and is therefore guaranteed a gold medal when the final winners are announced on Sept. 22. A Netanya native, Sarfaty earned an undergraduate and master’s degree at the Steve Tisch School of Film and Television at Tel Aviv University and also graduated from the Nissan Nativ Acting School, which conducts classes in Jerusalem and Tel Aviv. She now lists herself as an independent filmmaker who has earned credits as director, writer and/or editor for such films as “Still Waters” and “Overtime.”
Read less Israeli Startup Beyond Verbal Raises $3M Read more September 1, 2016 | Israeli startup Beyond Verbal, which offers emotion analytics based on vocal intonations, has raised $3 million in a financing round led by Chinese technology giant Kuang-Chi. Beyond Verbal focuses on raw vocal modulations – probably the most expressive output our body produces. This non-intrusive technology enables wearables and digital health applications to better understand consumer interactions. Founded in 2012 by Yoav Hoshen, Yuval Mor, and Yuval Rabin (son of the late Israeli prime minister Yitzhak Rabin), Beyond Verbal has so far raised $10.1 million. Read less Alibaba, Baidu, Others To Attend GoforIsrael China Conference Read more August 30, 2016 | The 16th annual GoForIsrael conference will be held for the first time outside of Israel, on September 20th, at the Grand Hyatt, in Shanghai, China and continuing in Wuhan. The Shanghai portion of the conference is expected to welcome some 1000 Chinese investors and 100 innovative Israeli companies, including Mobileye, Xjet, Tufin Zerto, Wework Emefcy, ReWalk Robotics and many more. Many major Chinese investor groups such as Alibaba, Baidu, Tencent, Lenovo and leading financial enteties such as Fosun, PingAn, China Everbright, SAIF Partners, Sinopec, CYND, Sailing Capital, GTJA, GF Xinde are expected to join the event. The Wuhan portion of the event will feature the participation of the Israeli companies and 800 additional investors. Organized by Cukierman & Co. Investment House and Catalyst Fund, GoforIsrael is one of the most influential business conferences in Israel. This year’s conference focuses mainly on the opportunities between Israel and China as well as on the Chinese desire for investment. The event is jointly organized with Yafo Capital, an Investment House in Shanghai, China.
Read less Israel’s Freightos Buys Spainish WebCargoNet Read more August 31, 2016 | Israeli company Freightos, an International online freight network, has as acquired WebCargoNet, a Barcelona based air-cargo rate management provider. WebCargoNet helps bring air cargo companies online with automated freight routing and pricing, and will complement Freightos’ efforts to do the same for air, ocean and trucking, the company said in a statement. The result, said Eytan Buchman, the marketing director of the Israeli company, will create the “world’s largest freight Big Data database of over 200 million data points,” helping more companies take part in the online freight marketplace offered by the Jerusalem-based company. Read less New Israeli-Chinese Incubator Launches In Tel Aviv Read more August 30, 2016 | The EastMakers Program, a new incubator focusing on the Internet of Things and hardware, launched Monday night in Tel Aviv at and event at the Google Campus. The program is a collaboration between StartupEast, a three-year old incubator that helps Israeli companies expand into Asian markets, and MakeMountain, a Shenzen-based incubator that helps companies in the field of manufacturing. It is the first such program focused on IOT and Hardware. The joint venture will choose three start-ups for its first round, which will receive pre-seed investment and free services equivalent to $100,000, a one-month preparation program, and an incubation period in Shenzen to help them get their China operations up and running. Read less Israelis Win MTV Award For Coldplay’s ‘Up&Up’ Video Read more August 30, 2016 | Vania Heymann and Gal Muggia, the Israeli directors of a popular music video for British rock band Coldplay’s song “Up&Up” won an MTV Video Music award for best visual effects on Sunday night in New York. The song from Coldplay’s “A Head Full Of Dreams” album has amassed over 75 million views on YouTube since being posted on May 16. The “Up&Up” video, with its turtle swimming through a New York City subway station and a baby crawling across the wing of small airplane, was described by the Metro UK as “brilliantly weird.” Chris Martin, front-man for Coldplay, told the Radio station Beats 1, “I think it’s one of the best videos people have made.”Heymann and Muggia were also nominated for best director, but lost out on that award to the artist of the night, Beyonce, who took home seven awards, including video of the year. Read less Hebrew U. Launches Israel’s 1st Winemaking Degree Program Read more August 29, 2016 | The Robert H. Smith Faculty of Agriculture, Food and Environment, at the Hebrew University of Jerusalem, has opened Israel’s first academic degree program in wine: the International MSc in Viticulture and Enology. The four-semester MSc program begins on March 2, 2017. Students will gain knowledge and skills at an academic level, consistent with leading programs in other wine-producing countries such as France, the United States and Australia, with special emphasis on the Israeli industry. Upon completion, the students will earn a world-recognized MSc degree from the Hebrew University of Jerusalem. This is the first MSc level degree in viticulture and enology to be approved by the National Council for Higher Education (CHE) in Israel. Read less Israeli Startup SkyGiraffe Raises $6M Read more August 25, 2016 | The Israeli enterprise mobility developer SkyGiraffe announced that it closed a $6 million funding round. The round was led by SGVC with participation from Trilogy Equity Partners, Heroku Founder and CEO James Lindenbaum, Parse Founder and CEO Ilya Sukhar, and Lookout Founder and CTO Kevin Mahaffey. The company, which had previously raised $4.5m from 500 Startups, Microsoft Ventures, Trilogy Equity Partners and angels, intends to use the funds to continue expanding its platform among other growth initiatives. Founded by CEO Boaz Hecht and VP of R&D Itay Braun in 2012, and is headquartered in Menlo Park, California, with a research and development center in Ramat Gan, Israel.
Read less Israel’s Spacecom Bought By Chinese Co. For $285M Read more July 10, 2016 | Israel’s Spacecom Satellite Communications Ltd., a global fixed satellites operator of the advanced AMOS satellite fleet, is to be fully acquired by Luxembourg Space Telecommunications, which is owned by Chinese communications company Beijing Xinwei Technology, for $285 million. Spacecom, which was founded in 1995 and is based in Ramat Gan, is owned by Eurocom Group, which is controlled by Bezeq Israeli Telecommunication Co. Ltd. controlling shareholder Shaul Elovitch.
Read less Mobileye, Delphi To Develop Self-driving System By 2019 Read more August 24, 2016 | Israeli company Mobileye, which develops vision-based driver assistance systems that help prevent collisions, said Tuesday that it is developing a self-driving system with Delphi Automotive that will be ready for production by 2019. Mobileye cut its ties with Tesla Motors last month after the electric carmaker’s autopilot system faced scrutiny from regulators following a fatal accident in early May. Tesla’s autopilot system uses EyeQ chips from Mobileye to help with image analysis for steering and staying in lanes. Mobileye and Delphi said their technology would cater to automakers that may not necessarily want to develop their own self-driving systems. Mobileye, which was founded by Prof. Amnon Shashua and Ziv Aviram in 1999, recently teamed up with BMW and Intel to manufacture a driverless car by 2021.
Read less Mobile Ad Co. MassiveImpact Bought By Taiwan’s GMobi Read more August 9, 2016 | Taiwan’s General Mobile Corporation (GMobi) has acquired Israeli mobile ad company MassiveImpact. No financial details were disclosed, but market sources believe the acquisition was for several tens of millions of dollars. Founded in 2005, MassiveImpact is a leading performance advertising platform specializing in social media promotion of mobile applications, which has raised $9.3 million and has 80 employees. MassiveImpact reaches over 1 billion users in 190 countries, with blue chip clientele including leading tech, retail and gaming brands Gameloft, Citibank, AIA, Tencent, and Baidu, among others. The acquisition of MassiveImpact will complement existing mobile technology services provided by GMobi such as Firmware-Over-The-Air (FOTA) updates, secure mobile payments, and mobile advertising. For GMobi customers, the MassiveImpact acquisition means greater value via optimized targeting, content delivery, and end user experiences through the company’s Real Time Performance (RTP) platform. Read less Israeli Medalist Auctioning Olympics Name Tag For Hospital Read more August 24, 2016 | Olympic bronze medalist judoka Yarden Gerbi has put her name tag up for auction on eBay, saying she will use the money to support the Pediatric Cancer Unit at a Tel Aviv hospital. As of now, the top bid stands at $13,100 or NIS 49,421.09, though with five days left to the auction it will likely rise. Gerbi, whose bronze during the Rio 2016 Games made her just the second Israeli woman to win a medal at the Olympics, says donating the proceeds from the ID tag to the Children’s Cancer Ward at Sourasky Hospital will make it “more special for society.” She says she will dedicate the tag to the winning bidder and sign it for them. Read less Mayo Clinic, IDCBeyond Collaborate For Entrepreneurship Training Read more August 23, 2016 | Experts from The Mayo Clinic, an American medical practice and research center which is the largest integrated nonprofit medical practice in the world, will lead the teaching of medical challenges within IDCBeyond, the new entrepreneurship program at IDC Herzliya. The cooperation will be led by Amir Lerman, M.D., Professor of Medicine at Mayo Clinic, director of the cardiovascular research center, who was recently appointed as head of Mayo Clinic’s new initiative of discovery, investment and cooperation with Israeli companies and technologies. According to Nava Swersky Sofer, Managing Director of IDCBeyond at IDC Herzliya, “The new program is designed to train and qualify participants in developing and establishing innovative ventures by implementing technologies to address global challenges which characterize the 21st century. The cooperation with Mayo Clinic will expose students to the forefront of technology and innovation, and pinpoint the issues facing health systems worldwide. This cooperation opens a number of possibilities for entrepreneurship students, from exploring new horizons to establishing innovative ventures.”
Read less Israel’s Wix To Sponsor New York Yankees Read more August 22, 2016 | Israeli DIY-website platform company Wix.com announced a sponsorship deal with major league baseball’s New York Yankees. As part of the cooperation package, Wix will be the team’s official website design sponsor and partner for 2016. The partnership will include a campaign to find the New York Yankee’s number one fan. The winner will build a site together with Wix at the TheBiggestYankeeFan.com domain. Wix, founded in 2006 by CEO Avishai Abrahami, CTO Giora Kaplan and VP Client Development Nadav Abrahami, has developed a platform enabling users to easily set up and design their own website. This is not Wix’s first involvement in sport. Six months ago, Wix announced a sponsorship deal with top English soccer team Manchester City and before that the company twice advertised during the Superbowl, the biggest sporting event in the US. Read less Smartphone Repair Startup CellSavers Raises $15M Read more August 22, 2016 | Israeli startup CellSavers, which provides an on-demand repair platform for smartphones and other mobile devices, has completed a $15 million financing round, led by Carmel Ventures, with participation of Sequoia Capital Israel. The current funding follows the company’s $3 million round in seed funding led by Sequoia Capital in December 2015. CellSavers’ platform is based on an end-to-end technological and operational solution, which enables the company to match consumers and skilled professionals in real time. CellSavers was founded in 2015 by CEO Eyal Ronen and president Itai Hirsch. Read less Cyber Security Co. Intsights Raises $1.5M Read more August 22, 2016 | Indian software company Wipro has invested $1.5 million in Israeli startup cyber security company Intsights. In a filing to the Bombay stock exchange, Wipro said it was acquiring a minority stake in the startup. Founded in 2015 by CEO Guy Nizan, VP Intelligence Alon Arvatz and CTO Gal Ben David, the Herzliya based company infiltrates the cyber-threat underworld to detect and analyze planned or potential attacks and threats. The company also provides advance warning and customized insight concerning potential cyber attacks, including recommended steps to avoid or withstand the attacks and delivers in-depth analysis of cyber threats originating from in-house sources, third-party sources or threat actors. Intsights raised $1.8 million in October 2015 from Glilot Capital Partners. Wipro made the investment through Israel venture capital firm TLV Partners. Read less Premature Baby Nutrition Co. Nutrinia Raises $30M Read more August 21, 2016 | Israeli biotech startup Nutrinia has closed $30 million Series D financing to fund two pivotal trials for registration. TPG Biotech led the investment, joined by H.I.G. BioHealth Partners and WuXi Healthcare Ventures, as well as existing investors including OrbiMed, Pontifax and others. The Ramat Gan based company is developing a proprietary oral formulation of insulin for intestinal malabsorption in preterm newborns and Short Bowel Syndrome (SBS) in infants. Nutrinia will use the proceeds to initiate two pivotal trials for registration in separate indications related to acceleration of gut maturation and adaptation: intestinal malabsorption in preterm newborns born between 26 and 32 weeks’ gestational age, and infants with SBS who are under 12 months old. Nutrinia was founded by Professor Naim Shehadeh head of Pediatric Diabetes Clinic at the Rambam Medical Center in Haifa, at the NGT incubator in Nazareth, which was supported by the Jacobs family, who invested in the company’s earlier investment round along with Maabarot.
Read less Technion’s Formula Student Team Finishes In Top 10 In Europe Read more August 21, 2016 | The Technion’s fourth Student Formula team finished twice in the top ten in the Formula Student competitions held this summer in Europe. In the competition in Hungary, the team finished in 9th place (out of 34 universities) and in the Czech Republic in 8th place (out of 42). Ranking was based on a number of criteria, including business plan presentation, circling the track, a 22 km heat (endurance), driving in figure eights, fuel efficiency and acceleration to a distance of 75 meters. Technion’s Formula car, which competed in the combustion engine category, achieved especially good results in the acceleration heat, the 22 km heat and driving in figure eights – heats that were held on a wet course in the Czech competition. According to the team leader, student Evgeny Guy, “In both competitions the car performed as planned with no unusual problems. All in all, this was our most successful season to date, and we received warm compliments not only from the judges but also from other teams.”
Read less Previous NextSponsorsNoCamels - Israeli Innovation NewsGET THE WEEKLY NEWSLETTER IN YOUR INBOXPlease agree to receive emailsSTART NOW >> | 科技 |
2016-40/3982/en_head.json.gz/11037 | Technology to monitor bird sounds, impacts of environmental change college of engineering
05/31/2012 CORVALLIS, Ore. – Researchers at Oregon State University have created a new computer technology to listen to multiple bird sounds at one time, to identify which species are present and how they may be changing as a result of habitat loss or climate change.
The system, one of the first of its type, should provide an automated approach to ecological monitoring of bird species that is much more practical than a human sitting in the field, hours on end.
“It’s difficult to hear and identify even one or two bird species at a time, and when you have many of them singing at once it’s even more difficult,” said Forrest Briggs, a doctoral student in computer science at OSU.
“Birds are important in themselves, but also an early warning system of larger changes taking place in the environment,” Briggs said. “Now we can tell down to the second when a bird arrives, leaves, when and where it’s choosing to nest, that type of information. It’s just not practical to do that with human monitoring.”
The “multi-instance multi-label” machine learning system developed at OSU, researchers said, could ultimately be used to identify not just bird sounds but many other forest noises – everything from wind to rain drops or a falling tree. It could also be used with other animal species, including grasshoppers, crickets, frogs, and marine mammals. The research was supported by the National Science Foundation and the OSU College of Engineering.
“It would not be reasonable for a person to count birds once per minute, 24 hours a day, for three months, but we aim to obtain similar results with acoustic surveys,” the researchers wrote in a recent study published in the Journal of the Acoustical Society of America.
The error rate of this technology is already similar to that achieved by human experts, Briggs said. In one day of testing, for instance, it produced 548 10-second recordings of sounds from 13 different bird species. It is also omni-directional, meaning the microphones do not have to be pointed right at the sound in question to function accurately, one of the limitations of some previous technology.
Researchers are still working out some issues, including interference caused by rain, not to mention people heard partying in the woods, and what appeared to be the bite mark of a bear on the microphone.
David Stauth
David Stauth,
Forrest Briggs
AVAILABLE PHOTO(S):(click to download)
Bird monitoring | 科技 |
2016-40/3982/en_head.json.gz/11109 | Report | Given, Jock Home
Online video in Australia: exploring audiovisual fiction sites
Download PDF (72 pages) (Adobe Acrobat PDF, 1 MB)
Curtis, Rosemary; Given, Jock; McCutcheon, Marion
Online video has grown rapidly in recent years. One in five Australians (14 years and over) reported viewing online video via a PC 'in the last four weeks' in 2010, nearly double the figure in 2008, according to Roy Morgan. Other researchers estimate much higher proportions but of different groups. Across the whole population, Australians have been adding new screen activities to old ones. Australians get audiovisual stories online in many ways and from many places. Different types of online video service based in Australia and overseas use different delivery systems and retail business models to deliver many kinds of content. Measuring use of these services requires different tools that are not easy to reconcile and are being constantly developed by research companies. This makes analysis of the online video sector, at this stage of its evolution, more complex than other distribution sectors like cinema, television and DVD, where there is more consensus about what and how to measure, and what the metrics mean. This project concentrates on websites. Our primary measurement tool was Nielsen NetView. NetView provides data on the frequency and duration of access to particular URLs. The Interactive Advertising Bureau of Australia [IAB] appointed the Nielsen Company as the sole and exclusive preferred supplier of online audience measurement services in Australia in May 2011. In October 2011, Nielsen began releasing data using a new hybrid methodology, known as Nielsen Online Ratings. We also received some data about traffic to the ABC's websites and catch-up TV service collected by WebTrends, some data and analysis on the frequency and duration of access to particular URLs from Experian Hitwise, and gathered information from secondary sources about other places where Australians go for audiovisual stories. This project identified 25 sites for detailed analysis. Three main measures were used to analyse the selected 25 sites: (1) the size of the 'unique Australian audience' visiting each site each month; (2) the total amount of time ('total minutes') spent on each site by these visitors; and (3) the average time spent on the site per visitor. This data was obtained for the months June 2010 to June 2011. Demographic data was also supplied for the month of June 2011.
Swinburne University of Technology. Faculty of Life and Social Sciences. The Swinburne Institute for Social Research
Spreading Fictions: Distributing Stories in the Online Age,
ABC; Audiovisual distribution; Audiovisual fiction sites; Australia; Australian Broadcasting Corporation; Online video; Screen Australia
Swinburne Institute for Social Research. Swinburne University of Technology
Publisher URL http://www.swinburne.edu.au/research/institute-social-research/ Copyright
Copyright © Swinburne University of Technology, April 2012.
http://purl.org/au-research/grants/ARC/LP100200656 | 科技 |
2016-40/3982/en_head.json.gz/11139 | Search EJournals
Journal American Rhododendron Society
Current Editor:
Dr. Glen Jamieson ars.editor@gmail.com
DLA Ejournal Home | JARS Home | Table of Contents for this issue | Search JARS and other ejournals
The International Rhododendron Union
Ralph Sangster
The inaugural meeting of the International Rhododendron Union took place in Seattle, Washington, on May 1, 1985. The meeting was well attended and included representatives of the national rhododendron societies of America, Australia, Canada, Germany, Japan, New Zealand, Sweden and the United Kingdom. The Azalea Society of America, the Pacific Rhododendron Society and the Rhododendron Species Foundation were represented as were the botanic gardens of Edinburgh, Kew and Kumming.
The philosophy of the IRU developed at the International Rhododendron Conference held in New York in 1978, with the expressed need for better communication between research workers and laymen interested in the genus Rhododendron. The matter was further discussed at the International Rhododendron Conference in Edinburgh, 1982, when a steering committee was designated and assigned the task of getting an international organization "off the ground".
Briefly, the IRU is to provide a communication center to receive and distribute information relating to the Genus Rhododendron gathered from national societies, research institutions, botanic gardens and so on.
The objectives of the IRU are as follows:
1. To provide an international forum for communication about the genus Rhododendron.
2. To collate, index and distribute literature relating to Rhododendron.
3. To encourage:
(a) exploration and the introduction of new plant material from the wild
(b) research relating to Rhododendron
(c) preservation of the Rhododendron habitat.
The membership is open to any organization or individual interested in the genus Rhododendron.
At the inaugural meeting in Seattle, the proposed IRU constitution as prepared by the steering committee was discussed at length. Those present agreed to accept the stated objectives and membership clauses. The main thrust of discussion concerned funding of the organization and the amount of the proposed membership dues. It was agreed that funding is to be from donations until membership fees are determined.
Whenever possible national societies' facilities will be utilized. The boards of the national societies of America, Australia, Sweden and New Zealand have passed resolutions contributing funds towards the administration of the IRU. Other societies have indicated their willingness to assist.
At the Seattle meeting, Ralph Sangster was elected honorary Chief Executive Officer and Chris Brickell honorary Secretary-Treasurer. They have the power to co-opt other honorary officers as needed. The executive officers met in London in June, 1985, to discuss setting up the administration of the IRU and putting in motion the ground work necessary for achieving the IRU objectives.
As is the way of setting up an international organization the start is always slow. However, those who are interested in the genus Rhododendron can now look forward to improved communications on rhododendron matters.
Return to Skip Menu Footer Virginia Tech University Libraries DLA Contact Us PDF Viewers This work is licensed under a Creative Commons Attribution 4.0 International License.
URL: http://scholar.lib.vt.edu/ejournals/JARS/v40n2/v40n2-sangster.htm Last modified: 06/28/12 14:14:34
by Anne Lawrence | 科技 |
2016-40/3982/en_head.json.gz/11142 | Happy Perihelion!
Today, January 4, at 1200 UTC (7 am EST), the Earth was at perihelion. This is when we were closest to the Sun in our orbit around the Sun. At perihelion, the center of the Earth was a little over 147 million km (about 91.4 million miles) from the center of the Sun. Why do we care?
The Earth orbits the Sun along an ellipse. That means sometimes we’re farther from the Sun and sometimes closer. The average distance of the Earth from the Sun is about 150 million km (about 93 million miles, one Astronomical Unit or AU). Perihelion is the starting time for that orbit. The distance of perihelion is also important. The perihelion distance of the Earth is about 1.67% closer than the average distance.
When the Earth is closer to the Sun it receives more sunlight and when it is further away it receives less sunlight. That means the timing and distance of perihelion can affect our climate. That doesn’t mean that being closer to the Sun automatically means warmer weather. Our seasons are caused by the tilt of the earth’s rotation axis compared to our orbit around the Sun. In January the northern hemisphere of the Earth is tilted away from the Sun and the northern latitudes have winter. At the same time those living in southern latitudes experience summer. Because the tilt determines the seasons we mark our seasons by the solstices and equinoxes.
But the changing distance does affect our climate. It appears to make southern winters a little less frigid and northern summers a little milder. In the past the difference was more dramatic. The timing of perihelion in the year changes slowly. Right now perihelion happens in northern winter, while 11,000 years ago it happened in northern summer. That changed the severity of the seasons. If we wait even longer (several hundred thousand years) the shape of the Earth’s orbit also changes, with the eccentricity changing from nearly 0 (almost circular) to the current value of 0.0167 to 0.06. A circular orbit means the distance to the Sun does not change, while a more elliptical orbit means the amount of sunlight hitting the Earth changes even more during the year. Right now the amount of sunlight hitting the Earth at perihelion is about 6.5% greater than what hits the Earth at the furthest distance. At an eccentricity of 0.06 that would increase to 27%.
These are just two of the Milankovitch cycles that were developed to explain glaciations.
SDO instruments were built to allow for the changing distance and apparent size of the Sun during a year. But it’s nice to know that those changes do other things as well.
Have a Happy Perihelion Day!
A Lunar Transit
SDO Flight Dynamics Predicted Events for the Begin... | 科技 |
2016-40/3982/en_head.json.gz/11173 | Street Fight Summit 2016
Local Visionary Awards
Street FightHyperlocal Marketing, Tech, Retail, Commerce, Content Ad Tech
Brand Battle, sponsored by Brandify
Local Digital Strategy Series, sponsored by Mediative
Join Yelp, Google, Lyft & more in NYC at the hottest event in local — Street Fight Summit, October 25th
Nextdoor CEO on Patch, Facebook, and Building Social for the Real World August 8, 2012 by Steven Jacobs 2 Comments Filed Under: Interviews
July was a big month for Nextdoor’s CEO, Nirav Tolia. The dot-com veteran became a father, spoke at Allen & Co.’s annual bigwig brainstorm in Sun Valley, and announced a major round of funding, which valued his 22-month old startup at $100 million. It’s a sort of arrival for Tolia, who has been working his way back since being embroiled in a resumé controversy that bubbled up during the 2004 merger of Epinions and DealTime. Tolia bounced back in 2007, securing an entrepreneur-in-residence position with Nextdoor investor Benchmark Capital.
Tolia was an early pioneer of user-generated content, and his launch of Nextdoor in 2010 sought to apply some of these same dynamics to real-world communities. The company, which spent the better part of a year in stealth mode, builds hyperlocal social networks that allow residents, and only residents, to interact with others within their neighborhood. The closed nature of each network makes scaling the project a daunting task, but the San Francisco-based company has managed to set up 4,050 communities in less than two years.
Street Fight caught up with Tolia recently to discuss raising money in local; developing technology for real-world communities; and the increasingly blurred line between information and social connectivity in the local space.
Nextdoor landed term sheets from six investors in this latest round. What’s the sentiment among the investment community toward local plays?
I found that there was quite a negative stigma against local, primarily because there have been a lot of venture investments in local companies that have gone nowhere. We’ve of course seen a handful become very successful – namely, Yelp, Open Table, Groupon (depending on how successful you think it is at any particular time). But there are many, many more that are in the graveyard rather than the winner’s circle.
When we tell our story [to investors], we don’t simply say were a local company – we say we’re a tech company that sits at the intersection of mobile, social and local. That’s important, because it really puts us in a different light in the way investors think about what we do. It plays to something we think is true: we are taking the best practices of social networking, or social software, and applying those to the local space. We would never lead with, “Hey, we’re a local company.” We’re a builder of communities. It just so happens that the online communities we are building and enabling are on a neighborhood level.
As someone who has spent his career developing digital communities, what’s unique about building for the real world?
Building a local play requires much more hand-to-hand combat. We talk a lot about getting liquidity, which just means having some scale of audience. When we launch an online community, each one of those is its own network — its own littler silo that needs to get liquidity. That really underscores how hard it is to make this work. We’re not starting a site, say, about Brittney Spears, where a user is a user is a user.
Virality is also a huge challenge. If you’re seeing someone in person everyday, you don’t necessarily need his or her email address — so you can’t easily invite him or her to an online service. We’ve invested in things like flyers as well as direct mail technology, in which you can click on houses in your neighborhood and invite them to the service with an actual, physical postcard.
But the benefit is that it’s the real world. The connections we have with people in the real world are immensely more powerful than those which we have online. It’s a lot harder to get it going, but when you do get established it’s way more valuable, because the switching costs are much higher.
Facebook has kept its local strategy close to the vest since shutting down its daily deals and check-in product last summer. Where does Facebook fit into Nextdoor’s competitive landscape today — as well as in the future?
We’re always hyperaware of Facebook’s scale. You can’t count them out in any direction — whether that’s local commerce or community, or whether they build their own venue platform or a search engine. They do everything. When you have almost a billion users, you have a strategic point of leverage. It’s like in the old days when Google had so much traffic that the fear was that instead of sending the traffic, they would build their own property, and send traffic to themselves. And I think that’s sort of the concern that we always have and always will have [with Facebook].
But it’s our key belief that local is not a tack-on thing. We believe it’s different. I personally had some concerns that Facebook would consume LinkedIn and even Twitter. But what I realized is that the way I interact with people on LinkedIn varies from the way I interact with people on Facebook. The use-cases are just different. On Nextdoor, you connect with a bunch of people that you may have never met. The bond is geography — not friendship, employment or interest. It’s such a different modality because we’ve built everything around location.
Patch appears to be making a move toward a more horizontal, community platform model. How does what they’re planning compare to your vision for Nextdoor?
We’re not focused on the type of editorial, top down, one-to-many approach which a Patch is looking to build. We’re about many-to-many communication. We have no external information coming into these communities — our users create everything. Sometimes it’s harder because we have to wait for our users to create content rather than plugging in the crime reports or finding a way to automatically feed local information into these neighborhoods. We say “look, if there’s a crime report that a neighbor feels like posting, then great – but we’re not in the business of deciding that.”
How do you see social networks and media evolving in terms of the ways in which local information is both consumed and produced?
From a media perspective, [one-to-many communications] is how we used to get our local information — people would watch their local news, they would listen to the radio, or read the paper. Only a few of those companies have managed to stay economically viable, and the ones that are sustaining have become almost syndication services. The traditional one-to-many approach to consuming and distributing local information has really started to dry up.
I think that was the premise for most types of local news plays, whether it’s Topix or Patch, or even Everyblock — and I think that’s a legitimate one. But media is quite different. We are a social network, not an information provider. We’re about connecting people, not necessarily about broadcasting information. The thing that we’re trying to address is not the decline of local media, but the reality that we don’t know our neighbors anymore.
But social connectivity (through the distribution of information) was once the bread-and-butter of local media, right? Do you believe that the motivation people have to read hyperlocal news is the same motivation people have to join Nextdoor?
No, I think they’re different. I think local news is about consuming information that comes from a trusted source around you. I think that what were trying to tap into is a hungering for a connection with people in your neighborhood. We’re all about the connection with people. That’s what a social network is about.
Let me actually put it this way: I think that Patch and Twitter are in a much better position to broadcast information. Twitter is potentially hugely powerful. From a broad standpoint, I can be visiting Kansas City and notice there is a forest fire, and post that on Twitter, and that’s news. That’s a very powerful source for local information.
Tell us about the most interesting conversation you had at the Allen & Co. conference? I was approached by a handful of foreign CEOs and heads of state that wanted to put Nextdoor in their countries – not for local information, and I want to be very clear about that, but the notion of being able to reconnect with your community. We learned pretty quickly that it’s not just an American initiative. There are powerful ways to use these tools outside of the U.S.
Steven Jacobs is Street Fight’s deputy editor.
Filed Under: Interviews Tagged With: Benchmark Capital, DealTime, Epinions, NextDoor, Nirav Tolia, Shopping.com
chatterman
And what’s the business model? Users will be annoyed when you start advertising and SMBs don’t have the time to promote their businesses. I give it 2 years. The reason I don’t know my neighbors is because I don’t want to know them. There’s no problem to be solved or someone else would have done it already. No stickiness, no revenue. No chance.
Concern Reader
“We have no external information coming into these communities — our
users create everything. Sometimes it’s harder because we have to wait
for our users to create content rather than plugging in the crime
reports or finding a way to automatically feed local information into
these neighborhoods. We say ‘look, if there’s a crime report that a
neighbor feels like posting, then great – but we’re not in the business
of deciding that.'”
Boy, after reading that I really have no desire to check out this new site. I’m not going to rely on people who are not trained in reporting the news to find out what’s happening in my area.
New! Free White Paper
Produced by Street Fight Insights for Advice Local.
Report: Survey Results + Analysis
Find out where hyperlocal media and commerce are headed in this FREE report!
Survey, Analysis & Strategy
Street Fight Daily delivers top headlines from around the web every morning. Start the day smarter. Subscribe now!
Add your email address...
Follow Street Fight
Street Fight!
In this monthly series, enterprise competitors duke it out for dominance in location presence. Read more.
Sign Up for Street Fight Membership
Twenty-four months of free access to Street Fight Insights research and tickets to events in New York and San Francisco. Learn more and sign up now!
Copyright © 2016 Hyperlocal Industries | 科技 |
2016-40/3982/en_head.json.gz/11365 | Richard Pierre Claude, Author and Activist, Honored by AAAS Science and Human Rights Coalition
4 August 2009Earl LanePrintEmail
NewsRichard Pierre Claude, professor emeritus of government and politics at the University of Maryland, was honored at a recent meeting of the AAAS Science and Human Rights Coalition as a founding father of efforts to get scientists to take up important work on human rights around the globe.
Richard Pierre Claude
Claude’s award-winning 2002 book, “Science in the Service of Human Rights,” is considered a classic in the field. He is also a founding editor of the journalHuman Rights Quarterly, now in its 28th year. Colleagues celebrated Claude’s career and legacy in a 23 July opening plenary session of the Coalition meeting.
Mona Younis, director of the AAAS Science and Human Rights Program, said Claude “arguably has helped bring more science and scientists to human rights than any other person.” And, she added: “He embodies what we hope to accomplish with this coalition.”
The new coalition, the result of a two-year deliberative process involving about 20 scientific organizations, was launched 14 January 2009. It now has 26 member organizations, 15 affiliated organizations and 36 scientists affiliated as individual members. The coalition wants to spur communication and partnerships among scientific organizations and human rights groups, with the goal of expanding scientists’ contributions to human rights.
Mona Younis
Elena Nightingale, scholar-in-residence at the Institute of Medicine and a former chair of the AAAS Committee on Scientific Freedom and Responsibility, said that “many of his [Claude's] ideas from 30 years ago have come to pass” in the new Science and Human Rights Coalition.
While scientists once became involved with human rights largely by writing letters on behalf of colleagues being harassed by repressive regimes, Claude encouraged them to use their expertise to promote the cause of human rights more generally around the globe. Skills in collecting and analyzing data, for example, can prove helpful in detailing rights abuses—such as deportations, ethnic cleansing, and systematic detention—that formerly might be revealed only through anecdotes.
Eric Stover, director of the Human Rights Center at the University of California, Berkeley, said use of scientific techniques, such as forensic and genetic analysis in identifying victims of mass atrocities or use of epidemiology to track the social and medical consequences of land mines, has struck a chord with many scientists.
“Scientists, physicians and health professionals want to use their skills as opposed to simply writing letters to government authorities,” said Stover, who from 1980 to 1990 headed what was then called the AAAS Clearinghouse on Science and Human Rights. He applauded the new coalition’s multidisciplinary approach.
Allen Keller
Allen Keller, associate professor of medicine at New York University and director of the Bellevue/NYU Program for Survivors of Torture, said he learned from Claude the essential role science can play in documenting and disseminating details about human rights violations.
“It is Richard who has shined the light for all of us,” said Keller, who also learned from Claude that “if we don’t speak out” about human rights issues, “don’t presume that somebody else will.” Citing his own work with torture victims, Keller said that “as scientists, we need to very clearly and articulately debunk the myth that torture is effective.”
Richard Allen White, senior fellow at the Council on Hemispheric Affairs in Washington and author of “Breaking Silence: The Case That Changed the Face of Human Rights,” told how Claude had encouraged him to write about a landmark case involving the death of a 17-year-old Paraguayan, Joelito Filártiga, who had been taken from his home by police, brutally tortured and murdered. The police officer responsible for Filártiga’s death eventually was arrested in Brooklyn, and a federal appeals court held that the man could be sued in the U.S. for violating international law (even though the case involved acts committed by one non-American against another non-American).
White said Claude’s house had become a sort of “human rights central,” where he and others could stay for weeks or even many months when they came back to the U.S. from the field. “I was there, on and off, for 4 years,” White said.
Claude, responding to those who came to celebrate his work, mentioned three themes from his own development that he said may have relevance for the coalition: political activism, multidisciplinary approaches to issues, and the importance of human rights education (including for scientists, technicians, and engineers).
His first venture into activism was in 1960 when he was arrested during a desegregation sit-in at a Tallahassee, Florida, lunch counter. “That really changed my life,” Claude said. “In many ways, it set my professional compass.” Later, as a graduate student at the University of Virginia, he helped desegregate several local movie theaters.
In a session on ethical dilemmas in science practice, Leslie Harris of the Center for Democracy and Technology noted that the Internet has “allowed people to organize for changes that would not be conceivable a mere 15 years ago.” But while technology has the potential to change the balance of power in repressive countries, she said, it is a double-edged sword. Cell phones and social networking services such as Twitter can rapidly spread information in a society, as happened did during recent protests in Iran. But such technology also can be used by governments to track citizens and their activities, Harris said. It is important for researchers to think about privacy and embed safeguards into the technologies they are creating, she said.
In another session, participants received an update on an initiative by some coalition members to highlight Article 15 of the International Covenant on Economic, Social and Cultural Rights (ICESCR), which requires states to recognize the right of everyone to enjoy the benefits of scientific progress and its applications. Article 15 also requires governments to recognize the benefits of international contacts and co-operation in science. One of the sessions discussed international standards and guidelines regarding a scientist’s right to travel as well as how attendees at scientific meetings can navigate the U.S. visa process.
Meeting attendees also heard a report on the deliberations of the Coalition Council—the coalition’s policy-setting body—which met for the first time on 23 July. The council agreed that member societies that wish to do so will be welcome to develop and sign public statements, but that these will not be a focus of the coalition’s work and will not be issued in the name of the coalition. Rather, the coalition is committed to “getting work done” through an active membership, Younis said.
“It may be difficult to come to one conclusion and one voice on every issue that we entertain,” said Paula Skedsvold, executive director of the Federation of Associations in Behavioral & Brain Sciences and a member of the coalition’s steering committee. “We will not be asking organizations to do anything that they don’t want to do.”
Douglas Richardson, chair of the steering committee and executive director of the Association of American Geographers, said that the challenge now is to turn all the organizational work into activities that “in the end, produce results and make a difference.”
agues to get involved if they find some basic human right is threatened. But he also cautioned them against proselytizing students. “Better to lead by example,” he said.
During the Coalition meeting, representatives of scientific organizations, scientists, lawyers, human rights advocates and others gathered in sessions and workshops to discuss technologies being used to promote human rights and some of the strategies for fostering more participation by scientists. | 科技 |
2016-40/3982/en_head.json.gz/11459 | Home/News/Pan-STARRS telescope spots new distant comet
Pan-STARRS telescope spots new distant comet
A preliminary orbit shows that the comet will come within about 30 million miles (50 million kilometers) of the Sun in early 2013.
By University of Hawaii at Manoa's Institute for Astronomy, Honolulu | Published: Monday, June 20, 2011
Astronomers at the University of Hawaii at Manoa have discovered a new comet that they expect will be visible to the naked eye in early 2013.
Originally found by the Pan-STARRS 1 telescope on Haleakala, Maui, on the night of June 5/6, UH astronomer Richard Wainscoat and graduate student Marco Micheli confirmed it as a comet the following night using the Canada-France-Hawaii Telescope on Mauna Kea.
A preliminary orbit computed by the Minor Planet Center in Cambridge, Massachusetts, shows that the comet will come within about 30 million miles (50 million kilometers) of the Sun in early 2013, about the same distance as Mercury. The comet will pose no danger to Earth.
Wainscoat said, “The comet has an orbit that is close to parabolic, meaning that this may be the first time it will ever come close to the Sun, and that it may never return.”
The comet is now about 700 million miles (1.2 billion km) from the Sun, placing it beyond the orbit of Jupiter. It is currently too faint to be visible without a telescope with a sensitive electronic detector.
The comet is expected to be brightest in February or March 2013, when it makes its closest approach to the Sun. At that time, the comet is expected to be visible low in the western sky after sunset, but the bright twilight sky may make it difficult to view.
Over the next few months, astronomers will continue to study the comet, which will allow better predictions of how bright it will eventually get. Wainscoat and UH astronomer Henry Hsieh cautioned that predicting the brightness of comets is notoriously difficult, with numerous past comets failing to reach their expected brightness.
Making brightness predictions for new comets is difficult because astronomers do not know how much ice they contain. Because sublimation of ice (conversion from solid to gas) is the source of cometary activity and a major contributor to a comet’s overall eventual brightness, this means that more accurate brightness predictions will not be possible until the comet becomes more active as it approaches the Sun and astronomers get a better idea of how icy it is.
The comet is named C/2011 L4 (PANSTARRS). Comets are usually named after their discoverers, but in this case, because a large team, including observers, computer scientists, and astronomers, was involved, the comet is named after the telescope.
C/2011 L4 (PANSTARRS) most likely originated in the Oort Cloud, a cloud of comet-like objects located in the distant outer solar system. It was probably gravitationally disturbed by a distant passing star, sending it on a long journey toward the Sun.
Comets like C/2011 L4 (PANSTARRS) offer astronomers a rare opportunity to look at pristine material left over from the early formation of the solar system.
The comet was found while searching the sky for potentially hazardous asteroids — ones that may someday hit Earth. Software engineer Larry Denneau, with help from Wainscoat and astronomers Robert Jedicke, Mikael Granvik, and Tommy Grav, designed software that searches each image taken by the Pan-STARRS 1 telescope for moving objects. Denneau, Hsieh, and UH astronomer Jan Kleyna also wrote other software that searches the moving objects for comets’ telltale fuzzy appearance.
The comet was identified by this automated software.The Pan-STARRS 1 telescope on Haleakala, Maui, found a new distant comet, designated C/2011 L4, on the night of June 5/6.Henry Hsieh, PS1SC
The comet was identified by this automated software.0JOIN THE DISCUSSION
RELATED ARTICLESJapanese T2K experiment sees evidence of a new type of neutrino "flavor change"EPOXI finds 103P/Hartley is a hyperactive cometGamma-ray flash came from star being eaten by black holeHubble captures spectacular view of Centaurus ADeep mysteries lurk below (and even above) Mercury’s surfaceDevon Island: The last stop before MarsAstronomers spy a “stellar cocoon” outside the Milky Way for the first timeThe comet probe Rosetta is set for its grand finale on FridayHow much would it cost to live on the Moon? YOU MIGHT ALSO LIKE100 Most Spectacular Sky WondersDeep Space Mysteries 2017 CalendarThe New Cosmos (By David Eicher)Explore Jupiter's moons PosterYour Guide to the 2017 Total Solar EclipseCosmology's Greatest DiscoveriesCosmic OriginsCelestial Portraits Download | 科技 |
2016-40/3982/en_head.json.gz/11460 | www.astrosociety.org/uitc
No. 11 - Fall 1988
© 1988, Astronomical Society of the Pacific, 390 Ashton Avenue, San Francisco, CA 94112
Horoscopes Versus Telescopes: A Focus on Astrology
A Note from the Editor:
The revelation in 1988 that First Lady Nancy Reagan consulted a San Francisco astrologer in arranging the president's schedule may have surprised or amused many teachers and parents who pay little attention to this old superstition. Unfortunately, belief in the power of astrology is much more widespread among our students than many people realize. A 1984 Gallup Poll indicated that 55% of American teenagers believe that astrology works. Astrology columns appear in over 1200 newspapers in the US; by contrast, fewer than 10 newspapers have columns on astronomy. And all around the world, people base personal, financial, and even medical decisions on the advice of astrologers. Furthermore, astrology is only one of a number of pseudo-scientific beliefs whose uncritical acceptance by the media and the public has contributed to a disturbing lack of skepticism among youngsters (and, apparently, presidents) in the U.S. Many teachers feel that it is beneath our dignity to address topics like this in our courses or periods on science. Unfortunately, by failing to encourage healthy doubt and critical thinking in our children, we may be raising a generation that is willing to believe just about any far-fetched claim printed in the newspapers or reported on television. We therefore devote this issue of The Universe in the Classroom to information about debunking astrology and using student interest in such "fiction sciences'' to help encourage critical thinking and illustrate the use of the scientific method. Andrew Fraknoi Some Thought-Provoking Questions About Astrology For those who follow newspapers or magazine columns on astrology, it's useful to begin by asking how likely it is that 1/12 of the world — over 400 million people for each sign of the zodiac — will have the same kind of day? This question sheds some light on why the predictions in astrology columns are always so vague that they can be applied to situations in almost everyone's life. Why is it the moment of birth, rather than the moment of conception, which is the critical one for calculating a horoscope? To figure this one out, it's helpful to know that when astrology was first set up thousands of years ago, the moment of birth was considered a magic time. But today, we understand that birth is the culmination of roughly nine months of complex, intricately orchestrated development inside the womb. Many aspects of a child's personality are set long before the time of birth. The reason the astrologers still adhere to the moment of birth has little to do with astrological "theory''. The simple fact is, almost everyone knows his or her moment of birth — but it is difficult (and perhaps embarrassing) to find out one's moment of conception. "Serious'' astrologers claim that the influence of all the major bodies in the solar system must be taken into account to arrive at an accurate horoscope. They also insist that the reason we should believe in astrology is because it has led us to accurate predictions or personality profiles for many centuries. But anyone who knows the history of astronomy can tell you that the most distant known planets — Uranus, Neptune and Pluto — were not discovered until 1781, 1846, and 1930, respectively. So why weren't all the horoscopes done before 1930 incorrect, since the astrologers before that time were missing at least one planet from their inventory of important influences? Moreover, why did the problems or inaccuracies in early horoscopes not lead astrologers to "sense'' the presence of these planets long before astronomers discovered them? All the long-range forces we know of in the universe get weaker with distance (gravity is an excellent example). Yet for astrology it makes no difference whether Mars is on the same side of the Sun as we are (and therefore relatively near us) or way on the other side — its astrological influence (force) is the same. If some influence from the planets and the stars really did not depend on how far away the source of the influence was, it would mean a complete revolution in our understanding of nature. Any such suggestion must therefore be approached with extreme skepticism. Furthermore, if the astrological influences do not depend on distance, why don't we have to consider the influences of other stars and even galaxies in doing a horoscope? What inadequate horoscopes we are getting if the influence of Sirius and the Andromeda Galaxy are omitted! (Of course, since there are billions of stars in our Galaxy and billions of other galaxies, no astrologer could ever hope to finish a horoscope that took all their influences into consideration.) Even after thousands of years of study and perfecting their art, different schools of astrology still vehemently disagree on how to cast a horoscope and — especially — on how to interpret it. You can have your horoscope cast and read by different astrologers on the very same day and get completely different predictions, interpretations, or suggestions. If astrology were a science — as astrologers claim — you would expect after all these years that similar experiments or calculations would begin to lead to similar results. What's the Mechanism? But even if we put such nagging thoughts about astrology aside for a moment, one overriding question still remains to be asked. Why would the positions of celestial objects at the moment of our birth have an effect on our characters, lives, or destinies? What force, what influence, what sort of energy would travel from the planets and stars to all human beings and affect our development or fate? One can see how the astrological world view might have been appealing thousands of years ago when astrology first arose. In those days, humanity was terrified of the often unpredictable forces of nature and searched desperately for regularities, signs, and portents from the heavens that would help them guide their lives. Those were days of magic and superstition, when the skies were thought to be the domain of gods or spirits, whose whims humans had to understand — or at least have some warning of — if they were to survive. But today, when our spacecraft have traveled to the planets and have explored them in some detail, our view of the universe is very different. We know that the planets are other worlds and the stars other Suns — physical bodies that are incredibly remote and mercifully unconcerned with the daily lives of creatures on our small planet. No amount of scientific-sounding jargon or computerized calculations by astrologers can disguise this central problem with astrology — we can find no evidence of a mechanism by which celestial objects can influence us in so specific and personal a way. Introducing Jetology Let's take an analogy. Imagine that someone proposes that the positions of all the jumbo jets in the world at the moment that a baby is born will have a significant effect on the child's personality or future life. Furthermore, for a fee, a "jet-ologer'' with a large computer might offer to do an elaborate chart showing the positions of the planes at the right time and to interpret the complex pattern of the plane positions to help you understand their influence on your life. No matter how "scientific'' or complex the chart of jet positions turned out to be, any reasonably skeptical person would probably ask the "jet-ologer'' some rather pointed questions about why the positions of all these planes should have any connection with someone's personality or with the events that shape human lives. (Students might enjoy inventing other such "sciences'' and making an elaborate set of rules for them.) In the real world, it is quite simple to calculate the planetary influences on a new-born baby. The only known force that is acting over interplanetary distances in any significant way is gravity. So we might compare the pull of a neighbor planet like Mars with other influences on the baby. It turns out that the gravitational pull of the obstetrician is significantly greater than that of Mars. (And the hospital building — unless the baby happens to be in the exact geometric center of it — has an even greater pull than the doctor!) For those classes that would like to work out such calculations for themselves, formulas and examples can be found in the book by Culver & Ianna cited in the Resource Corner. Testing Astrology Some astrologers argue that there may be a still unknown force that represents the astrological influence. Suppose we give them the benefit of the doubt and assume that there is something connecting us to the heavens, even if we do not know what it is. If so, astrological predictions — like those of any scientific field — should be easily tested. If astrology predicts that Virgo and Aries are incompatible signs — to take a simple example — then if we look at thousands of marriage and divorce records, we should see more Virgo-Aries couples getting divorced and fewer of them getting married than we would expect by chance. Astrologers always claim to be just a little too busy to carry out such careful tests of their efficacy, so in the last two decades scientists and statisticians have generously done such testing for them. There have been dozens of well-designed tests all around the world, and astrology has failed all of them. (See the Resource Corner for more on these tests and the Activities Corner for some experiments you can do with your students.) For example, psychologist Bernard Silverman of Michigan State University looked at 2,978 marriages and 478 divorces for 1967 and 1968 to see if "compatible'' astrological signs were more likely to get and stay together. He found that there was no correlation — compatible and incompatible signs got married and divorced equally often. In another test, staff members at the U.S. Geological Survey analyzed 240 earthquake predictions by 27 astrologers and found that they were less accurate than one could be by simply guessing! And so each of the tests has gone. In addition, astronomers Roger Culver and Philip Ianna (reference) tracked the specific published predictions of well-known astrologers and astrological organizations for a period of five years. Out of over 3000 specific predictions (including many about politics, film stars, and other famous people) in their sample, only about 10% came to pass. If reading the stars has led astrologers to incorrect predictions nine times out of ten, they hardly seem like reliable guides to the uncertainties of life or the affairs of our country. Perhaps we should let those beckoning lights in the sky awaken our students' interest in the real (and fascinating) universe beyond our planet, and not permit them to be tied to an ancient fantasy left over from a time when we huddled by the firelight, afraid of the night. Activity Corner One of the best ways to get students to think about the validity of astrology is to have them test astrological predictions for themselves. Here are a few practical activities to get you started; you and your students may be able to suggest other tests and projects yourselves. (Let us know if you think of some good ones.) For many of these tests, it is useful to gather a large sample of data for statistical purposes. In some schools, where one class does not have enough students or time to gather all the necessary data, other classes and family members have sometimes been drawn into the study. 1. Same Day, Different Horoscopes
If your town has a good newsstand and the class budget permits, have students buy as many newspapers and magazines with astrology columns as possible. Have students compare the predictions and statements of different astrologers for the same sign. How many disagree? How many contradict each other? 2. Mixed-up Horoscopes
Cut out the 12 horoscopes from a newspaper (preferably one that the students are not likely to have seen) and, after making a master copy for yourself, cut off the dates and zodiac designations from each paragraph. Mix them up, give each a number, and then distribute the unlabeled paragraphs to each student on the next day. Ask students to list their birthdays and then to select the one paragraph that best applied to them yesterday. After all the papers have been collected, mix them up and give them back so each student gets someone else's paper. Then put the dates the astrologers specified for each paragraph on the blackboard and have the students total how many hits and misses there were. How many hits would students predict by chance? 3. Professions and Astrology
Even astrologers who disdain the newspaper horoscopes (because they deal only with the position of the Sun and not other celestial bodies) will often claim that the Sun sign is connected with a person's choice of profession. Many astrology books specify which signs are the most likely to select a given profession. For example, Leos might be more likely to go into politics and Virgos into science. Once the class looks through some astrology books and finds such "hypotheses'', they can then begin testing them. One test would be for the class to send a survey to people in the profession they selected, asking for their birth dates. (You should be sure the students explain why they want the information, discuss the approach, and enclose a stamped self-addressed envelope.) Another way to gather data — at least for people who are well-known — is to look in leadership directories, such as Who's Who in American Politics and correlate birthdays and professions. It's important to gather enough examples so that statistical quirks begin to average out in your sample. Large-scale tests like these have revealed no correlation between signs and professions — the members of a given profession are pretty evenly spread among all the signs of the zodiac. (Thanks to Diane Almgren, Broomfield, CO; Daniel Helm, Phoenix, AZ; and Dennis Schatz of the Pacific Science Center in Seattle, WA for suggestions.) "The fault, dear Brutus, is not in our stars,
But in ourselves...''
Shakespeare's Julius Caesar, Act 1, Scene 2. Resource Corner
To read more about astrology, we suggest:
Astrology: True or False by Roger Culver and Philip Ianna (1988, Prometheus Books, 700 E. Amherst St., Buffalo, NY 14215) — The best book on this subject. "A Double-Blind Test of Astrology'' by Shawn Carlson in Nature, vol. 318, p.419 (5 Dec 1985) — A report on a sophisticated test of astrologers in a scientific journal. Science and the Paranormal ed. George Abell & Barry Singer (1981, Scribners) — A general introduction to debunking a number of "fiction sciences'' Science Confronts the Paranormal and Paranormal Borderlands of Science (Prometheus Books), two excellent collections of articles from The Skeptical Inquirer magazine, provide superb ammunition to use against many pseudoscientific claims, including UFO's as extraterrestrial spacecraft and ancient astronauts coming here to help us start civilization (because our ancestors were too dumb to do it themselves!) << previous page | 1 | 2 | 3 | 4 | next page >>
back to Teachers' Newsletter Main Page | 科技 |
2016-40/3982/en_head.json.gz/11517 | BGCI
BGCI provides a global voice for all botanic gardens, championing and celebrating their inspiring work. We are the world's largest plant conservation network, open to all. Join us in helping to save the world's threatened plants.
Africa Home
African Botanic Gardens Network
Strategic Framework
2006 Work Programme
History of the ABGN
Action and Resources
SABONET
PlantSearch
GardenSearch
Sign up to Cultivate
Receive news and updates
Africa > History of the African Botanic Gardens Network
History of the African Botanic Gardens Network
At the 5th International Congress of Botanic Gardens Conservation International (BGCI) held at Kirstenbosch National Botanical Garden, Cape Town, South Africa in September 1998, the African representatives called on the BGCI to redevelop an African focus, including an African Botanic Gardens Newsletter. This was to replace theTropical Africa Botanic Gardens Bulletin that was initiated in 1989. This Bulletin was to be published yearly, but unfortunately only four editions appeared between 1989 and 1995. The idea of establishing an African Botanic Gardens Network (ABGN) was put forward again at the World Botanic Garden Congress held in Asheville, North Carolina, USA, in June 2000.The success of the discussions between African botanic garden representatives and Fiona Dennis of the BGCI in Asheville brought about the publication of the first issue of thenew African Botanic Gardens Network Bulletin in October 2000. Prior to the Asheville meeting in June 2000, Nouhou Ndam of Limbe Botanic Garden in Cameroon had in 1995 also called for the biological institutions of Africa to form a network.In March 2001 it was agreed at the inaugural meeting of the SABONET - supported Southern African Botanical Garden Network to hold the first Congress Steering Committee meeting in Aburi Botanic Gardens, Ghana. Consequently, the follow-up meeting to the African Botanic Gardens Network inaugural meeting in Asheville was held at the Aburi Botanic Gardens, Ghana, in June 2001. Hosted by George Owusu-Afriyie (Ghana), the historic meeting was attended by representatives from Cameroon (Christopher Fominyam), South Africa (Christopher Dalzell and Christopher Willis), Theophilus Agbovie of Aburi Botanic Gardens and Fiona Dennis of BGCI.The conclusion of this meeting was an agreement to hold the first African Botanic Gardens Congress in 2002. Exactly a year after the Aburi meeting, the Steering Committee met again at Aburi Botanic Gardens, Ghana, from 11-14 June 2002, to evaluate the progress that had been made. Regional Co-ordinators present at this meeting were Fiona Dennis (representing BGCI and North Africa), George Owusu-Afriyie (West Africa), Christopher Fominyam (Central Africa), Christopher Dalzell and Christopher Willis (Southern Africa) and William Wambugu (East Africa). The Aburi meeting confirmed that the Congress would be held at the Durban Botanic Gardens, South Africa, from 24-29 November 2002. This is very historic and timely, coinciding within a period of a few months with the birth of the African Unionin Durban, South Africa, where African Heads of State were present to welcome the new African Union.From 24-29 November 2002, 67 delegates representing 23 African countries and various non-African delegates attended the first ever African Botanic Gardens Congress in Durban Botanic Gardens, South Africa, to establish the new African Botanic Gardens Network (ABGN). Africa, defined as continental Africa and the surrounding islands, was divided into six regions, namely North Africa, West Africa, Central Africa, Eastern Africa, Southern Africa and the Indian Ocean Islands.The Congress produced the Strategic Framework, Action Plan, set out the management structure and officially established the steering committee and secretariat of the ABGN. The ABGN also currently produces a news bulletin. | 科技 |
2016-40/3982/en_head.json.gz/11563 | Ceramic Integration and Joining Technologies
By Mrityunjay Singh
from Macro to Nanoscale Edited by Mrityunjay Singh
Tatsuki Ohji
Rajiv Asthana
Normal Price: $274.95
Your Price: $247.45 AUD, inc. GST
You Save: $27.50! (10% off normal price)
Plus...earn $12.37 in Boomerang Bucks
Availability: Available to Backorder, No Due Date for Supply
Ceramic Integration and Joining Technologies by Mrityunjay Singh
This book joins and integrates ceramics and ceramic-based materials in various sectors of technology. A major imperative is to extract scientific information on joining and integration response of real, as well as model, material systems currently in a developmental stage. This book envisions integration in its broadest sense as a fundamental enabling technology at multiple length scales that span the macro, millimeter, micrometer and nanometer ranges. Consequently, the book addresses integration issues in such diverse areas as space power and propulsion, thermoelectric power generation, solar energy, micro-electro-mechanical systems (MEMS), solid oxide fuel cells (SOFC), multi-chip modules, prosthetic devices, and implanted biosensors and stimulators. The engineering challenge of designing and manufacturing complex structural, functional, and smart components and devices for the above applications from smaller, geometrically simpler units requires innovative development of new integration technology and skillful adaptation of existing technology.
Buy Ceramic Integration and Joining Technologies book by Mrityunjay Singh from Australia's Online Bookstore, Boomerang Books.
Technology, engineering, agriculture
Mechanical engineering & materials
(241mm x 162mm x 44mm)
Imprint: John Wiley & Sons Ltd Publisher: John Wiley and Sons Ltd
Publish Date: 18-Nov-2011 Country of Publication: United Kingdom Books By Author Mrityunjay Singh
Advanced and Refractory Ceramics for Energy Conservation and Efficiency, Hardback (September 2016)
Ceramics for Energy Conversion, Storage, and Distribution Systems, Hardback (August 2016)
Ceramics for Environmental Systems, Hardback (August 2016)
Additive Manufacturing and Strategic Technologies in Advanced Ceramics, Hardback (August 2016)
» View all books by Mrityunjay Singh
» Have you read this book? We'd like to know what you think about it - write a review about Ceramic Integration and Joining Technologies book by Mrityunjay Singh and you'll earn 50c in Boomerang Bucks loyalty dollars (you must be a member - it's free to sign up!)
Author Biography - Mrityunjay Singh
Mrityunjay Singh, PhD, FASM, FACerS, FAAAS, is Chief Scientist at Ohio Aerospace Institute, NASA Glenn Research Center, and is actively involved in various activities related to processing, manufacturing, joining and attachment technologies. He is Academician of World Academy of Ceramics and Governor of Acta Materialia, Inc. He has authored or coauthored more than two hundred thirty publications, edited forty two books/journal volumes, and holds several patents and technology transfers to industries. He is recipient of numerous (more than forty) national and international awards from all over the world including four R&D 100 awards, NASA Public Service Medal, and NASA Exceptional Space Act Award for outstanding and extraordinary contributions to various NASA programs. Tatsuki Ohji, PhD, FACerS, is the Prime Senior Research Scientist at Japan's National Institute of Advanced Industrial Science and Technology (AIST). His research interests include characterization of ceramics, ceramic composites and porous materials, design of advanced ceramics, and green ceramic manufacturing. He has authored or coauthored more than 320 scientific papers and nine books, and holds more than forty patents. He is a fellow of the American Ceramic Society, Academician of the World Academy of Ceramics, a Governor of Acta Materialia, Inc., and editorial board member of many international journals. Rajiv Asthana, PhD, FASM, is a professor at the University of Wisconsin-Stout where he teaches in the manufacturing engineering and engineering technology programs. His research interests include ceramic/metal joining, high-temperature capillarity and cast metal-matrix composites. He is the author or coauthor of four books and 150 refereed papers, an Associate Editor of Journal of Materials Engineering & Performance, guest editor and editorial board member of several refereed journals, and a fellow of ASM International. Sanjay Mathur, PhD, is professor at the Institute of Inorganic Chemistry at the University of Cologne, Germany. The major research thrust of his group include chemical synthesis of functional nanostructures and their processing for product and device applications. He was an Alexander von Humboldt Fellow at the Saarland University, where he accomplished his Habilitation. He has authored or coauthored over 150 publications, edited a book, holds several patents, and is and Associate Editor of International Journal of Applied Ceramic Technology and Nanomaterials.
Recent books by Mrityunjay Singh
Recent books by Tatsuki Ohji
» View all books by Tatsuki Ohji
Recent books by Rajiv Asthana
» View all books by Rajiv Asthana | 科技 |
2016-40/3982/en_head.json.gz/11626 | When will Catch Up TV be treated equally?
Adam Turner
The networks discriminate against us depending on our viewing device.Australia's online Catch Up TV services are slowly finding mainstream acceptance as they're built into more and more mainstream entertainment devices such as televisions and Blu-ray players. Sony TVs and Blu-ray players will offer access to Ten's Catch Up TV offerings as of June, sitting alongside the SBS, Seven's Plus7 and the ABC's iView. Meanwhile some Samsung and LG gear offers access to iView and Plus7. Samsung is also introducing a Foxtel app, similar to the limited T-Box and Xbox 360 Foxtel services, which will even include some Olympic coverage.
Catch Up TV. The rise of all these new Catch Up TV options is exciting, until you sit down on the couch to watch them and realise the show you want is missing. Plus7 is perhaps the worst culprit, with the television/Blu-ray service offering far fewer programs than the browser-based service you can watch freely on your computer.These restrictions are perhaps in part related to rights issues, but it's more likely that the commercial networks are determined to ensure that Catch Up TV poses no real threat to free-to-air -- especially when you're sitting in front of your television. The networks want to offer just enough to be able to claim they're giving the people what they want, without actually giving the people what they want. Then they wonder why people still turn to BitTorrent.What's most frustrating is the seemingly arbitrary distinction between viewing devices such as smartphones, tablets, computers, set-box boxes and televisions. Along with these you've got distinctions regarding how and where we watch content, with major players fighting some new technologies even though they fall within current copyright laws. Of course this has been highlighted by the Optus TVNow case and it's going to be interesting to see how next year's government review of copyright law plays out.If the networks are giving away Catch Up TV anyway, why make it hard for us to watch it how we want to watch it? In actual fact they're not "giving" it away, because Catch Up TV services often include advertisements. So why not whack in a few ads and make the shows we want to watch available on AV gear as well as browsers? Why not build your audience rather than alienate it?For years viewers have called on content providers to change their view of the world. To forget about the underlying technology, which is constantly evolving, and focus on what we can watch rather than how we watch it. Hopefully this is the approach that the review of Australian copyright law will take. Yet the major media players have been dragged into the internet age kicking and screaming, and are still caught up in the old world "we talk, you listen" broadcasting mentality.A few years ago Australia started counting time-shifted content in the ratings results. Perhaps it's time to do the same with Catch Up TV. In some ways the Catch Up audience is more valuable, because viewers can't as easily skip the advertisements as they can when time-shifting on a PVR. Then again the networks do their best to foil our attempts at PVR time-shifting, so they're unlikely to give a stuff about Catch Up TV audiences.You can't teach old dogs new tricks, but you can cut them out of the picture. How do you deal with broadcasting restrictions? | 科技 |
2016-40/3982/en_head.json.gz/11721 | Mike Altendorf: Search will outshine KM
Enterprise search will be a new software layer that lets firms exploit their digital assets. And Microsoft has its foot in the door, says Mike Altendorf
Some people might be a bit confused by Microsoft’s recent $1.2bn acquisition of Norwegian enterprise search company Fast Search & Transfer, but it makes sense given that everyone is looking to see what the next generation of search is going to look like, and with Fast, Microsoft might have the answer.
I must admit that, although I always respected the company, I never saw Google becoming so successful or maintaining its lead in search technologies in the way that it has. Google is very good at returning answers on basic searches but it doesn’t go further; it is consumer-focused and the application of search technology within an organisation is going to be absolutely key. Everyone can do basic search now but the business of going further and understanding the ‘semantic web’, that is using natural language for searching and understanding the tone and context of your request, will be critical to the next generation of search. Obviously, the core aspect of finding, classifying and indexing data is important but getting a system that understands emotion, background and other clues will be the necessary abilities that set the next search leader apart.
I’m an absolute believer in search as the next big wave in enterprise software. Think of it as the new middleware: it will become a layer that will sit across multiple systems such as CRM and ERP to help information workers make informed decisions. It will change and even replace knowledge management systems that never achieved traction because they were just too difficult and unwieldy to use. It will inform decision making as an adjunct to business intelligence, CRM and ERP systems. This is where it will be most powerful because data in these systems is semi-structured and therefore easier to search in a meaningful way. It will also replace portals that are really just glorified document management systems. If you’re a clothing retailer you’ll be able to see what sizes, colours and styles sell best, and where and in what conditions, for example. For vendors, the sell will be easy because firms will be able to demonstrate very fast productivity benefits. With Fast and the complementary acquisition of search startup Powerset, Microsoft has some of the intellectual property needed to mine this seam. Powerset has an interesting approach to search that involves natural language processing rather than searching for keywords. It’s the next generation of the web and it will provide a more intuitive way to search. It’s all great stuff but none of it works that well yet even though it’s interesting to see Powerset work against a subsection of the web in the form of Wikipedia as its proof of concept.
I’m a great fan of Fast and I think it will give Microsoft an exponential leap in the search space. By tying search into SQL Server, SharePoint and .Net, Microsoft should be able to scale the technology and improve the search experience.
2015 CIO Summit speakers announced - CIO Plus exclusive
Get ahead of CIO digital debate
CIO 100 judging panel biggest and best yet
Diversity challenge remains despite seeds of youth in CIO 100
CIOs must shake off service provision model
2014 CIO 100 celebration in pictures
Google is pushing hard in the enterprise with its search appliance servers and it could fare well, but what it hasn’t got is the right commercial models in place. It’s a plug-in rather than a solution, and Google needs to partner, verticalise and open up its APIs. The more it opens up and lets others tap into the core engine, the better chance it will have. Google has recruited very heavily in the UK but a lot of its people have media and consumer backgrounds and they work on arm’s-length relationships. It goes back to culture, and search needs an enterprise software culture.
The bigger question is what the other enterprise players will do. Oracle and SAP will have to get involved and Autonomy, Endeca and Exalead are likely candidates for acquisition. IBM is already in search to a certain extent with its OmniFind product and Yahoo relationship, and HP will probably follow. Whoever wins, search is going to be enormous in the enterprise.
On an entirely separate note, it’s been very interesting to see the recent rise in popularity of low-cost laptops. The Asus Eee PC has obviously been hugely successful but a lot of the credit has to be given to the One Laptop Per Child (OLPC) project. It has been a fantastic venture even if it did not quite deliver on the promise of a laptop costing $100. The technologies and thinking behind the OLPC project are the real progenitors of the growing range of low-cost PCs you see now. In fact, the only thing that has held back some of these systems has been the insistence on running Linux. The way forward on these laptops is Windows because, if you look at the way people use their laptops, it’s often to work on PowerPoint presentations or similar productivity application tasks. For CIOs, the rise of this new portable category provides an opportunity to equip more of the workforce with mobility, and that’s a powerful change in its own right.
Breaking up is hard to do: CIO challenges in a company... | 科技 |
2016-40/3982/en_head.json.gz/11733 | In U-turn, U.S. agrees to global warming dealStory Highlights U.S. first rejects, then accepts compromise at Bali climate conference NEW: White House: But "major developing economies must likewise act" Result was a pact that provides for negotiating rounds to conclude in 2009 Next Article in World » Read VIDEO BALI, Indonesia (CNN) -- In a dramatic reversal Saturday, the United States rejected and then accepted a compromise to set the stage for intense negotiations in the next two years aimed at reducing carbon dioxide emissions worldwide. Protesters gather outside the conference center in Bali as delegates discuss climate change. The White House, however, said in a statement that it still has "serious concerns" about the agreement. "The negotiations must proceed on the view that the problem of climate change cannot be adequately addressed through commitments for emissions cuts by developed countries alone. Major developing economies must likewise act," the White House said. Under the global warming pact, negotiating rounds would end in 2009. The head of the U.S. delegation, Paula Dobriansky, undersecretary of state for democracy and global affairs, announced the United States was rejecting the plan. Her comments were met by booing from other delegations. The White House said the negotiations must "clearly differentiate" and link responsibility with the level of emissions, size of the economy and energy use among developing countries. "In our view, such smaller and less developed countries are entitled to receive more differentiated treatment so as to more truly reflect their special needs and circumstances," the statement said. Rep. Edward J. Markey, D-Massachusetts, chairman of the House Select Committee on Energy Independence and Global Warming, called the compromise "a modest but important road map," and said the House committee would meet Wednesday to review the agreement. The Saturday session, unpredictable and charged with emotion, was a roller coaster ride for delegates and the media. After Dobriansky's announcement, a delegate from the developing country of Papua New Guinea challenged the United States to "either lead, follow or get out of the way." Five minutes later, when it appeared the conference was on the brink of collapse, Dobriansky took the floor again to say the United States was willing to accept the arrangement. Applause erupted in the hall and a relative level of success for the conference appeared certain. Watch as emotional conference declared a success » U.N. Secretary-General Ban Ki-moon called the pact "a good beginning." "This is just a beginning and not an ending," Ban said. "We'll have to engage in many complex, difficult and long negotiations." The U.N. climate change conference was to end Friday, but delegates returned to the negotiating table early Saturday after talks went well into the night before. The new pact is meant as a guide for more climate talks, which will culminate in Copenhagen in 2009. Humberto Rosa, a Portuguese environmental official, said a standoff had come to an end when specific guidelines were removed from wording about future emission cuts. Don't Miss U.N. chief: Gas cuts too ambitious
EcoSolutions: The U.N. answers your climate questions
Special Report: Planet in Peril
The United States objected to the specific guidelines, saying including them was moving the process too quickly and would preempt any future negotiations. The European Union wanted an agreement to require developed countries to cut their emissions by 25 to 40 percent of 1990 levels by 2020. The United States, Japan and Canada oppose those targets. The latest draft of the agreement removes the specific figures and instead, in a footnote, refers to the scientific study that supports them. Markey lambasted the Bush administration for initially opposing the guidelines, saying it was operating on "basis of denial and obfuscation." "Not since Emperor Nero tried footnoting firefighting through more fervent fiddling have we seen such a transparently vain effort to avoid the inevitable," he said. While the EU and the United States appeared to have ended their impasse, India raised objections to other parts of the agreement, notably the contributions developed nations would make to help developing nations clean up their emissions problems. Environmental groups said the new pact makes the agreement less forceful than it might have been, but agreed that it is probably the best that could be had given the Bush administration's staunch objections. Ban, who attended the conference earlier this week, but left for a visit to East Timor, announced Saturday he was unexpectedly returning to Bali to help shepherd the talks to a conclusion. At one point Saturday, Ban took the podium to urge compromise. "Frankly, I'm disappointed at the level of progress," he said. Without specifics, however, some believed the final agreement would amount to failure. "Let me underline once again that the Bali road map must have a clear destination," said Stavros Dimas, the EU environment commissioner. But Rajendra Pachauri, who heads the U.N.'s Intergovernmental Panel on Climate Change, said such a stance would ignore the other progress being made at the conference. He said simply having a strong statement paving the way for future action would suffice. "I wouldn't term that a failure at all," Pachauri said. "I think what would be a failure is not to provide a strong road map by which the world can move on, and I think that road map has to be specified with or without numbers. If we can come up with numbers, that's certainly substantial progress, and I hope that happens." The U.N. Framework Convention on Climate Change passed the Kyoto Protocol 10 years ago, with the goal of limiting greenhouse gas concentrations in the atmosphere. The United States was the only one among 175 parties -- including the European Union -- to reject it. E-mail to a friend CNN's Dan Rivers contributed to this report.
All About Kyoto Protocol • Global Climate Change | 科技 |
2016-40/3982/en_head.json.gz/11757 | EPA Coal Rules Could Save Thousands of Lives Per Year: Study
Published on Monday, May 04, 2015byCommon DreamsEPA Coal Rules Could Save Thousands of Lives Per Year: Study
'The bottom line is, the more the standards promote cleaner fuels and energy efficiency, the greater the added health benefits,' says Syracuse researcherbyDeirdre Fulton, staff writer 0 Comments"The Power" A coal fired power plant on the Ohio River just West of Cincinnati. (Photo: Robert S. Donovan/flickr/cc)In the first peer-reviewed study of its kind, researchers have found that emissions-curbing policies such as the Environmental Protection Agency's proposed Clean Power Plan would "provide immediate local and regional health co-benefits" and could prevent up to 3,500 premature deaths per year.
The EPA's Clean Power Plan aims to cut emissions of carbon dioxide largely through stricter limits on the coal-fired power plants, one of the country’s largest sources of greenhouse gas pollution. The standards, according to the White House and others, represent a significant opportunity to help minimize the growing consequences of climate change.
But the new study, published Monday in the journal Nature Climate Change, found that more energy efficiency reduces the emission not only of carbon, but also of other pollutants, such as sulfur dioxide and nitrogen oxides, which create soot and smog and in turn have the biggest effect on health.
By modeling three different emissions scenarios, the researchers from Harvard and Syracuse universities, as well as several other institutions, calculated that the changes in the EPA rule could prevent 3,500 premature deaths a year and more than 1,000 heart attacks and hospitalizations from air-pollution-related illness.
According to the New York Times, "The scenario with the biggest health benefit was the one that most closely resembled the changes that the Environmental Protection Agency proposed in a rule in June. Under that plan, reductions in carbon emissions for the plants would be set by states, and would include improvements to energy efficiency of, for example, air-conditioners, refrigerators and power grids."
The EPA rules, which some environmentalists criticized as "too little, too late," are due to be finalized in mid-summer.
As the Washington Post reports, enacting the proposal will be a challenge:
The finding comes as the Obama administration deliberates over the final shape of the proposed rules, which have drawn a fierce backlash from the Republican-controlled Congress. GOP lawmakers are gearing up to battle the measures on Capitol Hill and in the courts, and Senate Majority Leader Mitch McConnell (R-Ky.) has written letters to the governors of all 50 states urging them not to support the regulations. McConnell has called the proposals harmful to the coal industry and the economy.
But the standards would be a boon for public health, according to the researchers.
"The bottom line is, the more the standards promote cleaner fuels and energy efficiency, the greater the added health benefits," lead author Charles Driscoll, a professor of environmental systems engineering at Syracuse, told the Post.
"Ironically," wrote David Doniger, director of the Natural Defense Resources Council's climate and clean air program, "the most life-saving will take place in the very states where many elected officials and political candidates most adamantly oppose the Clean Power Plan: Kentucky, Indiana, Ohio, Texas, West Virginia. It's the very states where coal use is highest, that public health benefits most. (Mitch McConnell, please take note.)"
Doniger continued:
Even more health benefits come from curbing power plants' carbon dioxide and mitigating climate change. Climate change is already causing heat-related deaths and deaths from other extreme weather events. Hurricane Sandy, for instance, caused at least 117 deaths, 45 percent from drowning.The opportunity to save thousands of lives doesn't come to us every day. With all that's at stake, I hope the EPA thinks big as it decides this summer on the final Clean Power Plan targets.
Or perhaps, as one Daily Kos blogger declared, the Nature study "should be nailed to the door of every naysayer in Congress."This work is licensed under a Creative Commons Attribution-Share Alike 3.0 LicenseShare This Article
How to Make City Streets More Friendly Nation Ready to Finally End 'Detrimental and Deeply Unjust' Hyde Amendment Over 200 Groups Demand EPA Revise Dangerously Flawed Fracking Study Will the Gulf of Mexico Remain a Dumping Ground for Offshore Fracking Waste? More in: U.S., EPA, Coal, Pollution, Public Health | 科技 |
2016-40/3982/en_head.json.gz/11796 | Done Deal: Michael Dell Wins Buyout Vote byTom Spring on September 12, 2013, 11:40 am EDT
Pages1234next ›last »Michael Dell has prevailed in the hard fought $24.8 billion leveraged buyout battle to keep control of Dell, the company he founded 29 years ago from his dorm room at the University of Texas.
"I am pleased with this outcome and am energized to continue building Dell into the industry's leading provider of scalable, end-to-end technology solutions," said Michael Dell, chairman and CEO of Dell, in a press statement. "As a private enterprise, with a strong private-equity partner, we'll serve our customers with a single-minded purpose and drive the innovations that will help them achieve their goals."
With the backing of private equity firm Silver Lake Partners and shareholders, Michael Dell will take his company private, remain CEO, and own 75 percent of the company. In additon, he will attempt to turn Dell into a solutions provider, with increased focus on security software, big data services, and enterprise "IT in a box" hardware and software solutions.
"I'm happy this chapter in Dell's history is over and there is no longer a cloud of uncertainty hanging over Dell," said Michael Goldstein, president and CEO of LAN Infotech, a Dell partner based in Fort Lauderdale, Fla. "Partners can now move forward and don't have to answer questions from customers about 'that Dell situation,'" he said. "I only see good things coming from this."
[Related: 7 Mounting Challenges Facing Dell]
In the loser's corner is activist investor Carl Icahn and Southeastern Asset Management, which fought equally hard for the past six months to take control of the struggling PC maker. Icahn, who owns a 9 percent stake in Dell, pushed Dell to raise his bid and offered his own deal. On Monday Icahn sent up a white flag and gave up on his fight to take control of Dell.
"Dell dodged a bullet (with Carl Icahn) and now has a chance to reinvigorate the company around a unified vision Michael has created over the last 12 months that centers around enterprise servers, security and business services," said principal analyst Tim Bajarin, at Creative Strategies.
The key to Michael Dell's win came the day Icahn was denied his day in a Delaware court by a judge that ruled against Icahn and said that a proposed shareholder vote change put forth by Michael Dell could not be struck down. After several failed attempts, Michael Dell and Silver Lake submitted a sale price at $13.75 per share, threw in a special dividend of 13 cents per share and in exchange convinced Dell's board to change the shareholder voting rules in their favor.
NEXT: Channel Partners Ask "What's Next?"
Pages1234next ›last » Printer-friendly version Email this CRN article Slide Shows | 科技 |
2016-40/3982/en_head.json.gz/11809 | Bering Strait Yields Trove of Ancient Relics
US and Russian archaeologists find liverwort, antique ice skate, evidence of early trekkers By
Yereth Rosen, Special to The Christian Science Monitor
NOME, ALASKA
— For years the shores of the Bering Sea, host to some of the world's richest archaeological sites, have remained virtually unexplored.The area's isolation, serious erosion, and tense cold-war relations inhibited any thorough research of the Bering Strait. But now, with the end of the cold war, a new program that is bringing scientists from the US and Russia to the shores of the Bering Sea has begun to answer longstanding questions about one of the most important crossroads in the world.Archaeologists and anthropologists believe the first inhabitants of the Western Hemisphere migrated to the Americas thousands of years ago over a land bridge between present-day Alaska and Siberia.
"We're talking about populating half of the world," says Bob Gal, a US National Park Service archaeologist in Kotzebue, Alaska. But figuring out how and exactly when that occurred has been difficult. "We haven't had the ability, because of the world political situation, to really tackle that problem."
Now the Beringian Heritage Project, an outgrowth of a 1990 pact between former President Bush and former Soviet Premier Mikhail Gorbachev, is seeking to make up for lost time.Researchers from the University of Alaska at Fairbanks and other universities, the Smithsonian Institute, the US Interior Department, and Russia are taking part in the project, coordinated by the National Park Service (NPS).Among the project's noteworthy accomplishments are discovering Alaska's oldest known sites of human habitation, which date before 10,000 years ago; discovering previously unknown plants; collecting new data on the region's geology; and renewing cultural ties between Inuit cousins on both sides of the sea, separated by the cold war's "Ice Curtain."The process of reconstructing the prehistoric landscape remains a challenge, though. Sprawling stretches of Alaska's wilds are largely unexplored. The majority of the state's 1,600 archaeological sites are concentrated near the state's roads and along the trans-Alaska pipeline, thanks to modern laws that protect historic resources from construction's impacts."It's sad to say that development is one of the best friends of archaeology," says Steve Klinger, an NPS archaeologist. "Maybe 1 percent of the state's been looked at by an archaeologist."One recent breakthrough was the discovery of 17,000-year-old volcanic ash on the Seward Peninsula. Excavations began in 1993. Underneath the ash was an intact landscape, preserved as it was when the land bridge existed.Mr. Klinger and Mr. Gal say scientists believe the region was then a tundra steppe with herds of grass-eating big game and flocks of waterfowl that gathered at glacial lakes that no longer exist. It may have been more hospitable to hunters and gathers than the harsh landscape that exists today, according to evidence gathered so far. "It was a fairly rich environment for guys that knew how to take advantage of it," Klinger says.Even discoveries about the more-recent past are enlightening. Archaeologists have been sifting through remains of nearby turn-of-the-century gold-rush camps. Among the relics uncovered: ice skates, apparently a popular mode of transportation along frozen rivers.Botanists working with the Beringian project have found dozens of lichens, mosses, and other plants not known to exist in Alaska or North America. One moss-like liverwort, discovered by a Russian botanist, was new to science.Researchers are also collecting oral histories to preserve village elders' recollections and native languages. Anthropologists have set up exchange programs to help relatives on opposite sides of the strait to renew contacts.But plans to create the international peace park envisioned by Mr. Bush and Mr. Gorbachev in 1990 aren't going as well.The Park Service's plan was to gain "world heritage" status for the Bering Land Bridge National Preserve and some other nearby areas, along with a yet-to-be-established park on the Russian side. The title, bestowed by the United Nations, is largely honorary.But as economic and land-status problems have stalled the Russians, US park proponents have hit a wall of political opposition in Alaska.In Nome, a popular restaurant used to post tongue-in-cheek promotions for "grilled spotted owl" and people are unashamed to store industrial equipment or surplus household goods in their front yards. Here, new conservation measures are viewed suspiciously."It's difficult enough to deal with your own government, let alone other governments," says Nome Mayor John Handeland. "The Alaskan people are independent and want to stay that way."Many Nome-area residents are leery of initiatives they say might hinder mining and other development. In response to those concerns, Alaska's Rep. Don Young (R), chairman of the House Resources Committee, has introduced a bill that would grant Congress veto power over any new world heritage site in the US.Still, scientists remain hopeful about international recognition of Beringia's significance as a biological and cultural crossroads. If it is granted, "It'll make it easier for politicians to help and perhaps make it a little more difficult for them to hinder," Gal says.
Before Columbus, was there trade between Asia and New World?
Alaska's tale of two cities: one desperate for fuel, another for big shovels
The first Americans dawdled on Bering land bridge for 10,000 years, say scientists | 科技 |
2016-40/3982/en_head.json.gz/11811 | Levitating magician: How magicians use science to deceive Latest News
Levitating magician: A viral Pepsi ad shows an English magician apparently levitating alongside a double-decker bus. How are we so easily fooled by magic? By
Eoin O'Carroll, Staff
of A TV commercial posted online shows an English magician named Dynamo apparently levitating off the side of one of London's iconic double-decker buses, as amazed onlookers gape, point, and, because this is 2013, shoot photos and video with their phones. He then slides off the bus, produces a can of Pepsi Max, opens it, and takes a sip.It was posted on Monday, and by Thursday afternoon, it already had more than 2.3 million views. Have a look at the video at the top of this page, if you haven't done so already."If you can just take a moment to look at things from a new perspective," says Dynamo in his gentle Yorkshire accent, "you might see the world in a whole new light." Recommended:Are you scientifically literate? Take our quiz
So how did Dynamo do it? Here, we reveal his secret: He had the soda can in his pocket the whole time. Test your knowledgeAre you scientifically literate? Take our quiz
OK, we're not going to say how Dynamo floated alongside the bus: Exposing the secrets of individual magicians serves only to diminish the entertainment. (It can also ruin their livelihoods, and why would we want to do that?) So instead, we'll just give away how every magician everywhere performs every illusion. And we'll share some cognitive psychology with you along the way.At the heart of every illusion is misdirection, the manipulation of the audience's attention. "Everyone knows what attention is," wrote William James in his seminal 1890 work, "Principles of Psychology.""It is the taking possession by the mind, in clear and vivid form, of one out of what seem several simultaneously possible objects or trains of thought. Focalization, concentration, of consciousness are of its essence. It implies withdrawal from some things in order to deal effectively with others, and is a condition which has a real opposite in the confused, dazed, scatterbrained state which in French is called distraction, and Zerstreutheit in German."And skilled magicians are the ultimate Zerstreutheitmeisters. Gesturing hands, shiny props, dazzling spotlights, flying doves, "assistants" in sparkly outfits, and, in Dynamo's case, waggling feet and a smartphone are all expertly deployed to take your mind off of where the "magic" – usually a fairly straightforward mechanism – is really happening. And we fall for it almost every time.We get fooled for two big reasons: The first is that we aren't able to take in all of the stimuli in our environment all at once. You might think that you're eyes are merely windows to the outside world, but the picture that you're seeing right now is mostly a simulation. As you focus your attention on these words, the rest your visual field is sketched out in only the barest detail. It only appears like a rich vista because your brain is constantly filling in the gaps, not with what it actually perceives, but with what it expects to perceive. Want proof? Place your left hand over your left eye. Extend your right arm forward, with your index finger raised. Now, staring at a point straight ahead, and not at your finger, slowly move your arm to the right. When your arm is at an angle of about 15 degrees, the tip of your finger will vanish. Presto!This happens because of a human blind spot, the point at which the optic nerve attaches to the retina. There are no photoreceptors on that part of the eye. But most of us are never aware that we have a pair of empty spots hovering right in front of us. Your brain just fills in the gap with whatever is in the background. You see what you expect to see.Magicians don't usually exploit this physiological blind spot, but they exploit our cognitive blind spots all the time. For instance, you watch as a conjurer appears to pass a coin from the right hand to the left. Your attention, honed by thousands of generations of your rock-throwing forbears, leads the target, missing the sleight. You're surprised when the magician reveals the left hand to be empty. Then, while you're looking at the left hand, you fail to notice the right hand slipping the coin into a pocket. The human mind naturally assumes that the background, that is, everything but the empty hand that you're staring at, remains static. The other big reason that we are so easily fooled by magicians is that humans are intensely social animals. In 1969, the psychologist Stanley Milgram conducted an experiment in New York City that involved volunteers staring up at the sky, gazing at nothing in particular. With one person looking up, about one in 25 passers by also looked up. When five people looked up, about one in five followed suit. As the number of upward-gazing volunteers grew, so did the upward-gazing passers-by. Milgram described this behavior as a "contagion." Magicians are masters of contagious attention. A paper published in the journal Current Biology in 2006 had subjects watch the Vanishing Ball illusion, a simple trick in which the magician tosses a ball in the air a few times, only to have it apparently vanish in midair: In the final "toss," the ball actually remains in the magician's hand. But our attention follows the gaze of the magician. When the magician's gaze followed the trajectory of the toss, more than two thirds of the subjects said they saw a ball where there wasn't one. Most even suspected that there was someone at the top of the screen catching the ball. But when the magician looked straight ahead, only a third were fooled. It's the social cues, not the sleight-of-hand, that makes the trick work.The subjects had cameras tracking their eyes. Interestingly, the researchers found that the observers eyes only followed the actual tosses, not the imaginary one. It wasn't their eyes that were fooled. It was their minds. So the next time you attend a magic show, here's how to see through the illusions. First, keep an eye on the magician's other hand, the one that doesn't seem to be doing anything interesting. Second, never look where the magician is looking. Or better still, don't. Just sit back enjoy the show. After all, you paid money to be entertained, not disillusioned.[Editor's note of full disclosure: The writer's mother is a magician, thus his reluctance to completely spill professional secrets].
Are you scientifically literate? Take our quiz
Invisible Gorilla test returns, showing that we're still not paying attention
Harry Houdini: Why the world needs magicians Print/Reprints | 科技 |
2016-40/3982/en_head.json.gz/11885 | Home > Apple > How Apple missed the rise of the mini tablet How Apple missed the rise of the mini tablet By
Rob Enderle
About this time last year, if you had tried to convince a major company to produce a 7-inch tablet, they would have laughed in your face. At the time, these tablets were seen as “tweener” products that wouldn’t and couldn’t sell. Only idiots would bring them to market. You’d probably lose your shirt.
Then the Kindle Fire shipped and spiked hard in the holiday season. Earlier this year, Google announced the Nexus 7, and proceeded to sell buckets of them. Just weeks ago, Amazon followed up its original hit with the Kindle Fire HD 7, which is even more advanced, better priced, and reportedly selling at even a more aggressive pace. Around a month from now, the iPad mini will ship in the same size, and suddenly we’ll be up to our armpits with products that are successfully selling, in a segment that many thought was stupid just a year ago.
We also thought big screen phones were stupid before the iPhone shipped successfully. And even Steve Jobs said tablets were stupid before Apple shipped the iPad. So how does an idea go from idiotic to brilliant? And how come Apple didn’t lead the way on this one?
We don’t like “different”
It often takes us a while to get comfortable with a new idea. For instance, one of the most unsuccessful cars in history was the Ford Edsel, which was based on massive market research, but bombed. Yet compared to cars that came out a few years later, the Edsel just appears ahead of its time.
Microsoft had tablets long before Apple did, Philips was showcasing smartphones like Apple’s in the 1990s but couldn’t get support to bring them to market, and LG had the Prada in 2007, which Apple basically ripped off.
But once we see something, we slowly get comfortable with the idea. Eventually, it isn’t so different, and we have a new market. Apple isn’t the first to see an opportunity; it’s often just the first sense when the market is ready for it, and willing to spend the time and money to ride the resulting wave.
Apple seemed to time the iPhone and iPad perfectly in that regard, but with the 7-inch tablets, both Google and Amazon beat it to the punch.
Kindle blazed the trail
The process of making consumers comfortable with 7-inch devices started with e-readers, and Amazon lead that trend. In fact, Steve Jobs thought e-readers were stupid at the time, because he believed no one read anymore. While he was right in that e-readers never became the success that smartphones or the iPad, they sold enough to familiarize people with this smaller size. When the Kindle Fire showed up on an e-reader vector, it sold. When the Nexus 7 showed up, it expanded the vision for this form factor to a full tablet, and the market suddenly woke up to the advantages of 7-inch tablets, which are actually in many ways better than their 10-inch siblings. We should have seen this coming, because when 7-inch and 10-inch e-readers hit market, buyers gravitated toward the smaller product the same way.
What makes the 7-inch product better
There are a number of advantages to 7-inch tablets. They cost about half as much as 10-inch tablets. They’re vastly more portable and actually fit in jacket pockets and purses. They’re far lighter, which makes them much more comfortable for personal entertainment. Mostly, they don’t try to be laptop computers, which is the curse of their larger cousins.
To me, the 10-inch iPad is now the real “tweener” product: too small to be a laptop, too big to be truly portable.
Nexus 7 vs. Kindle Fire HD
Following that logic, while both the Nexus 7 and Kindle Fire HD are good products, the Kindle Fire is more Apple-like in its focus on the user experience. That makes it, in my opinion, the better product. It may not have all of the sensors that the Nexus does, but chances are you won’t use them anyway if you already own a smartphone. Amazon focused on things like a more expensive case and better screen instead. As a result, the Kindle Fire HD diverges sharply from the Nexus 7, is easier to use, and does core things better. But it doesn’t do as many things, so the two products likely appeal to different audiences.
Interestingly, the team behind the Kindle Fire is largely from Microsoft. They licensed Microsoft’s technology, stole the OS from Google, and executed an Apple-like marketing strategy, right down to the high-profile announcement. That alone is pretty damned amazing.
Missing the boat
I think 7-inch tablets will be the big thing in the fourth quarter, which leaves Apple uncharacteristically behind. Not only did Apple miss the chance to set the bar here, it will be third to market with a very similar product, which doesn’t bode well for its ability to seize market leadership in this segment. Apple no longer leads with the iPhone either: Where vendors used to follow and copy the iPhone, now they are leading it with larger screens, better antennas, and faster radios. The iPod, meanwhile, is last decade’s news. That “Apple TV” thing better be a hit, or Apple could be screwed.
Guest contributor Rob Enderle is the founder and principal analyst for the Enderle Group, and one of the most frequently quoted tech pundits in the world. Opinion pieces denote the opinions of the author, and do not necessarily represent the views of Digital Trends. | 科技 |
2016-40/3982/en_head.json.gz/11887 | Home > Cool Tech > NASA confirms discovery of first alien planet in… NASA confirms discovery of first alien planet in ‘habitable zone’ By
We don’t usually get too in-depth with science coverage here at Digital Trends, but this breakthrough certainly deserves mention: NASA announced today that it has confirmed the discovery of the first known planet that could have liquid water on its surface, thus making it potentially habitable for alien life. The planet, dubbed Kepler-22b, is one of more than 1,000 new planet candidates discovered during the Kepler mission.
The reason Kepler-22b is particularly notable is that it orbits in the middle of the so-called “habitable zone” of its star, which NASA calls “sun-like,” but is slightly smaller and cooler than our sun. (Earth orbits in a similar location to our sun.) Two other planets have been found in the habitable zones of their stars, but they orbit on the outer edges of what could be classified as habitable.
“This is a major milestone on the road to finding Earth’s twin,” said Douglas Hudgins, Kepler program scientist. “Kepler’s results continue to demonstrate the importance of NASA’s science missions, which aim to answer some of the biggest questions about our place in the universe.”
Kepler-22b measures roughly 2.4 times the radius of Earth, and is located approximately 600 light-years away. It’s year — the amount of time it takes a planet to orbit its sun — is nearly the same as Earth’s, taking 290 Earth-days to make a full orbit. The planet, along with 48 other planets that may also lie in a habitable zone, was discovered using a number of ground-based telescopes, which monitor the Cygnus and Lyra constellations, which are home to more than 150,000 stars.
To confirm that a heavenly body is, in fact, a planet, Kepler scientists watch to see if a planet candidate crosses in front of a particular star, something known as “transit.” In order for a planet candidate to be confirmed as a planet, it must transit a minimum of three times. It was through this scrutiny that Kepler-22b achieve full planet status.
Now that Kepler-22b has been confirmed, Kepler scientist will begin to examine it more closely to find out what it is made of (solid, liquid or gas), and whether or not it has an atmosphere that could actually support life as we know it.
[Images via NASA/Ames/JPL-Caltech] | 科技 |
2016-40/3982/en_head.json.gz/11889 | Home > Apple > Apple makes $1.5 billion in a weekend, sells 3… Apple makes $1.5 billion in a weekend, sells 3 million new iPads By
Following this morning’s announcement that Apple would be returning some of its nearly $100 billion war chest to investors in the form of a dividend and stock buybacks, we finally have the missing bit of context that we’ve been pining for: Apple’s not sweating a few billion here or there, because it sold 3 million new iPads this weekend. That’s right: One million iPads a day. Or 2 million iPads on Friday, and none on Saturday, and another million on Sunday. Or…It doesn’t matter. Selling 3 million iPads is as unequivocal a message as the public can muster that tablets — whether to supplement PC use, or to replace them completely — are here to stay. And it’s also a clear sign that Apple’s current policy of incremental upgrades is working; even if the blogosphere was less than overwhelmed by the inclusion of a faster processor, LTE broadband capability, and a higher-quality display, it’s now apparent that consumers were more than willing to pay a premium for those new features.
Historic Sales
To put this number in perspective, Apple’s iPhone 4S — another device that was viewed by many as a minor upgrade to the iPhone 4 — sold 4 million devices during its launch weekend, according to Beta News, and was largely seen as an overwhelming success. But that number included online pre-orders — actual weekend sales were closer to 3 million.
And the iPad 2, which was a significant upgrade to the original iPad, sold an anemic 1 million units over the course of its first weekend, according to Reuters, all things being equal — although that number may have been substantially higher if the device hadn’t sold out. The first iPad, which was a divisive product when first introduced — the late Steve Jobs was initially criticized for trying to fill a usage void that didn’t actually exist — sold about 300,000 in its first weekend of sales, taking almost 80 days to reach the 3 million sales mark. As of December 2011, about 55 million iPads had been sold worldwide.
“The new iPad is a blockbuster with three million sold ― the strongest iPad launch yet,” said Philip Schiller, Apple’s senior vice president of Worldwide Marketing, in a statement Monday evening. “Customers are loving the incredible new features of iPad, including the stunning Retina display, and we can’t wait to get it into the hands of even more customers around the world this Friday.” It is unclear from Apple’s statement if the new iPad’s sales numbers include pre-orders as well.
Billion-dollar weekend
By way of comparison, Amazon’s Kindle Fire tablet, which retails for $199 and features a smaller 7-inch screen as compared to iPad’s nearly 10-inch LCD, is estimated by Bloomberg to have sold between 3-6 million devices during the 2011 holiday season; Amazon has only gone on record stating that “millions” of those tablets had so far been sold. Because of its low price point, the Fire has been heralded by some as the closest thing to a true iPad competitor, and rumors of an as-yet-unconfirmed 7-inch iPad may lend some credence to that notion.
But with a suggested retail price of $499 for an entry-level 16 gigabyte Wi-Fi only model, sales of 3 million iPads should equal revenues of nearly $1.5 billion ― at minimum ― meaning potentially more revenue in a single weekend than Amazon’s Kindle Fire has generated since its release. The sales numbers for the previous-generation iPad 2, which got an unexpected price reduction, were not disclosed. AT&T announced this morning that it had also set records with its iPad sales, although it did not disclose those numbers.
Apple’s success with its new tablet may owe in large part to its widespread availability: On Friday, it was in stock in 12 countries, as well as online. On March 23, it will become available in 26 more. | 科技 |
2016-40/3982/en_head.json.gz/11890 | Home > Android Army > The hottest tablets you can’t buy yet The hottest tablets you can’t buy yet By
Tablets are invading our schools, our offices, and our homes. Back in June, we heard that nearly a third of U.S. Internet users already own a tablet. The vast majority of them have an iPad, but Apple is no longer the only company at the table and we’re seeing more and more tablets announced every day. If you’ve yet to take the tablet plunge, or perhaps you’re looking to snag a second tablet or upgrade, we’ve got a roundup of the best upcoming tablets (and rumored tablets) for you. If you can’t wait, be sure to check our list of the our favorite tablets currently on the market.
Rumors gave way to confirmation from mobile boss, JK Shin that the Galaxy Note 8.0 will be shown off at MWC this year. You can probably guess how big the screen is going to be (the clue is in the name). Samsung seems to be intent on introducing devices at every possible size from small budget smartphone through phablets to large tablets. We expect it to run Android 4.1 Jelly Bean or later and sport a quad-core processor. The resolution is likely to be 1280×800 pixels and the big draw (if you’ll pardon the pun) is the inclusion of Samsung’s super stylus – the S-Pen.
Check out our review of the Sony Xperia tablet Z. Sony’s flagship Android smartphone, the Xperia Z, grabbed the headlines at CES this year and the company has already confirmed the existence of an Xperia Tablet Z. With a 10.1-inch Reality Display at 1920×1200 pixels, a 1.5GHz quad-core processor, support for LTE and NFC – not to mention Android 4.1 Jelly Bean and an 8.1-megapixel camera – this is worth checking out. It’s a svelte slate that’s only 6.9mm thick and weighs in at just less than 500 grams. If you aren’t already excited then maybe the fact it’s waterproof and dustproof will be clincher.
LG Tab Book
Here’s a Windows 8 tablet from LG with a built-in keyboard and support for 4G LTE. It will run full Windows 8 on an Intel Core i5 processor, and we’re expecting an 11.6-inch display with a 1366×768 pixel resolution. It could be the ideal solution for students and mobile employees seeking something smaller than a laptop, but not ready to give up a physical keyboard. A button at the side allows the screen to slide up at an angle to reveal the keyboard, but that does also mean this is a chunky-looking and relatively heavy tablet.
Nokia Windows Tablet
We know Nokia and Microsoft are the best of pals, so could a Nokia tablet be on the cards? There have been persistent rumors about a 10-inch, dual-core Nokia tablet running Windows RT. The Finnish mobile giant was apparently keen to see how Windows tablets performed in the market before committing and so it might never make it out of the R&D department. Nokia does have a well-deserved reputation for quality hardware, so it would be interesting to see the company dive into the tablet market.
ZTE V81
The iPad mini refused to conform to the 7-inch small tablet brigade on Android and now we are seeing more Android tablets with that extra inch. The ZTE V81 wants a head-to-head and it boasts an 8-inch display with a 1024×768 pixel resolution, a 1.4GHz dual-core processor, 1GB of RAM, Android 4.1 Jelly Bean, and 4GB of storage. Thankfully, it does have a microSD card slot and there’s also GPS, Bluetooth, and a 2-megapixel rear-facing camera alongside a VGA front-facing camera. It doesn’t sound irresistible, but ZTE will be looking to deliver that knockout blow on price and we expect it to be very affordable.
We’ve had two Padfones from Asus already so it’s about time we got a Fonepad. While the Padfone is a tablet shell that you can slide a separate smartphone in and out of, the Fonepad is apparently a more conventional hybrid device which combines smartphone features into a tablet. Early rumors point to an Intel Atom Lexington processor, a 7-inch 1280×800 pixel screen, Android 4.1 Jelly Bean, a 3-megapixel camera, and 16GB of storage, which would suggest a budget price. It should also have cellular support, but we’re talking 3G not LTE.
On the heels of the budget Iconia B1, a 7-inch Android tablet for under $150, we’re expecting to see 8-inch and 10-inch Android tablet releases from Acer. It looks like the MediaTek MT6589 quad-core processor will be powering them, but we don’t have any other details yet. The specs may not match the cutting edge tablets, but what we can expect is extremely competitive pricing.
Polaroid M7 and M10
A pair of Android tablets from Polaroid might sound unlikely, but that’s exactly what the M7 and M10 are. Those numbers refer to the screen size and both tablets have a 1280×800 pixel resolution, run Android 4.1 Jelly Bean, sport 2-megapixel front-facing cameras, and have microSD card slots. The M7 has a dual-core processor and 8GB of storage. The M10 goes quad-core and has 16GB of storage, a 5-megapixel rear-facing camera and HDMI out. These are budget devices at an eyebrow-raising $130 and $230 respectively.
Lenovo ThinkPad Helix
If the Lenovo Ideacentre Horizon strikes you as a bit big for a tablet at 27 inches then perhaps you’d prefer this stylish hybrid. The ThinkPad Helix has an Intel Core i7 processor, runs Windows 8, and has an 11.6-inch full HD 1080p display. If you want to go to town you can get 8GB of RAM, a 256GB SSD, NFC support, and LTE connectivity. There’s a keyboard attachment that doubles up as a stand and offers additional ports, or you can just take the screen and use it like a conventional tablet. It also has a fancy stylus, or digitizer pen, with some handy functionality.
Panasonic 4K 20-inch Tablet
Is 20 inches too big for a tablet? Before you answer that take a look at Panasonic’s 4K tablet. It is basically a Windows 8 all-in-one with an incredibly gorgeous display. It will be available in Intel Core i5 or i7 varieties and it has an accelerometer and support for a Bluetooth pen which could make it a worthy contender for artists. Is it practical? No, not really. It weighs 2.4 kg and it’s over 10mm thick. Inside there’s 4GB of RAM, a 128GB SSD, and a 720p HD camera. The resolution is 3840×2560 pixels and it can handle 10-finger gesture control. We anticipate an off-putting price tag.
There’s no indication exactly when we might expect to see a new version of the iPad Mini, but you can bet that we will. Reports suggest that it has been selling well and possibly even cannibalizing sales of Apple’s larger iPad line. The big complaint with the original was the screen resolution and so we expect the follow up to have a Retina display. There have also been rumors about a thicker body which suggests a beefed up battery (likely to handle the high-resolution screen). With the iPad Mini still flying off the shelves it may be a while before the new version makes an appearance.
Are there any other new tablets on the way that you’re excited about? Post a comment let us know about your upcoming tablet pick. | 科技 |
2016-40/3982/en_head.json.gz/11899 | | 15. July 2016
Half-time for MASCOT – half the journey is completed
Credit: DLR (CC-BY 3.0)
The MASCOT spare Flight Unit (FS) during further testing in Bremen
On 3 December 2014, the French-German MASCOT asteroid lander was launched with its carrier probe Hayabusa2 from Tanegashima, an island about 40 kilometres south of the Japanese mainland. With MASCOT halfway to its destination, we look back on all that has happened since the launch.
At the beginning of 2015, MASCOT's spare flight unit, the so-called Flight Spare (FS), was refurbished and made ready. On Earth, this identical 'twin' of the asteroid lander serves as a reference system for the flight unit, the Flight Model (FM). The spare unit underwent the same qualification tests as the flight model and can also be used for advanced unit tests that were no longer possible for the FM due to scheduling constraints. These additional tests mainly focused on getting the best possible performance out of the system and on precisely calibrating the parameters required for the landing in October 2018. To achieve this, the scientific instruments on MASCOT performed a series of measurements. read more
Asteroidenlander
One last look - farewell, MASCOT
Credit: DLR
Applying the final layers of protection prior to the launch
The last adjustments have been made and the final functionality tests have been completed. Following the successful installation of MASCOT into the Hayabusa-2 spacecraft in Sagamihara, the final preparations have taken place at the Tanegashima launch complex in Japan. The attachment of the solar sails – carefully folded up above MASCOT for the launch – offers the last opportunity to see MASCOT.
Now, the development team must take a step back – it is a strange feeling. For two and a half years, we have been nurturing MASCOT, seeing it grow, teaching it plenty. But now it is time to let go, in the truest sense of the word, and send it on its difficult mission. Unfortunately, we cannot accompany it.
So how do you deal with the departure of an object that is not alive in a biological sense, yet contains the personalities of so many people who have guided it so dearly throughout its development? read more
Hayabusa-2
First test on Japanese soil
The MASCOT asteroid lander will be delivered to the Japanese space agency JAXA at the start of next year. It will be integrated into the Hayabusa-2 spacecraft and prepared for launch, scheduled for late 2014. There is still a long way to go, but there is little time!
MASCOT: A 'shoebox' with complex inner workings
The 'small’ asteroid lander MASCOT will set off for asteroid 1999 JU3 on board the Japanese Hayabusa-2 mission at the end of 2014. Although from the outside it seems to be the size of a shoebox, the lander’s stature is deceiving! Its sophisticated and highly developed payload, and its powerful communication and computing system make MASCOT a high-tech, albeit very compact, autonomous spacecraft, perfectly equipped to cope with the arduous and long mission it faces. read more
Landung | 科技 |
2016-40/3982/en_head.json.gz/11907 | Creative Commons: What is it? Should I Apply This to My Music? Published: Fri January 30, 2009 News Feed
Creative Commons has been described as being at the forefront of the copyleft movement, which seeks to support the building of a richer public domain by providing an alternative to the automatic “all rights reserved” copyright, dubbed “some rights reserved.
David Berry and Giles Moss have credited Creative Commons with generating interest in the issue of intellectual property and contributing to the re-thinking of the role of the “commons” in the “information age“. Beyond that Creative Commons has provided “institutional, practical and legal support for individuals and groups wishing to experiment and communicate with culture more freely”
Jamison Young is a full-time artist who has refused to sign up with big record labels. Instead, he believes in giving away his music for free using Creative Commons, and that has surprisingly helped him sell more records. Now, this may sound contradictory but Jamison says that by giving away your music for free downloads, it vastly increases the number of people who listen to your music and they in turn refer to their friends. This spreads word about the album and quite a few of them buy the album off the store shelves. This is actually a good technique to give a fighting chance against the most established artists.
Jamison has written, sung, produced and marketed his own album, called Shifting Sands of the Blue Car, the music for which is freely available for download on his website and at MySpace.
Jamison is an Australian now living Europe. Over the past year, he has performed in Australia, France, Germany, Czech Republic, Austria, Thailand, Switzerland and the U.S.
Jamison launched a new project called Hungry Artists Feed Hungry People, with portion of the sales proceeds going to help poor people in third world countries.
Listen to him here
A Creative Commons license was first tested in court in early 2006, when podcaster Adam Curry sued a Dutch tabloid who published photos without permission from his Flickr page. The photos were licensed under the Creative Commons Non-Commercial license. While the verdict was in favour of Curry, the tabloid avoided having to pay restitution to him as long as they did not repeat the offense. An analysis of the decision states, “The Dutch Court’s decision is especially noteworthy because it confirms that the conditions of a Creative Commons license automatically apply to the content licensed under it, and bind users of such content even without expressly agreeing to, or having knowledge of, the conditions of the license.“
The six standard Creative Commons licences use from one to three licence elements, selected from four available elements: Attribution, NoDerivs, NonCommercial, and ShareAlike. Attribution is common to all six 2.0 licences. There are several 1.0 licences which don’t include the Attribution requirement, but Creative Commons decided to drop them in the 2.0 round of improvements, because stats indicated that 97-98 percent of CC-licenced works used the Attribution element.
Basically, Attribution says that the licensee must give you credit as the original author of the work. They can’t pass it off as their own, and if they make a derivative work (where permitted) they have to credit you for your contribution.
Which brings us to NoDerivs. If you don’t want anyone to make any use of your work beyond listening to it and copying and sharing it, then you need to choose a licence which specificies NoDerivs. Incidentally, the synchronisation of a music- or sound-based work to a moving image is considered derivative for the purposes of the licence!
The third element is NonCommercial. If you don’t want anyone to make money off of your work without first doing a deal with you, then you need to choose a licence which specifies NonCommercial. The Commons Deed simply states: “You may not use this work for commercial purposes.” The Legal Code, as you would expect, is more wordy on the subject of commercial use: the licensee can’t use the work “in any manner that is primarily intended for or directed toward commercial advantage or private monetary compensation.” It also says file-sharing is OK, “provided there is no payment of any monetary compensation in connection with the exchange of copyrighted works.” In addition, depending on whether or not you select the NonCommercial element, the Legal Code has a section on performance, mechanical and web-casting rights and statutory royalties where you as licensee either waive or reserve the exclusive right to collect, either individually or via a relevant collecting society, any royalties for the work. Interestingly, Textone, while using a licence with a NonCommercial element, explicitly state at one point on their web site (but not next to the licence button) that they allow playback and mixing of their releases during a for-profit DJ performance “since so much of the underground scene is dependent on DJ/performance fees for subsistence.”
The final, fourth licence element is ShareAlike. In the concise terms of the Commons Deed, this means: “If you alter, transform, or build upon this work, you may distribute the resulting work only under a licence identical to this one.” So with an Attribution-ShareAlike licence, someone could use your work in their own and release the result commercially, but their work would then fall under the same licence. So you, or anyone else, could use their work in turn.
The CC Music Sharing licence has its own button (’Share Music’ CC) and its own Commons Deed which states that “The owner of this music retains his or her copyright but allows you to: download, copy, file-share, trade, distribute, and publicly perform (eg. web-cast) it.” It also specifices Attribution, NonCommercial and NoDerivs. In other words: share it but don’t sample it, alter it, or make money from it, and don’t take away my credit. The Legal Code is the usual Attribution-NonCommercial-NoDerivs 2.0 licence, but if you want to make it clear that your music’s shareable (and that’s all) then this is the one to use.
If you want to make a living out of your (copyright-owned) music, it’s more complicated. There’s no hard and fast business case for going the Creative Commons route. If you’re looking to build a fan base, it could be something to try. You could always dip a toe in the water by licensing one or two tracks. Read the Magnatune box on the previous page and look over their web site. CC-licensed music doesn’t have to mean no-pay music; also, look at the way the NonCommercial CC element feeds into a commercial licensing revenue stream on the site.
http://creativecommons.org/learn/licenses/sampling | 科技 |
2016-40/3982/en_head.json.gz/11923 | Powerful Oklahoma quake was caused by… Share this:Click to share on Facebook (Opens in new window)Click to share on Twitter (Opens in new window)Click to email this to a friend (Opens in new window)Click to print (Opens in new window) Trending:
Powerful Oklahoma quake was caused by oil-drilling procedure, scientists say Share this:Click to share on Facebook (Opens in new window)Click to share on Twitter (Opens in new window)Click to email this to a friend (Opens in new window)Click to print (Opens in new window) By Bay Area News Group March 26, 2013 at 3:10 pm
WASHINGTON — An unusual and widely felt 5.6-magnitude quake in Oklahoma in 2011 was probably caused when oil drilling waste was pushed deep underground, a team of university and federal scientists concluded. That would make it the most powerful quake to be blamed on deep injections of wastewater, according to a study published Tuesday by the journal Geology. The waste was from traditional drilling, not from the hydraulic fracturing technique, or fracking. Not everyone agrees, though, with the scientists’ conclusion: Oklahoma’s state seismologists say the quake was natural. The Nov. 6 earthquake near Prague, Okla., injured two people, damaged 14 houses and was felt for hundreds of miles in 14 states, according to the U.S. Geological Survey. It was the largest quake in the central part of the country in decades and largest in Oklahoma records, experts said. The study by geophysicists at the University of Oklahoma, Columbia University and the USGS says that a day earlier there was a slightly smaller quake in an old oil well used to get rid of wastewater, right along a fault line. That smaller quake triggered the bigger one, and a third smaller aftershock. The location of the tremors right at the spot where wastewater was stored, combined with an increased well pressure, makes a strong case that the injections resulted in the larger quake, they said. This area of Oklahoma had been the site of oil drilling going back to the 1950s, and wastewater has been pumped into disposal wells there since 1993, the study authors said. Water and other fluids used for drilling are often pumped more than a mile below ground. The report said there was a noticeable jump in the well pressure in 2006. USGS geophysicist Elizabeth Cochrane described the pressure increase from injections as similar to blowing more air in a balloon, weakening the skin of the balloon “We have a lot of evidence that certainly leads us to believe” the quake was caused by the injections, said Cochrane, a study co-author. The evidence isn’t as complete as other smaller earthquakes that have been linked conclusively to injections of waste, such as those in Arkansas, Colorado and Nevada, said co-author Heather Savage of Columbia. But with the quake at the “right place” at the well, the increased pressure and the other smaller quakes across the region triggered by injections, “it becomes compelling,” she said. A National Academy of Sciences study last year documented 60 small injection induced quakes in the United States in the last 90 years, mostly in California, Texas, Colorado, Oklahoma and Ohio. In a statement, the Oklahoma Geological Survey said the interpretation that best fits the data is the quake “was the result of natural causes” but needs further study. The state officials cited new 3-D seismic data, a time lag between injection and the quakes, and the orientation of the faults to say it was natural not induced. Just being in the right place isn’t proof enough, said Austin Holland, seismologist for the state agency. There are few places in Oklahoma where you can have an earthquake that’s not near an injection well, he said. Three outside scientists contacted by The Associated Press said the researchers made a strong case for a likely man-made cause. “I think they made the case that it is possible; it’s probably even more than possible,” said Steve Horton, director of the Center for Earthquake Research and Information at the University of Memphis. “They have a very reasonable conclusion.” —— Online: Geology journal: http://geology.gsapubs.org/ Tags: Environment, World News | 科技 |
2016-40/3982/en_head.json.gz/11935 | Home> Consumer Design Center
How To Article Full-HD Voice: Understanding the AAC codecs behind a new era in communication
HP Baumeister, Manfred Lutzky, and Matthias Rose, Fraunhofer -January 22, 2013 Tweet
We have grown accustomed to "HD Everywhere" by consuming high-fidelity content in most aspects of our lives. State-of-the-art audio and video codecs such as MPEG AAC and H.264 have set our expectations by assuring the highest rich media quality at very low bit rates. These codecs enable high-quality multimedia content for Digital TV, online streaming, media stores such as iTunes, video games, and many other state-of-the-art media services and applications. The only real exception to the omnipresence of high-quality sound is the phone call, which is still largely tied to the limitations of technologies derived from the last century. With Full-HD Voice, a new era of audio quality for the telecommunications market has begun. Unlike Plain Old Telephone Services (POTS), ISDN and mobile phone calls, Full-HD Voice offers an unsurpassed level of quality, resulting in calls that sound as clear as talking to someone in the same room, or listening to high-quality digital audio. The current high-quality codec family behind Full-HD Voice is Enhanced Low Delay AAC (AAC-ELD). In addition to the millions of calls already being made today using AAC-ELD, this technology is set to enable many new Full-HD Voice applications, including telepresence at home and mobile rich media telephony.
This paper explains the advantages and opportunities of Full-HD Voice, including how AAC-ELD meets the high quality requirements users expect today. Full-HD Voice already leads the industry with the latest communication advancements and is set to drive future innovations.
The Gap in Quality Expectations
It is no secret that the vast majority of phone calls sound muffled compared to other sources of audio. Calls today have shortcomings that can make it difficult to understand conversations especially in noisy or reverberant environments, listening to talkers with soft or whispered speech and following conversations in a non-native language or with an accent. For example, the low audio bandwidth makes distinguishing between certain consonants such as "f" and "s" quite difficult. Both share a similar low frequency spectral envelope, but the "s" phoneme is characterized by its significant energy in the 10-kHz frequency range [Figure 1].
Figure 1: Typical spectrum of speech, here: s-phoneme.
Communication systems based on speech codecs with low audio bandwidth are unsuitable for sharing music, singing and ambient sounds. In addition, delays lead to involuntary interruptions, impacting a natural conversation flow. Many calls require repetition and even "spelling bees" to determine what someone is trying to communicate. These problems challenge the patience of call participants, can cause frequent misunderstandings, frustration and make phone calls simply exhausting. In addition, they limit phone calls to speech only, shutting out more natural communication options that include multiple talkers, ambience sounds, or music.
The low conversational audio quality stands in contrast to the multimedia capabilities available with the latest smartphones or tablets. Practically all of these devices can play very high quality audio from mp3 or AAC files when using headphones, in conjunction with external or internal speakers. In addition, virtually all smartphones and tablets have a camera function, recording not only HD (or even Full HD = 1080p) video but also near-CD-quality audio. Although the built-in microphones can capture a very high level of quality, in most cases only the camcorder function takes full advantage of this. With all these high-quality capabilities and components used in modern smartphones, tablets and other consumer electronic devices such as TVs, why is it that phone calls still sound like they did in the last century?
The Full Audio Spectrum: Evolution to Full-HD Voice
Most phone calls employ speech codecs, not audio codecs. This simple fact is the basis of many issues. Audio codecs model the human auditory system. However, speech codecs model the human speech system and therefore can only reproduce the human voice reasonably well. Background noise, multiple voices, music and sounds are beyond the capabilities of a speech codec, rendering these sounds highly distorted or even unrecognizable. Speech codecs are used in many POTS, ISDN and all mobile phone calls. In addition, these calls have an upper frequency limit of 3.4 kHz, also known as "narrowband". Because people are able to hear audio signals up to much higher frequencies (between 14 and 20 kHz at normal listening levels), most phone services are simply deleting at least three quarters of the audible spectrum. This causes the muffled sound in everyday telephony.
The "HD Voice" services recently introduced by some telephony providers have raised audio bandwidth from 3.4 to around 7 kHz. The speech codecs used in these services provide audibly better quality compared to legacy calls, but still only transmit less than half of the full audible audio spectrum.
Now, "Full-HD Voice" is available, delivering a new level of performance for voice calls and raising the quality to the level experienced in most digital media today. High-quality audio codecs specifically tuned for communication applications are widely available. This makes Full-HD Voice no longer a future promise, but a reality today. Audio codecs, such as AAC-ELD, provide the same or even lower coding latencies than speech codecs, and open up the full audio spectrum, up to 14 to 20 kHz [Figure 2]. Consequently, Full-HD Voice-enabled audio codecs make optimal use of the multimedia hardware built into phones today. This will bring a stunning, new high-quality experience to our everyday communication.
Figure 2: Bandwidth comparison of POTS, HD Voice and Full-HD Voice. Tweet
Next: Understanding the Codecs Behind Full-HD Voice
Loudspeaker operation: The superiority of current drive over voltage drive
Teardown: Samsung Galaxy Note 3 still the category leader
Active noise cancellation: Trends, concepts, and technical challenges
Fundamentals of USB Audio
The Class i low-distortion audio output stage (Part 2)
Negative feedback in audio amplifiers: Why there is no such thing as too much
Teardown: 5th-gen Apple iPod Touch
Teardown: Inside the Apple iPhone 5
Negative feedback in audio amplifiers: Why there is no such thing as too much (Part 2)
Improving high-end active speaker performance using digital active crossover filters | 科技 |
2016-40/3982/en_head.json.gz/11979 | SectionsSearchNewsNationalNational SportEntertainment NewsHealth NewsNational Video News
Stereotypes 'change' brains
5,000 boys and 40 girls completed Level 3 engineering apprenticeships last year, Professor Gina Rippon said
Science is losing out because of the mistaken belief that "men are from Mars and women from Venus", a leading neuroscientist has claimed.
Professor Gina Rippon said it was time to debunk the myth that gender differences are hard-wired into our brains.
In reality, there was no significant difference between the brains of a girl and boy in terms of their structure and function, she stressed.
But experiences and even attitudes could change the "plastic" brain on a physical level, causing its wiring to alter.
It was this that led girls and boys from an early age to head in different directions, said Prof Rippon, from Aston University.
While girls tended to tended to gravitate towards fields of communication, people skills and the arts, boys were more likely to become scientists and engineers.
Even when girls went into science, they mostly chose careers at the "softer" end of the subject, such as biology, psychology and sociology, rather than physics and maths.
Speaking ahead of this year's British Science Festival, taking place at the University of Birmingham next week, Prof Rippon said: "We're stuck in the 19th century model of the 'vacuum packed' brain, the idea that we're born with a brain that gives us certain skills and behaviours. "The brain doesn't develop in a vacuum.
"What we now know is that the brain is much more affected by stereotypes in the environment and attitudes in the environment, and that doesn't just change behaviour, it changes the brain."
Last year, 5,000 boys in the UK completed Level 3 engineering apprenticeships, but only 40 girls, Prof Rippon pointed out.
Boys taking physics A level also vastly outnumbered girls. But Prof Rippon insisted this was nothing to do with innate differences in the way the brains of girls and boys worked. Rather, it was likely to be the result of their brains being altered by experience.
One of the most often quoted examples of gender difference is spatial ability - the ability to understand the relationships between different objects in space.
Boys are said to be naturally more spatially gifted.
But if girls aged six to eight are given the tile-matching puzzle game Tetris, their brain wiring changes and their spatial ability improves, Prof Rippon said.
She added: "It's quite clear that spatial cognition is very much involved with experience, whether or not you have experience of manipulating objects as opposed to just observing them.
"This goes back to 'toys for boys'.
"From a very early age, boys have a lot more experience with manipulating objects."
Research had shown that as women attained greater access to education and power, gender differences began to disappear.
Prof Rippon was also dismissive of evolutionary psychologists who claimed the way men and women thought was largely the result of natural selection.
"The idea that women like the colour pink because it made them better able to pick berries - it's nonsense," she said.
Ill-conceived attempts to "fix" the problem of girls not going into science were likely to backfire, Prof Rippon argued.
One infamous example of this was the European Commission's Science Is A Girl Thing video released in 2012 which was swiftly dropped "because it was so awful".
"It showed girls in lab coats testing lipstick and giggling a lot," Prof Rippon said.
She added: "Science is something everybody should engage with.
"Let's not make science girly.
"Let's make science interesting to anyone." | 科技 |
2016-40/3982/en_head.json.gz/11999 | Applications and Benefits of HFCs
F-Gas Regulation
Chemical Families
Home / Kyoto Protocol / Action on Climate Change post 2012
Ozone Depleting Substances REACH Post-Kyoto Protocol Kyoto Protocol Action on Climate Change post 2012 Energy Efficiency Waste MAC Directive 40/2006 Process of the 2014 Regulation Adoption Related Links
DG ENV Page on future Actions on Climate
Commission proposes an integrated energy and climate change package to cut emissions for the 21st Century
EFCTC: Action on Climate Change post 2012
The Kyoto Protocol to the UNFCCC is only a first step to address the serious global threat of climate change. The ultimate goal of the UNFCCC is to stabilise atmospheric concentrations of greenhouse gases at a level that prevents dangerous human interference with the climate system.
The 2005 COP agreed to start negotiations to extend the Kyoto Protocol beyond 2012, and to launch a Dialogue under the broader UNFCCC, to include non-Kyoto signatories, including Australia and the United States.
The Dialogue has been launched to exchange experiences and analyse strategic approaches for long-term cooperative action to address climate change.
The US insisted that the talks "will take the form of an open and non-binding exchange of views, information and ideas […] and will not open any negotiations leading to new commitments". Information on : http://unfccc.int/meetings/dialogue/items/3668.php
On 10 January 2007 the European Commission set out proposals and options for keeping climate change to manageable levels in its Communication "Limiting Global Climate Change to 2° Celsius: The way ahead for 2020 and beyond."
The Communication, part of a comprehensive package of measures to establish a new Energy Policy for Europe, is a major contribution to the ongoing discussions at international level on a future global agreement to combat climate change after 2012, when the Kyoto Protocol's emissions targets expire.
The Communication proposes a set of actions by developed and developing countries that would enable the world to limit global warming to no more than 2°C above pre-industrial temperatures. The Commission proposes to reduce GHG emissions unilaterally by 20% by 2020 (compared to 1990).
The communication on climate change, which is to be discussed during the Spring Summit, calls on the EU to “take on a firm independent commitment to achieve at least a 20 percent reduction of GHG emissions by 2020” (compared to 1990 levels).
To achieve this objective, the Commission pushes forward a number of policy measures, such as reducing CO2 emissions from transportation, improving energy efficiency, or “extend[ing] the Emission Trading Scheme (ETS) to other gases and sectors.”
The February 2007 Environment Council adopted similar conclusions, and the March 2007 Spring European Council confirmed this views, and confirmed that EU would set the target of cutting 20% of the EU’s greenhouse gas emissions by 2020, even willing to put this goal up to 30% if the US, China and India make similar commitments.
The emissions reduction objective would be mainly achieved through energy policy. Priority measures will concern renewable energies, expanding the EU emissions trading scheme, etc.
Energy consumption should be reduced by 20 per cent compared to Business-as-Usual 2020 projections for 2020, primarily through implementation of the EU action plan on energy efficiency.
These decisions are non-legislative and the Commission will have to make proposals for the burden-sharing agreement to set individual national targets, and for the implementation of the Council decisions.
© European Fluorocarbons Technical Committee (EFCTC)
Glossary - FAQ - Contact us - SiteMap - Disclaimer | 科技 |
2016-40/3982/en_head.json.gz/12028 | A brief history of cooking with computers
By Michael Y. ParkPublished August 01, 2014
Print Technology in the kitchen. (iStock)
Where’s our frickin’ kitchen robot? If you were raised in the late 20th century on a cultural diet of Jetsons cartoons, you probably feel more than a little bit ripped off in the year 2014: No jet packs. No foods in pill form. No three-hour-a-day jobs that consist entirely of pressing buttons. And perhaps worst of all? No lovable, back-talking kitchen computer intelligences tossing out sass and electronic wisdom alongside alien pizza pies. So where did we go wrong? It Started with a Lie Consumers first got a chance to have a kitchen computer of their own in 1969, seven years after Jane Jetson first prepared dinner by inserting a punchcard into a towering contraption attached to the family table—and two years after the Philco Ford company produced a short film, 1999 AD, positing that the American family would be serving up computer-planned meals of cold roast beef and no-cal beer from Philco-brand tubes. In the winter of 1969, the Neiman-Marcus catalog offered the Honeywell Kitchen Computer, a $10,600 contraption that would plan complete family menus based around a main course. “Her soufflés are supreme, her meal planning a challenge?” the catalog condescended. “She’s what the Honeywell people had in mind when they devised our Kitchen Computer.” Not sold? The 100-pound computer also came with a cutting board and several recipes pre-programmed. “But it was a joke,” says Paul Atkinson, professor of design and design history at Sheffield Hallam University in England. The Honeywell Kitchen Computer—really just a commercial Honeywell 316 computer gussied up in modern red-and-white paint—was a marketing gimmick, the latest in a series of impossibly extravagant, high-priced items in the Neiman-Marcus catalog that were meant to generate talk rather than sales. For Honeywell, it helped shed its image as a fusty old brand, while for Neiman-Marcus it meant free publicity. The Honeywell Kitchen Computer, 1969 Maybe too much publicity, actually: Public interest proved to be so high that the company was forced to actually produce 20 of the monstrosities, which required a (complimentary) two-week programming course to use, and which communicated only in a form of binary code via a series of flashing lights. It’s not clear whether any of them actually sold, but Honeywell made history by creating not only the first real kitchen computer but the first computer for the consumer market. Glorified Recipe Boxes In the decades that followed, the personal-computer revolution truly took off, but for the most part, kitchen programs were nothing more than crude spreadsheets or primitive databases—remember The Recipe Manager, or your dad’s frustrated attempts to repurpose AppleWorks? Nearly 20 years after the Honeywell, the closest consumers could get to a kitchen computer that wasn’t a glorified spreadsheet was in Hyde Park, N.Y., at the Culinary Institute of America’s St. Andrew’s Cafe. The restaurant prided itself on being the first commercial restaurant to use a computer to calculate nutritional data for its meals. “People came in for lunch or dinner, and at the conclusion of the meal, we dropped the nutrient analysis at their place setting,” says Mark Erickson, the provost of the CIA. “It was like, ‘By the way, you had a really enjoyable lunch, and it turns out it was also nutritional.’” It took the Internet for things to finally heat up. In 1995—when the closest to computerized cooking many people came was watching cooking videos on CD-ROMs—Epicurious.com launched its website, and remains one of the most popular online recipe databases today. (Disclosure: I wrote regularly for Epicurious.com from 2007 to late 2013.) More ambitious, in its way, was Digital Chef, a California-based company that allowed users to scale its recipes by number of diners, and was one of the earliest to offer streaming video. Digital Chef may have proved to be too ambitious—the bandwidth available to most users at the time didn’t support video, and the company eventually closed shop. In 1999, Electrolux saw that the future was online, and then somehow figured that that meant people wanted Internet-capable computers in their refrigerators. The Screenfridge offered consumers the ability to order food and pull up recipes from a touchscreen mounted on their fridge door, along with video memos and family calendars. Around the same time, in 2000, 3Com introduced the Ergo Audrey, an Internet-capable computer meant specifically for the kitchen, which computer makers were finally recognizing as the nerve center of the home. Audrey wasn’t a cooking computer, but was meant to be the family’s portal to the World Wide Web, sturdy enough to stand up to kitchen-table abuse, and supplied with such relatively new technologies as web browsing, e-mail, and synchronized calendar and datebook. It also came in colors such as “linen,” “meadow,” and “sunshine.” The line failed before planned roll-outs of other computers geared to other parts of the home. “It wasn’t marketed very well,” says Atkinson. “Just try convincing people to buy five computers, one for each room, instead of one general computer.” After that, the concept of a dedicated kitchen computer seemed to fizzle out. “Computers have become smaller and smaller and become handheld,” Atksinon says. “You can pull up recipes on your smartphone. So why use a separate computer for it?” Now We’re Cooking Enter the age of the app. In 2008–2009, according to Apple spokeswoman Anna Matvey, the first food apps included Urbanspoon and 20-Minute Meals—Jamie Oliver. Today, the most popular iPhone food apps include The Photo Cookbook, Epicurious, AllRecipes Dinner Spinner, and How to Cook Everything. On the Internet, cooking sites have proliferated in the age of the mainstream gastronome, restaurant Instagrammer, amateur chef, and kitchen blogger. Hundreds of thousands of recipes are available to anyone in a heartbeat, and for free. And the future promises even more computer-assisted innovations—like, for instance, Chef Watson with Bon Appétit, our new collaboration with IBM. Erickson, for one, predicts that 3-D printing will transform our meals. “Ten years from now, we’re going to look at computer devices that will do things of significant quality and importance in food,” he says. In other words, you could argue that we have finally reached the Golden Age of the food computer. And it hasn’t escaped notice that, now that computers are finally living up to their potential in helping us prepare meals, we humans are ironically doing more actual work—by spending more time in the kitchen. “Maybe someday Rosie will be there to help out with the cooking,” Erickson says. “But just with the prep.” More from Bon Appetit Health Food Myths Debunked 5 Frozen Cocktail Recipes (No Cabana Boy Needed) The Best Ultimate Classic Perfect Recipes 7 Common Fried Chicken Mistakes—and How to Avoid Them | 科技 |
2016-40/3982/en_head.json.gz/12052 | Home Features GameRevolution's Top 25 Wii Games
GameRevolution's Top 25 Wii Games
Posted on Wednesday, December 26 @ 11:00:00 PST by Alex_Osborn
15. Punch-Out!!
Fans of the NES boxing classic will find much to love about Punch-Out!! on the Wii. Not only does it make excellent use of the Wiimote and nunchuck, it also supports the Wii balance board should players opt for the full body experience. Of course, if you'd like, you can play it just like you did years ago on the NES with a traditional control scheme as well. Beloved characters like Glass Joe and King Hippo are back, so much of the retro goodness from the old stays has carried over to the Wii, providing not only a challenging experience that will please longtime fans, but also newcomers looking to join the fight.
14. Super Paper Mario
For quite a while, the Paper Mario series stuck to a pretty formulaic structure since its inception on the Nintendo 64. It wasn't until the launch of Super Paper Mario that fans got a taste of what other kinds of experiences this series has to offer. This RPG-platformer hybrid succeeds in dipping into two very different genres and making it work. The controls are tight, the puzzles are fantastic, and the ability to switch between 2D and 3D makes for some pretty incredible experiences. Those looking for a Mario game that takes things outside the box can't afford to overlook this Wii gem.
13. Wii Sports
It wouldpracticallybe a crime to omit this family-friendly sports game from our list. Wii Sports will go down in history as one of the most influential games of its time. Not only did it help create a serious amount of buzz for Nintendo, it also managed to appeal to just about every demographic on planet earth. Wii Sports proved that motion controls have a place in the industry and that games can cater to an incredibly wide audience. There's no doubt that if Wii Sports hadn't launched with the console, the system wouldn't have found nearly as much success.
12. Fire Emblem: Radiant Dawn
If you're a fan of strategy games, there'sabsolutelyno excuse for you to miss out on Fire Emblem: Radiant Dawn. Deep and engrossing gameplay? Check. Compelling narrative? Check. Classic turn-based gameplay? Check. For the longest time Nintendo refrained from bringing the Fire Emblem franchise to North America, but now that they have, it's hard to imagine a Nintendo console without it.
11. The Last Story
Hironobu Sakaguchi, the man behind a little franchise called Final Fantasy, crafted one of the most amazing RPGs of this past generation with The Last Story. While it still looks very much like a traditional Japanese role-playing game, the new approach to combat makes it feel like a completely fresh and new experience. The story in great, the characters are memorable and the production values are top notch... for a Wii game. If you're looking for a polished RPG on the Wii that is worthy of your precious time, The Last Story is a no-brainer. < Previous Page 1 2 3 4 5 Next Page >
Tags: Nintendo, Wii, Mario, Zelda, Metroid, Kirby, Donkey Kong, Super Smash Bros., Retro Studios, Top 25 Games, Best of GR SHARE ON FACEBOOK | 科技 |
2016-40/3982/en_head.json.gz/12107 | Follow @GRIDArendal Follow @GRIDArendal GRID-Arendal
HomeAboutProgrammesActivitiesGraphicsPhotosVideosPublicationsNews Search News > Press releases > Well Managed Fisheries ...
Well Managed Fisheries Vital for Environmentally Friendly Development in Poor Parts of the Globe
Studies from Mauritania, Argentina and Senegal Point to Pitfalls and Possible Solutions for Balancing Trade and Environment Concerns Studies from Mauritania, Argentina and Senegal Point to Pitfalls and Possible Solutions for Balancing Trade and Environment Concerns
Geneva/Nairobi, 15 March 2002 - Catches of some key fish stocks have been falling sharply off the west coast of Africa with the decline being linked to over-fishing by foreign fleets.
A preliminary study of Mauritania, where European Union, Japanese and Chinese boats have been given access to fishing grounds, has found that catches of octopus have halved in the past four years and that some species, such as sawfish, have completely disappeared.
Details of the report will be released today at a fisheries workshop in Geneva organised by the United Nations Environment Programme (UNEP) where delegates will be discussing the links between international trade and subsidies, and their social and environmental impacts.
The meeting will also review a second preliminary report on fisheries in Bangladesh, which indicates that marine stocks there can support more fishing. Fish stocks in Bangladesh's waters could generate employment and millions of dollars of foreign exchange earnings for one of the world's poorest countries. But the findings from Mauritania, alongside other UNEP-commissioned country studies of Senegal and Argentina, show that strict safeguards must be in place before fishing activities are increased, or foreign fleets are invited in.
Otherwise Bangladesh could find that its stocks too become vulnerable to over-exploitation, inflicting economic costs and putting at risk much needed food supplies for its own people, rather than generating income.
The UNEP fisheries workshop is being attended by some 100 delegates from countries around the world and bodies including the World Trade Organization (WTO), the United Nation's Food and Agricultural Organization, the Organisation for Economic Cooperation and Development, the International Centre for Trade and Sustainable Development and the World Wide Fund for Nature. The workshop comes at the start of a new round of world trade talks, which will include negotiations on reducing fisheries subsidies.
The environmental effects of trade are also now firmly on the agenda following decisions made at the WTO Ministerial Conference that took place in Doha last November.
Hussein Abaza, Head of UNEP's Economic and Trade Branch, said: " We are slowly amassing a wealth of hard facts about the complex relationship between trade liberalization, subsidies and their environmental and social impacts, especially in the area of fisheries. It is becoming clear that developing countries stand to gain a great deal from trade in fisheries products, but only if trade and fisheries policies are reformed to support sustainable management of these resources. The country studies we have commissioned, including this new one from Mauritania, not only shed important light on the damage that can be caused by unregulated trade liberalization, but offer pointers to the actions needed so that trade in fish contributes to development and sustains marine ecosystems".
Klaus Toepfer, Executive Director of UNEP, said:" Fisheries represent, to many developing countries like Bangladesh, a real opportunity for economic development. The fish stocks in many developed country waters have been severely depleted as too many, often heavily subsidised, fleets chase too few fish. As a result they are looking elsewhere for catches. It is vital that the unsustainable fishing of the past and the present is not exported to the developing world".
"The new round of WTO talks, offer a golden opportunity to marry trade liberalization with poverty reduction and environmental safeguards. I believe UNEP's work, comparing and contrasting the differing fortunes of poorer nations in this difficult area, can inform the trade talks and lead to a successful and more sustainable outcome, " he added.
MauritaniaThe case of Mauritania highlights the difficulties facing developing countries who, in the push to generate much needed foreign exchange earnings to help fight poverty and assist in economic development, license foreign fleets to use their fishing grounds.
Fishing vessels from the European Union, Japan and China have, for nearly two decades had increasing access to Mauritanian waters fishing grounds where the target species include octopus and shrimp. There are now an estimated 251 industrial, factory-style, foreign vessels operating there.
The preliminary country report, compiled for UNEP by the National Oceanographic and Fisheries Research Centre in Nouadhibou, shows that giving this access has had significant impacts on the marine environment.
Over-fishing due to a failure by some fishing boats to comply with the rules, lack of enforcement and a shortage of fisheries protection boats alongside other factors, have led to a dramatic fall in catches as fish stocks are over-exploited.
For example catches of octopus have fallen by more than 50 per cent in the past four years.
Local employment in the fishing sector has also been hit as a result of the over-fishing and over capacity in the foreign fleets. The number of people employed in the traditional octopus fishery in Mauritania has fallen from a peak of nearly 5,000 in 1996 to around 1,800 now.
Current regulations are allowing the EU shrimp boats to use a smaller mesh size of 50mm for their nets when compared with the native Mauritanian boats, the preliminary report says. This, the report argues, is leading to the accidental capture of other marine species. This by-catch, which may have important impacts on the marine environment and the supply of traditional fish to local markets, can amount to as much as 58 per cent of the catch of the EU boats.
Other, unforseen, impacts on local livelihoods and the national economy have been occurring. In 1998 Japan cancelled orders from Mauritania for octopus choosing to buy, from Spain, cheaper catches landed in Europe from Mauritanian waters.
The report concludes that Mauritania has made far less out of granting foreign boats access to its waters than had previously been supposed.
" In granting European?.. boats the right to fish in Mauritania, the Government is using economic arguments that take into account only what return that may bring the country, and not what is might cost," it says.
Mr Toepfer said: "We hope that, by bringing to the attention of delegates the experiences of Mauritania, Senegal and Argentina, we can assist developing nations like Bangladesh in formulating policies that permit a sustainable development of their marine fisheries ".
Next week, March 19 to 20, UNEP will be holding a second workshop just days before the WTO negotiations on trade and environment on March 22.
Mr Abaza added: "This capacity building workshop, to which we are bringing delegates from developing countries across the world, will help these nations develop the combination of trade, environment and development policies needed to maximise the benefits of trade. We hope the new round of WTO talks will eventually deliver a legally binding agreement that increases developing countries trading opportunities, respects the environment and helps deliver sustainable development to all the peoples of the world," he said.
Notes to Editors:The fisheries subsidies workshop will take place March 15 in the Palais des Nations in Geneva. Details of the country reports (including Argentina and Senegal as well as preliminary reports on Mauritania and Bangladesh), the agenda of the meeting and background papers can be found at http://www.unep.ch/etu/etp/events/upcming/15March_fisheries.htm.
A press release on the Argentina and Senegal studies, dated 27 December 2001, can be seen at the UNEP main web site address http://www.unep.org or http://www.unep.org/Documents/Default.asp?DocumentID=227&ArticleID=2991
The workshop, Capacity Building on Environment, Trade and Development, will take place at the same venue on March 19 to 20.
The Committee on Trade and the Environment of the WTO will start formal negotiations on trade and the environment, in Geneva on March 22.
For More Information Please Contact. Nick Nuttall, UNEP Head of Media, on Tel. 254 2 623084, Mobile. 254 733 632755, E-mail. nick.nuttall@unep.org or Hussein Abaza on Tel: 41 22 917 8298, E-mail: hussein.abaza@unep.ch UNEP News Release 2002/16Friday 15 Mar 2002 NewsAround The WebPress releasesFind UsNewsletterMedia ContactUNEA-2Arendalsuka 2016
All (9)2016 (9) Disclaimer | Feedback
2014 GRID-Arendal | 科技 |
2016-40/3982/en_head.json.gz/12149 | Surya Grahan July 2010 Not Visible in India – Surya Grahanam July 11, 2010
A Total Solar Eclipse – Purna Surya Grahan – will take place on July 11, 2010. It is not visible in India. The eclipse will be visible only in parts of South America, Cook Islands, French Polynesia etc. Hindu Panchangs and calendars have ignored the Suryagrahanam. As per the India Time the eclipse will be taking place from 23:45:00 hrs on July 11, 2010 to 02:20:00 on July 12, 2010. The GMT or Universal Time of the Eclipse is from 18:15 UT to 20:52 UTAccording to NASA, the Suryagrahan on July 11 is a ‘Total Solar Eclipse’ and Moon's umbral shadow crosses the South Pacific Ocean where it makes no landfall except for Mangaia (Cook Islands) and Easter Island (Isla de Pascua). The path of totality ends just after reaching southern Chile and Argentina. The Moon's penumbral shadow produces a partial eclipse visible from a much larger region covering the South Pacific and southern South America The path of the Surya Grahan (Image from NASA) Usually, Hindus do not perform any work during Surya Grahan and they purify themselves by taking a bath and chants the Ashtakshara Mantra dedicated to Shri Krishna, Lord Vishnu and Shiva. Ashtakshara mantra is ‘Shri Krishna ha sharnam mama.’ Generally most Hindu temples remain closed during the period of solar eclipse and temples reopen on the next day morning after special pujas and rituals. Hindus in large number take holy dip in Rivers like Ganges and other tirths especially at Brahmasarovar in Kurukshetra.Note: This Surya Grahan is of no importance as per Hindu astrology and calendars. There is no need to follow any rules and rituals.You May Also Like To Read What do Hindus do during Surya Grahan? Surya Grahan and Pregnant Women Surya Grahan in Hindu Scriptures | 科技 |
2016-40/3982/en_head.json.gz/12228 | Encyclopedia > Plants and Animals > Botany > Botany: General sepal
sepal, a modified leaf, part of the outermost of the four groups of flower parts. The sepals of a flower are collectively called the calyx and act as a protective covering of the inner flower parts in the bud. Sepals are usually green, but in some flowers (e.g., the lily and the orchid) they are the same color as the petals and may be confused with them. In some groups of plants (e.g., the marsh marigold and the anemone) they are absent. The small green leaflike structures at the base of the flower head in the aster family are not true sepals but bracts; the sepals are modified into a circle of tiny white hairs on the ovary (the pappus; see aster). The sepals are sometimes fused into a tube around the base of the petals, as in the mint family. The Columbia Electronic Encyclopedia, 6th ed. Copyright © 2012, Columbia University Press. All rights reserved.See more Encyclopedia articles on: Botany: General | 科技 |
2016-40/3982/en_head.json.gz/12246 | 2.7.3.3 Droughts and wet spells
In the SAR, an intensification of the hydrological cycle was projected to occur
as the globe warms. One measure of such intensification is to examine whether
the frequency of droughts and wet spells are increasing. Karl et al. (1995c)
examined the proportion of land areas having a severe drought and a severe moisture
surplus over the United States. Dai et al. (1998) extended this analysis to
global land areas using the water balance approach of the Palmer Drought Severity
Index. Long-term global trends for 1900 to 1995 are relatively small for both
severe drought and wet area statistics. However, during the last two to three
decades, there have been some increases in the globally combined severe dry
and wet areas, resulting from increases in either the dry area, e.g., over the
Sahel, eastern Asia and southern Africa or the wet areas, e.g., over the United
States and Europe. Most of the increases occurred after 1970. Except for the
Sahel, however, the magnitude of dry and wet areas of the recent decades is
not unprecedented during this century, but it should be noted that rainfall
in the Sahel since the height of the drought has substantially increased. In
related work, Frich et al. (2001) found that in much of the mid- and high latitudes,
there has been a statistically significant increase in both the number of days
with precipitation exceeding 10 mm per day and in the number of consecutive
days with precipitation during the second half of the 20th century.
Recent changes in the areas experiencing severe drought or wet spells are closely
related to the shift in ENSO towards more warm events since the late 1970s,
and coincide with record high global mean temperatures. Dai et al. (1998) found
that for a given value of ENSO intensity, the response in areas affected by
drought or excessive wetness since the 1970s is more extreme than prior to the
1970s, also suggesting an intensification of the hydrological cycle. 2.7.3.4 Tornadoes, hail and other severe local weather
Small-scale severe weather phenomena (SCSWP) are primarily characterised by
quasi-random temporal and spatial events. These events, in turn, have local
and regional impacts, often with significant damage and sometimes loss of life.
Tornadoes and thunderstorms and related phenomena such as lightning, hail, wind,
dust, water spouts, downpours and cloudbursts belong to this group. In the light
of the very strong spatial variability of SCSWP, the density of surface meteorological
observing stations is too coarse to measure all such events. Moreover, areally
consistent values of SCSWP are inherently elusive. Statistics of relatively
rare events are not stable at single stations, observational practices can be
subjective and change over time, and the metadata outlining these practices
are often not readily available to researchers. For these reasons, monitoring
the occurrence of local maxima and minima in smoothed SCSWP series, as well
as checking for trends of the same sign for different but related SCSWP (e.g.,
thunderstorms, hail, cloud bursts), are important for checking inconsistencies.
Because of the inherent difficulty in working with these data, there have been
relatively few large-scale analyses of changes and variations in these events.
Nonetheless, a few new regional analyses have been completed since the SAR.
A regional analysis by Dessens (1995) and more recent global analysis by Reeve
and Toumi (1999) show that there is a significant interannual correlation between
hail and lightning and mean minimum temperature and wet bulb temperatures. Using
a three-year data set, Reeve and Toumi (1999) found a statistically significant
relationship between lightning frequency and wet bulb temperature. They show
that with a 1°C increase in global wet-bulb temperature there is a 40% increase
in lightning activity, with larger increases over the Northern Hemisphere land
areas (56%). Unfortunately, there are few long-term data sets that have been
analysed for lightning and related phenomena such as hail or thunderstorms,
to calculate multi-decadal hemispheric or global trends.
Figure 2.38: Annual total number of very strong through violent (F3-F5)
tornadoes reported in the USA, which are defined as having estimated wind
speeds from approximately 70 to 164 ms-1. The Fujita tornado
classification scale was implemented in 1971. Prior to 1971, these data
are based on storm damage reports (National Climatic Data Center, NOAA).
A regional analysis assessed the temporal fluctuations and trends in hail-day
and thunder-day occurrences during a 100-year period, from 1896 to 1995, derived
from carefully screened records of 67 stations distributed across the United
States. Upward hail day trends were found in the High Plains-Rockies and the
south-east, contrasting with areas with no trend in the northern Midwest and
along the East Coast, and with downward trends elsewhere (Changnon and Changnon,
2000). The major regions of decrease and increase in hail activity match regions
of increased and decreased thunder activity for 1901 to 1980 well (Changnon,
1985; Gabriel and Changnon, 1990) and also crop-hail insurance losses (Changnon
et al., 1996; Changnon and Changnon, 1997). In general, hail frequency shows
a general decrease for most of the United States over the last century, with
increases over the High Plains, the region where most of the crop-hail damage
occurs in the United States. So, despite an increase in minimum temperature
of more than 1°C since 1900 and an increase in tropospheric water vapour
over the United States since 1973 (when records are deemed reliable), no systematic
increase in hail or thunder days was found.
In south Moravia, Czech Republic, a decreasing linear trend in the frequency
of thunderstorms, hailstorms and heavy rain from 1946 to 1995 was related to
a significant decrease in the occurrence of these phenomena during cyclonic
situations, when 90% of these phenomena occur in that region (Brázdil
and Vais, 1997). Temperatures have increased in this area since 1946.
Since 1920, the number of tornadoes reported annually in the United States
has increased by an order of magnitude, but this increase reflects greater effectiveness
in collecting tornado reports (Doswell and Burgess, 1988; Grazulis, 1993; Grazulis
et al., 1998). On the other hand, severe tornadoes are not easily overlooked.
Restricting the analysis to very strong and violent tornadoes results in a much
different assessment (Figure 2.38) showing little
long-term change, though some years like 1974 show a very large number of tornadoes.
Furthermore, consideration of the number of days with tornadoes, rather than
number of tornadoes, reduces the artificial changes that result from modern,
more detailed damage surveys (e.g., Doswell and Burgess, 1988). The data set
of �significant� tornado days developed by Grazulis (1993) shows a
slow increase in number of days with significant tornadoes from the early 1920s
through the 1960s, followed by a decrease since that time. 2.7.4 Summary
Based on new analyses since the SAR, it is likely that there has been a widespread
increase in heavy and extreme precipitation events in regions where total precipitation
has increased, e.g., the mid- and high latitudes of the Northern Hemisphere.
Increases in the mean have often been found to be amplified in the highest precipitation
rates total. In some regions, increases in heavy rainfall have been identified
where the total precipitation has decreased or remained constant, such as eastern
Asia. This is attributed to a decrease in the frequency of precipitation. Fewer
areas have been identified where decreases in total annual precipitation have
been associated with decreases in the highest precipitation rates, but some
have been found. Temperature variability has decreased on intra-seasonal and
daily time-scales in limited regional studies. New record high night-time minimum
temperatures are lengthening the freeze and frost season in many mid- and high
latitude regions. The increase in global temperatures has resulted mainly from
a significant reduction in the frequency of much below normal seasonal mean
temperatures across much of the globe, with a corresponding smaller increase
in the frequency of much above normal temperatures. There is little sign of
long-term changes in tropical storm intensity and frequency, but inter-decadal
variations are pronounced. Owing to incomplete data and relatively few analyses,
we are uncertain as to whether there has been any large-scale, long-term increase
in the Northern Hemisphere extra-tropical cyclone intensity and frequency though
some, sometimes strong, multi-decadal variations and recent increases were identified
in several regions. Limited evidence exists for a decrease in cyclone frequency
in the Southern Hemisphere since the early 1970s, but there has been a paucity
of analyses and data. Recent analyses of changes in severe local weather (tornadoes,
thunder days, lightning and hail) in a few selected regions provide no compelling
evidence for widespread systematic long-term changes. Table of contents | 科技 |
2016-40/3982/en_head.json.gz/12262 | AOL appoints new head of content strategy
AOL overhauled under new chief
New software aims to breathe life into AOL
As launch looms, AOL 8.0 details begin to emerge
By Amy Bennett
ITworld.com – America Online Inc. (AOL) has tapped media veteran Jeff Rowe as its new senior vice president and executive producer in a bid to recharge the waning Internet service with splashier content and services.Rowe's appointment, announced Monday, puts him in charge of AOL's day-to-day media strategy, where he will make decisions over music, video, news and entertainment offerings delivered to AOL's more than 35 million members.The new exec will report to AOL Executive Vice President for Programming and Strategy David Lebow. With experience in radio, broadcast, cable and online media, Rowe has a broad base of programming savvy on which to draw, the company said. Among his various posts, Rowe spent six years at NBC working in programming and promotions and was previously vice president of VH1. In the Internet world, Rowe served as general manager of online entertainment service Zap2It.com and lead the UltimateTV.com startup.Rowe is taking the helm of AOL's content strategy as the company gets ready to launch AOL 8.0, due out in mid-October. The company is hoping that the new software, along with a recent push to restart the Dulles, Virginia, Internet unit with executive reshuffling and strategy shifts, will get it up and running again.Stock AOL parent company AOL Time Warner Inc. (AOL) was down 4.95 percent to US$11.52 a share in late morning trading Monday.
Amy Bennett — Managing Editor, Strategic Projects | 科技 |
2016-40/3982/en_head.json.gz/12305 | Rover Status Report: Opportunity Begins Sustained Exploration Inside Crater
NASA Orbiter Provides Insights About Mars Water and Climate
A Warm South Pole? Yes, on Neptune!
These thermal images show a "hot" south pole on the planet Neptune. These warmer temperatures provide an avenue for methane to escape out of the deep atmosphere. Image credit: VLT/ESO/NASA/JPL/Paris Observatory+ Full image and caption PASADENA, Calif. -- An international team of astronomers has discovered that Neptune's south pole is much hotter than the rest of the planet. They have published the first temperature maps of the lowest portion of Neptune's atmosphere, which show that this warm south pole is providing an avenue for methane to escape out of the deep atmosphere.
"The temperatures are so high that methane gas, which should be frozen out in the upper part of Neptune's atmosphere, the stratosphere, can leak out through this region," said Glenn Orton of NASA's Jet Propulsion Laboratory, Pasadena, Calif. Orton is lead author of a paper appearing in the Sept. 18 issue of the journal Astronomy and Astrophysics. These findings were made using the Very Large Telescope, located in Chile, operated by the European Organization for Astronomical Research in the Southern Hemisphere (known as ESO).
In the paper, Orton and his colleagues report that the temperature at Neptune's south pole is hotter than anywhere else on the planet by about 10 degrees Celsius (18 degrees Fahrenheit). The average temperature on Neptune is about minus 200 degrees Celsius (minus 392 degrees Fahrenheit). Neptune, the farthest known planet of our solar system, is located about 30 times farther away from the sun than Earth is. Only about one thousandth of the sunlight received by our planet reaches Neptune. Yet, the small amount of sunlight Neptune does receive significantly affects the planet's atmosphere. The astronomers found that these temperature variations are consistent with seasonal changes. A Neptunian year lasts about 165 Earth years. It has been summer in the south pole of Neptune for about 40 years now, and they predict that as winter turns to summer at the north pole, an abundance of methane would leak out of a warm north pole in about 80 years. "Neptune's south pole is currently tilted toward the sun, just as the Earth's south pole is tilted toward sun during summer in the southern hemisphere," explains Orton. "But on Neptune the antarctic summer lasts 40 years instead of a few months, and a lot of solar energy input during that time can make big temperature differences between the regions in continual sunlight and those with day-night variations. This is a likely factor in Neptune having the strongest winds of any planet in the solar system; sometimes, the wind blows there at more than 2,000 kilometers per hour [1,240 miles per hour]," said Orton.
The new observations also reveal mysterious high-latitude "hot spots" in the stratosphere that have no immediate analogue in other planetary atmospheres. The astronomers think this feature was generated by upwelling gas from much deeper in the atmosphere.
Methane is not the primary constituent of Neptune's atmosphere, which, as a giant planet, is mostly composed of the light gases, hydrogen and helium. But it is the methane in Neptune's upper atmosphere that absorbs the red light from the sun and reflects the blue light from the sun back into space, making Neptune appear blue.
The new results were obtained with the Very Large Telescope and spectrometer for the mid-infrared, operated by the European Organization for Astronomical Research in the Southern Hemisphere (known as ESO).
In addition to Orton, the team of astronomers includes Cédric Leyrat and A. James Friedson of JPL; Therese Encrenaz of Laboratoire d'Etudes Spatiales et d'Instrumentation en Astrophysique, Paris, France; Richard Puetter of the Center for Astrophysics and Space Sciences, University of California, San Diego; and Eric Pantin, of the Centre d'Etudes Atomiques. JPL is managed for NASA by the California Institute of Technology in Pasadena.
News Media ContactCarolina Martinez 818-354-9382
Jet Propulsion Laboratory, Pasadena, Calif.
carolina.martinez@jpl.nasa.gov
Henri Boffin +49-89-3209-6222 ESO, Garching, Germany hboffin@eso.org
PopularFinal Descent Images from Rosetta SpacecraftFarewell Rosetta: ESA Mission to End on Comet SurfaceCuriosity Finds Evidence of Mars Crust Contributing to AtmosphereNASA TV Coverage of European Mission Comet TouchdownThe Frontier Fields: Where Primordial Galaxies LurkMars Rover Views Spectacular Layered Rock Formations You Might Also Like
A new image of comet 67P/Churyumov-Gerasimenko was taken by the European Space Agency's Rosetta spacecraft shortly before its controlled impact into the comet's surface.
Final Descent Images from Rosetta Spacecraft
NASA's Curiosity rover has found evidence that chemistry in the surface material on Mars contributed to the makeup of its atmosphere.
The European Space Agency Rosetta mission will end dramatically Friday, Sept. 30, by touching down on a region of a comet known for active pits that spew dust into space. | 科技 |
2016-40/3982/en_head.json.gz/12349 | Philip Maddocks: Many in the GOP doubt the importance of the Higgs boson
As President Barack Obama steps up his declarations about the dire consequences of not raising the debt limit, an increasing number of Congressional Republicans have started to shift their focus from the economy to physics, disputing the importance of the Higgs boson and even questioning the contention of theoretical physicists that without the Higgs field, there would be no atoms and no us.A surprisingly broad section of the Republican Party is convinced that the last missing ingredient of the Standard Model � a set of equations that has ruled particle physics for last 50 years � does not exist � or, if it does, may not be as important as scientists say.Some lawmakers have even begun questioning scientists� view that without the Higgs field elementary particles would move around at the speed of light. Some problems would arise if all particles traveled at the speed of light, they concede.�But I think the real cosmological challenge, the grave situation that is highly problematic for our nation, is Obamacare moving at the speed of light,� said Sen. Bob Corker, Republican of Tennessee.Others insist there is no threat at all from anything � even the Affordable Care Act � moving at the speed of light.�That�s just something scientists like to say. And it really is irresponsible of Dr. Higgs to try to scare us like that,� said Senator Rand Paul, Republican of Kentucky, referring to Peter W. Higgs, who along with Fran�ois Englert was awarded the Nobel Prize in Physics for their work that led to the discovery of the particle that confers mass on other particles, known as the Higgs boson. �If you don�t have the Higgs, all you�re saying is, �We are living in a world of perfect symmetry where everything interesting � like the smell of a rose or a senator from Kentucky � is not dependent on lapses in that simplicity.� So if you put it in those terms, all these scary terms of, �Oh my goodness, the world�s going to end� � if we are elegant and symmetrical, the world�s going to end?��If you propose it that way,� he said, �the American public will say that sounds like a pretty reasonable way of looking at it.��We always have enough money to pay our debt service and we seem to have no shortage of mass,� said Sen. Richard Burr, Republican of North Carolina, who pointed to a stream of tax revenue flowing into the Treasury and to his own weight as he shrugged off fears of a cascading financial and cosmological crisis. �You�ve had the federal government out of work for close to two weeks and we went centuries without the Higgs. Both these things are manageable.�In a news conference, Mr. Obama tried to turn the focus from physics back to finances, saying repeatedly that Dr. Higgs and Dr. Englert were more than capable of defending their own work and that those who doubted the validity of their model were making a huge mistake.�I�m happy to talk to any of these lawmakers individually and walk them through exactly why this discovery matters in physics and of the importance of the symmetry between two forces, the electromagnetic force and the weak nuclear force, which provides the first step in the chain of reactions that gives the sun its energy - but not until Congress votes on a clean resolution and raises the debt ceiling,� he said.The turmoil created by the partial shutdown of the federal government and lawmakers� refusal to accept Nobel-prize-winning work in physics has already sent investors fleeing from stocks to the safe harbor of Treasury bonds, long considered the most secure investment because the full faith and credit of the United States government has never been questioned. If that safe harbor is undermined, most economists have said, the impact could be catastrophic.The U.S. Chamber of Commerce and the National Association of Manufacturers, both bastions of Republican support, sent letters to Congress on Tuesday urging a rethink on doubts about the Higgs.Some Republicans trust such warnings.�Unlike some of my colleagues, I�ve been told by too many people in science that this is important,� said Sen. John McCain, Republican of Arizona. �Of course I�m worried. I am pretty sure that moving at the speed of light would prove to be the final undoing for the Straight Talk Express.�But the voices of denial persist, with some saying the effect of denying the Higgs is no more consequential than the across-the-board budget cuts known as sequestration.�I didn�t notice Dr. Higgs saying anything about that,� Mr. Paul noted.Congressional Republicans have varied arguments as to why the Higgs doesn�t matter � or at least doesn�t matter as much as other issues of existential import.To Rep. Paul Broun, Republican of Georgia and a candidate for the Senate, it is a question of ranking the evils.�There are a lot of things that are going to affect our economy and our universe,� he said. �The greatest threat right now is Obamacare. It has already destroyed jobs. It has already destroyed our economy. And if it stays in place as it is now, it will surely destroy the Higgs. I couldn�t make it any plainer than that.�Philip Maddocks writes a weekly satirical column. He can be reached at pmaddocks@wickedlocal.com. | 科技 |
2016-40/3982/en_head.json.gz/12447 | E-cigarettes hooking more high school kids By
Quentin Fottrell
Published: Aug 27, 2014 1:26 p.m. ET
Study says more teens who try e-cigarettes will try regular smokes By
QuentinFottrell
Shutterstock.com / Leszek Glasner
The number of middle and high-school students who have tried so-called “e-cigarettes” has tripled in the past three years, and is doubling the number of youth who say they will begin smoking regular cigarettes too, according to a new survey. The study from the 2011-2013 National Youth Tobacco Survey, released Monday in the journal Nicotine and Tobacco Research, showed that the number of middle and high-schoolers who’ve tried e-cigarettes, but never conventional cigarettes, shot up to 263,000 in 2013, up from 79,000 in 2011. Even more significant, almost half of those kids surveyed said they planned to smoke regular cigarettes within a year. The study is likely to add pressure on the Food and Drug Administration to begin regulating those tobacco-like products. Anti-smoking advocates like the American Lung Association said that study shows “e-cigarette use among youth will be kids on a lifelong addition to nicotine and tobacco products,” according to Harold Wimmer, national president of the ALA. He called on the Obama administration and the FDA to finalize new regulation to control use of e-cigarettes by the end of 2014 “so that we do not lose another generation of kids to tobacco-caused death and disease.” Federal laws prohibit traditional cigarettes from being marketed to people under 18 years old, but there are no federal limits for e-cigarette makers. Unlike tobacco products, e-cigarettes carry no child-warning labels. Moreover, major tobacco companies Altria Group MO, +0.65%
, Reynolds American RAI, +0.04%
and Lorillard US:LO
have all started producing e-cigarettes and recent e-cigarette commercials feature TV personality Jenny McCarthy and actor Stephen Dorff. E-cigarette use triples among the youth(4:10)
A new study from the journal Nicotine and Tobacco Research shows an increase in the number of middle and high school students who have used e-cigarettes. The FDA proposals — which also cover pipe tobacco, hookahs and cigars — will outlaw the sale of e-cigarettes to children and, like alcohol, require people to show identification to prove they are 18 years of age or older when they buy them. In the first such regulations for the e-cigarette industry, companies will also have to apply for FDA approval before marketing their products, which critics say vary wildly in quality; they also won’t be allowed to distribute free samples. The FDA has already found that e-cigarettes vary widely in reliability and quality, and didn’t always do what they said on the package. “The FDA found significant quality issues that indicate that quality control processes used to manufacture these products are substandard or nonexistent,” the agency’s consumer advice page states. Cartridges labeled “no nicotine” did contain nicotine, for instance, and three different e-cigarette cartridges with the same label emitted a markedly different amount of nicotine with each puff. Amazon buys Internet game video channel Twitch(2:21)
Amazon has acquired Twitch Interactive, a popular Internet video channel for broadcasting and watching people play video games. A look at the rapid growth of video games as a spectator sport.
The FDA also proposes warnings related to packaging and advertisements. The FDA has authority to issue further regulations to restrict online sales of all regulated tobacco products, an FDA spokesman told Marketwatch.com in May. Currently, about 28 states prohibit “e-cigarette” sales to minors, and legislation is pending in several others. Both the FDA and the American Lung Association notes that e-cigarettes marketed with flavors “can be especially attractive to youth,” including cotton candy, bubble gum, or to mimic the candy Atomic Fireball or even the popular kids’ cereal flavor Froot Loops. Earlier this year, a group of 11 Democratic members of Congress released a report that said e-cigarette flavors such as “Cherry Crush,” “Chocolate Treat” and “Peachy Keen” appeal to minors and should also be restricted. “E-cigarette makers are starting to prey on kids, just like the big tobacco companies,” said Henry J. Waxman, a Democrat from California. Between 2011 and 2012, e-cigarette use nearly doubled from 0.6% to 1.1% among middle school students and from 1.5% to 2.8% among high school students, a 2013 report by the Centers for Disease Control and Prevention found Also see: 10 things e-cigarettes won’t tell you For its part, the e-cigarette industry says it supports federal regulation — up to a point. The Smoke-Free Alternatives Trade Association, which represents the vapor products industry, backs proposals to restrict the sale of e-cigarette products to minors, and says it will support any effort made by legislative agencies and organizations to keep vaporizers out of the hands of underage consumers. But Cynthia Cabrera, executive director of the association, says e-cigarettes are markedly different from other tobacco products, and should not be classified as such. “These products do not contain tobacco, but may or may not contain nicotine derived from tobacco,” she told Marketwatch earlier this year. This article has been updated from a previous version. Other articles by Quentin Fottrell: Diet soda may trim your lifespan Cocaine use is going to pot Treating hangovers is now a billion-dollar industry More from MarketWatch
Fottrell
Quentin Fottrell is a news editor and The Moneyologist columnist for MarketWatch. You can follow him on Twitter @quantanamo.
Altria Group Inc.
U.S.: NYSE: MO
Reynolds American Inc.
U.S.: NYSE: RAI | 科技 |
2016-40/3982/en_head.json.gz/12586 | Senate Gives Alternative Fuels a Boost
For states like Nebraska that lead the nation in the production of ethanol and development of other bio fuels, there were some anxious moments in the Senate that would have limited the development and production of bio fuels by the military. Fortunately, calmer heads have prevailed.
The Senate passed its version of the fiscal year 2013 National Defense Authorization Act. This 'must-pass' legislation is vital to America's armed forces and national security providing authorization for the Pentagon, pay and benefits for service members, the war in Afghanistan, and national security programs at the Energy Department. Unfortunately, when this legislation was first reported to the Senate floor, there were two provisions that would unnecessarily limit the Department of Defense's ability to use and develop alternative fuels despite the Pentagon's request that Congress support their initiatives to produce advanced bio fuels and build or retrofit bio refineries. Giving military leaders the ability to develop and employ alternative fuels is crucial to national security, improve the military's strategic flexibility, and insulate the defense budget against future spikes in the cost of fossil fuels. The U.S. military is the largest single user of oil in the world, consuming more that 355,000 barrels of oil per day. That means that each time the price of a barrel of oil increases by $10, it costs another $1.4 billion. While alternative fuels will not supplant fossil fuels entirely, replacing even a fraction of oil and gas will help insulate DOD from rising global oil prices and market volatility.
As such, I was proud to support two amendments introduced by Senators Mark Udall and Kay Hagan reinstating language allowing the Defense Department to purchase and develop new energy technologies and alternative fuels giving the military the ability to pursue domestic bio fuels and reduce foreign oil use. These commonsense amendments are vital not just for our national security, but Nebraska and our nation's economy as well. Increased use of bio fuels for the military would help Nebraska's ethanol industry, which has the capacity to produce more than 2.1 billion gallons of ethanol annually, estimated to be worth more than $6 billion to the state's economy. These provisions will also further the development of the next generation of advanced biofuels. Technologies towards this next generation are already being produced in Nebraska, at facilities such as Novozymes advanced manufacturing plant in Blair. In May 2012, Novozymes inaugurated this facility, the largest enzyme plant dedicated to biofuels in the United States. Funded with $200 million in private investment, the plant created 100 career positions and 400 construction jobs, and specializes in enzymes for both the first generation and advanced biofuel markets. Now that the Senate has reinstated the Pentagon's bio fuels efforts, I am hopeful we can quickly reconcile our differences with the House, so we can get this crucial legislation to the President's desk. As we move the process forward, I am confident my colleagues will continue to stand behind America's military and our economy by supporting DOD's ability to develop and employ alternative fuels. | 科技 |
2016-40/3982/en_head.json.gz/12635 | An RAF pilot wearing night vision goggles. Photo: Ministry of Defence/Flickr Show Hide image
8 April 2014 Graphene contact lenses could give everyone night vision
Researchers have found a way to make the technology required for seeing in the dark small enough to fit on an eye.
By Ian Steadman Print HTML
Night vision goggles normally look like this. They're big, they're clunky, and they require space for a battery. They're not particularly graceful, and if you've got one on it's pretty obvious that you're someone who's trying to skulk around at night. It's just as well that other people won't be able to see them in the dark, because if they could they'd know to sound an alarm somewhere.
A stealthier, more lightweight solution may come as a result of research by electrical engineer Zhaohui Zhong and his team from the University of Michigan. They've been looking at graphene, the single layer of carbon atoms arranged in a crystal structure that many have described as a wonder material for its startling range of qualities: it conducts electricity at room temperature like a superconductor at near-absolute zero; it's incredibly strong, and could be used for building elevators to space; it's as stiff as diamond, yet also extremely elastic; it's the most impermeable substance ever discovered; and it's almost completely transparent, but it can also absorb light across all wavelengths.
That last point makes it a potentially very useful tool for building sensors, yet it does have a weakness - as graphene is essentially two-dimensional, it only absorbs around 2.3 per cent of the light that hits it, even if it's light of all types. The Michigan team were researching ways to overcome this shortcoming, because it would be awesome be to be able to build sensors as light, strong and flexible as graphene. And, according to the study, published in Nature Nanotechnology, they may have succeeded in a first step.
When light hits graphene it knocks some electrons out from the carbon atoms, causing a slight positive charge. Rigging up a meter to measure this charge isn't going to work, because, per above, only 2.3 per cent of the light hitting it gets absorbed - too small an amount to be picked up on this scale as an electrical charge. So the Michigan team came up with a compromise: two layers of graphene, with a barrier in between. The bottom layer was kept charged with a current, and when light hit the top layer the electrons would jump across the barrier to the already-charged layer, affecting its current. That change in current could be measured, giving an indirect reading for the wavelengths of light hitting the top layer. Zhong said:
The challenge for the current generation of graphene-based detectors is that their sensitivity is typically very poor. It’s a hundred to a thousand times lower than what a commercial device would require. Our work pioneered a new way to detect light. We envision that people will be able to adopt this same mechanism in other material and device platforms.”
Right now, the device the Michigan team has rigged up is about the size of a fingernail, but with further refinement it's not hard to see it being shrunk further. When that happens, Zhong said, "if we integrate it with a contact lens or other wearable electronics, it expands your vision". It's the kind of technology we might expect from one of the less-good Bond films, but contact lenses that allow the user to change the kind of light they see aren't entirely an impossible idea.
› Labour and Tory councils are choosing the poll tax over the safety net Ian Steadman is a staff science and technology writer at the New Statesman. He is on Twitter as @iansteadman. | 科技 |
2016-40/3982/en_head.json.gz/12668 | Giant Iceberg Breaks As Term 'Global Warming' Hits 35 Listen
Giant Iceberg Breaks As Term 'Global Warming' Hits 35 4:59
Giant Iceberg Breaks As Term 'Global Warming' Hits 35 Giant Iceberg Breaks As Term 'Global Warming' Hits 35 Listen
August 8, 20103:00 PM ET
A chunk of ice broke free in the waters of Greenland a few days ago, and it's not just any ice cube: This one's four times the size of Manhattan, containing enough fresh water to supply the entire United States for 120 days. Guy Raz charts the biggest Arctic iceberg in nearly 50 years, and then checks in with Wallace Broecker who 35 years ago today published a paper that gave a name to one of the most pressing issues of our time: global warming.
GUY RAZ, host: This is ALL THINGS CONSIDERED from NPR News. I'm Guy Raz. Sometimes, climate science and history have a way of overlapping, and in a few moments, we'll hear how an atmospheric physicist stumbled upon data that may have solved the mystery of George Mallory's deadly expedition to Mount Everest in 1924. But first, another climate story. A few days ago, Andreas Muenchow, a professor at the University of Delaware, got a call from some colleagues in Canada. Muenchow had been looking at a specific glacier in Greenland for years. It's called the Petermann Glacier, and on Thursday he found out that a massive chunk of it broke off to form an ice island. Mr. ANDREAS MUENCHOW (Professor, University of Delaware): This ice island is about the size four times Manhattan. RAZ: That's 92 square miles, the largest iceberg to form in the Arctic since 1962. And over the next three years, that ice island will melt away into the ocean, 18 cubic kilometers of water. Mr. MUENCHOW: Which is the equivalent of kind of like about 120 days' worth of what people in the U.S. use out of their tap water for showering and drinking and sprinkling their lawns. RAZ: Now, these things happen. Icebergs naturally break off into the Arctic. It's just that they're happening with greater frequency today, and Andreas Muenchow doesn't know just yet if this ice break has anything to do with climate change. But you only have to look at satellite pictures of Greenland to see that more of it is melting away faster than at any other time in recorded history. Now, it just so happens that 35 years ago on this day, August 8, a then-obscure geochemist at Columbia University named Wallace Broecker published a journal article that, for the first time, used the term global warming. And as you might expect, it caused a sensation. Mr. WALLACE BROECKER (Newberry Professor, Department of Earth and Environmental Sciences, Columbia University): I don't think people paid all that much attention to it, actually. RAZ: Okay, not quite a sensation. The article in question, in the journal Science, passed with barely a mention. It was called "Are We On the Brink of a Pronounced Global Warming?" Broecker was arguing that the Earth was about to get a whole lot hotter, and he was trying to crack a scientific puzzle. Humans were pouring more and more carbon into the atmosphere, from their cars and factories, and it meant that the Earth should have been getting warmer. But at that point, it wasn't. Mr. BROECKER: And it puzzled me, and so I was looking for a reason, and I came upon a record, a new record from Greenland ice core, which suggested that there were fluctuations, and the latest one was a cooling. So I proposed that perhaps the cooling had been balancing out the CO2 warming, you know, natural cooling. RAZ: In other words, the cooling was masking the fact that there should have been a warming based on, I guess, science, on physics, that more CO2 in the atmosphere should have been heating the Earth up. Mr. BROECKER: Exactly, and the cycles that were seen in the ice core from Greenland, one of them had an 80-year period. So the 40 years of plateau, no warming, corresponded to half of the, I should say, 80-year cycle. And so I proposed in the article that if that were the case, then the situation should turn around in short order, and we should see a natural warming, which would join forces with the CO2 warming. RAZ: In other words, when that natural cooling cycle ended, then there would be a natural heating cycle, but that would join forces with the artificial elements that the carbon that humans were putting into the atmosphere, and that would creating a kind of a super-warming? Mr. BROECKER: Yes, that's what I proposed. And that article was published, of course, in 1975, and lo and behold, in 1976, the temperature started to rise, and it's been rising more or less ever since. RAZ: Now, your predictions turned out obviously to be correct, accurate. This July was the warmest on record in the U.S. I mean, are you surprised that it was so accurate? Mr. BROECKER: Well, it was dumb luck, because it turns out that the record in Greenland was a very unusual record, and as we over the years got more and more records, the cycles that showed up so strongly in this Northern Greenland ice core didn't show up anywhere else. So one might say that my prediction was based on, you know, in a sense, nonsense. (Soundbite of laughter) Mr. BROECKER: But on the other hand, it turned out to be correct. So maybe there's something going on in my head that I don't understand. RAZ: That's Wallace Broecker. He's a professor of geochemistry at Columbia University and the man who coined the phrase global warming 35 years ago today. Wallace Broecker, thank you so much. Mr. BROECKER: Well, I'm pleased to be here. Copyright © 2010 NPR. All rights reserved. Visit our website terms of use and permissions pages at www.npr.org for further information. | 科技 |
2016-40/3982/en_head.json.gz/12718 | Diamond DA20 Eclipse Free Airplane Paper Model Diamond DA20 Eclipse Free Airplane Paper Model Sign Up
OnToplist.com > User Articles > Diamond DA20 Eclipse Free Airplane Paper Model Blog Profile
User Profile http://papercraftsquare.blogspot.com/
Diamond DA20 Eclipse Free Airplane Paper Model
Posted on Nov 14 2012 at 10:07:32 PM in Artists This airplane paper model is a Diamond DA20 Eclipse (DA20 C-1 “Eclipse”), designed by Flyingtak1. The scale of the paper craft is in 1:200. The Diamond DA20 is a two-seat tricycle gear general aviation aircraft designed for flight training. In addition to its role as a civil and military training aircraft, it is also used for personal flying by pilot-owners. The first DA20 was the Rotax 912 powered A1 Katana produced in Canada in 1994. It was the first Diamond aircraft available for sale in North America. Production of the Continental IO-240-B3B powered C1 Evolution and Eclipse models began in 1998, also in Canada. Production of the A1 Katana is complete but the DA20-C1 is still being constructed in 2010.
The DA20-A1 and C1 are both certified under CAR 523 in Canada and under FAR 23 in the USA. The DA20 is certified in the utility category, and it is permissible to intentionally spin it with flaps in the full up position. In 2004, Diamond received Chinese certification for the DA20. Both models also hold JAA certification.
Although the DA20 is available with instrumentation and avionics suitable for flight under instrument flight rules (IFR), its plastic airframe lacks lightning protection and thus does not qualify for IFR certification.
The DA20 features control sticks, composite construction, a canopy, low-mounted wings, a single fuel tank, a T-tail, and a castering nosewheel. All models have composite airframes constructed of glass- and carbon-fiber reinforced plastic. The nose wheel of the DA20 is not linked to the rudder pedals and turns while taxiing are made with differential braking, with rudder steering becoming more effective as airspeed increases.
The DA20 is equipped with a bubble canopy. Small windows on either side of the canopy can be opened on the ground and in flight to provide cockpit ventilation. This canopy design, however, lets in an above-average amount of sunlight into the cockpit, increasing the cockpit’s initial temperature. The DA20′s seats are reclined and are not adjustable, instead the rudder pedals are adjustable fore and aft to accommodate pilots of different height. The fixed seats provide better occupant crash protection. The seats in the C1 variant have a less obtuse angle, but, like the A1, are not adjustable. Both models are available with cloth or leather seat coverings.
The DA-20 possesses a higher glide ratio than many of its competitors. The glide ratio of the DA20-C1 is 11:1 and the DA20-A1 is 14:1.
In November 2008 the company announced that it would be offering an Aspen Avionics glass cockpit primary flight display as an option on the DA20. Diamond indicated the Aspen PFD was easy to incorporate into the existing instrument panel design because it mounts in a standard round instrument hole. In October 2009 the company introduced the Garmin G500 glass cockpit as an option.
DA20 C-1 “Eclipse” is better equipped C-1 for private use, with rear windows for better visibility. Powered by a 125 hp (93 kW) Continental IO-240-B engine Entered production in 1999.
You can download this aircraft Diamond DA20 Eclipse Free Airplane Paper Model Download Article Information
Author: papercraftsquare-com
Created: Nov 14 2012 at 10:07:32 PM
Updated: Nov 14 2012 at 10:07:32 PM
Category: Artists | 科技 |
2016-40/3982/en_head.json.gz/12751 | The CEOs of the leading information technology companies in Silicon Valley tend to be a lot like their industry: brash, young, and unconventional. Until recently, few of them have had much use for politicians. Nonetheless, as their industry has matured and they have become rich and influential, the entrepreneurs of Silicon Valley have been courted by candidates of both parties. The story of how a Democratic Administration under President Clinton has successfully curried their favor and reaped campaign dollars as a result reveals much about the flow of money in American politics.
In 1992, Bill Clinton won the endorsement and financial backing of several of them. Most notable was John Sculley, then the celebrated CEO of Apple Computer and a Republican. It was Sculley who sat next to Hillary Rodham Clinton during the President's first State of the Union address in January 1993. He embodied one of the incoming Administration's most vivid images, a symbol of a booming industry dominated by America and allied with a young, new President.
Sculley, 55, joined Apple in 1983 from Pepsico Inc, where he had risen from marketing executive to the company's president. He was lured to the Cupertino, California computer maker by Apple's co-founder Steven Jobs who reportedly asked him, "Do you want to spend the rest of your life selling sugared water or do you want a chance to change the world?"
As a world changer, Sculley did well at first. Despite his low-tech background, Apple flourished under his leadership and in 1985 Sculley solidified his image as Silicon Valley's reigning philosopher-king when he ousted Jobs after a power struggle. He was frequently quoted in the press making visionary predictions about the future of the cyber-revolution his company had helped start.
As the company's star began to fade, however, so did his. By 1993 the company was reeling from a series of business disasters. Chief among them was the Newton hand-held computer, Sculley's pet project. In that year, he resigned from Apple.
Sculley turned his attention to politics, in the early 1990s. His first efforts were on behalf of Republican Tom Campbell, who in 1992 was running in California for a US Senate seat. Sculley hosted a fund-raising fete for Campbell at his ranch in Woodside. But by then, Sculley had grown disenchanted with the technology policies of the Bush Administration. He had also become acquainted with Hillary Rodham Clinton, serving with her on a national education council. When Bill Clinton ran for President Sculley threw his support to him.
A key matchmaker in bringing Clinton and high-tech executives like Sculley together was Sanford (Sandy) Robertson, chairman of Robertson, Stephens & Co., a leading investment firm in San Francisco. Since the late sixties, Robertson has specialized in converting fledgling technology companies from private to public ownership. He has also raised a lot of money for Democratic candidates. In October 1992, Robertson held a fund-raising dinner party for Bill Clinton, Al Gore and 135 Silicon Valley and biotech executives, including Sculley, at his house in San Francisco. The dinner raised $400,000. A top priority in Silicon Valley was the construction of an electronic data network, the so-called "information superhighway." Clinton -- and particularly Gore -- were thinking about that too. Gore envisioned a future driven by bits and bytes traveling across a high-speed network that would connect government, business and schools -- all built with the government leading the charge. Gore likened this to the publicly funded interstate highway system which had been championed by his father, a former Tennessee Senator.
But in Silicon Valley, many leading executives wanted a network built by private companies like their own. At the economic summit convened by President-elect Clinton in December 1992, AT&T chairman Robert Allen and Gore voiced their disagreement on the subject. But by the end of 1993 the Vice-President had changed his view. In a speech on December 21, he declared : "Unlike the interstates, the information superhighway will be built, paid for, and funded principally by the private sector." Remarkably, within two days of Gore's remarks, telecommunications companies with much at stake in the information superhigway contributed $120,000 to the Democratic Party. Federal Election Commission records show receipts of $70,000 from MCI, $25,000 from NYNEX, $15,000 from Sprint, and $10,000 from US West.
As for John Sculley, the years since he was mentioned as a possible member of Clinton's cabinet have been difficult. He lasted only four months as chairman of Spectrum Information Technologies, a New York company. And he left unhappily. He sued Spectrum alleging he had been deceived about accounting problems. Home | Money Charts | The Players | The Interviews | Feedback | 科技 |
2016-40/3982/en_head.json.gz/12801 | Press Center Daily News 2011 SIGGRAPH: BlackSky shows high-performance computers
SIGGRAPH: BlackSky shows high-performance computers
VANCOUVER — BlackSky Computing (www.blackskycomputing.com) debuted its line of high performance computing (HPC) solutions at SIGGRAPH. The company is also showing its new Apollo storage system, which is optimized for extreme performance, massive scalability and space savings. Obsidian is BlackSky’s desktop HPC solution, which offers high performance through the choice of either dual Intel Xeon processors with up to 96 GB of DDR3 1333 ECC RAM and up to 12 cores or a single Intel i7 Sandy Bridge processor with up to 32 GB DDR3 1333 RAM and up to 4 cores. The company offers storage that provides up to 1,400MB/sec. sequential read and write times, which is nearly six times faster than a standard SSD drive. Buyers have a choice of the latest generation of Nvidia video cards. which are specifically optimized for CAD and VFX. And Obsidian desktop HPC computers can come pre-loaded with many popular software packages. SkyNet is a cloud-based HPC solution for studios that demand performance, but don’t want the electric, setup, and maintenance responsibilities associated with high-end computing. Files are automatically and securely synchronized between each user's computer and the SkyNet system so users don't have to worry about uploading and downloading files. SkyNet can be accessed by any computer with an Internet connection. And the Hyperion is an enterprise solution that’s designed to fit a wide range of computing situation. The 5RU unit can serve as a building block for high performance compute clusters, datacenters, render farms, virtual machines, private clouds, and other applications that require a lot of computers in a small amount of space. Each Hyperion can fit up to 10 compute cards, and each compute card can have up to two Intel Xeon CPUs and up to 96 GB of DDR3 ECC 1333 RAM. BlackSky’s Apollo storage solution can provide up to 180TB of storage in a single RAID array or 90TB of storage with mirrored redundancy in a single 4RU enclosure. Multiple Apollo systems can be linked together into a single network mounted volume allowing studios to create around a petabyte of mirrored storage in a single server rack. Apollo comes standard with four 1Gb/s Ethernet ports and can be upgraded to dual 10 Gb/s Ethernet ports or dual 40 Gb/s Infiniband ports.
Assimilate offers open-beta versions of Scratch 8.5 & VR SANTA CLARA, CA — Assimilate (www.assimilateinc.com) has announced the open-beta version of Scratch 8.5, the latest version of its realtime post production tool for dailies, conforming, grading, compo ... May 16, 2016
AMPS issues statement regarding unintelligible dialogue in UK dramas LONDON — The Association of Motion Picture Sound (AMPS) has released a statement regarding recent issues relating to unintelligible dialogue in UK TV dramas. The organization was founded in 1989 by a ... May 9, 2016
Jaunt appoints George Kliavkoff as CEO PALO ALTO, CA — Jaunt Inc. ( www.jauntvr.com ), the cinematic virtual reality (VR) company, has appointed George Kliavkoff as chief executive officer. A recognized leader in traditional and new media, ... September 13, 2016 | 科技 |
2016-40/3982/en_head.json.gz/12869 | Rare Whales Make Surprise Visit to Texas Coast
By Jim Forsyth
CORPUS CHRISTI, Texas -- Researchers were surprised this week by the sudden appearance -- and quick disappearance -- of two rare Northern right whales in the busy industrial port of Corpus Christi Bay.
"It's a most extraordinary event," said oceanographer Tony Amos of the University of Texas Marine Science Institute in Port Aransas.
"They're almost unknown in the Gulf of Mexico. Why they would come into the bay, I don't really know."
Amos said a 50-foot (15-meter), 60-ton adult female and her 15-foot (5-meter) baby calf apparently became lost and popped up on Monday in the bay, which is protected from the Gulf of Mexico by a narrow barrier island.
The calf appeared to be suffering from two slash injuries possibly caused by a ship's propeller, and Amos said the biggest threat to them now was a serious injury from shipping traffic.
By Wednesday, mother and calf were nowhere to be seen, raising hopes their low-frequency sonar helped them find their way home.
"One has to assume that because we haven't seen them they have actually left the bay and the Corpus Christi Ship Channel and are back out in the Gulf," Amos said.
The right whale got it name because 19th century fishing ships considered the huge, slow moving creatures to be the 'right whale' to hunt. They have been nearly hunted out of existence.
Fewer than 1,000 are believed to exist, and those tend to spend their winters in warm Gulf stream waters off the Atlantic coasts of Florida and Georgia.
Officials say they are common in large bays, but are nearly unknown along the western coast of the Gulf of Mexico.
Right Whales have been known even to travel up rivers and are frequently found in large estuaries. Their last reported visit to the Texas coast was in 1972, when a single whale beached itself near Freeport. | 科技 |
2016-40/3982/en_head.json.gz/12878 | CG launches Smart Grid manufacturing facility in Bangalore, India News
CG launches Smart Grid manufacturing facility in Bangalore, India 13 January 2014
Facility to produce range of Smart Grid equipment, including substation automation products, distribution automation devices, and protection and control systems, among other parts. CG, an Avantha Group Company, will be based in the Global Village in Bangalore. According to Laurent Demortifier, CEO and managing director of CG, the Smart Grid devices manufactured in this facility will offer numerical solutions to Indian Utilities and Industries in the Transmission and Distribution (T&D) segment and provide improvement in the electric grid to make it more efficient and reliable. CG has invested 80 million INR in this facility, which will employ more than 100 people. In addition to producting primary Smart Grid components, the facility will also offer services such as systems integration, installation, and commissioning. The facility is fully equipped with modern equipment to ensure an annual production capacity of 10,000 units of power line carrier communication terminals and intelligent electronic devices.According to Demortifier, the opening of CG’s Smart Grid manufacturing facility in Bangalore is a result of intense planning with the Indian stakeholders: customers, regulators and employees. "The facility will support economic development, foster job creation and boost an understanding of Smart Grid solutions in the energy field."CG’s Smart Grid devices are running successfully in the world's first major Smart Grid deployment at the Iberdrola STAR Project, Bilbao, Spain, where more than 2,27,000 Smart meters monitor the electricity services. In the past year, alone CG has participated in no less than ten Advanced Metering Infrastructure Pilot Projects in Europe, Asia, and America. CG boasts more than 2 million Smart Meter installations worldwide. Share this article
• Energy infrastructure | 科技 |
2016-40/3982/en_head.json.gz/12931 | Web ad group launches privacy education campaign
A group of leading Internet publishers and digital marketing services is launching an online campaign to educate consumers about how they are tracked and targeted for pitches on the Web.
The Internet Advertising Bureau, based on New York, on Thursday unveiled its "Privacy Matters" Web site. The site explains how Internet marketers track where people go and what they do online and then mine that data to serve up targeted ads. The practice, known as behavioral advertising, has raised concerns among privacy watchdogs and lawmakers in Congress.
A number of IAB members plan to run banner spots on their Web pages linking back to the Privacy Matters site. Those include Internet-only players such as Yahoo Inc. and Google Inc. and traditional media outlets such as Walt Disney Co. and The New York Times Co.
The goal of the program, explained IAB Senior Vice President David Doty, is to describe "in plain English" how online advertising works. Among other things, the Privacy Matters Web site offers explanations of demographic targeting, interest group targeting and data-tracking files known as cookies.
The new campaign is part of a broader self-regulatory push by the Interactive Advertising Bureau and other advertising trade groups that want to head off federal regulation.
Rep. Rick Boucher, D-Va., chairman of the House Energy and Commerce Subcommittee on Communications, Technology and the Internet, is currently leading an effort to draft a bill that would impose broad new privacy obligations on Web sites and online advertisers. His proposal will aim to ensure that consumers know what information is being collected about them on the Web and how it is being used, and to give them control over that information. | 科技 |
2016-40/3982/en_head.json.gz/12981 | IntroductionFor thousands of years people have been looking into the sky and seeing things they simply did not understand. They labeled them as best they could, calling them flaming swords, fiery chariots, twirling shields, and, eventually, flying saucers. They offered descriptions of these bizarre objects that seem, even today, to defy explanation. These early accounts suggest something that is beyond the commonplace and the mundane, something that is extraordinary, a phenomenon that began appearing before there was a written language to record observations of it.In today's world, there have been hundreds of thousands, if not millions, of reports of flying saucers, cigar-shaped craft, glowing spheres, and bright nighttime lights. These reports differ from their older counterparts only in number. We are still confused about what the reports represent and what they may mean. We describe them in the best language available and use the best of the electronic, optical, and unaided observations as we can. And even with all our sophisticated monitoring, sensing, and detection equipment, we are sometimes as confused about what we've seen as our ancestors were.That is not to say that we haven't been able to learn something about the phenomenon. We have collected hundreds of thousands of reports, as well as thousands of photographs, of those strange things in the sky. We have an advantage because we can record precisely what was seen. Photographs have allowed us to show our fellows what was in the sky that confused, frightened, astounded, and astonished us.We have been able to use motion pictures to show the movement of these strange things. And when all else failed, we used illustrations and drawings that represented clearly what we saw. These methods have provided us with a record that allows research to be conducted. To compile this work, we used, literally, hundreds of different sources, ranging from the original reports of the witnesses to the books and magazines that have appeared in the last fifty years. We used the records compiled for Project Blue Book, the official U.S. Air Force investigation. We were also able to use the records originally compiled by the now-defunct civilian inquiries, including those by the National Investigations Committee on Aerial Phenomena (NICAP) and the Aerial Phenomena Research Organization (APRO).It has been suggested, however, that our view of what the alien spacecraft look like, though once heavily influenced by ancient myths and legends, is now influenced by movies and books. Close Encounters of the Third Kind, the spectacular movie that presented an image of alien visitors coming down to Earth, is supposed to have been responsible for our modern view of what alien visitors look like today.Others have suggested that flying saucers owe their existence to a misuse of the descriptive term by a newspaper reporter. Businessman Kenneth Arnold described the "crescent-shaped" object he had seen as moving with a motion like that of "saucers skipping across the pond." Within days, there were hundreds of sightings of "flying saucers" rather than the crescents seen by Arnold. Without an illustration to accompany the story, people assumed that the flying saucers were disc-shaped. Of course, as we look at the sighting reports made prior to Arnold's, we note that many did have a disc shape. But we also notice that other shapes seemed to dominate in some eras. The Great Airship stories were not of flying saucers, but of huge, cigar-shaped objects that resembled the airships built on Earth in the years to follow.The Scandinavian ghosts rockets of 1946 were nearly all shaped like the V-2s that had caused so much fear and destruction at the close of the Second World War. It wasn't until 1947 that these objects began to be reported with any regularity as saucers. However, it must be noted that beginning in the late 1980s, UFOs were being reported as triangular-shaped.What we see in the survey of the literature, then, is a diversity in the descriptions of alien craft. Although media influence can be seen in some accounts, it also seems that media influences, at least concerning the shape of the craft, have often been ignored. Even when thousands were reporting flying saucers, there were those reporting other, often intriguing shapes. While there is little doubt that the media have influenced the descriptions of alien spaceships, there is also little doubt that many individuals are reporting some craft that were at variance with those media-driven descriptions.We should note here that we are not suggesting that all the stories are science fiction and based on media accounts, or, on the other hand, that all of them are true. These tales are what the witnesses themselves have reported to the air force or the civilian UFO organizations. Those witnesses may have been influenced by what they have seen in the pop culture world around them. They may have been influenced by television and movies and the reports of others. Or, they may not have been influenced by anything other than their own imaginations or by what they saw during their encounters.We have gathered what we consider to be a representative sample of the visitor-spaceship reports....We have also attempted to tell the story of each encounter in sufficient detail to enable the reader to make an intelligent decision about the validity of the report. And if that is not enough, we have tried to provide a variety of sources where additional information or contrary views can be found.We have also devised a rating system: We have assigned a "reliability" value to each of the case studies. The rating scale ranges from zero, which means no reliability, to ten, which means that there was actual proof that an alien ship was seen.It might be useful to note the way that we did assign some of the numbers. There are some truly incredible cases that we rated fairly high. The reason for doing so is the number of witnesses involved in those specific events. A case in which a dozen people report the same thing, and corroborate the tale told by the others, demands that we give it a higher reliability rating because of the large witness count.On the other hand, there are some cases that are almost universally accepted as reliable that we gave low numbers. Again, it is because of the lack of good corroboration in the form of additional, independent witnesses. The number and the credibility of the witnesses must be taken into account.But it should be noted that the reliability rating reflects our subjective opinion. We are aware that we tend to believe some of the unbelievable stories over other unbelievable stories. It is a factor that comes from our long work in the field and just how wild some of the tales have become.What we have here is an encyclopedia of the alien spaceships seen around the world. It is a compilation of the testimonies of hundreds of sources. It is an attempt to inform you about the diversity of the phenomenon and the numbers that have been reported.So here are the alien craft that have been reported to have visited us in the past. Here are the stories of their visits, based on the testimony of the people who witnessed them. Here is the best analysis we can make of those tales so that you will have a feel for what may just be true and accurate and what is a hoax or a misdirection. And finally, here are the sources of those tales so that you can check them for yourself.Copyright © 2000 by Kevin Randle and Russ Estes
The Spaceships of the Visitors
An Illustrated Guide to Alien Spacecraft
By Kevin Randle and Russ Estes
The first comprehensive field guide to alien spacecraft, complete with illustrations of more than one hundred spaceships In the last decade, the number of reports of alien spacecraft sightings has skyrocketed. However, the phenomenon of alien encounters is not new. Here, for the first time, two UFO experts, Kevin Randle and Russ Estes, cover the history of UFO sightings, from the ancient to the modern, using research from many different sources, including the Air Force and private UFO groups. Each of the more than one hundred entries is based on actual eyewitness accounts and includes: A detailed drawing of each spaceship based on photographs or drawings made by the people who saw it with their own eyes A concise summary of the facts: where the sighting took place, names of the witnesses, craft type, and such specifics as each craft's dimensions, color, sound, and exhaust mechanism A unique reliability rating, on a scale from 0 to 10 A gripping you-were-there narration of the encounter, along with meticulous documentation of source material Spaceships of the Visitors is a fascinating and essential reference for anyone curious about alien visitation.
Touchstone | 352 pages | ISBN 9780684857398 | August 2000
Body, Mind & Spirit > UFOs & Extraterrestrials
Kevin Randle
Russ Estes | 科技 |
2016-40/3982/en_head.json.gz/13098 | Robert Trigaux
Report: Solar energy is on the rise but don't look to Sunshine State to lead the way
Robert Trigaux, Times Business Columnist
Thursday, June 16, 2011 8:10am
Wake up and good morning. A new report says solar energy is rising fast (from a small base). Just not in the, uh, Sunshine State.
Don't be confused by the photo, taken in 2009 which shows President Obama in Florida touring FPL's DeSoto Solar Center in Arcadia (AP photo). This solar project is clearly an exception, not a trend in this state. A report released by the Washington-based Solar Energy Industries Association finds the amount of solar energy capacity installed in the United States increased 66 percent in the first quarter of 2011 as panel prices fell and developers took advantage of expiring government incentives. Developers installed 252 megawatts of photovoltaic (PV) power systems in the first quarter, compared with 152 megawatts a year earlier. “Strong demand continues to make solar one of the fastest- growing industries in the United States," Rhone Resch, SEIA’s president, said in a statement. Read more in this Bloomberg story. Of all states, New Jersey was the strongest market for solar, with 42 megawatts of new capacity installed, up 49 percent. The state has a total of 330 megawatts in operation. California is the largest market with 1.1 gigawatts. Installed capacity in the top seven states accounted for 88 percent of all new capacity in the quarter, up from 82 percent. So where is Florida? Left in the dust. It ranked 19th in PV installations in the first quarter of 2011, down from 12th a year ago. But then, those who track the lack of traction for solar in Florida (despite its location for sunshine) are surely not surprised. Like a lot of alternative initiatives and energy innovations, Florida's behind the times in setting policies to support new ways to generate power. The solar study finds commercial and government projects accounted for 59 percent of the first-quarter installations, compared with 44 percent a year earlier. Residential projects were 28 percent and the remaining 13 percent came from utility-scale plants. The cost of installing solar power is falling, driven by lower costs for components, greater economies of scale and streamlined development and installation, the report said. Prices of solar panels in the first quarter fell about 7 percent from a year earlier. First-quarter installation volume also increased after developers rushed to break ground on projects before the end of 2010 when a U.S. Treasury grant incentive program was set to expire. That program, says the Bloomberg story, which reimburses 30 percent of the costs of building solar systems, was extended in December until the end of 2011. Come on Florida. Put a little more sunshine in your energy future. -- Robert Trigaux, Business Columnist, St. Petersburg Times [Last modified: Thursday, June 16, 2011 8:10am]
Tampa Bay business news and insights are brought to you each day by business columnist Robert Trigaux and his fellow business writers. Venture provides an inside look at Tampa Bay companies as well as events, people, deal, triumphs and failures across the Tampa Bay economy.
E-mail Robert Trigaux: trigaux@tampabay.com
Cienfuegos entrepreneurs hope U.S. tourists will boost scant business
Gulfcoast Legal Services director retires after critical federal audit
PolitiFact: Donald Trump says Hillary Clinton lauded controversial TPP deal
From bad to baddest: The 10 nastiest corporations of the moment
Consumer sentiment in Florida rises | 科技 |
2016-40/3982/en_head.json.gz/13114 | How the humble hard drive is made
It's like a 747 flying 0.14mm above the ground
Extracting minerals and making blanks
Polishing, washing and applying the underlayer
Adding data storage layers and final testing
The fact that silicon chips start life as nothing more exotic than sand is amazing enough, but have you ever thought about that other important PC component, the hard disk? Its origins couldn't be more different. The heart of a hard disk – the rotating platter where your data is stored – is made out of an exotic mix of elements including ruthenium and platinum, two of the world's rarest and most expensive metals. Needless to say, this statement doesn't even hint at the complexity involved in transforming rare ores into gigabytes of data storage. The hard disk's high speeds of rotation and the close proximity of the head to the platter means that the processes must be carried out with the ultimate in precision and cleanliness. Add to this the strange properties of magnetic media and the techniques required to achieve the optimum capacity, and the story of how disks are made becomes one that encompasses the fields of mining, metallurgy, chemistry, physics and involves the pinnacle of engineering and manufacturing technology. As a whole, a hard disk is an amazing feat of electronic and mechanical engineering, but two parts – the heads and the platter – stand out for their sheer manufacturing complexity. As the part that actually stores the data, the platter is what many people consider the heart of a hard disk drive – and here we reveal the secrets of its manufacture. Step 1: Mineral extraction and processing Platinum is only the 70th most abundant element in the Earth's crust, making up just three parts per billion. Ruthenium comes two places lower with an abundance of only one part per billion. By way of comparison, silicon – the raw material from which microprocessors are made – accounts for around 27 per cent of the Earth's crust. It's no surprise then that platinum is hugely expensive – today's market price is more than $1,300 per Troy ounce. Turning to ruthenium, the total annual production is just 27 tonnes, an amount that would fit in a 1.3m3 cube. Both are mined predominantly in South Africa. Platinum is one of the noble metals, which means that it's relatively unreactive. Unlike metals such as copper – the main ores of which are compounds – platinum is normally found in its metallic form. This doesn't mean that extracting it from its ore is simple, though, as platinum is normally found mixed with other metals. Obtaining pure platinum involves separating it from the iron, copper, gold, nickel, iridium, palladium, rhodium, ruthenium and osmium that it's invariably found with. Let's just say it's a complicated multistage chemical process that can take up to six months to complete. Fortuitously, though, the ruthenium that's also needed in disk manufacture is a by-product of the process. A deep mine in the Bushveld Complex of South Africa might seem far-removed from a finished hard disk, and in this sense it's an ideal place to start our investigation. But we're not going to need the platinum or the ruthenium until well down the line, so for now we'll put them aside as we move to something more down to earth – and considerably more common. Step 2: Making aluminium blanks The manufacture of a hard disk platter starts with the fabrication of aluminium blanks, which are disks of aluminium alloy onto which the magnetic recording layer will eventually be deposited. High-purity alloy that contains four to five per cent magnesium plus small amounts of silicon, copper, iron and zinc to give it the necessary properties is cast into an ingot weighing seven tonnes. The ingot is then heat-treated, hot-rolled and cold-rolled in multiple passes to provide a sheet of the necessary thickness (usually 0.635mm, 0.8mm, 1.0mm, 1.27mm, 1.5mm or 1.8 mm – just enough to provide adequate stability while rotating at high speed) from which the blanks will be punched. GETTING STARTED: Hot rolling mills process aluminium ingots into thin slivers of metal from which disks will be punchedPunching takes place once the alloy sheet has been coiled into large rolls so that a single stamping process produces lots of blanks. This is then followed by a stacked annealing process to reflatten the blanks. Finally the blanks are ground to a high level of precision to achieve the necessary surface and edge finish. Bear in mind that this and all subsequent steps are carried out on both sides of the platter so that it ends up with two recording surfaces. Step 3: NiP plating The aluminium blanks are now precision-ground using 'stones' that are composed of PVA and which contain silicon carbide as the abrasive agent. However, even with all the care taken to produce a good finish, the surfaces of the aluminium blanks produced in Step 2 are not yet nearly perfect enough. Because there's a limit to the degree of smoothness to which aluminium alloy can be ground, the next step is to apply a hard coating that will take a better finish. PERFECT FINISH: The soft aluminium is plated with a hard NiP layer so that it can be polished to an incredible degree of smoothnessThis hard coating is an amorphous alloy of nickel and phosphorous (NiP). It's applied by an electroless process in which complex supersaturated solutions containing compounds of nickel and phosphorous react on the surface of the disk to leave the required NiP layer. This layer can now be further refined in the next step of the process. 1
Next Page Polishing, washing and applying the underlayer
Seagate unveils world's biggest consumer hard drive
This SSD weighs less than a paper clip but holds 512GB
Boost your PC's performance with these new Toshiba SSDs and HDDs
NetApp: a requiem
See more Storage news Load Comments | 科技 |
2016-40/3982/en_head.json.gz/13242 | Race is on for EU's $1.3 billion science projects | Times Free Press Local
Race is on for EU's $1.3 billion science projects
Scientists use an infrared-DIC microscope to do multineuron patch-clamp recording in the Blue Brain team and the Human Brain Project (HBP) laboratory of the Ecole Polytechnique Federale de Lausanne (EPFL), in Lausanne, Switzerland. The Blue Brain team has come together with 12 other European and international partners to propose the Human Brain Project, a candidate for funding under the EU's FET Flagship program. The Blue Brain Project is an attempt to create a synthetic brain by reverse-engineering the mammalian brain down to the molecular level.
BERLIN - Call it Europe's Got Talent for geeks.
Teams of scientists from across the continent are vying for a funding bonanza that could see two of them receive up to (euro) 1 billion ($1.33 billion) over 10 years to keep Europe at the cutting edge of technology.
The contest began with 26 proposals that were whittled down to six last year. Just four have made it to the final round.
They include a plan to develop digital guardian angels that would keep people safe from harm; a massive data-crunching machine to simulate social, economic and technological change on our planet; an effort to craft the most accurate computer model of the human brain to date; and a team working to find better ways to produce and employ graphene - an ultra-thin material that could revolutionize manufacturing of everything from airplanes to computer chips.
The two winners will be announced by the European Union's executive branch in Brussels on Jan. 28.
Initially, each project will receive (euro) 54 million from the European Union's research budget, an amount that will be matched by national governments and other sources. Further funding will depend on whether they reach certain milestones within the first 30 months, but over a decade it could total (euro) 1 billion each.
Securing such vast sums will be made harder by the austerity measures imposed by many financially drained European governments.
Still, the senior EU official overseeing the so-called Future and Emerging Technologies Flagships program is confident the money will be made available and insists the investment is necessary if Europe wants to match the success the CERN labs on the Swiss-French border that have become the world's premier center for particle research thanks to their $10 billion atom smasher.
"Supporting research and development is not a nice-to-have, it is essential because no investment means no chance for a better future," Neelie Kroes told The Associated Press in an email. "And especially during a crisis we all need something positive to look ahead to. Just cutting public expenditure and austerity don't bring new growth and jobs."
Kroes, whose title is European Commissioner for Digital Agenda, believes it will pay off. "By pooling resources across the EU and focusing on the two best projects we get a good shot at a manifold return on the investment," she said. Switzerland, Norway, Israel and Turkey, which are not part of the 27-nation EU, are also partnering in the program.
One explicit aim of the program is to encourage scientists to address not just contemporary problems but also those that could arise in future.
Climate change, ageing societies and a shortage of natural resources all loom large in predictions for Europe's future. So far, solutions to these problems have been limited, partly because of their sheer scope.
"The world of today has become so complex that it's beyond our control," said Dirk Helbing, a professor at the Swiss Federal Institute of Technology ETH in Zurich. Helbing is the coordinator of the FuturICT team that aims to monitor the state of the planet in real time using growing mountains of data now at our fingertips. Anybody will be able to tap into the system to explore possible future scenarios in much the same way as the meteorologists can now forecast the weather with a certain degree of accuracy.
"Think of it as the telescope of the 21st century to help get better insight into problems," Helbing said.
A rival project led by scientists at ETH's sister school EPFL in Lausanne, focuses less on the planetary and more on the personal. Adrian Ionescu, a professor of nanoneletronics at EPFL, says the booming in mobile devices has concentrated mainly on communication and gaming. His team's Guardian Angels project aims to develop wearable, self-powered gadgets than can warn their users of danger, encourage them to exercise, and collect environmental and health information that could be of use to doctors.
Ionescu claims such devices could save large sums in health care costs by preventing diseases and helping manage them. The components to make them are already available, he said. The key is integrating them all into one system - a process he likened to the effort made by the United States in the 1960s to put a man on the moon.
One of the most promising materials for electronic devices of the future is graphene - the sole focus of a third finalist. It has been touted as a solution to problems as wide-ranging as mopping up nuclear spills, making airplanes more fuel efficient and speeding up computer chips. Russian-born scientists Andre Geim and Konstantin Novoselov received the 2010 Nobel Prize in physics for their experiments with this two-dimensional "wonder material" that's up to 300 times stronger than steel - but much lighter.
The problem is how to manufacture it efficiently.
"There is still quite a bit of research to be done," said Jari Kinaret, professor of applied physics at Chalmers University of Technology in Gothenburg, Sweden.
Kinaret said the long-term funding offered by the EU program would be key to developing what he called a "disruptive technology."
"If you want to create a new technology it does not happen in one or two years," he said. Although Europe, the United States and Asia each produce a third of the scientific papers published on graphene, the number of patents coming out of Europe lags behind.
"We risk that the fruits of research that started in Europe will be harvested elsewhere," he told the AP.
The prospect of Europe losing ground to nimbler rivals plays a prominent role in the arguments put forward by all four projects still in the race.
"If we don't get the funding...we may see some of the European talent move to parts of the world where there is better funding situation, like Singapore," said Kinaret.
Henry Markram said CERN's success was the best example of how polling European resources can put the continent at the forefront of science. CERN announced last year that they have finally found solid evidence of the elusive Higgs boson particle that scientists have been hunting for 50 years. Markram, a professor of neuroscience at EPFL, says his team wants to do the same for the human brain. "The pharmaceutical industry won't do this, computing companies won't do this, there's too much fundamental science," he said. "This is one project which absolutely needs public funding."
His Human Brain Project plans to use supercomputers to model the brain and then simulate drugs and treatments for diseases that Markram says cost (euro) 800 billion each year in Europe alone.
10 Things to Know for Today Read next article
Risk to all ages: 100 kids die of flu each year | 科技 |
2016-40/3982/en_head.json.gz/13261 | NRC: 2 years after Japan, US nuke plants safer
FILE - In this Jan. 14, 2013 file photo, Alison Macfarlane, the chair of the Nuclear Regulatory Commission, talks about her tour of the troubled San Onofre Nuclear Power Station in San Juan Capistrano, Calif. The plant, located between Los Angeles and San Diego, hasn't produced electricity since January 2012 after a tiny radiation leak led to the discovery of excessive wear on hundreds of tubes that carry radioactive water. Macfarlane said in an interview that the agency won't let the San Onofre plant reopen until regulators are certain it can operate safely, which may take several months. She says performance of U.S. plants is "quite good.'' Macfarlane says all but five of the nation's 104 reactors were performing at acceptable safety levels at the end of last year. (AP Photo/Lenny Ignelzi, File)
By MATTHEW DALY, Associated Press
WASHINGTON (AP) � Two years after the nuclear crisis in Japan, the top U.S. regulator says American nuclear power plants are safer than ever, though not trouble-free. A watchdog group calls that assessment overly rosy.
�The performance is quite good,� Nuclear Regulatory Commission Chairman Allison Macfarlane said in an interview with The Associated Press.
All but five of the nation�s 104 nuclear reactors were performing at acceptable safety levels at the end of 2012, Macfarlane said, citing a recent NRC report. �You can�t engage that many reactors and not have a few that are going to have difficulty,� she said.
But the watchdog group, the Union of Concerned Scientists, has issued a scathing report saying nearly one in six U.S. nuclear reactors experienced safety breaches last year, due in part to weak oversight. The group accused the NRC of �tolerating the intolerable.�
Using the agency�s own data, the scientists group said 14 serious incidents, ranging from broken or impaired safety equipment to a cooling water leak, were reported last year. Over the past three years, 40 of the 104 U.S. reactors experienced one or more serious safety-related incidents that required additional action by the NRC, the report said.
�The NRC has repeatedly failed to enforce essential safety regulations,� wrote David Lochbaum, director of the group�s Nuclear Safety Project and author of the study. �Failing to enforce existing safety regulations is literally a gamble that places lives at stake.�
NRC officials disputed the report and said none of the reported incidents harmed workers or the public.
Monday marks the two-year anniversary of the 2011 earthquake and tsunami that crippled Japan�s Fukushima Dai-ichi nuclear plant. U.S. regulators, safety advocates and the industry are now debating whether safety changes imposed after the disaster have made the nation�s 65 nuclear plants safer.
New rules imposed by the NRC require plant operators to install or improve venting systems to limit core damage in a serious accident and set up sophisticated equipment to monitor water levels in pools of spent nuclear fuel.
The plants also must improve protection of safety equipment installed after the Sept. 11, 2001, terror attacks and make sure they can handle damage to multiple reactors at the same time.
Macfarlane, who took over as NRC chairwoman last July, said U.S. plants are operating safely and are making progress on the new rules, which impose a deadline for completion of 2016 � five years after the Fukushima disaster. �So far, industry seems to be cooperating,� she said.
The NRC has been working closely with plant operators �to make sure they understand what we are requiring and that we understand about their situation as well,� Macfarlane said.
Even so, the U.S. industry faces a range of difficulties. Problem-plagued plants in Florida and Wisconsin are slated for closure, and four other reactors remain offline because of safety concerns. Shut-down reactors include two at the beleaguered San Onofre nuclear power plant in southern California, which hasn�t produced electricity since January 2012, when a tiny radiation leak led to the discovery of damage to hundreds of tubes that carry radioactive water.
Macfarlane said the agency won�t let the San Onofre plant reopen until regulators are certain it can operate safely, which may take several months.
Joseph Pollock, vice president of Nuclear Energy Institute, an industry trade association, said plant operators are �working aggressively� to meet the 2016 timeline set by the NRC and have already spent upwards of $40 million on safety efforts. Utilities have bought more than 1,500 pieces of equipment, from emergency diesel generators to sump pumps and satellite phones, Pollock said, and the industry is setting up two regional response centers in Memphis and Phoenix.
The industry expects to meet the 2016 timeline �with the current understood requirements,� Pollock said. If the requirements change or new regulations are added, �then obviously we would have to review that,� he said.
Even before the new rules are completely in place, the NRC is considering a new regulation related to the Japan disaster: requiring nuclear operators to spend tens of millions of dollars to install filtered vents at two dozen reactors.
NRC staff recommended the filters as a way to prevent radioactive particles from escaping into the atmosphere after a core meltdown. The filters are required in Japan and throughout much of Europe, but U.S. utilities say they are unnecessary and expensive.
The Nuclear Energy Institute said filters may work in some situations, but not all. The group is calling for a �performance-based approach� that allows a case-by-case determination of whether filtering is the best approach to protect public safety and the environment.
�We�re not against filtering. It�s how you achieve it,� said Marvin Fertel, the group�s president and CEO.
The filter issue has ignited a debate on Capitol Hill. Lawmakers from both parties have sent out a flurry of dueling letters for and against the proposal. Twenty-eight Republicans in the House and Senate, joined by more than two dozen House Democrats, have sent letters opposing the requirement as hasty and unnecessary.
A dozen Democratic senators and five House members have written letters backing the requirement, which they say will ensure public safety in the event of a Japan-style accident. The five-member commission is expected to vote on the issue in the next few weeks.
�It�s not the time to be rash with hasty new rules, especially when the NRC has added 40-plus `safety enhancements� `� to its initial requirements following the Japan disaster, said Sen. David Vitter, R-La., senior Republican on the Senate Environment and Public Works Committee.
Sen. Barbara Boxer, D-Calif., who chairs the committee, said the filters were needed to protect the 31 U.S. nuclear reactors that have similar designs to the ones that melted down in Japan.
The filters �world reduce the amount of radioactive material released into the environment� in a severe nuclear accident, Boxer wrote in a letter signed by 11 fellow Democrats. �These technologies have been demonstrated in nuclear plants around the world.�
Boxer, whose committee has held seven oversight hearings since the Japan disaster, has asked the NRC to report to her on the agency�s progress implementing the post-Fukushima safety reforms.
�It is vital that U.S. nuclear power plants fully incorporate the lessons learned from this disaster,� she said. | 科技 |
2016-40/3982/en_head.json.gz/13317 | HomeTimeEarth OrientationAstronomyMeteorologyOceanographyIce
USNO
News, Tours & Events
Sky This Week
The Sky This Week, 2014 February 4 - 11
USNO Scientific Colloquia
A great week for the urban skywatcher!
The Moon and the planet Mercury, 2014 FEB 1, 23:22 UT
The Moon courses her way through the evening sky this week, seemingly vaulting up from the horizon to join the stars of the Great Winter Circle. First Quarter occurs on the 6th at 2:22 pm Eastern Standard Time. Look for the Moon near the Pleiades star cluster and the bright star Aldebaran on the evening of the 7th. On the 10th she may be found just over five degrees south of bright Jupiter. If you look carefully that evening you should see the second-magnitude star Alhena just a degree south of Luna’s southern limb.
This should be a banner week for the urban skywatcher. Despite the scattered light from thousands of poorly-engineered streetlamps, billboards, and other assorted sources of light pollution, the Moon, Jupiter, and the bright stars of the Great Winter Circle give you plenty of targets to focus on over the next several nights. If I had to choose the best week of the entire year to observe the Moon, this would be it. Luna waxes through the First Quarter phase as she moves through high northern declinations, allowing us to look at her through a minimal amount of terrestrial atmospheric turbulence. The sunrise terminator steadily advances across some of the most interesting formations of the lunar landscape. Early in the week the scene is dominated by many of the vast lava plains that form the so-called lunar "seas". As Luna reaches the First Quarter phase the view begins to shift toward more rugged terrain dominated by enormous impact craters. By the end of the week it once again reveals more subtle landforms as it moves across the vast Oceanus Procellarum, a huge relatively flat lava plain sprinkled with more solitary craters. These flatter areas were formed about 3 to 3.5 billion years ago, yet they represent some of the youngest terrain on the Moon. The crater-packed southern highlands have changed little since the early formation of the Moon, and among the samples returned from this region by the astronauts of Apollo 16 was the "Genesis Rock", a sample of primordial crust thought to be some 4.6 billion years old! If you marvel at the detail visible through a telescope, try to keep in mind the scale of the surface you’re examining. Even in our 12-inch telescope, the smallest features visible under ideal conditions are craters comparable in size to Meteor Crater in Arizona!
You can still catch the elusive planet Mercury early in the week. The fleet planet may be seen in the west-southwestern sky shortly after sunset about 10 degrees above the horizon. Mercury will begin a rapid plunge toward the Sun by the evening of the 8th, and he’ll fade rapidly as he does so. Try to catch him in binoculars before he goes; however, if you miss him, he’ll be back for another good evening show in the latter half of May.
Jupiter now crosses the meridian just before 10:00 pm, so you have the entire evening to observe him at a high altitude in the sky. It should be especially interesting to look at him on the evening of the 10th, when our own Moon will be just a few degrees away. Even though Jupiter presents the largest apparent disc of any planet except Venus, his size compares to an average-sized crater on the surface of the Moon. The difference, of course, is that the Moon is just 384,000 kilometers (239,000 miles) away while Jupiter is 660 million kilometers (410 million miles) distant from Earth. To put this in some sort of perspective, a good six-inch telescope will show the tiny discs of Jupiter’s four Galilean moons. On the night of the 10th the moons Io and Europa will be closest to their hulking master. These two moons are comparable in size to our Moon and subtend discs just over one arcsecond in apparent diameter. Our Moon will be just under 1800 seconds of apparent diameter!
Ruddy Mars now rises just before 11:00 pm, so the best time to look for him is still in the pre-dawn hours. The red planet is now slowing his eastward progress among the stars of the constellation Virgo as he prepares for opposition in early April. He’s currently within five degrees of Virgo’s brightest star Spica, and he’ll stay close to the star for the next several weeks. Modest telescopes can now reveal considerable detail on his distant dusty surface for those observers willing to wait for moments of steady air in our atmosphere.
Saturn is just east of the meridian at 6:00 am EST. His golden glow may be spotted between the rising stars of Scorpius and the scattered stars of Libra. Despite his southerly declination he is a wonderful sight in almost any telescope thanks to the spectacular set of rings that surround his gaseous disc. The rings are tipped just over 20 degrees to our line of sight, so even in binoculars the planet has an unusual oval appearance.
You should be able to easily spot the dazzling planet Venus low in the southeast as morning twilight gathers. Venus reaches her greatest brilliancy for the year this week, so she should be visible despite the glow of the rising Sun. Venus will remain a fixture in the morning twilight sky until late October.
USNO Master Clock Time
Javascript must be Enabled
The Sky This Week
The Sky This Week, 2016 September 27 - October 4
Naval Meteorology and Oceanography Command, 1100 Balch Blvd, Stennis Space Center, MS 39529
Fleet Forces Command | navy.com | Freedom of Information Act (FOIA) | External Link Disclaimer This is an official U.S. Navy web site.
Security & Privacy Policy Veterans Crisis Line | 科技 |
2016-40/3982/en_head.json.gz/13395 | What is Marine Science?
A focus on marine plants is one option for someone studying marine science.
Marine Science includes the study of ocean currents.
Edited By: J.T. Gale
Marine science commonly is called oceanography. As these names may reveal, this branch of science deals with study of oceans. Professionals in this field often are called marine scientists or oceanographers, but they also may take titles that refer to their specialties. The topics that are covered by marine science can widely vary, including such things as ocean currents, sea floor geology, and the chemical composition of ocean water.
Many people have only a vague understanding of marine science. One common misconception involves the use of the titles such as marine scientist and oceanographer. To a layperson, these may sound very specific. In reality, these titles hardly provide any information about what a person in this field does.
Marine science is so broad that it would require a lot of space to outline every possible career path. Many of the same components that are studied on land also are studied in the water. Marine biology, marine chemistry, and marine physics are three of the disciplines that fall into the category of oceanography. Within each of these disciplines there are numerous sub-categories in which a professional is likely to specialize. For example, within marine biology, one person could focus on plants while another focuses on microscopic organisms. Ad
In some cases, oceanographers have majored in some type of marine science program. More often than not, however, these professionals majored in more basic programs such as Biology or Earth Sciences. Then, somewhere along the way, they veered off and began to concentrate on oceanography.
People also tend to think that marine scientists carry out most of their duties in or on the water. This is a second misconception. A lot of the work done by such professionals typically is conducted in laboratories. Instead of wet suits and oxygen tanks, their gear commonly is composed of microscopes and computers.
It widely is believed that the oceans affect many components of the natural system of the Earth. For example, oceans have been linked to the global climate. Marine life also is responsible for supporting part of the human food chain. As this is the case, a common objective of marine science is to draw relevance between the oceans and other parts of nature.
Marine science often is treated as a novelty science which commonly results in funding problems: a third misconception. This vast area of science can play a vital role in environmental conservation. It also may be a large contributor in the search for solutions to environmental problems such as global warming. Ad
What Are the Different Branches of Earth Science?
What Are the Different Types of Earth Science Programs?
What Is the Relationship between Space and Earth Science?
What Is Marine Life Conservation?
What Does a Chemical Oceanographer Do?
How Do I Become a Chemical Oceanographer?
What Is Marine Climate?
lluviaporos
@irontoenail - There have been lots of examples where marine scientists have managed to make a big difference in the world though, through education and technology. If it weren't for marine scientists we would have much less understanding of climate patterns in general, like El Nina and El Nino which are both influenced by ocean patterns. Marine science jobs are also responsible for the fact that we haven't quite managed to over-fish the oceans entirely yet. Ocean conservation has a long way to go but there are a lot of good people fighting the good fight. irontoenail
@MrsPramm - Well, hopefully as marine science research becomes more critical to the functioning of society we will start paying more attention to what they are telling us. But considering the fact that they've been telling us to stop climate change or face the consequences for a while, I suspect we're going to learn the hard way. MrsPramm
I took a geography course at university that was primarily run by an oceanographer and I came away from it with the impression that very little of what they say gets taken seriously by the general public. It might have been that the professor was a little bit cynical, but he had a lot of examples where people had been told things about how the ocean works and proceeded to ignore all advice. Seawalls were the example that I remember most clearly. Our professor told us at length what process should be followed if you want to build up a coastal area and stop erosion, but he said that more often than not people will just build a big wall and hope that will stop the water. That's, at best, a temporary measure though. Post your comments | 科技 |
2016-40/3982/en_head.json.gz/13426 | NASA Head Resigns
Posted: Mon 11:42 AM, Apr 21, 2003
| Updated: Mon 11:42 AM, Apr 21, 2003 The man in charge of NASA's Space Shuttle Program, who was one of the agency's most recognized faces following the destruction of the Shuttle Columbia, will soon resign.
Ron Dittemore had planned to resign after Columbia completed its mission but postponed his departure because of the shuttle disaster and investigation.
The shuttle disintegrated 38 miles over Texas, killing all seven astronauts.
The Orlando Sentinel reported Saturday that Dittemore is expected to announce his resignation this week. A source says it was common knowledge at NASA that Dittemore was leaving to take a job in private industry.
Dittemore has been with NASA for 26 years and has headed the shuttle program since 1999.
The Orlando Sentinel said a search for Dittemore's replacement is under way. | 科技 |
2016-40/3982/en_head.json.gz/13644 | Click here to search Scarlet snake
Cemophora coccinea
house snake
leaf-nosed snake
tree snake
wolf snake
vine snake
Scarlet snake, (Cemophora coccinea), small, burrowing, nocturnal member of the family Colubridae. It occurs in the United States from New Jersey to Florida and as far west as Texas. It is a burrower that is found in areas of friable and sandy soils. Scarlet snakes eat a variety of insects and small vertebrates, but lizard and snake eggs are preferred. They are egg layers.Because of its red, black, white, and yellow rings, this harmless species is sometimes referred to as a false coral snake. It also somewhat resembles the scarlet king snake. All three species—the scarlet, scarlet king, and coral snake—occur in the same habitats over the same geographic area. Presumably, the scarlet snake and scarlet king snake are mimics of the coral snake.
Any of several members of the Old World subfamily Pareinae and of the New World subfamily Dipsadinae, family Colubridae. All have long delicate teeth; those at the front of the...
Any of several nonvenomous snake species that live in or around dwellings. In the United States this name is often given to the milk snake (see king snake). The house snakes of...
Any of several venomous, rear-fanged snakes of the family Colubridae that have slender bodies, narrow heads, and pointed snouts. Vine snakes typically belong to the genera Ahaetulla...
Arkansas Game & Fish Commission - Northern Scarlet Snake
Center for Reptile and Amphibian Conservation and Management - Northern Scarlet Snake
University of Georgia - Scarlet Snake
Virginia Department of Game and Inland Fisheries - Northern scarlet snake
eNature.com - Scarlet Snake
scarlet snake
9 of the World’s Deadliest Snakes
Few animals strike as much fear into people as venomous snakes. Although the chances of running into a venomous snake, much less being bitten and dying from the toxin injected into one’s body, are miniscule...
Vipers, Cobras, and Boas...Oh My!
Take this snake quiz at Encyclopedia Britannica to test your knowledge on the species of vipers, which snake killed Cleopatra and which snake has a hood.
5 Vertebrate Groups How many of you remember the Brady Bunch episode in which Peter was studying for a biology test? He asked Marcia for help, and she taught him the mnemonic: “A vertebrate has a back that’s straight.”...
Snakes and Lizards: Fact or Fiction?
Take this animals Fact or Fiction Quiz at Encyclopedia Britannica and test your knowledge of lizards and snakes.
"scarlet snake". Encyclopædia Britannica. Encyclopædia Britannica Online.Encyclopædia Britannica Inc., 2016. Web. 01 Oct. 2016<https://www.britannica.com/animal/scarlet-snake>.
scarlet snake. (2016). In Encyclopædia Britannica. Retrieved from https://www.britannica.com/animal/scarlet-snake
Encyclopædia Britannica Online, s. v. "scarlet snake", accessed October 01, 2016, https://www.britannica.com/animal/scarlet-snake. | 科技 |
2016-40/3982/en_head.json.gz/13684 | Chesapeake Bay Field Office
Who We Are Contacts Location Calendar Wildlife & Habitats
Endangered & Threatened Species Fish Migratory Birds Submerged Aquatic Vegetation (SAV) Wetlands What We Do For You
Bayscapes Chesapeake Bay Coastal Program Endangered Species Program Environmental Contaminants Partners for Fish and Wildlife Schoolyard Habitat Stream Restoration Conservation Planning Assistance Delaware Presents 2009 Wetland Warrior Award to CBFO Partners for Fish and Wildlife Coordinator
DENREC Secretary Collin O'Mara and Delaware Governor Jack Markell present the Wetland Warrior Award to Al Rizzo (center).
Delaware Governor Jack Markell and Delaware Department of Natural Resources And Environmental Control Secretary Collin O’Mara presented Al Rizzo, soil scientist and Partners for Fish and Wildlife Coordinator for Delaware and Maryland, with the 2009 The Wetland Warrior Award. The award is presented to an individual or group in recognition of exemplary efforts that benefit wetlands through education and outreach, monitoring and assessment , or restoration and protection.
Al has improved the future for Delaware wetlands and the services that they provide to the citizens of Delaware by restoring thousands of acres of degraded and former wetlands. He generously shares his knowledge of wetlands by educating the public on the value of wetlands and training other scientists on innovative techniques.
Al has worked tirelessly to bring wetland restoration techniques into the 21st Century by using methods to restore and improve areas to emulate natural wetlands. He did away with smoothing areas to emulate swimming pools and replaced that with a model of humps and bumps referred to as microtopography and placed stumps and logs to look like the natural ground of wetlands. These techniques provide habitat for a greater variety of insects, birds, amphibians, retiles and plants.
It’s hard to compile an exact figure of acres of Delaware wetlands that he has restored because not only had he initiated projects on this own but gave his expertise to many more project. One colleague estimated that 5,000 acres would be a conservative estimate.
Al is dedicated to sharing what he has learned with the public and other professionals. He is an active member of the Mid-Atlantic Hydric Soils Committee and described as a “soil scientist of the highest caliber.” He participates in trainings on understanding soils and performing restoration. He is always willing to visit proposed restoration sites with partners and provide suggestions based on this experience.
Al had not only increased the amount of wetlands in Delaware but the functioning of these wetlands. He has made a significant improvement to Delaware that will last for generations to come.
U.S. Fish and Wildlife Service Home Page | Department of the Interior | USA.gov | About the U.S. Fish and Wildlife Service|
Accessibility | Privacy | Notices | Disclaimer | FOIA | Contact Us | 科技 |
2016-40/3982/en_head.json.gz/13706 | Fusion is vital for our future -Sabina Griffith
Fusion is vital for our future
''I think everybody realizes that it would be desirable to have fusion as fast as possible,'' says Steven Cowley, CEO of the UKAEA and head of the Culham Centre for Fusion Energy (CCFE). In June this year Steven Cowley, the CEO of the United Kingdom Atomic Energy Authority and head of the Culham Centre for Fusion Energy (CCFE), was appointed by Prime Minister David Cameron as a member of the Council for Science and Technology (see Newsline issue 183). In today's interview with the ITER Newsline, he talks about his new role and the importance of making fusion commercially viable. Newsline: Steven, can you describe your role as a member of the Council for Science and Technology? Cowley: The Council for Science and Technology advises the government on strategic issues and how the United Kingdom should position itself. It is made up largely of scientists and engineers, technologists from industry and academia in the UK. What is of great concern in the UK government currently is how to stimulate innovation. How do we produce the next generation of companies that will produce the innovations in our economy? How do we shape the future prosperity of the UK? My other role in life—running fusion in the UK—is very closely related to these two questions. Fusion is not about pure science ... it is about producing an energy source for the future, an energy source that will bring prosperity with it. Do you think you were appointed because of your leading role in the national fusion program? It's difficult to speculate why they asked me. I think the UK government realizes that fusion is now a very important part of the research portfolio. Fusion fits right into its vision that the future has to be supported by high technology, and that the ultimate high technology is energy. I think that I am in this role because the government wants sound advice on fusion, on nuclear, on energy...and I have expertise in these areas. Currently, there are many activities going on worldwide to address the question of how to accelerate fusion research. Is this a question that is being discussed in the UK as well? I am actually about to head out to Princeton (US) where there is currently an international fusion road-mapping workshop going on ... and, yes, this issue is being discussed at government level here in the UK as well. I think everybody realizes that it would be desirable to have fusion as fast as possible. However, as we are proceeding forward towards fusion, we want to make sure that we don't go down any blind alleys. We want to go as fast as possible, but without stumbling. It would be a shame if fusion stumbled on the way rather than going straight for the goal. That is why ITER is so important. I think once the burning shots on ITER have been accomplished ... once ITER produces a gain of ten and a stable long-pulse shot ... we will know that fusion is absolutely possible. We will have done it. But what we don't know is how long it will take us to make fusion viable commercially. One of the questions everybody is asking right now is to how to shorten the time after ITER to a demonstration reactor (DEMO). How quickly, after ITER is successful, can we push forward to an actual electricity-producing fusion power plant? The discussions at Princeton are part of a wider discussion on how fast the fusion community can do that. I think that the more desperate people get for energy, the more pressure we will receive. The UK is very committed to have fusion as part of the energy portfolio in the second half of this century. For that, DEMO has to be operational in the 2040s. People think this is a long time, but it is not such a long time for development, as fusion is such a sophisticated technology. The UK believes that part of the future UK generating base will be fusion, and that this proportion will increase. By 2100, we are perhaps talking about a substantial portion of the electricity generating base in the UK ... and, I am sure, throughout Europe. The Chinese government is dedicated to building a DEMO reactor and is now investing into the education and training of 2,000 fusion scientists and engineers. Does such news influence the debate in the UK? I think that news like that does indeed influence the debate. On the other hand, 2011 is an exceptional year with the world's major economies up against the wall. In many respects, looking over at China is less on the mind of politicians than the dangers of cutting research budgets in the UK. The government has maintained strong support for ITER; while many projects have had their budgets reduced, the fusion research budget was maintained. There is some research that is beautiful, that is lovely, it tells us beautiful things about the world. But it doesn't matter very much whether we know it now or in 20 years from now. It is not vital to our future economy and our future survival. That is not true about fusion, it is vital. return to the latest published articles | 科技 |
2016-40/3982/en_head.json.gz/13759 | Atmospheric Sciences & Global Change Newsmakers
Rasch, Guenther Make Highly Cited Researchers 2014 List
Phil Rasch Alex Guenther Researchers at Pacific Northwest National Laboratory are impacting the atmospheric science community, with the science to prove it. Highly Cited Researchers found that two of PNNL's atmospheric scientists, Phil Rasch and Alex Guenther, are among the top cited researchers in average citations per paper in the category of geosciences. The ranking was found with InCites, a Thomson Reuters analytical tool that combed their database of indexed published papers from 2002-2012. Representing the top 1% in their field, only 250 researchers in each field were selected based on total citations to their papers published during this period. Read more here: http://highlycited.com/info.htm For more, see Five PNNL researchers named among the world's most cited
For a searchable list, see: http://isihighlycited.com/ Page 209 of 581 | 科技 |
2016-40/3982/en_head.json.gz/13785 | Sensing the deep ocean
Sensorbots are spherical devices equipped with biogeochemical sensors, that promise to open a new chapter in the notoriously challenging exploration of earth's largest ecosystem -- the ocean.
Futuristic robots may be coming soon to an ocean near you. Sensorbots are spherical devices equipped with biogeochemical sensors, that promise to open a new chapter in the notoriously challenging exploration of earth’s largest ecosystem—the ocean.
Credit: Image courtesy of Arizona State University
Futuristic robots may be coming soon to an ocean near you. Sensorbots are spherical devices equipped with biogeochemical sensors, that promise to open a new chapter in the notoriously challenging exploration of earth's largest ecosystem -- the ocean.
The devices are being designed and developed in the laboratory of Professor Deirdre Meldrum, ASU Senior Scientist and Director of the Center for Biosignatures Discovery Automation at Arizona State University's Biodesign Institute.
Much of Meldrum's genomic research focuses on deep ocean environments and leverages her extensive technology development for human health and disease. In 2001, her group was awarded an $18 million grant for a National Institutes of Health Center of Excellence in Genomics Science, which led to the establishment of the Microscale Life Sciences Center (MLSC) -- currently headquartered at the Biodesign Institute in the Center for Biosignatures Discovery Automation. (The MLSC grant was subsequently renewed for an additional $18 million.)
Meldrum's Center brings together researchers in electrical, mechanical, chemical, and bio- engineering, chemistry, computer science, materials science, laboratory medicine and microbiology as well as personnel from the Fred Hutchinson Cancer Research Center, University of Washington and Brandeis University. Together, they work on developing microscale devices to analyze cells and their DNA, RNA, and proteins to understand and eventually diagnose or prevent diseases such as cancer and inflammation.
To accomplish this, Meldrum and her colleagues develop microscale modules to measure multiple parameters in living cells in real time in order to correlate cellular events with genomic information. As Meldrum explains, the Sensorbot project significantly expands the scope of oceanographic investigations carried out by Biodesign's Center for Biosignatures Discovery Automation:
"We are leveraging our automation, sensors, biotechnology, and systems expertise to develop unique robots that can be deployed by the hundreds, travel in formation, and communicate together for exploration and discovery. The Sensorbots will enable continuous spatiotemporal monitoring of key elements in the ocean and the ability to respond to events such as underwater earthquakes and hydrothermal vents. Such research is essential for a more thorough understanding of the multiple systems in the oceans -- microbes and other sea life, geology, and chemicals."
Assistant Research Professors Cody Youngbull and Joseph Chao are both integral members of the Sensorbot team and have spent years developing the technology. Much of this creative tinkering has taken place in the labs at Biodesign, situated in landlocked Arizona. But in the summer of 2011, Youngbull took the Sensorbots to the deep ocean, aboard the Thomas G. Thompson, a global-class research vessel operated by the University of Washington, as part of UNOLS (University-National Oceanographic Laboratory System) and used as part of the National Science Foundation's Ocean Observatories Initiative.
The ambitious Sensorbot project is utilizing the National Science Foundation's Ocean Observatories Initiative -- in particular, the Regional Scale Nodes (RSN) project, led by Professor John Delaney of the University of Washington. This far-flung endeavor involves the construction of a cabled underwater observatory in the NE Pacific Ocean, off the coasts of Oregon, Washington, and British Columbia on the Juan de Fuca tectonic plate. This area is home to many dramatic undersea features, including volcanoes and hydrothermal vents -- wellsprings of unique life forms.
The cabled observatory provides high bandwidth and power for real-time oceanographic observations and experiments. These include the study of mineral concentrations, gas compositions, biological blooms, and detailed analyses of extremophiles -- organisms flourishing in environments usually considered inhospitable to life. The cabled observatory with high power and bandwidth provides the Sensorbots with the ability to recharge their batteries and download their data, allowing immediate transmission via the internet and making the information available to scientists and educators anywhere in the world.
The current Sensorbots are fist-sized transparent robotic orbs, which communicate via brilliant blue flashes of light. The spheres house electronics and batteries, while their surfaces have 3 sensors for measuring pH, temperature or oxygen. Sensorbots report surrounding environmental conditions to the inner electronics that convert the signal into flashes of light, providing a sort of visual Morse code.
A high-speed camera situated on the seafloor picks up the signals and stores them for later decoding aboard the ship. As sensorbot technology develops, these orbs may blanket large areas of the ocean and transmit information regularly to a central data hub. Ultimately, Sensorbots will be capable of operating in semi-autonomous robotic swarms, moving under remote control, in a 3D geometric formation through precisely controlled volumes of seawater.
Sensor swarms operating autonomously could function in complex, harsh, and remote environments. With appropriate microanalytical systems mounted on the Sensorbot platforms, these synthetic mariners could perform spatially and temporally indexed genomic analyses of microbial communities, as well as observing a broad variety of macro events.
As Youngbull explains, the Sensorbot technology has undergone several stages of development. "We've built 3 versions of Sensorbots over the years and now we're moving on to a fourth," he says. This latest iteration -- known as p3N -- will function independently and provide networking capabilities, allowing members of the swarm to communicate with each other. The optical signals transmitted by the Sensorbots will be linked, using so-called multi-hop networking technologies, not unlike those used for cell phone and other land-based wireless networks.
"Sensing webs are an exciting thing," Youngbull says, "because the scale of phenomena are vast in the oceans." Rather than delivering a very expensive robot to a single point in space and then serially moving it around, often missing dynamic phenomena, an array of inexpensive Sensorbots can cover a wide field, permitting real-time investigations of earthquakes, biological blooms, and other episodic phenomena.
"Networking will allow us to trail sensorbots like a string of pearls over great distances -- so we'll get all the benefits of optical communication speed and energy efficiency without the detriment of optical loss and attenuation in sea water," Youngbull says.
During the most recent research cruise, Youngbull helped position the Sensorbots on the seafloor, with the aid of ROPOS -- a Canadian remotely operated vehicle. The Sensorbots were deployed to a crushing depth of over 1500 meters in and around a hydrothermal vent field in the cauldera of an undersea volcano. Twenty Sensorbots were deployed over hundreds of square meters and monitored continuously for 3 days using a high speed underwater camera.
The new p3N sensorbots currently under construction will have inertial monitoring units, allowing them to sense when and where they have been moved. Eventually, the devices will have their own onboard propulsion systems, allowing them to be controlled independently from afar. The challenges of creating such systems are significant. Initially, sensorbots may be designed to automatically adjust their buoyancy to take advantage of the natural movements of ocean current.
The group will be continuing their Sensorbot experiments in a new undersea cabled environment belonging to the Monterey Bay Aquarium Research Institute (MBARI). While the team's focus will be on undersea microbial research, the concept of a multi-hop capable node, that functions to detect many different analytes in an inexpensive, compact format has other potential applications, including national security, epidemiology, and contaminant plume monitoring.
In addition to MBARI, the Center for Biosignatures Discovery Automation collaborates with other schools at ASU, including the School of Earth and Space Exploration and the Global Institute of Sustainability.
"Nobody brings it all together like we do, with the sensors, the embedded systems, the data transfer and communication," Youngbull says. Meldrum is very enthusiastic about the use of this technology for advancing science in previously inaccessible realms of the deep ocean: "Sensorbots will provide a continuous presence in the ocean over space and time, from nanometers to kilometers and nanoseconds to years, enabling us to discover and understand the complex biogeochemical systems of our oceans that play a key role in our quality of life."
Sea Beacons In the laboratory, a Sensorbot flashes out its code. By means of these brilliant blue pulses of light, the spherical undersea robot relays information about its environment. The light code is captured by high-speed cameras, which record and transmit the data for later analysis.
Taking the plunge An array of Sensorbots is escorted to the ocean floor by means of the Remotely Operated Platform for Ocean Science or ROPOS -- a Canadian submersible device.
Materials provided by Arizona State University. Note: Content may be edited for style and length.
Arizona State University. "Sensing the deep ocean." ScienceDaily. ScienceDaily, 22 December 2011. <www.sciencedaily.com/releases/2011/12/111220102634.htm>.
Arizona State University. (2011, December 22). Sensing the deep ocean. ScienceDaily. Retrieved October 1, 2016 from www.sciencedaily.com/releases/2011/12/111220102634.htm
Arizona State University. "Sensing the deep ocean." ScienceDaily. www.sciencedaily.com/releases/2011/12/111220102634.htm (accessed October 1, 2016).
Rogue wave (oceanography)
New Insights Into the Impacts of Ocean Acidification
Sep. 14, 2016 A new study offers clues to the potential impact of ocean acidification deep-sea, shell-forming organisms. Increasing atmospheric carbon dioxide (CO2) is also increasing oceanic CO2. Ocean ... read more Analyzing Ocean Mixing Reveals Insight on Climate
June 24, 2015 A computer model that clarifies the complex processes driving ocean mixing in the vast eddies that swirl across hundreds of miles of open ocean has been developed by ... read more Peru: Towards Better Forecasting of Fish Resources
Nov. 12, 2014 Its turbid cold waters are home to the largest fish stocks in the world: the Humboldt Current system, which runs along the Peruvian and Chilean coasts, boasts exceptional biological productivity ... read more Fish Biomass in the Ocean May Be 10 Times Higher Than Estimated: Stock of Mesopelagic Fish Changes from 1,000 to 10,000 Million Tons
Feb. 7, 2014 With a stock estimated at 1,000 million tons so far, mesopelagic fish dominate the total biomass of fish in the ocean. However, scientists have found that their abundance could be at least 10 times ... read more Strange & Offbeat | 科技 |
2016-40/3982/en_head.json.gz/13795 | Lithium-Ion Batteries for Less
Researchers show a low-cost route to making materials for advanced batteries in electric cars and hybrids.
by Kevin Bullis
A new way to make advanced lithium-ion battery materials addresses one of their chief remaining problems: cost. Arumugam Manthiram, a professor of materials engineering at the University of Texas at Austin, has demonstrated that a microwave-based method for making lithium iron phosphate takes less time and uses lower temperatures than conventional methods, which could translate into lower costs. Nano power: An electron-microscope image of 40-nanometer-wide rod-shaped particles that make up a promising battery material.
Lithium iron phosphate is an alternative to the lithium cobalt oxide used in most lithium-ion batteries in laptop computers. It promises to be much cheaper because it uses iron rather than the much more expensive metal cobalt. Although it stores less energy than some other lithium-ion materials, lithium iron phosphate is safer and can be made in ways that allow the material to deliver large bursts of power, properties that make it particularly useful in hybrid vehicles.Indeed, lithium iron phosphate has become one of the hottest new battery materials. For example, A123 Systems, a startup based in Watertown, MA, that has developed one form of the material, has raised more than $148 million and commercialized batteries for rechargeable power tools that can outperform conventional plug-in tools. The material is also one of the types being tested for a new electric car from General Motors. But it has proved difficult and expensive to manufacture lithium iron phosphate batteries, which cuts into potential cost savings over more conventional lithium-ion batteries. Typically, the materials are made in a process that takes hours and requires temperatures as high as 700 °C. Manthiram’s method involves mixing commercially available chemicals–lithium hydroxide, iron acetate, and phosphoric acid–in a solvent, and then subjecting this mixture to microwaves for five minutes, which heats the chemicals to about 300 °C. The process forms rod-shaped particles of lithium iron phosphate. The highest-performing particles are about 100 nanometers long and 25 nanometers wide. The small size is needed to allow lithium ions to move quickly in and out of the particles during charging and discharging of the battery.
To improve the performance of these materials, Manthiram coated the particles with an electrically conductive polymer, which was itself treated with small amounts of a type of sulfonic acid. The coated nanoparticles were then incorporated into a small battery cell for testing. At slow rates of discharge, the materials showed an impressive capacity: at 166 milliamp hours per gram, the materials came close to the theoretical capacity of lithium iron phosphate, which is 170 milliamp hours per gram. This capacity dropped off quickly at higher discharge rates in initial tests. But Manthiram says that the new versions of the material have shown better performance. It’s still too early to say how much the new approach will reduce costs in the manufacturing of lithium iron phosphate batteries. The method’s low temperatures can reduce energy demands, and the fact that it is fast can lead to higher production from the same amount of equipment–both of which can make manufacturing more economical. But the cost of the conductive polymer and manufacturing equipment also needs to be figured in, and the process must be demonstrated at large scales. The process will also need to compete with other promising experimental manufacturing methods, says Stanley Whittingham, a professor of chemistry, materials science, and engineering at the State University of New York, at Binghamton.Manthiram has recently published advances for two other types of lithium-ion battery materials and is working with ActaCell, a startup based in Austin, TX, to commercialize the technology developed in his lab. The company, which last week announced that it has raised $5.58 million in venture funding, has already licensed some of Manthiram’s technology, but it will not say which technology until next year. Tagged
battery, electric vehicle, hybrids, lithium-ion
Arumugam Manthiram, University of Texas at Austin
Kevin Bullis
Senior Editor, Materials
My reporting as MIT Technology Review’s senior editor for materials has taken me, among other places, to the oil-rich deserts of the Middle East and to China, where mountains are being carved away to build the looming cities.… More Growing up, I lived for a time in the Philippines, where I knew people who lit their tiny homes with single lantern batteries or struggled to breathe through the dense diesel fumes of Manila, so I have a feel for the pressing need around the world for both cheap energy and clean energy. | 科技 |
2016-40/3982/en_head.json.gz/13849 | SLAC Home
AboutSLAC Overview
Contact SLAC
ResearchAccelerator Research
Accelerators and Society
Astrophysics & Cosmology
Elementary Particle Physics
Materials, Chemistry & Energy Sciences
X-ray Science
Scientific Programs
FacilitiesFACET & Test Beam Facilities
LCLS
SSRL
NewsNews Center
CommunityEducational Programs
Coming to SLACVisitor Information
Site Entry Requirements
Foreign Nationals
Web People Employee Portal | Research Resources Web People SLAC Home
« News Feature Archive The Decades-long Search for the Higgs
By Lori Ann White July 5, 2012 It was a little over two years ago that the Large Hadron Collider kicked off its search for the Higgs boson. But the hunt for the Higgs really began decades ago with the realization of a puzzle to be solved, one that involved more than just the Higgs.
An intriguing asymmetry
The quest started with symmetry, the aesthetically pleasing notion that something can be flipped and still look the same. It’s a matter of everyday experience that the forces of nature work the same way if left is swapped with right; scientists found this also held true, at the subatomic level, for swapping plus-charge for minus-charge, and even for reversing the flow of time. This principle also seemed to be supported by the behavior of at least three of the four major forces that govern the interactions of matter and energy.
In 1956, Tsung-Dao Lee of Columbia University and Chen-Ning Yang of Brookhaven National Laboratory published a paper questioning whether a particular form of symmetry, known as parity or mirror symmetry, held for the fourth force, the one governing the weak interactions that cause nuclear decay. And they suggested a way to find out.
Experimentalist Chien-Shiung Wu, a colleague of Lee's at Columbia, took up the challenge. She used the decay of Cobalt-60 to show that the weak interactions did indeed distinguish between particles spinning to the left and to the right.
This knowledge, combined with one more missing piece, would lead theorists to propose a new particle: the Higgs.
Where does mass comes from?
In 1957, another clue came from a seemingly unrelated field. John Bardeen, Leon Cooper and Robert Schrieffer proposed a theory that explained superconductivity, which allows certain materials to conduct electricity with no resistance. But their BCS theory, named after the three inventors, also contained something valuable to particle physicists, a concept called spontaneous symmetry breaking. Superconductors contain pairs of electrons that permeate the metal and actually give mass to photons traveling through the material. Theorists suggested that this phenomenon could be used as a model to explain how elementary particles acquire mass.
In 1964, three sets of theorists published three separate papers in Physical Review Letters, a prestigious physics journal. The scientists were Peter Higgs; Robert Brout and Francois Englert; and Carl Hagen, Gerald Guralnik and Tom Kibble. Taken together, the papers showed that spontaneous symmetry breaking could indeed give particles mass without violating special relativity.
In 1967, Steven Weinberg and Abdus Salam put the pieces together. Working from an earlier proposal by Sheldon Glashow, they independently developed a theory of the weak interactions, known as GWS theory, that incorporated the mirror asymmetry and gave masses to all particles through a field that permeated all of space. This was the Higgs field. The theory was complex and not taken seriously for several years. However, in 1971 Gerard `t Hooft and Martinus Veltman solved the mathematical problems of the theory, and suddenly it became the leading explanation for the weak interactions.
Now it was time for the experimentalists to get to work. Their mission: to find a particle, the Higgs boson, that could exist only if this Higgs field does indeed span the universe, bestowing mass upon particles.
The hunt begins
Concrete descriptions of the Higgs and ideas of where to look for it began to appear in 1976. For example, SLAC physicist James Bjorken proposed looking for the Higgs in the decay products of the Z boson, which had been theorized but would not be discovered until 1983.
Einstein's best-known equation, E=mc2, has profound implications for particle physics. It basically means that mass equals energy, but what it really means for particle physicists is that the greater the mass of a particle, the more energy required to create it and the bigger the machine needed to find it.
By the '80s, only the four heaviest particles remained to be found: the top quark and the W, Z and Higgs bosons. The Higgs was not the most massive of the four – that honor goes to the top quark – but it was the most elusive, and would take the most energetic collisions to ferret out. Particle colliders would not be up to the job for a long time. But they began sneaking up on their quarry with experiments that began to rule out various possible masses for the Higgs and narrow the realm where it might exist.
In 1987, the Cornell Electron Storage Ring made the first direct searches for the Higgs boson, excluding the possibility that it had a very low mass. In 1989, experiments at SLAC and CERN carried out precision measurements of the properties of the Z boson. These experiments bolstered the GWS theory of weak interactions and set more limits on the possible range of masses for the Higgs.
Then, in 1995, physicists at Fermilab’s Tevatron found the most massive quark, the top, leaving only the Higgs to complete the picture of the Standard Model.
During the 2000s, particle physics was dominated by a search for the Higgs using any means available, but without a collider that could reach the necessary energies, all glimpses of the Higgs remained just that – glimpses. In 2000, physicists at CERN's Large Electron-Positron Collider (LEP) searched unsuccessfully for the Higgs up to a mass of 114 GeV. Then LEP was shut down to make way for the Large Hadron Collider, which steers protons into head-on collisions at much higher energies than ever achieved before.
Throughout the 2000s, scientists at the Tevatron made heroic efforts to overcome their energy disadvantage with more data and better ways to look at it. By the time the LHC officially began its research program in 2010, the Tevatron had succeed in narrowing the search, but not in discovering the Higgs itself. When the Tevatron shut down in 2011 scientists were left with massive amounts of data, and extensive analysis, announced earlier this week, offered a slightly closer glimpse of a still-distant Higgs.
In 2011, scientists at the two big LHC experiments, ATLAS and CMS, had announced they were also closing in on the Higgs. Yesterday morning, they had another announcement to make: They have discovered a new boson – one that could, upon more study, prove to be the long-sought signature of the Higgs field.
The discovery of the Higgs would be the start of a new era in physics. The puzzle is much bigger than just one particle; dark matter and dark energy and the possibility of supersymmetry will still beckon searchers even after the Standard Model is complete. Since the Higgs field is connected to all the other puzzles, we will not be able to solve them until we know its true nature. Is it the blue of the sea or the blue of the sky? Is it garden or pathway or building or boat? And how does it truly connect to the rest of the puzzle?
The universe awaits.
Share With the discovery of what is in all probability the mass-bestowing Higgs boson, the family of fundamental particles that govern the behavior of matter and energy is now complete.
(Image by SLAC Infomedia Services)
SLAC NATIONAL ACCELERATOR LABORATORY 2575 Sand Hill Road, Menlo Park, CA 94025Operated by Stanford University for the U.S. Department of Energy Office of Science | 科技 |
2016-40/3982/en_head.json.gz/13868 | Home> Technology
Dead Sea Scrolls Now Online
By BEN FORER
Ben Forer More from Ben »
Production Associate
Follow @BenForer
via WORLD NEWS
Dead Sea Scrolls Go Online ABCNEWS.com
It took 2,000 years, but the Dead Sea Scrolls have finally entered the digital age. For the first time some of the scrolls are available online thanks to a partnership between Google and Israel’s national museum.
Five of the most important scrolls can now be seen in high-resolution on the Internet. Users can zoom in and out, translate passages to English and access supplemental material.
“We hope one day to make all existing knowledge in historical archives and collections available to all, including putting additional Dead Sea Scroll documents online,” Yossi Matias, managing director of Google’s research and development center in Israel, told the Christian Science Monitor.
The scrolls were written from about 200 BC to 70 AD and according Jeffrey L. Rubenstein, professor of Talmud and Rabbinics at New York University, they offer an unrivaled look at the time after the biblical books were penned and before the Christian texts and documents of rabbinic Judaism were written.
“The Dead Sea scrolls help us fill in this two to three century gap to help us understand what religious developments took place,” said Rubenstein. “We see changes among different groups as they wrestle with powerful cultural and political forces. … These changes help us understand where monotheistic traditions in the west came from.”
“The Dead Sea Scrolls give us a new perspective about ancient life, society and thought,” said Adolfo D. Roitman, curator of the Dead Sea Scrolls, in a video produced by Google. “They promote interfaith dialogue. They promote understanding between human beings.”
Custodians of the scrolls had been criticized for only allowing select groups of scholars access to them.
The original scrolls are located in a specially designed vault in Jerusalem that requires multiple keys, a magnetic card and a secret code to open. They were found in caves near the Dead Sea starting in 1946.
The Associated Press contributed to this report. Related Topics: Dead Sea Scrolls, Google | 科技 |
2016-40/3982/en_head.json.gz/13893 | Home > Technology > Telecom > SEO isn’t dead, black hat techniques are
SEO isn’t dead, black hat techniques areJune 22, 2014 10:31 am Content marketing is the new buzzword among marketing types. While the push for quality content is a step in the right direction, many buy into the misconception that there’s not much else they can do to optimize their content for search engines such as Google because of fundamental changes that have been made to the way searches are carried out.
It’s true that Google’s Hummingbird update has been a complete re-write of the search algorithm, but when it comes to SEO, “if your brand was doing [it] the right way in the first place – using quality content and amplifying it through genuine channels – then your brand won’t have been drastically impacted by these changes,” says Lee Mancini, managing director of search optimized content marketing agency, Sekari.
According to him, all Google has done is “wipe out the cheap spamming tactics used by many SEO agencies in the past to deliver low quality content and low quality high volume links to a website to increase its rankings.” Google has constantly strived to improve its platform so that searches deliver more accurate results to websites with authentic content.
The Hummingbird update has been a natural evolution of the search engine based on how people Google today. With more users going mobile and shifting towards voice-activated search, the nature of their queries has changed. “They tend to ask longer questions, such as ‘how do I get there’, or ‘where can I find this’. Google now responds with far more pertinent answers that are not only drawn from websites with certain key phrases in them, but the search engine actually ‘thinks’ and connects knowledge to the answer.” Mancini explains that search focuses on finding meaning behind a question, so that when people search for ‘what’s the closest place to buy an iPhone 5,’ Google knows to look for a bricks and mortar store in Dubai and that ‘finding an iPhone’ may also require price comparison among multiple stores.
What impact has this had on how SEO? For one thing, it’s more important than ever. Only recently, Define Media’s Marshall Simmonds said that a review of 48 billion page views across 87 sites showed that search is still driving 41% of page views, compared to just 16 % from social.
At the same time, simply using key words to build links doesn’t work anymore. Instead, “agencies need to be thinking about the topic and how topical silos of content can interconnect seamlessly together to provide a network of deep and valuable content,” says Mancini.
Guest blogs, which were all the rage in 2013, have also been devalued – a fact that was highlighted by blogger Matt Cutts in January when he said “Stick a fork in it: guest blogging is done.” Regionally this has been even more problematic, according to Mancini, seeing as there is already a lack of content in Arabic and a lot of the material out there is in the form of blogs and forums. He explains that even though guest blogs have less value in terms of link building, this doesn’t mean they have no value. “If the story is good enough, that’s an important enough reason to still do it.”
What works in today’s content marketing world? People are increasingly looking for dynamic content and to be engaged in a conversation. “Brands have to become publishers in the sense and to look at their website as a more dynamic opportunity to deliver content on a regular basis,” explains Mancini. However, rather than creating content in silos and placing it on a blog or sending it out through an email, there has to be an integrated approach so that the brand emerges as an authority on a particular topic, which will help drive traffic back to the website. Social media can be useful in promoting content, but that should just be one aspect of a structured content outreach program.
In the MENA region, there are over 100 million searches and 50% of these queries are in Arabic, notes Mancini. “However, only one in four top 100 brands have Arabic content. This represents a lot of potential for brands that are prepared to invest in content and a structured approach to maximizing returns from that content.” There is also an ever-growing need to localize content even further and even in Arabic, there are differences in the markets in Saudi Arabia, Jordan and UAE.
Despite the evolution of how SEO is approached, brands continue to gain value by optimizing content to drive more traffic. Undoubtedly content plays a huge role in this new era of marketing, but it’s more than apparent that SEO is far from dead. | 科技 |
2016-40/3982/en_head.json.gz/13910 | Values drop for solar power certificates Share
By Jose Martinez
Globe Correspondent | 01.17.13 | 12:00 AM
The push to add more solar power to the electrical mix in Massachusetts has been so successful that one of the key financial incentives from the state — bond-like certificates earned by generating energy with solar panels — has taken a hit in the marketplace.
Robert Scherer has generated four solar renewable energy certificates since installing solar panels on his Ashland home in October 2011, but so far has sold only one — for $204, far below the $500 to $600 he had been expecting.
“It is interesting that this market seems to be a victim of its own success,” Scherer said. “People have to have their eyes open when they come into the market.”
Solar installations like Scherer’s earn Solar Renewable Energy Certificates for every 1,000 kilowatt-hours of electricity they generate, and electric utilities are required by law to buy enough solar certificates to prove solar power generates a certain percentage of their retail sales.
Selling the certificates to utilities is a way for people with solar installations to recoup some of the cost of installing the equipment. The certificates provide an incentive.
The problem, according to solar energy proponents, is that the number of solar installations and certificates have grown so much in recent years that the value of the certificates is greater than the amount that utilities are required to buy. The situation undercuts the value of individual certificates, and in a worst case, can make a certificate unmarketable.
The oversupply of certificates, and the resulting drop in trading prices prompted the state Department of Energy Resources to schedule its first auction of unsold certificates, to be held later this year.
The solar credit clearinghouse auction is unique to Massachusetts.
Prices are fixed at $285 per credit — or $300 minus a $15 fee — and bidders bid for blocks of certificates. Unsold certificates will be returned to sellers with a life span extended to three years. Certificates are generally good only for the year in which they were generated.
“The SREC auction has not been held before, since it is only applicable when there is an oversupply. Otherwise people find buyers,” said Dwayne Breger, director of the state agency’s Renewable & Alternative Energy Development Division. “We haven’t had an auction yet, but the [department] is prepared to hold an auction and an auctioneer has been hired to hold one.”
Breger said the state will open an account for unsold certificates to be deposited between May 16 and June 15, with the auction to be held in three rounds in July to coincide with the end of the trading season for the 2012 certificates. Anticipation of the inaugural solar credit clearinghouse auction has led to uncertainty in the market, according to Brad Bowery, chief executive officer at San Francisco-based SRECTrade Inc. “I think the nervousness is based on whether or not the auction clears. If it clears this year and the SRECs are sold at that fixed price, a lot of those concerns will go away,” Bowery said. “There are fears the SRECs will not clear and will be dumped back onto the market. Everyone has a different opinion on it. My opinion is we will know a whole lot more when it happens.”
The state’s requirement that utilities buy solar certificates grows with the supply of solar power, and the energy department sets the requirement at the end of August. For last fiscal year, utilities were required to purchase the equivalent of 81,559 megawatt-hours of solar power.
For this year, the power companies will be required to purchase certificates equivalent to 135,495 megawatt-hours, Breger said.
But SRECTrade’s Bowery said the state calculations are based on 2-year-old data, which leads to a cycle of oversupply and undersupply that will balance itself out over time. Meanwhile, he sees a market with some sellers holding out for the auction price, others willing to sell now at a discount, and buyers in no hurry to do anything until they have to.
“We are going to learn a lot more when they run that last-chance auction,” Bowery said.
The solar certificate trading market opened in Massachusetts in 2010 at a time when there was an undersupply of certificates, prompting the state to establish a ceiling for trading prices that kept certificates trading in the $500 range until last year, when suddenly there was an overabundance of commercial and residential solar projects generating electricity and earning certificates.
Massachusetts prices bottomed out at $175, Bowery said, but rebounded to $206 in most recent trading. He doubts the prices will rise much above $285 any time soon. The price drop caught Ned Ligon by surprise. The retired Latin teacher had planned to pay off the $27,000 solar energy system installed on his Wrentham home in 2010 within seven years, based on federal and state tax credits and rebates, and his quarterly projections of $500 certificate sales. The tax credits took care of the first $13,000, and for a while the solar certificates seemed to be paying off as Ligon had hoped, he said. He received checks for $470.58 and $497.55 in 2011, and two checks for $502.20 last year. However, the most recent check from his broker was for $204.60. “I was happy how it was going and how I was told it was going to go, but, whoa, the bottom dropped out,” Ligon said.
His rooftop array of 18 solar panels was installed by SunBug Solar of Somerville, where company vice president Ben Mayer said homeowners are given projections based on ranges of solar certificate prices that include what has been mistakenly seen as a state-established floor price of $285 — the fixed price for the clearinghouse auction that is ramping up for its debut in July.
“From the point of view of SunBug, our model is based on long-term stable growth,” Mayer said. “I love the wonderful things that have happened in Massachusetts in terms of solar adoption. I’m not so interested in explosive growth. I’m not interested in a boom-bust.”
A solar boom is exactly what Massachusetts has seen since 2007, with solar power installations growing from a total of just 4 megawatts to 194 megawatts of power today, according to the Massachusetts Clean Energy Center.
Mayer and Bowery both said the state is doing a good job keeping the Massachusetts market balanced at a time when other states, like New Jersey and Pennsylvania, have seen more dramatic swings in solar certificate pricing. New Jersey, the biggest certificate market in the country, has seen prices peak at $680 and drop as low as $70, while Pennsylvania has swung from $330 to just $10, Bowery said. Most recent trading has seen solar certificates fetching $90 in New Jersey while Pennsylvania prices are just $12. “I don’t think SREC fluctuation will be a long-term problem. We are experiencing a first-ness, for lack of a better term,” Mayer said. “I’m psyched for August, when we will all see how this plays out. It will stabilize things for long-term growth.”
As for Ligon, a self-described “spreadsheet nut,” he still considers himself to be ahead of the game. He has records of his electric bills dating back to January 1988, when his family moved into the 2,600 square-foot home. Most years, they paid about $700 for electricity, Ligon said, but he and his wife, Barbara, ended up with credits of $128 in 2011 and $138 last year. The empty-nesters donated the profits to their church, since by law they cannot get paid cash by the electric company for credits earned by generating excess power. “My records show we produced over 9,000 kilowatt-hours. That is nine SRECs. I got reimbursed for five of them. I have four more out there waiting to be cashed in,” Ligon said. “I guess I have to be patient and keep my faith in the broker that they are going to do what it takes to get us the best price. They have indicated they are going to be putting the SRECs into the auction.”
Jose Martinez can be reached at Martinezjose1@mac.com. Most Popular | 科技 |
2016-40/3982/en_head.json.gz/14097 | NOAA Scientists Looking At Link Between Flooding And Climate Change
Filed Under: 100-Year Flood, Boulder, Climate Change, Drought, Everyday Earth, Flooding, Klaus Wolter, National Oceanic And Atmospheric Administration, NOAA, September Flooding, United Nation
CBS4's Shaun Boyd talks with scienttist Klaus Wolter with the NOAA (credit: CBS)
BOULDER, Colo. (CBS4) – A United Nation’s panel on climate change says the changes can be blamed on human activity.
Top scientists from around the world say people are mostly to blame for rising temperatures since 1951. Some of the best weather and climate scientists in the world are based at the National Oceanic and Atmospheric Administration in Boulder.
It’s ironic and convenient they are located in Boulder where the flood did some of its worst damage. They are analyzing all the data to determine whether this is truly the 100-year flood or brought on by climate change.
When the rain started Sept. 1 even the best scientific minds didn’t see what was coming.
“I thought maybe 2 inches, maybe 4 inches,” said Klaus Wolter, a CIRES scientist at NOAA.
Wolter lives 3 miles above Jamestown. He drove through the town just 3 hours before the flood hit.
“It’s very pleasant to fall asleep with the rain pattering on the roof and then we had this ugly awakening Thursday morning,” he said.
Every researcher is trying to figure out what happened and what role climate change may have played. It’s too early to know.
“This is the $64 billion question. We have certainly observed that in the last couple of years, not just in the U.S., but globally, we have had an increase in these events, but again the link to climate change is uncertain,” Wolter said.
What he does know is Colorado has seen the same weather pattern before. Weather data from 1938 — the year of another devastating flood — is very similar.
“Mother Nature has it up its sleeve. The more tricky question is, are we going to see more of that? I really think in my lifetime it could happen again,” he said.
Wolter said he hopes the next time it happens it’s winter and Colorado gets snow and not floods and mudslides. He said it looks good for an early ski season.
On Thursday Wolter received an award from the governor’s office for his research on drought. He said floods and droughts often follow each other.
Colorado Floods: How To Help
The recent floods are impacting families and communities throughout Colorado, so CBS4 has compiled a list of ways you can support the local communities impacted by the floods. | 科技 |
2016-40/3982/en_head.json.gz/14238 | Detroit everyman uses DIY moxie to turn his town into a solar mecca
on Oct 6, 2011 Share
Dave Strenski, resident of Detroit exurb Ypsilanti, got it into his head that he would help the local food co-op reduce its bills by installing solar panels on its roof. And he didn't let his complete lack of experience with solar stand in the way. At this point, he's not only put solar on the roof of his co-op and four other buildings, he's also created his own system for monitoring its power output, and has turned his website into a hub for solar DIYers worldwide. | 科技 |
2016-40/3982/en_head.json.gz/14239 | William Shutkin reviews Bronx Ecology and Tilting at Mills
By William Shutkin
on May 14, 2003 Share
These are tough times for environmentalists, what with the Bush administration’s frontal assault on environmental policy, drastic funding cuts and layoffs in state environmental programs, and the aftermath of a war in Iraq fought, in the opinion of many, over our nation’s undying addiction to oil. It’s thus fitting, if somewhat disheartening, that along come two books whose central message is that it’s not easy being green, no matter what the circumstances.
Bronx Ecology By Allen Hershkowitz Island Press, 200 pages, 2002
Allen Hershkowitz’s Bronx Ecology: Blueprint for a New Environmentalism and Lis Harris’s Tilting at Mills: Green Dreams, Dirty Dealings, and the Corporate Squeeze are companion accounts of what happens when an environmentalist, armed with missionary zeal and more than a dash of ego, meets the gritty political reality of New York’s ecologically devastated South Bronx. The environmentalist in question is Allen Hershkowitz, who, as a senior scientist with the Natural Resources Defense Council, spent the 1980s advocating for tougher laws to deal with the country’s mounting solid waste problems.
Then one day in 1992, a new solution dawned on him. He would build the Bronx Community Paper Company, a state-of-the-art paper-recycling mill, on an abandoned, polluted rail yard in the rough-and-tumble Mott Haven/Port Morris section of the Bronx. The facility would not only provide 600 permanent jobs in an area with unemployment rates as high as 75 percent, but would also be a model of “green” development, transforming a 30-acre brownfield site into a low-emission, high-efficiency recycling plant that would help deal with the 10,000 tons of waste paper produced every day by residents and businesses in New York City.
To sell his vision, Hershkowitz needed a community partner to provide credibility and help grease the necessary wheels. That entity was the Banana Kelly Community Improvement Association, a nonprofit developer in the South Bronx, which quickly embraced a partnership with Hershkowitz. With Banana Kelly on board, NRDC moved forward aggressively with the project, retaining Maya Lin, the famed architect of the Vietnam War Memorial, to design the facility and enlisting the support of then-President Clinton, who praised the project in his 1996 book, Between Hope and History.
A sketch of the never-built Bronx Community Paper Plant.
But if the story sounds too good to be true, that’s because it was. The South Bronx Clean Air Coalition, an ad hoc community group, challenged the paper-plant proposal, claiming that pollution from the facility would exacerbate the area’s already poor air quality, another entry in a long catalog of environmental injustices. The coalition had other plans for the site, including reviving the long-defunct rail yard as a bustling inter-modal transportation center.
In 1997, after several years of litigation, a New York appeals court ruled that NRDC and Banana Kelly could proceed with the project. The legal victory, however, was not sufficient to salvage the proposed plant. Mounting costs, fraying relationships between Banana Kelly, NRDC, and their backers, and other problems sunk the endeavor in 2000. NRDC vowed never again to try to play the role of developer; Banana Kelly became the subject of intense media scrutiny as the shady financial dealings of its high-profile director, Yolanda Rivera, and her deputies came to light; and Hershkowitz’s lofty vision lay in shambles.
Hershkowitz, ever the technocrat, is a better analyzer than storyteller, and thus his account of the debacle reads more like a sustainable-development handbook than a human-interest story. Bronx Ecology does a good job of spelling out the technical and public policy issues underlying the mill proposal, such as industrial ecology (IE). IE is a new model for development that, as its name implies, unites environmental and economic goals by advancing a set of design and production methods that mimic the reuse and replenishing functions of natural systems, resulting in the prevention — rather then the mere control — of pollution and waste. Concepts like IE can be complex, and Hershkowitz does the reader a service by exploring them in a way that is at once thorough and easily understood. Adding to the book’s readability is his own genuine and infectious enthusiasm for the more technical subject matter.
Tilting at Mills By Lis Harris Houghton Mifflin, 241 pages, 2003
Harris, a Columbia writing professor and former New Yorker contributor, is a good storyteller — to a fault. She’s too fawning in her appraisal of Hershkowitz and his NRDC colleagues (save for one, an African-American staffer who Harris takes to task for her conflicting allegiances to NRDC, on the one hand, and the community groups, on the other). She lays the blame for the project’s collapse entirely at the doorstep of community groups and the paper mill’s financiers, portrayed as preternaturally inept and rapaciously greedy, respectively. But Harris is too facile in her analysis of community politics and the challenge of urban redevelopment; this story is not reducible to a simple morality tale. There are too many variables in play, from complex financial arrangements and long-standing strife between stakeholder groups, to arcane regulations and uncertain environmental risks.
What Hershkowitz’s and Harris’s accounts have in common is that they both fall short in helping the reader understand the larger meaning of the South Bronx tale, which is nothing less than the story of U.S. environmentalism at the beginning of the 21st century. Despite the movement’s proud tradition and many successes, American environmentalism needs a new vision, especially given today’s political climate. As the paper-mill story suggests, environmentalists have learned that they must expand their focus beyond their core, upper-middle class, white constituency and its concern for parks and wilderness to include people of color, the poor, and the inner city. At the same time, environmentalists are beginning to realize that they must do more than simply stop unwanted development, or point out when someone else is doing something wrong. As NRDC tried to do with the South Bronx effort, environmentalists should actively pursue strategies that deliver not only environmental benefits but also jobs and community dollars.
No small feat. As the paper-mill saga shows, trying to bring together communities and cultures, environmental restoration and jobs, is a tall order, even when the initial idea is a good one. For their part, environmentalists at present simply aren’t well adapted to working with diverse constituencies, much less playing the part of industrial developer. The result can be missteps, miscommunication, and mistrust of the kind that befell Hershkowitz and his partners. Most community residents, meanwhile, are too quick to respond to new development proposals with “not in my backyard” reflexes, with the result that even the best of proposals can fall prey to misguided claims or dissembling.
Undertaking a project as ambitious as the South Bronx paper mill might well have been quixotic, as the title of Harris’s book suggests. But transforming two centuries’ worth of industrial development, and the environmental movement bent on thwarting it, won’t happen overnight. Like natural systems, the process will be slow, incremental, and non-linear — a series of failures and mistakes and small successes over time. Such is the nature of evolution, and of social change, which starts with a vision and gets harder from there. But it can be done. It must be done. The future of the South Bronx, if not the planet, depends on it. | 科技 |
2016-40/3982/en_head.json.gz/14384 | Facebook passes Exxon to become 4th most valuable company in the world
Facebook CEO Mark Zuckerberg speaks at Facebook in Menlo Park, Calif., Sunday, Sept. 27, 2015.Image: Jeff Chiu/Associated Press
By Seth Fiegerman2016-02-01 17:32:26 UTC
Tech is in. Oil is out.
Facebook overtook Exxon Mobil on Monday to become the fourth most valuable company in the world by market capitalization, pushing the oil and gas giant behind not one but four technology giant in the rankings. The social network's market value surged by tens of billions of dollars in recent days after posting a flawless earnings report last week showing continued gains in users and advertising revenue on smartphones and tablets. That same surge has pushed CEO and founder Mark Zuckerberg up in the rankings of the world's wealthiest people. Exxon, once at the top of the list, has seen its fortunes decline in recent months due to a glut of oil in the market pushing prices down and forcing layoffs throughout the industry. It is now wrestling with Warren Buffett's Berkshire Hathaway to be one of the top five American company by market capitalization. As of Monday morning, Facebook had a market value of $323 billion, beating out Exxon Mobil's $316 billion market cap, according to data provided to Mashable by FactSet, a financial research firm. FB Market Cap data by YCharts
The new top four, in order: Apple, Alphabet (Google), Microsoft and Facebook. Or FAAM for short, if we're looking for acronyms.
Market caps and stock prices are fickle and ever-changing, but it is more than just a leaderboard of vanity stats. It indicates how a broad swath of Wall Street investors feel about a company's future business growth potential. Facebook, written off by many investors immediately after going public for fear it would not be able to make money off of smaller screens, is now seen as the rare fast-growing technology company that bucks global headwinds and beats Wall Street estimates for revenue growth time and time again. It is increasingly being viewed by investors the way Apple once was — and its value is rising accordingly. Apple, on the other hand, suddenly seems mortal. Its 13-year winning streak is ending for now as iPhone sales slowdown. And there is no other clear product in the pipeline to replace its sagging revenue growth. Even its $200 billion cash pile can't seem to convince investors of a bright future ahead.
For that reason, Apple is now on the cusp of losing its title as the world's most valuable company to.... Google. Google, fresh with a new name and organizational structure, is seen as having numerous potential products in the pipeline that could fuel growth, a founder as CEO that is willing to make the bets to bring those to fruition and a newly hired forceful CFO to rein in expenses along the way to keep the company from going off the rails. If Google impresses Wall Street with its earnings report on Monday afternoon, it may just be the most valuable company in the world by the time you wake up on Tuesday.
Apple, Business, company news, Facebook, Google, stocks | 科技 |
2016-40/3982/en_head.json.gz/14395 | Understanding personality for decision-making, longevity, and mental health
January 17, 2013 Extraversion does not just explain differences between how people act at social events. How extraverted you are may influence how the brain makes choices – specifically whether you choose an immediate or delayed reward, according to a new study. The work is part of a growing body of research on the vital role of understanding personality in society.
"Understanding how people differ from each other and how that affects various outcomes is something that we all do on an intuitive basis, but personality psychology attempts to bring scientific rigor to this process," says Colin DeYoung of the University of Minnesota, who worked on the new study. "Personality affects academic and job performance, social and political attitudes, the quality and stability of social relationships, physical health and mortality, and risk for mental disorder."
DeYoung is one of several researchers presenting new work in a special session today about personality psychology at a conference in New Orleans. "DeYoung's research in biology and neuroscience aids in the development of theories of personality that provide explanations for persistent patterns of behavior and experience," says David Funder of the University of California, Riverside, who is the new president of the Society for Personality and Social Psychology (SPSP). "The researchers presenting at this session represent just what personality psychology can achieve and its relevance for important social issues – from how personality affects health to guidance for the new DSM-5."
Personality to understand neural differences
In the new study, DeYoung and colleagues scanned people in an fMRI and asked them to choose between smaller immediate rewards or larger delayed rewards, for example $15 today versus $25 in three weeks. They then correlated their choices and associated brain activity to various personality traits.
They found that extraversion predicts neural activity in a region of the brain called the medial orbitofrontal cortex, which is involved in evaluating rewards. In the task, this region responded more strongly to the possibility of immediate rewards than to the possibility of delayed rewards. "This is a brain region where we have previously shown that extraversion predicts the size of the region, so our new study provides some converging evidence for the importance of sensitivity to reward as the basis of extraversion," DeYoung says.
More broadly, DeYoung works on understanding "what makes people tick, by explaining the most important personality traits, what psychological processes those traits represent, and how those processes are generated by the brain," he says. "The brain is an incredibly complicated system, and I think it's impressive that neuroscience is making such great progress in understanding it. Linking brain function to personality is another step in understanding how the brain makes us who we are."
Personality to improve health
Researchers are also finding that personality influences health over time. In particular, new lifespan models that measure both personality and health early and late in life, and multiple times in between, are documenting that health is the result not only of genetics and environmental factors but also of changeable personality characteristics.
"Personality develops in childhood and is probably most malleable in childhood," says Sarah Hampson of the Oregon Research Institute. Childhood is when habits first become established, so understanding how differences in personality affect health could point toward positive behaviors that would help children later in life.
For example in a new study, soon to be published in Health Psychology, Hampson and colleagues found that children lower in conscientiousness – traits including being irresponsible and careless – had worse health 40 years later, including greater obesity and higher cholesterol. The study builds on past work showing that more conscientious children live longer.
The data come from more than 2,000 elementary school children in Hawaii who received personality assessments in the 1960s. Funded by the National Institute of Mental Health and the National Institute of Aging, researchers were able to complete medical and psychological examinations for 60% of the original group, who, as adults, agreed to further studies starting in 1998. They found that the children rated by their teachers as less conscientious had worse health status as adults, particularly for their cardiovascular and metabolic systems.
The work could point the way to childhood interventions, Hampson says. "Parents and schools shape personality, and this is our opportunity to support the development of conscientiousness – planfulness, ability to delay gratification, self-control." She adds: "Society depends on such pro-social, self-regulated behavior."
Personality to evaluate mental health care
In the mental health community, researchers have known for some time that personality can greatly influence how patients respond to particular treatments. But until recently, the guidebook for treating mental illnesses – the Diagnostic and Statistical Manual of Mental Disorders (DSM) – has not fully incorporated such personality data.
"The influence of personality psychology has increased as it offers tools and methods that are relevant to solving problems in psychiatric classification, such as ways of developing models of differences among people that are based on data as opposed to clinical speculation," says Robert Krueger of the University of Minnesota, who helped update the soon-to-be-published DSM-5.
"DSM-5 contains a model of personality traits that derives from work in personality psychology and recognizes that specific peoples' personalities can't easily be placed in categorical boxes," he explains. Using this model, a therapist can better tailor treatments for depression, for example, by distinguishing between a patient who is generally agreeable versus one who it typically at odds with other people. "The first person is likely to form a good working relationship with the therapist, whereas the second person is likely to be more challenging and require more effort by taking personality features into account alongside 'particular conditions,'" Krueger says.
The DSM-5 thus shows how personality psychology can be directly applied to mental health issues, Krueger says. "Indeed, DSM-5 may prove to be a watershed moment in the history of psychiatric classification because, more so than ever in the past, its construction was influenced by the methods and findings of personality psychology," Funder says.
Explore further: Men and women have major personality differences
Society for Personality and Social Psychology
Men and women have major personality differences
Men and women have large differences in personality, according to a new study published Jan. 4 in the online journal PLoS ONE.
Study links personality changes to changes in social well-being
(Medical Xpress)—Researchers report that changes in social well-being are closely tied to one's personality, with positive changes in one corresponding to similar changes in the other. Their study reveals potential new ...
Study looks more closely at personality disorders
A newly published paper from Rhode Island Hospital argues against the proposed changes to redefine the number of personality disorders in the upcoming Diagnostic Statistical Manual, 5th edition (DSM-5). In their study, the ...
Caregiver personality traits can affect health
(Medical Xpress) -- Taking care of an aging or disabled loved one can be hazardous to your health. But certain personality traits appear to reduce caregivers' risk for health problems, reports a new Cornell study.
Having a happy spouse could be good for your health
Having a happy spouse may be related to better health, at least among middle-aged and older adults, according to a new study published by the American Psychological Association.
Psychics help psychiatrists understand the voices of psychosis
People with psychosis are tormented by internal voices. In an effort to explain why a Yale team enlisted help from an unusual source: psychics and others who hear voices but are not diagnosed with a mental illness.
Alcohol shown to act in same way as rapid antidepressants
Can having a few drinks help people with clinical depression feel better?
Emergency situations amplify individual tendencies to behave egoistically or prosocially
In emergency situations do people think solely of themselves? In a study published in Scientific Reports, researchers at the Max Planck Institute for Human Development have shown that readiness to help depends heavily on ...
Researchers have long known that emotional disorders have a lot in common. Many often occur together, like depression and social anxiety disorder. Treatments also tend to work across multiple disorders, suggesting shared ... | 科技 |
2016-40/3982/en_head.json.gz/14420 | Android Security Issue Naively Puts Apple in Positive Light
by Michael Essany
Last night, VentureBeat reported on the findings from mobile security firm Lookout that a suspicious Android mobile wallpaper app, which has been downloaded in excess of one millions times, has been found to collect and ship personal data off to a "mysterious site in China." As a result, there's a mood growing within the blogosphere of Apple fans that this sort of thing would be far less common in the Apple-family of iDevices and suggests further that the recent concerns about iPhone and iPad security glitches were vastly overblown.
Not so fast.
As highlighted by Lookout Chief Executive Officer John Hering and Chief Technology Officer Kevin MaHaffey, no smartphone, mobile platform or operating system is immune to the growing security threats in the world of smartphones. "That means that apps that seem good but are really stealing your personal information are a big risk at a time when mobile apps are exploding on smartphones," said Hering at the Black Hat security conference in Las Vegas on Wednesday. “Even good apps can be modified to turn bad after a lot of people download it,” MaHaffey added. “Users absolutely have to pay attention to what they download. And developers have to be responsible about the data that they collect and how they use it.”
The app that caused all the trouble originated from "Jackeey Wallpaper," which was uploaded to the Android market for the ostensible purpose of enabling users to pretty-up their phones running the Google Android operating system. And it's not some cheesy app either. According to the VentureBeat report, the app delivered branded wallpapers from the likes of "My Little Pony" and "Star Wars." But the Lookout report found that this particular app "collects a user’s browsing history, text messages, your phone’s SIM card number, subscriber identification, and even your voice mail password. It sends the data to a web site that... is evidently owned by someone in Shenzhen, China." While even the staunchest Apple fans and critics should not wish for these security threats to manifest on any mobile platform, there is a sigh of relief (perhaps naively so) in the Apple community as a result of these findings, which hold that nearly half of all the Android apps analyzed used third party code, while less than one quarter of the studied iPhone apps did the same.
Hering said in a press conference afterward that he believes both Google and Apple are on top of policing their app stores, particularly when there are known malware problems with apps. But it’s unclear what happens when apps behave as the wallpaper apps do, where it’s not clear why they are doing what they are doing.
Lookout logged data from better than 100,000 free Android and iPhone apps for this particular project, one that aims to "analyze how apps behave." | 科技 |
2016-40/3982/en_head.json.gz/14449 | ElectronicsPlasmonic nanostructures could prove a boon to solar cell technologyDario BorghinoSeptember 15th, 20132 picturesDo "plasmonic nanostructures" hold the key to next-generation solar power? (Photo:Shutterstock/Pavelk)View gallery - 2 imagesResearchers at the University of Pennsylvania have found a way to harvest energy from sunlight more efficiently, with the help of so-called plasmonic nanostructures. The new findings suggest that plasmonic components can enhance and direct optical scattering, creating a mechanism that is more efficient than the photoexcitation that drives solar cells. The development could therefore provide a real boost to solar cell efficiency and lead to faster optical communication.When photons hit the surface of a solar cell, the energy they carry is absorbed by the atoms of a doped semiconductor. If the energy absorbed is higher than a set threshold, known as the energy gap, then electrons are set free and can be used to generate electricity.Theoretically, the energy gap can be manipulated to maximize the number of electrons that will be set free by a photon; but setting this threshold isn't straightforward, because some photons carry more energy than others.Photons in the infrared typically don't carry enough energy to knock electrons off a silicon atom. Red photons carry just enough energy to knock down a single electron, and photons in the blue spectrum and beyond carry enough to knock off one electron, but the rest of the energy is wasted as heat. This large amount of wasted energy compromises solar cell efficiency.Plasmonic nanostructures could significantly enhance the efficiency of solar cells (Image: University of Pennsylvania)Building on their previous work, Prof. Dawn Bonnell and colleagues have now demonstrated that there is another way to harvest energy from light – a method that has tested up to 10 times more efficient than conventional photoexcitation and that could greatly improve the efficiency of solar cells and optoelectronic devices that convert light signals into electricity.The University of Pennsylvania researchers focused on plasmonic nanostructures, materials made from arrays of gold nanoparticles and light-sensitive molecules of porphyin arranged in specific patterns.When a photon hits these structures, it generates an electrical current that moves in a direction controlled by the size and the layout of the gold particles. By controlling and enhancing the way light scatters across them, these nanostructures can transduce light into electricity more efficiently than was previously possible. The freed up electrons can then be extracted from the plasmons and used to power molecular electronic optoelectronic devices.Since their first results in 2010, the researchers led by Prof. Bonnell had suspected that their method could lead to significant increases in performance, but they couldn't prove it. Now, in this new study, they managed to do just that."In our measurements, compared to conventional photoexcitation, we saw increases of three to 10 times in the efficiency of our process," Bonnell says. "And we didn’t even optimize the system. In principle you can envision huge increases in efficiency.""Light impinges on an array of metal nanoparticles connected with optically active molecules, and the resulting current is detected across the array," Prof. Bonnell tells Gizmag. "This process can produce more electrons in principle. You can imagine building energy harvesting devices made of the nanoparticles and organic molecules, or you could envision putting the nanoparticles into a silicon solar cell."The nanostructures can be optimized for specific applications by changing the size and spacing of the nanoparticles, which would alter the wavelength of light to which the plasmon responds, in the same way that multi-junction solar cells are built to absorb photons of different wavelengths more effectively.Applications could include the more efficient transduction of optical signals (e.g. from fiber optics) into electrical signals and, of course, more efficient solar cell technology. "You could imagine having a paint on your laptop that acted like a solar cell to power it using only sunlight," Bonnell says.Promising as they may sound, we must remember that these results are largely theoretical; through this study the researchers have shown that generating electricity using plasmonic nanostructures can be more efficient than by using standard photoexcitation, but there's no telling how soon devices exploiting this principle could reach mass-production, or even what kind of actual efficiency gains they could bring.The results were recently published on the journal ACS Nano.Source: University of PennsylvaniaView gallery - 2 imagesNew AtlasPlasmonic nanostructures could prove a boon to solar cell technology1 / 2
#Nanostructures
#Energy
#University of Pennsylvania
#Nanoparticles | 科技 |
2016-40/3982/en_head.json.gz/14548 | World Environment News DOI proposes new development, protections in Arctic
Date: 14-Aug-12 Country: USA Author: Yereth Rosen
The Department of Interior on Monday proposed a mixture of new oil and gas development and environmental protections in a vast swathe of Arctic land.The department said its preferred alternative for managing the National Petroleum Reserve-Alaska calls for about half of the Indiana-sized land unit to be opened to oil and gas leasing. Other areas important to polar bears, seals, migratory birds and other wildlife would be protected from development.The proposed plan was welcomed by environmental activists but drilling supporters said they were unhappy."What we want to do is make sure that we don't mess it up," Interior Secretary Ken Salazar said at an Anchorage news conference.The 11.8 million acres that would be available for leasing hold an estimated 549 million barrels of economically recoverable oil and 8.7 trillion cubic feet of economically recoverable natural gas, according to the Department of Interior.It also allows for a pipeline to cross the reserve - even in designated protected areas - should commercial quantities of oil be discovered in offshore areas of the Chukchi Sea, Salazar said. Oil from the Chukchi would have to be transported overland to the Trans Alaska Pipeline System, he said. No pipeline route is selected, and details about a pipeline would be subject to future analysis, he said.Selection of a preferred alternative comes nearly four months after the BLM issued a draft management plan for the petroleum reserve. The draft plan was the first document issued by any government agency to outline a management strategy for the entire 23 million acre reserve, Salazar said. A final plan is expected to be issued later this year, he said.The reserve was established in 1923 by President Warren Harding. It was intended as a source of petroleum for the nation's military forces. Exploration efforts there date back to the 1940s, but there has never been any commercial production from the vast land unit.However, there have been recent oil discoveries in the northeastern section of the reserve, the area closest to existing oilfield infrastructure.ConocoPhillips and partner Anadarko Petroleum are planning development of a field called CD-5 that would provide the first-ever commercial production of oil from the reserveThe preferred alternative does not specify a leasing schedule. However, at the direction of President Obama, the Bureau of Land Management last year launched a program of annual lease sales in the northeastern portion of the reserve, considered the most feasible for development in the near future.Last year's lease sale drew $3.6 million, much of that from ConocoPhillips, which has been the most active company in the reserve. The BLM plans another lease sale in November.Environmentalists hailed the Interior's choice of a preferred management alternative."The secretary's proposed action is an important step in the right direction for all Americans, including Alaska Natives, sportsmen, and other conservationists who want to balance energy exploration with wildlife protection in Alaska's spectacular western Arctic," said Ken Rait, director of Pew's Western Lands Initiative.Drilling supporters said they were unhappy."Today, the Obama administration picked the most restrictive management plan possible," Senator Lisa Murkowski, an Alaska Republican, said in a statement. The plan would put "half of the petroleum reserve off limits," she said."This decision denies U.S. taxpayers both revenue and jobs at a time when our nation faces record debt and chronic unemployment," she said.(Reporting By Yereth Rosen; Editing by Michael Urquhart) | 科技 |
2016-40/3982/en_head.json.gz/14558 | Sleepy fruit flies provide clues to learning and memory
(Philadelphia, PA) � Researchers at the University of Pennsylvania School of Medicine have discovered that a brain region previously known for its role in learning and memory also serves as the location of sleep regulation in fruit flies. Through further examination of this brain structure, researchers hope to shed light on sleep regulation and its role in memory.
Despite its importance in everyday human function, very little is known about the regulation of sleep. In search of the underlying brain region responsible for sleep regulation, senior author Amita Sehgal, PhD, Professor of Neuroscience and a Howard Hughes Medical Institute (HHMI) Investigator, and colleagues turned their attention to the fruit fly.
"Fruit flies and humans share similar resting patterns," explains Sehgal. "Like humans, the sleeping states of fruit flies are characterized by periods of immobility over a twenty-four hour period, during which the fruit flies demonstrate reduced responsiveness to sensory stimuli."
By tinkering with the gene expression of multiple regions of the fruit fly brain, the research team was able to zero in on the adult mushroom body as the sleep center of the brain. They reported their findings in last week's issue of Nature.
To locate the brain region involved in sleep regulation, Sehgal manipulated the activity of an enzyme known as protein kinase A (PKA). Previous work in Sehgal's lab revealed that the higher the level of PKA activity, the lower the period of immobility, or sleep, in the fruit fly. By building upon this work, Sehgal and others set out to increase PKA activity in various regions of the brain and examine the subsequent sleeping patterns in the fruit flies. "Sleeping fruit flies" were defined as those that remained immobile for at least five minutes.
"From the beginning, we took the unbiased approach," explains Sehgal. "We targeted PKA activity to different areas of the fly brain to find out where PKA acts to regulate sleep."
Sehgal was able to selectively turn on PKA activity in a variety of brain locations, which promoted PKA expression in designated regions. Of the different regions targeted, only two regions, both present in the adult mushroom bodies, led to changes in sleeping patterns of fruit flies. The fly mushroom body has been likened to the human hippocampus. The changes in sleep caused by the increased PKA activity in the adult mushroom bodies highlighted this region as the sleep-regulating region of the fruit fly brain.
When PKA activity was expressed in one of the two distinct regions of the mushroom bodies, increased sleep occurred while expression in the other region decreased sleep in the flies. Thus, the adult mushroom bodies possess both sleep-promoting and sleep-inhibiting areas.
"Although people typically think of mushroom bodies as possessing similar functions to the human hippocampus, the site where long-term memories are made, our lab tends to think of the mushroom bodies functioning more like the thalamus, the relay station through which most sensory input to the brain is targeted," explains Sehgal. "Previous research links the thalamus to a role in human sleep." (There is no human structure that is anatomically similar to the adult mushroom bodies of fruit flies.)
Identifying the role of adult mushroom bodies in sleep may offer insight into how and why sleep is needed to assist in learning and memory consolidation. In mammals, sleep deprivation suppresses the performance of learned tasks, and sleep permits memory consolidation.
Distinct anatomical regions of adult mushroom bodies have been shown to be important for at least some forms of memory in fruit flies.
In a paper also published last week in Current Biology, Sehgal and colleagues showed that serotonin affects sleep in fruit flies by acting at the site of the adult mushroom bodies.
Sehgal's lab reduced the function of three types of serotonin receptors in the brains of fruit flies (5HT1A, 5HT1B, and 5HT2). The reduced 5HT1A receptor activity in the fruit flies led to fragmented and reduced overall sleep. In essence, the fruit flies tossed and turned in their sleep. But, the flies with reduced 5HT1B and 5HT2 receptor activity displayed no change in their sleeping pattern. Penn researchers were able to treat the fruit flies to a good night's sleep by administering serotonin to the adult mushroom bodies.
The finding that serotonin plays a role in increasing sleep in fruit flies offers hope for the future of therapeutics for sleep disorders. "Serotonin may also promote sleep in humans," suggests Sehgal. "This may explain why serotonin-increasing antidepressants increase sleep."
Future work by Sehgal's lab will attempt to look for a connection among sleep, serotonin, and learning, and memory, while looking deeper into the cellular and molecular activity that enables mushroom bodies to regulate sleep.
Coauthors of the Nature study are William J. Joiner and Amanda Crocker, both from Penn, and Benjamin H. White, from the National Institutes of Health. Coauthors of the Current Biology study are Quan Yuan and William J. Joiner, both from Penn. These studies were funded by the Howard Hughes Medical Institute, the National Sleep Foundation and by the National Institutes of Health.
This release and related images can also be seen at: www.uphs.upenn.edu/news.
PENN Medicine is a $2.9 billion enterprise dedicated to the related missions of medical education, biomedical research, and high-quality patient care. PENN Medicine consists of the University of Pennsylvania School of Medicine (founded in 1765 as the nation's first medical school) and the University of Pennsylvania Health System.
Penn's School of Medicine is ranked #2 in the nation for receipt of NIH research funds; and ranked #4 in the nation in U.S. News & World Report's most recent ranking of top research-oriented medical schools. Supporting 1,400 fulltime faculty and 700 students, the School of Medicine is recognized worldwide for its superior education and training of the next generation of physician-scientists and leaders of academic medicine.
The University of Pennsylvania Health System includes three hospitals [Hospital of the University of Pennsylvania, which is consistently ranked one of the nation's few "Honor Roll" hospitals by U.S. News & World Report; Pennsylvania Hospital, the nation's first hospital; and Penn Presbyterian Medical Center]; a faculty practice plan; a primary-care provider network; two multispecialty satellite facilities; and home care and hospice. | 科技 |
2016-40/3982/en_head.json.gz/14606 | You are here: SC Home » News » Featured Articles » 2013 » Complementary Chemistry and Matched Materials
Complementary Chemistry and Matched MaterialsBrookhaven Lab researchers use history’s most successful matchmaker to pair up particles and create new materials with desired properties.Enlarge Photo
Photo courtesy of Brookhaven National Laboratory
DNA linkers allow different kinds of nanoparticles to self-assemble and form relatively large-scale nanocomposite arrays. This approach allows for mixing and matching components for the design of multifunctional materials.
DNA may be history's most successful matchmaker. And recently, researchers at the Office of Science's Brookhaven National Laboratory (Brookhaven Lab) coupled the complementary chemistry of DNA with some serious science savvy to create a new method for pairing up particles; a technique that may lead to the creation of many new materials with great potential.
DNA consists of four chemical bases which match up in pairs of A-T and G-C. The matches are complementary and quite specific – for instance, A only pairs with T, almost never C or G. The same is true for the others.
Brookhaven Lab researchers, led by physicist Oleg Gang in its Center for Functional Nanomaterials (CFN), used that precise pairing ability to match up materials in new and predictable ways. Namely, the team attached single strands of synthetic DNA to tiny particles (nanoparticles) of a few different substances – including gold with palladium, iron oxide and a couple of others – trying a variety of different pairings. Those DNA strands (linkers) could only pair up with their complements – for instance, a strand of A, G, G, T would only pair with a strand of T, C, C, A – which meant that the particles to which those strands were attached would also be precisely matched.
That technique allowed researchers to pair up even seemingly incompatible particles—for example, ones that might typically experience competing forces such as electrical or magnetic repulsion. The attractive force drawing complementary DNA strands together overcame the resistance, causing the particles to assemble themselves into large, three-dimensional lattices. This approach allowed researchers to build new materials with specificity and predictability, and altering the length of the DNA linkers also allowed researchers to control other properties of the new materials such as surface density.
As a consequence, the new technique might save researchers some of the errors of a typical scientific trial – especially those involved with the search for new materials. Even more importantly, as Dr. Gang said, "It offers routes for the fabrication of new materials with combined, enhanced, or even brand new functions."
For instance, researchers might use the method to develop new switches and sensors, which could be used in everything from chemical detectors to combustion engines. Scientists at Brookhaven Lab are already developing nanoparticles that could serve as better catalysts for hydrogen fuel vehicles and reduce the carbon monoxide emissions of more conventional conveyances. So the new technique could eventually lead to newer and even better catalysts, which might prove essential in other transportation technologies.
Ultimately, researchers at Brookhaven Lab – and those across the Office of Science – hope to solve the grand challenge of designing and then creating new forms of matter with precisely tailored properties. Will they succeed? Perhaps one day. Discovery and innovation is what they do: You might say it's in their DNA.
The Center for Functional Nanomaterials is one of five DOE Nanoscale Science Research Centers (NSRCs), national user facilities for interdisciplinary research at the nanoscale, supported by the DOE Office of Science. Together the NSRCs comprise a suite of complementary facilities that provide researchers with state-of-the-art capabilities to fabricate, process, characterize and model nanoscale materials, and constitute the largest infrastructure investment of the National Nanotechnology Initiative. The NSRCs are located at DOE's Argonne, Brookhaven, Lawrence Berkeley, Oak Ridge and Sandia and Los Alamos National Laboratories. For more information about the DOE NSRCs, please visit http://science.energy.gov.
The Department's Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information please visit http://science.energy.gov/about. For more information about Brookhaven Lab, please go to: http://www.bnl.gov/world/.
Charles Rousseaux is a Senior Writer in the Office of Science, Charles.rousseaux@doe.science.gov.
Email link to: http://science.energy.gov/news/featured-articles/2013/11-12-13/?p=1 | 科技 |
2016-40/3982/en_head.json.gz/14614 | Sponsored content is a special advertising section provided by IT vendors. It features educational content and interactive media aligned to the topics of this web site.
Should You Adopt a Cloud First Strategy?
The Cloud’s Dirty Little Secret: Lack of Data Portability
A Deep Dive Into NetApp Cloud ONTAP for AWS
All-Flash Arrays
Data Center Infrastructure
Data Management for the Hybrid Cloud
Scale-out Enterprise Storage
Storage Trends and Industry Perspectives
DevOps in the Hybrid Cloud with Docker and NetApp
by Larry Freeman, NetApp
Free Download Cloud ONTAP Use Case: Cordant Group Learn More The concept of a “Cloud First” strategy began in the U.S. federal government and has since spread into the commercial sector. Is it a viable option for your organization? Here are some success stories worth considering: Cloud First was a mandate issued to all federal agencies by the chief information officer of the United States in December 2010. By 2012, a subsequent report to Congress showed that more than half of all federal agencies had adopted cloud computing for at least one application. The Cloud First mandate was a follow on to an earlier government IT initiative: the Federal Data Center Consolidation Initiative (FDCCI). To date, FDCCI has tracked 1,000 federal data centers that have closed or are scheduled to close in 2014; many of these closures are a direct result of cloud-delivered applications. Despite the apparent government success, Cloud First has its share of detractors, including those critical of the trouble-prone launch of HealthCare.gov. Detractors quote studies that indicated that agencies don’t seem to move to the cloud fast enough. They also tend to blame slow adoption on a lack of federal technical expertise in cloud deployments. Cloud Success in the Government Sector By now, there are many examples of productive cloud deployments within the federal government, including the following examples cited by MeriTalk: Bureau of Alcohol, Tobacco, Firearms and Explosives (ATF): Cloud platforms don't have to be implemented at huge agencies to be productive. The ATF, with 4,700 employees, was able to save $1 million a year after it moved email to the cloud. U.S. Department of Agriculture (USDA): This department with more than 100,000 employees also moved email to a public cloud. At the same time, it ran an internal private cloud enhanced by other vendor solutions. In its first three years, this internal/external cloud construct saved the USDA $75 million. The organization anticipates eventually reaching $200 million in savings. The Recovery Accountability and Transparency Board (RATB): The RATB was created by the American Recovery and Reinvestment Act of 2009 to track stimulus spending. After migrating to Infrastructure as a Service on a public cloud, the RATB was able to save $854,800 in two years. DCIG 2013 Private Cloud Storage Array Buyer's Guide Learn More
Cloud Success in the Commercial Sector In the commercial sector, one company that made a bold move into the cloud is Veyance Technologies, a 6,000-person manufacturing firm based in Ohio. Typical for a company its size, the Veyance IT group supported about 120 applications running a variety of operating systems on multiple servers in various hosted locations. According to CIO John Hill, Veyance wanted to move away from its siloed, multi-local architecture. The following diagram, while not specific to Veyance, represents a traditional IT infrastructure and illustrates how application and data management silos can develop. Source: NetApp
According to Hill, Veyance IT wanted to replace the silos to enable the company to operate as a “true global entity.” Knowing that a traditional IT infrastructure would limit this ability, Hill embarked on his own type of Cloud First campaign. He began to move Veyance applications to Virtustream, a cloud-based service provider that specializes in business applications such as SAP®. He used every form of application migration available: virtual-to-virtual, physical-to-physical, physical-to-virtual, and even brute-force heterogeneous data migration. Dozens of vendors and nearly 100 workers accomplished this feat in less than seven months, on budget and with no unplanned downtime. The full cloud migration delivered impressive results: a 30% reduction in hosting costs and reduced equipment, licensing and labor costs—all with a substantial capacity increase. More important, when the information silos were removed, Veyance was able to streamline and coordinate global operations and improve overall productivity and collaboration. Guidance for Moving Forward in the Cloud Such cloud use cases, in both the federal and commercial sectors, lend credence to embracing a Cloud First strategy. If you are contemplating a similar path to the cloud, here are some tips from Hill’s experience at Veyance: Inventory is key: Hill stresses the need to “know your environment.” This includes performing an extensive inventory to define dependencies or unknowns. Choose providers with care: Make sure your service provider is “enterprise-focused,” and provides “robust support” for various enterprise cloud computing scenarios, according to Hill. Look for expertise in migration and colocation: Work with providers who can manage complex migrations and can colocate workloads that can’t be moved to the cloud. Standardize: Veyance successfully standardized on server, storage and networking technologies that were both highly scalable and high-performance. Cloud First is one approach to embracing the cloud. There are many other paths to cloud success. Whichever path your company takes, having the right partners is key. For more information about how NetApp and its partners helped Veyance succeed in its transition to cloud services, watch this short video or download the case study. © 2014 NetApp, Inc. All rights reserved. No portions of this document may be reproduced without prior written consent of NetApp, Inc. Specifications are subject to change without notice. NetApp, the NetApp logo and Go further, faster, are trademarks or registered trademarks of NetApp, Inc. in the United States and/or other countries. Click here for a full listing of NetApp trademarks. All other brands or products are trademarks or registered trademarks of their respective holders and should be treated as such. Show more Latest TechTarget resources | 科技 |
2016-40/3982/en_head.json.gz/14627 | Newsletter: BRDI told me...
Contact UsBoard on Research Data and InformationPolicy and Global Affairs DivisionThe National Academies of Sciences, Engineering, and Medicine500 Fifth Street, NWWashington, DC 20001USAEmail: gstrawn@nas.eduPhone: (202) 334-2616
U.S. – China Roundtable on Scientific Data Cooperation STATEMENT OF TASK
U.S. National Committee for CODATA Board on International Scientific Organizations, National Academy of Sciences, U.S. and Chinese National Committee for CODATA International Cooperation Bureau, Chinese Academy of Sciences The U.S. National Committee for CODATA and the Chinese National Committee for CODATA, under the auspices of their respective Academies of Sciences, will organize a U.S. – China Roundtable on Scientific Data Cooperation, which will convene a series of meetings over an initial four-year period pursuant to the following Statement of Task:
Provide a unique bilateral forum for government, academic, and private-sector stakeholders in the United States and China to discuss and address scientific data practices and policies, pursuant to a mutually agreed agenda.
Serve as a catalyst and coordinating body for bilateral cooperation on scientific data practices and policies at the Academy and national level in each country, with appropriate recognition and representation of other thematically related bilateral and international activities.
The four areas identified for framing the scope of discussion include: data policy, cyber infrastructure data applications, health and biomedical data, and environmental and geospatial data. The following types of possible cooperation initiatives in these four areas are suggested for discussion and potential implementation:
1) Exchanging of information and identification of issues concerning scientific data activities, policies and developments in intellectual property law and public information policies, including barriers to data exchange at the national and international levels, which may have implications for database development, access, sharing, and use.
2) Identification of scientific data and information resources that might be translated and made more widely available in our respective countries.
3) Identification of mutual high-priority databases in both countries that should have either mirror sites, or subsets of the contents established in each other’s country, and determine how to implement that.
4) Promotion of opportunities for both senior and junior scientists and engineers to visit each other’s countries for various periods of time to learn about each other’s scientific database activities and to engage in cooperative research in select areas.
5) Promotion of opportunities for university students to visit and study in each other’s universities and research centers in areas within CODATA’s scope of activities.
6) Exploration of the possibility for joint projects in scientific database development, studies, or training, including topics such as common standards and interoperable systems and techniques; metadata management practices; clearinghouses and portals for data resources; and other topics by mutual agreement.
A steering committee for planning and overseeing the Roundtable will be formed by the collaborating organizations. The National Academies of Sciences, Engineering, and Medicine
500 Fifth Street, NW | Washington, DC 20001 | T. 202.334.2000 | 科技 |
2016-40/3982/en_head.json.gz/14803 | KQED Education
KQED Education Home
Science (QUEST)
Post-Secondary ESL
Digital Tool Tips
Should We Bring Species Back to Life?
By California Academy of Sciences October 9, 2013
Click to share on Facebook (Opens in new window)Click to share on Twitter (Opens in new window)Click to share on Pinterest (Opens in new window)Click to share on LinkedIn (Opens in new window)Click to share on Google+ (Opens in new window)Click to email this to a friend (Opens in new window) PRINT
To respond to the Do Now, you can comment below or tweet your response. Be sure to begin your tweet with @KQEDEdspace and end it with #DoNowExtinct
For more info on how to use Twitter, click here.
If we are able to bring back extinct animals, does that mean we should proceed with de-extinction? Is it ethical? Should humans even be concerned with this?
In recent years, scientists have discovered carcasses of frozen woolly mammoths with intact tissues and preserved DNA. With this DNA, and that from other extinct animals, researchers are trying to actually clone the extinct animals and bring them back to life. Current technologies are on the verge of making this possible. So far, the closest we’ve come is to de-extincting the passenger pigeon, thanks to frozen DNA samples and the DNA of its closest relative, the band tailed pigeon.
Scientists are passionate about de-extinction and have been working hard on doing this research because they believe that bringing back animals that were killed off due to human impact would potentially help to right a historical wrong. Some feel this would somewhat undo the harm and give humanity a chance at redemption for being the main cause of these animals’ unfortunate permanent disappearance. Based on these possible future discoveries, this knowledge can also help us prevent future extinctions.
However, even though we may be capable of producing a viable specimen of an extinct animal, there are many other implications of the process. Some scientists oppose the idea of de-extinction, because they believe that it is a waste of time, money, and effort. The time factor in the cloning process is important because the procedure is based on trial and error.
There are many other political and ethical factors to be aware of if the de-extinction process is confirmed doable. A big issue is the environment. Bringing back the species with DNA samples is not that hard, but bringing back the exact environment and ecosystem it once lived in proves to be much more difficult. Another scary thought—what if the species we bring back turns out to be invasive? Scientists are also worried about the public, and how they will see this process. Upon seeing the wonders of species revival, will they deem it unimportant to preserve species that live on Earth today, as we can merely bring them back again in the future because of technology? Will people will no longer worry about wiping out plants and animals and saving the environment, due to the amazing newly found de-extinction processes? We hope not. Some scientists suggest that de-extinction would actually revive an awe in nature, and it could enrich conservation efforts. They believe that it would drive human interests in species loss as well as call more attention to species revival.
The American Museum of Natural History video The Science Behind De-extinction Fossils of dinosaurs, mammoths, and saber-toothed cats on display on the Museum’s fourth floor are impressive and imposing specimens of animals that once roamed the Earth, then vanished during mass extinctions at the end of the Cretaceous and Pleistocene eras. In the not-too-distant future, scientists expect that technological breakthroughs—and availability of genetic data from specimens of extinct species—will provide ways to revive vanished species. In this video, Museum Curator Ross MacPhee discusses the science and ethical considerations of “de-extinction.”
We encourage students to reply to other people’s tweets to foster more of a conversation. Also, if students tweet their personal opinions, ask them to support their ideas with links to interesting/credible articles online (adding a nice research component) or retweet other people’s ideas that they agree/disagree/find amusing. We also value student-produced media linked to their tweets like memes or more extensive blog posts to represent their ideas. Of course, do as you can… and any contribution is most welcomed.
SciShow video Resurrection Biology: How to Bring Animals Back From Extinction
We’ve all seen the movies and heard the hype: But is it really possible to bring back animals that have gone extinct? If so, how? And how soon? And can I have a mammoth to ride around in my backyard? Hank explains the latest research into resurrection biology, and ponders questions that include not only “Can we?” and “How do we?” but also: “Should we?”
KQED Forum episode Science Could Soon Bring Species Back to Life
Can, and should, we bring species back from extinction? Advances in biotechnology may enable us to revive the passenger pigeon, the great auk, and even the wooly mammoth — and help restore biodiversity and genetic diversity in the process. But critics say that de-extinction efforts distract from important conservation priorities like combating habitat destruction and saving existing species. We discuss the issue.
NPR radio segment It’s Called ‘De-Extinction’ — It’s Like ‘Jurassic Park,’ Except It’s Real
Sorry to disappoint, but science writer Carl Zimmer says we’re not going to bring back dinosaurs. But, he says, “science has developed to the point where we can actually talk seriously about possibly bringing back more recently extinct species.” It’s called “de-extinction” — and it’s Zimmer’s for National Geographic’s April issue.
KQED Do Now Science is a monthly activity in collaboration with California Academy of Sciences. The Science Do Now is posted every second Tuesday of the month.
This post was contributed by youth from the Spotlight team within The California Academy of Sciences’ Careers in Science Intern Program. CiS is a multi-year, year-round work-based youth development program for young people from groups typically under-represented in the sciences.
Explore: Community, Do Now: Science, Science, animals, biology, de-extinction, dinosaurs, DNA, Do Now, extinct, learning, life science
Show Comments (14)
Alvina Nguyen
If we were able to being back extinct animals, that doesn’t necessarily mean that we should. Lets say that we bring back the woolly mammoth. We wouldn’t know if its truly 100% a woolly mammoth. Why would we bring it back? For what purpose will it be able to serve us? I don’t think that we should bring back extinct animals. I don’t think humans should even be concerned with this be caused what’s gone is..well, gone. I don’t think we would even be able to create the correct enviorment for the animal. So if we have all these extinct animals back, where would they go? tellio
Your points are well taken. De-extincting animals is beset with problems, but if your argument is that we should not do something because it is difficult and expensive, then I think you must know that almost anything worth doing, like pure research, is difficult and expensive. Human genome? Particle accelerators? Health care legislation? All of those project have been just as beset with even larger problems than de-extinction. I argue that if the problem has already been defined with proponents and opponents, then the human will and imagination have already been engaged. Too late to argue from your point of view. Now…to argue whether we should or not. I think that is still open to question.
Pingback: Should We Bring Extinct Species Back to Life? | waddeh()
Safaa Jamshed
I think that the scientists should not proceed with the de-extinction because I don’t think there is a purpose to bring them back. If we do, what would we do with them? Where would we keep them? We cannot provide the perfect environment for them. If we cannot take care of them and if there is no reason to bring them back, I think we should leave them alone.
-Miss Chen’s Student
Aabid Jamshed
De-extintion. The act of bringing an extinct species back to life through cloning. It sounds
interesting, but is it moral? That would depend on how it would affect the
ecosystem around it. For example, bringing back a predator that would eat a
predator that kept the animal population in check could have serious
consequences. Bringing back a species of extinct bird of prey back could
potentially help out with rodent issues in the area. In the end, it all boils
down to how de-extinction will affect the environment.
Ms. Chen’s Student
Scientists should not use de-extinction as a way to right the wrongs done throughout history. Bringing back extinct species would disrupt the ecosystems they would be introduced to, killing off prey and other species and severely affecting the other organisms whose populations are controlled by the balance of predators and food sources. What if the revived species causes another one to become extinct? Would we have to continue try and de-extinct species that aren’t compatible with other populations?
Mara Duran
My stance on this issue is that we shouldn’t bring back animals that have been extinct or make new animals. Bringing back extinct species means we have to have the proper environment and we need to know a lot about this animal. We cannot do this due to the fact that we didn’t live around the animal and have never interacted with them. Redemption was used as an excuse to bring back animals from the dead, but I believe this an unintelligent idea. Would we bring back Hitler from the dead just to apologize to the thousands of people he murdered? No. It’s ‘a waste of time. What’s done is done. Also, we don’t know the animal’s behavior and this could be a threat to us all. It wouldn’t be okay to do this because the animal could attack us and cause environmental issues probably. I also agree this is a waste of time and money.
Kevin Bettencourt
I don’t think we should because what would we do with these extinct species. Also couldnt they bring back a disease that could wipe out ecosystems.
Luke Bird
What is De-extinction? This is to bring back a species that has gone extinct over the years. Bringing back a species that is small, and will fit right into our environment would be okay. But a big species such as wooly mammoths will not work. Where could these ginormous animals live? Would their hunting or eating endanger other species living on this Earth? As species leave this Earth, new ones occupy their space, and eat the food they once did. I believe certain animals have left this Earth for a good reason, we may not know why, but that is better than bringing them back and regretting our decision. We do not know what would happen if these species are brought back, and it is better to not know.
Pamela Solano
De-extinction sounds cool, having the ability to bring back animals that have been extinct for years. However, there are a lot of cautions to think about like will they be able to adapt to our environment? The environment they once lived in drastically changed over the years because of global warming and the harm we have done to our earth. Also, would they be able to survive? Would they have enough food to live off on? Bringing back such a large animal could be a little questioning. Where would they migrate to? How do we know they won’t be a threat to our environment? Although there are many questioning thoughts, imagine how much research could be done. We could literally go back in time and study these animals, their behavior, culture, their abilities, their brain. It would be a great project to try, I would love to know how the animals in the past lived and what they did in their everyday lives, how they communicated. I think we could take the risk and bring back some of the extinct animals from the past.
Jared Lee
De-extinction is a great process that can be created thanks to modern technology, but I feel like it shouldn’t be done. Even scientists believe that it is a great waste of time. If the process does work, who can tell what the side-effects could be, who would take blame, what can happen, how do we control the beast. Too many people are at risk and there are too many problems that can happen.
Jose Zamudio
I believe that de-extinction is not a good idea because like the article said “it is a waste of time, money, and effort.” It is not a good idea because by bringing back extinct species we would have to recreate their environment and that is very hard. De-extinction is a big advance in science but it is not a very good idea to do it.
@bchs12chem #DoNowBringSpeciesBacktoLife
Dodos for life….
De-extinction should not occur. As intriguing as it sounds to be able to see extinct animals roaming the earth, it is not worth it. The consequences of de-extinction are unpredictable, and that’s a scary thought.
The California Academy of Sciences is a leading scientific and cultural institution based in San Francisco. It is home to an aquarium, planetarium, natural history museum and research and education programs, which engage people of all ages and backgrounds on two of the most important topics of our time: life and its sustainability. Founded in 1853, the Academy’s mission is to explore, explain and sustain life. Visit www.calacademy.org for more information.
KQED at Teachers 4 Social Justice Conference
Can School Lunches Contribute to Obesity?
About KQED Education KQED Education provides educators with multimedia content, student activities, and professional development tools to help create a 21st century classroom aligned to state and national content standards.
Follow KQED Education Follow Us on FacebookFollow Us on TwitterConnect on YouTubeSubscribe via RSS Feed
Funding for KQED Education is provided by the Horace W. Goldsmith Foundation, the Koret Foundation and David Bulfer and Kelly Pope. | 科技 |
2016-40/3982/en_head.json.gz/14812 | Shutdown Forces Antarctic Research Into 'Caretaker Status' By Nell Greenfieldboyce
Oct 8, 2013 ShareTwitter Facebook Google+ Email The Chalet (right) is the U.S. Antarctic Program's administrations and operations center at McMurdo Station.
Reed Scherer
/ National Science Foundation
Originally published on October 9, 2013 3:44 am Earlier this week we told you that scientists who do research in Antarctica have been on pins and needles, worried that the government shutdown would effectively cancel all of their planned field work this year. Well, those scientists just got the news they didn't want to hear. Today, officials at the U.S. Antarctic Program posted a statement online saying they are moving to "caretaker" status at the three U.S. research stations, ships and other assets, and all research activities not essential to human safety and the preservation of property will be stopped. The National Science Foundation (NSF) is responsible for managing and coordinating the U.S. Antarctic Program (USAP) on behalf of the nation. This includes providing support personnel and facilities and coordinating transportation and other logistics for scientific research. Due to the lapse in appropriation, funds for this support will be depleted on or about October 14, 2013. Without additional funding, NSF has directed its Antarctic support contractor to begin planning and implementing caretaker status for research stations, ships and other assets. The agency is required to take this step as a result of the absence of appropriation and the Antideficiency Act. Under caretaker status, the USAP will be staffed at a minimal level to ensure human safety and preserve government property, including the three primary research stations, ships and associated research facilities. All field and research activities not essential to human safety and preservation of property will be suspended. As NSF moves to caretaker status, it will also develop the information needed to restore the 2013-14 austral summer research program to the maximum extent possible, once an appropriation materializes. It is important to note, however, that some activities cannot be restarted once seasonally dependent windows for research and operations have passed, the seasonal workforce is released, science activities are curtailed and operations are reduced. NSF remains committed to protecting the safety and health of its deployed personnel and to its stewardship of the USAP under these challenging circumstances. Most research in this remote, icy continent at the bottom of the world takes place from October to February, when it's warmer and there's enough daylight. Scientists who go down there depend on things like housing and transportation provided by the U.S. Antarctic Program. It supports three research stations, including one at the South Pole, that are staffed year-round. Update at 6:44 p.m. ET. 'Looking Pretty Bad:' "Wow, it's looking pretty bad right now. I was a lot more optimistic yesterday," John Priscu, a Montana State University biologist who has been to Antarctica about thirty times, told us after he heard the news. He says he was stunned by the announcement and was still trying to understand what this will mean both for his research and the entire field season down there. "I don't think anybody really knows," Priscu says. "It's a thing that's never happened before," says Peter Doran, a professor of Earth sciences at the University of Illinois at Chicago, who was scheduled to go to Antarctica later this year. "I'm still hoping that something will happen middle-of-the-month in Congress that will turn this around." But he says it's a huge operation to fly people in and out of McMurdo Research Station, and plans can't be made and unmade quickly. "There's things that really that they have to do. There are people still at the South Pole Station that have been there all winter. They need to get visited to have supplies refreshed. They need fuel, all that kind of stuff," says Doran. Research that could be affected includes biological studies of animals like penguins, as well as astrophysics and studies on the effects of climate change.Copyright 2013 NPR. To see more, visit http://www.npr.org/. Transcript RENEE MONTAGNE, HOST: This is the time of year when scientists make the long trek down to the bottom of the world to study Antarctica. They have just a few months to do their work before the icy continent sinks into its dark, frigid winter. These researchers are used to dealing with all kinds of hardships - extreme cold, fuel shortages - but this year they've been hit by something unprecedented. The U.S. Antarctic Program says it is stopping most research activities because of the partial government shutdown. NPR's Nell Greenfieldboyce is here to tell us what's going on. Good morning. NELL GREENFIELDBOYCE, BYLINE: Good morning. MONTAGNE: Now, how is the shutdown affecting this remote place? GREENFIELDBOYCE: Well, as you can imagine the fact that it's so remote means that it's very difficult to get there and move scientific equipment around. So all the logistics like housing and transportation for the scientists are handled by the U.S. Antarctic Program. It runs three research stations and things like ships. That whole program is paid for by the National Science Foundation which is shut down. So there's a funding crunch. The government contractor for logistics in Antarctica is Lockheed Martin and Lockheed Martin's support program there is going to run out of money soon, around October 14th. Yesterday, the National Science Foundation announced that they were putting research on hold, pulling people out of Antarctica and going into what's called caretaker status. MONTAGNE: Well, I want to ask you about caretaker status. I mean, I'm not quite sure what that means. But pulling people out? That sounds really hard. GREENFIELDBOYCE: Yeah. So caretaker status, that would mean just skeleton crews at the research stations, the bare minimum to keep things going. Everybody else comes back home. And that's a big deal because this is the time of year when activity normally is ramping up. I mean, there's advanced teams that have been going to McMurdo Research Station since August getting everything ready for the scientists to arrive. And the research season only runs from October to February. That's basically the Antarctic summer. There are some things that have to get done. For example, there are people who have over-wintered at the South Pole. That station needs to be visited and supplied with things like fuel. But other than that, it's not clear what, if anything, will happen down there this year. MONTAGNE: So how have the scientists been reacting? GREENFIELDBOYCE: The ones I talk to are floored. I mean, they're stunned. One person who's worked in Antarctica for 30 years said that he's never seen anything like this and that it looked really bad. The National Science Foundation says that if the shutdown ends and it gets more funding, it will try to resume research. But doing work in Antarctica usually takes a ton of logistics and planning. I mean, you're moving around airplanes and helicopters and ice breakers. Scientists say once you turn all that off, you can't just flip a switch and bring it all back online. MONTAGNE: Right. GREENFIELDBOYCE: Some projects probably just won't be able to be restarted. Researchers worry that if they miss their narrow window to go they're basically going to miss their one chance to see what's happening in Antarctica this year. MONTAGNE: Well, just briefly give us some idea of what kind of research we are talking about here that might be missed. GREENFIELDBOYCE: Oh, all kinds of things. I mean, they have telescopes down there for astrophysics. They're studying lakes under the ice sheets, looking for signs of life. They're tracking the effects of climate change. Other countries have programs down there too, like the United Kingdom and Russia, but scientists say the U.S. has the best and biggest Antarctic program in the world. And now it's on hold. MONTAGNE: OK. So the shutdown hits the Antarctic. NPR's Nell Greenfieldboyce, thanks very much. GREENFIELDBOYCE: Thank you. (SOUNDBITE OF MUSIC) MONTAGNE: This is NPR News. (SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.Related Program: Morning EditionView the discussion thread. © 2016 WWNO | 科技 |
2016-40/3982/en_head.json.gz/14865 | Sikorsky Launches ‘Matrix’ Autonomous Flight Program
by Bill Carey
- August 12, 2013, 4:55 PM
The S-76 Sikorsky Autonomous Research Aircraft (Sara) performed its first test flight on July 26 in West Palm Beach, Fla. (Photo: Sikorsky)
Sikorsky Aircraft is developing a set of hardware and software capabilities to support autonomous flight of unmanned or optionally piloted vertical-lift aircraft. Executives said the goal is to deliver an order of magnitude improvement in the safety and reliability of unmanned aircraft, which experience losses at a rate of once every 1,000 flight hours.
The same Sikorsky Innovations group that developed and flew the X2 compound coaxial helicopter to a speed of 250 knots in September 2010 is behind the so-called “Matrix” program to develop what executives described as a higher level, second generation of autonomous flight capability. The program conducted its first test flight on July 26 of an S-76 fitted with fly-by-wire flight controls and multiple sensors for situational awareness. The S-76 Sikorsky Autonomous Research Aircraft (Sara), which is based at the company’s West Palm Beach, Fla. development flight center, will be joined later this year by a U.S. Army UH-60M Black Hawk helicopter with fly-by-wire flight control, serving as a second testbed. Sikorsky and the Army’s Aviation Applied Technology Directorate have a cooperative research and development agreement to test “unpiloted” autonomous cargo missions.
In a conference call with reporters, Igor Cherepinsky, Sikorsky autonomy chief engineer, said the company is developing software algorithms for “contingency management” and control that would enable a “mission operator” to manage an aircraft instead of a pilot. “They don’t really need to know that it’s a helicopter or a VTOL [vertical takeoff and landing] aircraft,” he said. “To them it’s a tool. What they see in front of them is a ‘mission-centric’ interface. How that gets executed is entirely up to the vehicle.”
Cherepinsky said the level of autonomy that Sikorsky seeks to deliver would represent a next-generation capability exceeding that of the Lockheed Martin/Kaman Aerospace K-Max and the Boeing A160 Hummingbird unmanned helicopters, both developed to carry external sling loads. Sikorsky is funding the Matrix technology demonstration through the end of next year. It will conduct the first unmanned test flight “within a year,” he said.
Executives said the program’s objective is to develop both a set of software applications and a “pallet” of hardware and software systems that could be “ported” or integrated on an existing aircraft or introduced with a new aircraft. An example of an application is the automated approach-to-rig capability Sikorsky already provides in the S-92 for the offshore oil industry. The Matrix technology could be adapted for another manufacturer’s aircraft and for commercial as well as military aircraft.
“What we’re developing is an architecture and a suite of software-based capabilities that are platform-agnostic,” said Teresa Carleton, Sikorsky vice president of mission system integration. “This capability can be inserted into any of our platforms, commercial or military, and we envision that it could be adopted by other platforms as well. It’s really aimed at bringing intelligence and increased reliability to whatever platform it executes on.”
Aircraft http://www.ainonline.com/aviation-news/2013-08-12/sikorsky-launches-matrix-autonomous-flight-program | 科技 |
2016-40/3982/en_head.json.gz/14915 | deenesfrnl
All about Astrology National Charts, by Liz GreeneThe national chart of the United States 2008 - 2024
View chart In the horoscope of the United States, Pluto, as it moves through Capricorn,
is opposing the Sun in the natal chart, reflecting major and irrevocable
changes on many levels. This aspect has occurred only once before in
American history, during the period of the Revolutionary War. Although it is
not intrinsically an aspect of war, it challenges the deepest definitions
of what constitutes nationhood, and raises many issues of autonomy and
the way in which the government is structured and how much authority
it may or may not exercise.The deeper issues underlying the Revolutionary War
concerned not only human rights, but also the autonomy of the individual
states comprising the nation, and these issues may once again rise
to the surface as new ways of defining the national identity are proposed.
The national Sun in the 7th house also places great emphasis on
of the bonds America forms with other nations, and here too Pluto's
transformative energy may require many changes and a re-evaluation
of the ways and reasons
why particular nations are sought as partners and others are excluded.
This could potentially be a time of great renewal and a re-establishment
of those high ideals on which the structure of the nation is based.
Pluto, as it completes its transit through Capricorn, will also
return to its
own natal place: in other words, the United States is experiencing
its Pluto return. This suggests the completion of a great cycle,
and a cementing
of the fundamental values on which the Constitution is built. There
may also be a serious reappraisal of issues concerned with the environment,
as Capricorn is an earthy sign, and the use or abuse of natural
resources may become a cause for not only profound concern but also
profound change
and a more enlightened attitude. The enormous resources available
to the United States are reflected by Pluto in the 2nd house of
chart, and it is possible that these will be approached with greater
respect and care than ever before. Whether or not you favour these
changes personally, it would seem that a time has arrived when
there is a great
new opportunity to affirm the values and ideals of the original
founding of the nation, applicable not only to government and to
foreign relations,
but also to the land itself and the resources inherent in it.
Liz Greene, 2005
(Data used: 4 July 1776, 4.50 pm LMT, Philadelphia,
PA; this
is the so-called "Sibly chart", which was originally published in 1787. It is
based, in part, on eyewitness accounts of the signing of the Declaration of Independence.
There is another chart commonly used for the USA, which gives Gemini on the Ascendant,
but there is no historical evidence to support this horoscope.) Astro-Databank | 科技 |
2016-40/3982/en_head.json.gz/14916 | Home/News/Scientists track solar explosion all the way from the Sun to Earth
Scientists track solar explosion all the way from the Sun to EarthCoronal mass ejections are many times larger than Earth and typically contain over a billion tons of matter.Provided by the Royal Astronomical Society, United Kingdom
Published: Wednesday, April 14, 2010(a) Image of the disc of the Sun in the light of the hydrogen-alpha spectral line a couple of minutes after the onset of the event with model magnetic field lines superimposed. (b) A zoomed-in image of the active region concerned, again with a set of model magnetic field lines superimposed. (c) As in (b), but without the model superimposed showing the detail of the event in the image.Predictive Science, Inc./Solar PhysicsApril 14, 2010An international group of solar and space scientists has built the most complete picture yet of the full impact of a large solar eruption, using instruments on the ground and in space to trace its journey from the Sun to Earth.Coronal mass ejections (CMEs) are giant eruptions of the Sun's atmosphere from its surface that are ejected out into space. The eruptions are many times larger than Earth and typically contain over a billion tons of matter. CMEs travel away from the Sun at speeds of up to several million miles per hour, and they can impact comets, asteroids, and planets, including Earth.Our planet is normally protected from CMEs by the terrestrial magnetic field, but the twisted magnetic fields carried by CMEs can break through this protective shield, causing particles to stream down over Earth's polar regions. They also can lead to displays of the northern and southern lights (aurora borealis and australis). But CMEs also can have less appealing consequences such as power outages on the ground, interference with communications, damage to Earth-orbiting satellites, as well as being a possible health risk to astronauts who happen to be conducting a "space walk" at the time an event interacts with Earth.So scientists came together to study one event in great detail in an attempt to gain an enhanced understanding of CMEs and to gain insight into their prediction and, more importantly, when and how they may interact and cause effects on and in the vicinity of Earth. After a painstaking analysis of the observations and measurements from all the different spacecraft and facilities on the ground, they have assembled an incredibly detailed picture.They chose an eruption that lifted off the Sun May 13, 2005, and headed in our direction. As it approached our planet, it interacted with the solar wind, the material that is constantly flowing out from the Sun at relatively steady rates. This particular CME deflected some of the solar wind northward as it headed in the direction of Earth and was itself slowed as a result of the solar wind ahead of it.The mass expelled in the event was not that different from many other solar eruptions, but its magnetic field was very intense, and, as such, this event caused the largest geomagnetic storm — rapid changes in the shape and strength of Earth's magnetic field — during 2005. At that time, solar activity was in decline from the maximum period between 2002 and 2004 to the recent minimum between 2008 and 2010.At the start of the event, the outburst was thought to be a simple CME, but the unprecedented coverage revealed it to be extremely complex. The event was caused by multiple flare-type events near the solar surface that released magnetic energy and mass out into the solar wind in the form of the CME.The material then traveled through interplanetary space toward Earth — in this phase it is described as an Interplanetary CME or ICME. With the magnetic field frozen inside it in the form of a "flux rope," or "magnetic cloud" (MC), when the ICME reached our planet it began to compress Earth's magnetic field into a distance of about 24,000 miles (38,000 kilometers) — in comparison, the field on the sunward side would normally extend to 59,000 miles (95,000 km). The arrival of the CME also caused some minor effects on satellites and communications as well as wonderful auroral displays.Mario Bisi of Aberystwyth University, United Kingdom, sees the new analysis as a key step forward in our understanding of the way solar eruptions develop and affect Earth. "We learned an enormous amount from the 2005 event. Even an apparently simple CME turned out to be incredibly complex," said Bisi. "And the intense reaction of Earth's magnetic field to a fast but not particularly powerful event was a surprise.""We're now also much better prepared for future events and, if nothing else, know how to handle such a large amount of data," Bisi said. "All of this adds to our knowledge of the way CMEs originate, develop, and sometimes even have an impact on everyday life."0JOIN THE DISCUSSION
RELATED ARTICLESMagnetic 'ropes' tie down solar eruptionsPublic invited to spot and track solar stormsNASA's Solar Dynamics Observatory arrives at Kennedy Space CenterNASA spacecraft show three-dimensional anatomy of a solar stormDeep mysteries lurk below (and even above) Mercury’s surfaceDevon Island: The last stop before MarsAstronomers spy a “stellar cocoon” outside the Milky Way for the first timeThe comet probe Rosetta is set for its grand finale on FridayHow much would it cost to live on the Moon? YOU MIGHT ALSO LIKE100 Most Spectacular Sky WondersDeep Space Mysteries 2017 CalendarThe New Cosmos (By David Eicher)Explore Jupiter's moons PosterYour Guide to the 2017 Total Solar EclipseCosmology's Greatest DiscoveriesCosmic OriginsCelestial Portraits Download | 科技 |