This is a follow-up to an earlier report about the DZero detector not finding the "bump" that was observed in the CDF data. The report out of DZero has now been published, and is covered in the APS's "Physics". I believe you get to download a copy of the paper for free.
Zz.
Thursday, June 30, 2011
No Grainiess Of Space So Far
New analysis of gamma rays from celestial gamma ray bursts has not found any graininess of space down to 10^-46 meter scale!
Looks like this scale is even smaller than the Planck length scale. Hum...
Zz.
Dr Laurent and colleagues searched for differences in the polarisation at different energies, but found none to the accuracy limits of the data. Some theories suggest that the quantum nature of space should manifest itself at the ‘Planck scale’: the minuscule 10-35 of a metre, where a millimetre is 10-3 m. However, Integral’s observations are about 10 000 times more accurate than any previous and show that any quantum graininess must be at a level of 10-48 m or smaller.
“This is a very important result in fundamental physics and will rule out some string theories and quantum loop gravity theories,” says Dr Laurent.
Looks like this scale is even smaller than the Planck length scale. Hum...
Zz.
Wednesday, June 29, 2011
The Physics of Tibetan Singing Bowl
I've seen this on TV before in a documentary show, but I don't think I've been intrigue by it till now.
This is a video of the so-called Tebetan singing bowl:
And he's a paper describing the physics of it:
The Tibetan singing bowl : an acoustics and fluid dynamics investigation
Abstract: We present the results of an experimental investigation of the acoustics and fluid dynamics of Tibetan singing bowls. Their acoustic behavior is rationalized in terms of the related dynamics of standing bells and wine glasses. Striking or rubbing a fluid-filled bowl excites wall vibrations, and concomitant waves at the fluid surface. Acoustic excitation of the bowl's natural vibrational modes allows for a controlled study in which the evolution of the surface waves with increasing forcing amplitude is detailed. Particular attention is given to rationalizing the observed criteria for the onset of edge-induced Faraday waves and droplet generation via surface fracture. Our study indicates that drops may be levitated on the fluid surface, induced to bounce on or skip across the vibrating fluid surface.
Zz.
This is a video of the so-called Tebetan singing bowl:
And he's a paper describing the physics of it:
The Tibetan singing bowl : an acoustics and fluid dynamics investigation
Abstract: We present the results of an experimental investigation of the acoustics and fluid dynamics of Tibetan singing bowls. Their acoustic behavior is rationalized in terms of the related dynamics of standing bells and wine glasses. Striking or rubbing a fluid-filled bowl excites wall vibrations, and concomitant waves at the fluid surface. Acoustic excitation of the bowl's natural vibrational modes allows for a controlled study in which the evolution of the surface waves with increasing forcing amplitude is detailed. Particular attention is given to rationalizing the observed criteria for the onset of edge-induced Faraday waves and droplet generation via surface fracture. Our study indicates that drops may be levitated on the fluid surface, induced to bounce on or skip across the vibrating fluid surface.
Zz.
Labels:
Acoustics,
Classical Physics,
Fluid Mechanics,
The Physics Of
Graphene Update
Who better to give you an update on their own work on graphene than a Nobel laureate in that field itself. So here's Andre Geim giving a talk on graphene and an update on the work from his group. You also get a brief interview to the material and why it has taken condensed matter physics by storm several years ago.
Zz.
Zz.
Tuesday, June 28, 2011
Interview With Steven Weinberg
The San Francisco Chronicle published a short interview with Steven Weinberg. In it, he talked about the LHC, the search for the Higgs, and String theory.
Zz.
Lundborg: Is string theory still the best game in town?
Weinberg: It's the best game in town, but it may not be the right game.
Lundborg: You're disappointed that it hasn't actually led anywhere?
Weinberg: Since the 1980s string theory has not come up with a prediction of anything new that we could then verify in the laboratory in a way that could convince us that string theory is right.
Although there may be one fundamental string theory, it has many, many solutions. And finding the one that describes our world may be impossibly hard.
Zz.
Monday, June 27, 2011
Wild Fire Threatens Los Alamos
Los Alamos National Lab is closed today due to a wild fire nearby. According to the lab's website:
As I recall, this is not the first time the lab has been threatened by a fire. Several years ago, wild fire may even have crossed onto the lab's grounds.
Zz.
"All laboratory facilities will be closed for all activities and nonessential employees are directed to remain off site," the statement said. "Employees are considered nonessential and should not report to work unless specifically directed by their line managers."
As I recall, this is not the first time the lab has been threatened by a fire. Several years ago, wild fire may even have crossed onto the lab's grounds.
Zz.
Astronomy And Potential New Cancer Treatement
One of the many complaints that we often get from ignorant people is the question on why physics don't help to find the cure for cancer. We get such complaints whenever the topic is on the funding for high energy physics, for astronomy/astrophysics, etc.. etc., i.e. when people simply can't see how such funding has any direct impact on their lives.
Of course, anyone who knows anything about these fields can tell you that while there many not be any obvious and direct impact on our lives from the subject matter of these fields, the advancement and technology brought about due to progress in these fields certainly have a direct and immediate impacts on our lives. The advancement in particle accelerators alone can account for a huge selling point as a direct benefit from high energy physics.
Interestingly enough, here's another one, and it comes from the field of astronomy. Knowledge from astronomy may offer a way treat cancer!
So, in addition to the advances made in detector physics that also benefits the public, here's another aspect from astronomy that could have a direct impact on our lives.
The moral of the story here is that, one never knows how a particular knowledge or field of study may contribute to our lives. When politicians belittle certain aspect of research which they could not understand or have little realization on the significance, they could have easily jeopardized something that could be important. Something that may appear esoteric has been shown to have a significant impact later on. One needs to only look at the history of quantum mechanics.
Zz.
Of course, anyone who knows anything about these fields can tell you that while there many not be any obvious and direct impact on our lives from the subject matter of these fields, the advancement and technology brought about due to progress in these fields certainly have a direct and immediate impacts on our lives. The advancement in particle accelerators alone can account for a huge selling point as a direct benefit from high energy physics.
Interestingly enough, here's another one, and it comes from the field of astronomy. Knowledge from astronomy may offer a way treat cancer!
In studying how chemical elements emit and absorb radiation inside stars and around black holes, the astronomers discovered that heavy metals such as iron emit low-energy electrons when exposed to X-rays at specific energies.
Their discovery raises the possibility that implants made from certain heavy elements could enable doctors to obliterate tumors with low-energy electrons, while exposing healthy tissue to much less radiation than is possible today. Similar implants could enhance medical diagnostic imaging.
So, in addition to the advances made in detector physics that also benefits the public, here's another aspect from astronomy that could have a direct impact on our lives.
The moral of the story here is that, one never knows how a particular knowledge or field of study may contribute to our lives. When politicians belittle certain aspect of research which they could not understand or have little realization on the significance, they could have easily jeopardized something that could be important. Something that may appear esoteric has been shown to have a significant impact later on. One needs to only look at the history of quantum mechanics.
Zz.
Sunday, June 26, 2011
Time To Rethink About Static Electricity?
You'd think that something that has been taught for ages, and is a standard part of any basic text on science or physics, would be fully known by now, wouldn't you? Yet, here it is, a paper on static electricity, published in Science, of all places!
A new paper[1] seem to contradict a common understanding on how static electricity on objects come about.
Abstract: When dielectric materials are brought into contact and then separated, they develop static electricity. For centuries, it has been assumed that such contact charging derives from the spatially homogeneous material properties (along the material's surface) and that within a given pair of materials, one charges uniformly positively and the other negatively. We demonstrate that this picture of contact charging is incorrect. While each contact-electrified piece develops a net charge of either positive or negative polarity, each surface supports a random “mosaic” of oppositely charged regions of nanoscopic dimensions. These mosaics of surface charge have the same topological characteristics for different types of electrified dielectrics and accommodate significantly more charge per unit area than previously thought.
Wired has a review of this work.
From what I've gathered, there is an actual transfer of material, not charges, from one object to another, when they touch or rub against each other.
It will be interesting to see if this is the predominant cause of static electricity.
Zz.
[1] H.T. Baytekin et al., Science Online DOI: 10.1126/science.1201512 (Publication citation not available yet).
A new paper[1] seem to contradict a common understanding on how static electricity on objects come about.
Abstract: When dielectric materials are brought into contact and then separated, they develop static electricity. For centuries, it has been assumed that such contact charging derives from the spatially homogeneous material properties (along the material's surface) and that within a given pair of materials, one charges uniformly positively and the other negatively. We demonstrate that this picture of contact charging is incorrect. While each contact-electrified piece develops a net charge of either positive or negative polarity, each surface supports a random “mosaic” of oppositely charged regions of nanoscopic dimensions. These mosaics of surface charge have the same topological characteristics for different types of electrified dielectrics and accommodate significantly more charge per unit area than previously thought.
Wired has a review of this work.
For many of us, static electricity is one of the earliest encounters we have with electromagnetism, and it’s a staple of high school physics. Typically, it’s explained as a product of electrons transferred in one direction between unlike substances, like glass and wool, or a balloon and a cotton T-shirt (depending on whether the demo is in a high school class or a kids’ party). Different substances have a tendency to pick up either positive or negative charges, we’re often told, and the process doesn’t transfer a lot of charge, but it’s enough to cause a balloon to stick to the ceiling, or to give someone a shock on a cold, dry day.
Nearly all of that is wrong, according to a paper published in today’s issue of Science. Charges can be transferred between identical materials, all materials behave roughly the same, the charges are the product of chemical reactions, and each surface becomes a patchwork of positive and negative charges, which reach levels a thousand times higher than the surfaces’ average charge.
From what I've gathered, there is an actual transfer of material, not charges, from one object to another, when they touch or rub against each other.
So, what causes these charges to build up? It’s not, apparently, the transfer of electrons between the surfaces. Detailed spectroscopy of one of the polymers (PDMS) suggests that chemical reactions may be involved, as many oxidized derivatives of the polymer were detected. In addition, there is evidence that some material is transferred from one surface to another. Using separate pieces of fluorine- and silicon-containing polymers allowed the authors to show that signals consistent with the presence of fluorine were detected in the silicon sample after contact.
It will be interesting to see if this is the predominant cause of static electricity.
Zz.
[1] H.T. Baytekin et al., Science Online DOI: 10.1126/science.1201512 (Publication citation not available yet).
Friday, June 24, 2011
MINOS Reports Agreement With T2K
Latest press release out of Fermilab announcing results from MINOS. It is consistent with the recent announcement out of T2K of muon neutrino oscillation to electron neutrino.
Zz.
Fermilab experiment weighs in on neutrino mystery
Scientists of the MINOS experiment at the Department of Energy’s Fermi National Accelerator Laboratory announced today (June 24) the results from a search for a rare phenomenon, the transformation of muon neutrinos into electron neutrinos. The result is consistent with and significantly constrains a measurement reported 10 days ago by the Japanese T2K experiment, which announced an indication of this type of transformation.
The results of these two experiments could have implications for our understanding of the role that neutrinos may have played in the evolution of the universe. If muon neutrinos transform into electron neutrinos, neutrinos could be the reason that the big bang produced more matter than antimatter, leading to the universe as it exists today.
The Main Injector Neutrino Oscillation Search (MINOS) at Fermilab recorded a total of 62 electron neutrino-like events. If muon neutrinos do not transform into electron neutrinos, then MINOS should have seen only 49 events. The experiment should have seen 71 events if neutrinos transform as often as suggested by recent results from the Tokai-to-Kamioka (T2K) experiment in Japan. The two experiments use different methods and analysis techniques to look for this rare transformation.
To measure the transformation of muon neutrinos into other neutrinos, the MINOS experiment sends a muon neutrino beam 450 miles (735 kilometers) through the earth from the Main Injector accelerator at Fermilab to a 5,000-ton neutrino detector, located half a mile underground in the Soudan Underground Laboratory in northern Minnesota. The experiment uses two almost identical detectors: the detector at Fermilab is used to check the purity of the muon neutrino beam, and the detector at Soudan looks for electron and muon neutrinos. The neutrinos’ trip from Fermilab to Soudan takes about one four hundredths of a second, giving the neutrinos enough time to change their identities.
For more than a decade, scientists have seen evidence that the three known types of neutrinos can morph into each other. Experiments have found that muon neutrinos disappear, with some of the best measurements provided by the MINOS experiment. Scientists think that a large fraction of these muon neutrinos transform into tau neutrinos, which so far have been very hard to detect, and they suspect that a tiny fraction transform into electron neutrinos.
The observation of electron neutrino-like events in the detector in Soudan allows MINOS scientists to extract information about a quantity called sin^2 2 theta-13 (pronounced sine squared two theta one three). If muon neutrinos don’t transform into electron neutrinos, this quantity is zero. The range allowed by the latest MINOS measurement overlaps with but is narrower than the T2K range. MINOS constrains this quantity to a range between 0 and 0.12, improving on results it obtained with smaller data sets in 2009 and 2010. The T2K range for sin^2 2 theta-13 is between 0.03 and 0.28.
“MINOS is expected to be more sensitive to the transformation with the amount of data that both experiments have,” said Fermilab physicist Robert Plunkett, co-spokesperson for the MINOS experiment. “It seems that nature has chosen a value for sin^2 2 theta-13 that likely is in the lower part of the T2K allowed range. More work and more data are really needed to confirm both these measurements.”
The MINOS measurement is the latest step in a worldwide effort to learn more about neutrinos. MINOS will continue to collect data until February 2012. The T2K experiment was interrupted in March when the severe earth quake in Japan damaged the muon neutrino source for T2K. Scientists expect to resume operations of the experiment at the end of the year. Three nuclear-reactor based neutrino experiments, which use different techniques to measure sin^2 2 theta-13, are in the process of starting up.
“Science usually proceeds in small steps rather than sudden, big discoveries, and this certainly has been true for neutrino research,” said Jenny Thomas from University College London, co-spokesperson for the MINOS experiment. “If the transformation from muon neutrinos to electron neutrinos occurs at a large enough rate, future experiments should find out whether nature has given us two light neutrinos and one heavy neutrino, or vice versa. This is really the next big thing in neutrino physics.”
The MINOS experiment involves more than 140 scientists, engineers, technical specialists and students from 30 institutions, including universities and national laboratories, in five countries: Brazil, Greece, Poland, the United Kingdom and the United States. Funding comes from: the Department of Energy Office of Science and the National Science Foundation in the U.S., the Science and Technology Facilities Council in the U.K; the University of Minnesota in the U.S.; the University of Athens in Greece; and Brazil's Foundation for Research Support of the State of São Paulo (FAPESP) and National Council of Scientific and Technological Development (CNPq).
Fermilab is a national laboratory supported by the Office of Science of the U.S. Department of Energy, operated under contract by Fermi Research Alliance, LLC.
For more information about MINOS and related experiments, visit the Fermilab neutrino website:
http://www.fnal.gov/pub/science/experiments/intensity/
Zz.
Labels:
Elementary Particles,
Experiment,
Fermilab,
Neutrino
"It's A Simple Matter Of Physics!"
Damn right!
I read with a bit of amusement at this report out of New Zealand. It appears from my understanding that there was quick a debate on a "forward throw" in rugby as stated in one of the "rules", I think.
Now, I can't quite understand what the brouhaha is all about. However, it appears that people are puzzling on how a person who is running forward, then makes a pass backward, can result in a ball that is still moving "forward". Note that all of this description (forward, backward) implicitly implies a motion relative to ground and in reference to a particular direction.
What I gather from the report is that a lot of people didn't learn about a simple Galilean transformation.
It really is simple physics! And who would have thought that Galilean transformation would be an issue in rugby?
Zz.
I read with a bit of amusement at this report out of New Zealand. It appears from my understanding that there was quick a debate on a "forward throw" in rugby as stated in one of the "rules", I think.
Now, I can't quite understand what the brouhaha is all about. However, it appears that people are puzzling on how a person who is running forward, then makes a pass backward, can result in a ball that is still moving "forward". Note that all of this description (forward, backward) implicitly implies a motion relative to ground and in reference to a particular direction.
What I gather from the report is that a lot of people didn't learn about a simple Galilean transformation.
"Basically it's got to be going backwards from the player, faster than the player is running forwards," said professor Matthew Collett, who teaches theoretical physics at the University of Auckland.
"So if the player throws the ball backwards and throws it slower than the speed he is running at, then relative to the ground, the pass will be going forward."
It really is simple physics! And who would have thought that Galilean transformation would be an issue in rugby?
Zz.
High Cost For Underground Physics Experiments
The cost for building a suite of underground physics experiments has come in, and it isn't pretty.
The cost for building LBNE and the Homestake Mines is estimated to be between $1.2 to $1.5 billion, while the costs for the dark matter and neutrinoless double beta decay experiment will be $300 and $400 million, respectively.
It's a daunting issue considering that the whole physics community in the US are expecting flat to lower budgets for the next several years. High energy physics certainly is a tough sell on any given year, but the next few years will certainly be a tremendous challenge and could affect the future of high energy physics experiments in the US as the Tevatron retires.
Zz.
So DOE's Office of Science asked a committee led by Jay Marx of the California Institute of Technology in Pasadena to evaluate the costs and risks of different options for the three main experiments slated for Homestake. The first is a gigantic particle detector known as the Long Baseline Neutrino Experiment (LBNE) to snare particles called neutrinos fired from Fermi National Accelerator Laboratory (Fermilab), 1300 kilometers away in Batavia, Illinois. The second is a detector weighing several tons to spot particles of the mysterious dark matter whose gravity appears to bind the galaxies. The third is a detector weighing at least a ton that would search for a revolutionary form of radioactivity called neutrinoless double beta decay.
The cost for building LBNE and the Homestake Mines is estimated to be between $1.2 to $1.5 billion, while the costs for the dark matter and neutrinoless double beta decay experiment will be $300 and $400 million, respectively.
It's a daunting issue considering that the whole physics community in the US are expecting flat to lower budgets for the next several years. High energy physics certainly is a tough sell on any given year, but the next few years will certainly be a tremendous challenge and could affect the future of high energy physics experiments in the US as the Tevatron retires.
Zz.
Wednesday, June 22, 2011
Doing Experiments In High Schools Not As Effective As Thought?
This is an article from the UK questioning the effectiveness of doing experiments (or practicals, as they call it) in secondary schools in the UK.
Whenever we see an argument like this, we need to separate out two distinct issues here:
1. Is the ineffectiveness is due to fallacy of the principle itself, i.e. the idea that doing experiments/practicals at this level is simply not contributing to the understanding of the subject matter, or
2. Is it due to HOW the principle is implemented, i.e. practicals can be useful IF it is done properly.
The article does not make that distinction, and in fact, has pieces from both.
My personal view on this is that, at THAT level (i.e. secondary school, high school in the US), lab work should be close to "playing" rather than trying to make the student discover known physics laws and ideas. In my series of essays on Revamping Undergraduate Laboratory, I basically listed the central philosophy of having such labs - making the student discovery the relationship between two or more variables, rather than making them discovery some physics concepts. In other words, I put the techniques of discovery, or how we know certain things, as being more important than trying to rediscovery Newton's laws.
Maybe that's something that can also be done to students at this level. But then again, when you have practicals at the A-levels testing you on stuff like this, one can't simply throw out the syllabus and do whatever one pleases.
Zz.
Many standard school science practicals purport to be experiments when they are nothing of the sort. What we are doing a lot of the time, for example when asking them to "investigate the factors that affect the resistance of a wire", is getting students to carry out practical work with the intention that they discover something which is already known. This approach was described as "intellectually dishonest" by Rosalind Driver in her important essay, The Fallacy of Induction. It is naive and pedagogically unsound to think that all we need to do as science teachers is provide children the opportunity to discover the laws of science for themselves. As Driver wrote, "explanations do not spring clearly or uniquely from data". Yet this approach to practical work persists, according to Professor Robin Millar, due to "the prevalence of the empiricist/inductive view of science... the belief that ideas will 'emerge' automatically from the event itself, if students work carefully enough". As Millar, who has carried out extensive research into what students learn from practical work, points out, "in practice this rarely happens".
Whenever we see an argument like this, we need to separate out two distinct issues here:
1. Is the ineffectiveness is due to fallacy of the principle itself, i.e. the idea that doing experiments/practicals at this level is simply not contributing to the understanding of the subject matter, or
2. Is it due to HOW the principle is implemented, i.e. practicals can be useful IF it is done properly.
The article does not make that distinction, and in fact, has pieces from both.
My personal view on this is that, at THAT level (i.e. secondary school, high school in the US), lab work should be close to "playing" rather than trying to make the student discover known physics laws and ideas. In my series of essays on Revamping Undergraduate Laboratory, I basically listed the central philosophy of having such labs - making the student discovery the relationship between two or more variables, rather than making them discovery some physics concepts. In other words, I put the techniques of discovery, or how we know certain things, as being more important than trying to rediscovery Newton's laws.
Maybe that's something that can also be done to students at this level. But then again, when you have practicals at the A-levels testing you on stuff like this, one can't simply throw out the syllabus and do whatever one pleases.
Zz.
Tuesday, June 21, 2011
Two Reviews On Unconventional Superconductivity
In recent weeks, there were two very good review articles on the physics of superconductivity, focusing especially on the so-called unconventional superconductivity. This includes the ruthenates, the cuprates, and the iron-based superconductors.
The first review was by Mike Norman, which appeared in a recent issue of Science (Science 332, 196 (2011)).
Abstract: During the past few decades, several new classes of superconductors have been discovered. Most of these do not appear to be related to traditional superconductors. As a consequence, it is felt by many that for these materials, superconductivity arises from a different source than the electron-ion interactions that are at the heart of conventional superconductivity. Developing a rigorous theory for any of these classes of materials has proven to be a difficult challenge, and will continue to be one of the major problems in physics in the decades to come.
The second review article was by G. R. Stewart on the iron-based superconductors. It is to appear in an upcoming issue of Rev. Mod. Phys.
Abstract: Kamihara and coworkers' report of superconductivity at Tc = 26 K in fluorine-doped LaFeAsO inspired a worldwide effort to understand the nature of the superconductivity in this new class of compounds. These iron pnictide and chalcogenide (FePn/Ch) superconductors have Fe electrons at the Fermi surface, plus an unusual Fermiology that can change rapidly with doping, which lead to normal and superconducting state properties very different from those in standard electron-phonon coupled 'conventional' superconductors. Clearly superconductivity and magnetism/magnetic fluctuations are intimately related in the FePn/Ch - and even coexist in some. Open questions, including the superconducting nodal structure in a number of compounds, abound and are often dependent on improved sample quality for their solution. With Tc values up to 56 K, the six distinct Fe-containing superconducting structures exhibit complex but often comparable behaviors. The search for correlations and explanations in this fascinating field of research would benefit from an organization of the large, seemingly disparate data set. This review attempts to provide an overview, using numerous references, with a focus on the materials and their superconductivity.
Lots of things to read. The article by Norman, especially, has a very compact summary of the development of superconductivity in general, and what is meant by "unconventional" superconductivity.
Zz.
The first review was by Mike Norman, which appeared in a recent issue of Science (Science 332, 196 (2011)).
Abstract: During the past few decades, several new classes of superconductors have been discovered. Most of these do not appear to be related to traditional superconductors. As a consequence, it is felt by many that for these materials, superconductivity arises from a different source than the electron-ion interactions that are at the heart of conventional superconductivity. Developing a rigorous theory for any of these classes of materials has proven to be a difficult challenge, and will continue to be one of the major problems in physics in the decades to come.
The second review article was by G. R. Stewart on the iron-based superconductors. It is to appear in an upcoming issue of Rev. Mod. Phys.
Abstract: Kamihara and coworkers' report of superconductivity at Tc = 26 K in fluorine-doped LaFeAsO inspired a worldwide effort to understand the nature of the superconductivity in this new class of compounds. These iron pnictide and chalcogenide (FePn/Ch) superconductors have Fe electrons at the Fermi surface, plus an unusual Fermiology that can change rapidly with doping, which lead to normal and superconducting state properties very different from those in standard electron-phonon coupled 'conventional' superconductors. Clearly superconductivity and magnetism/magnetic fluctuations are intimately related in the FePn/Ch - and even coexist in some. Open questions, including the superconducting nodal structure in a number of compounds, abound and are often dependent on improved sample quality for their solution. With Tc values up to 56 K, the six distinct Fe-containing superconducting structures exhibit complex but often comparable behaviors. The search for correlations and explanations in this fascinating field of research would benefit from an organization of the large, seemingly disparate data set. This review attempts to provide an overview, using numerous references, with a focus on the materials and their superconductivity.
Lots of things to read. The article by Norman, especially, has a very compact summary of the development of superconductivity in general, and what is meant by "unconventional" superconductivity.
Zz.
Labels:
Condensed Matter Physics,
Review,
Superconductivity
Monday, June 20, 2011
Tossing A Leaky Bottle
I mentioned previously the column "What Happens Next" in the issue of Physics Education journal. I am a fan of that column because I love thinking about these "mundane" problems or puzzles. We dealt with bouncing grapes in sodas last time.
This time, it is another good one from the May 2011 issue. You have a regular plastic water bottle, filled with water. You poke a hole close to the bottom of the bottle, and another hole close to the top. With both holes opened, the water will flow out of the bottom hole. You can stop that by closing the hole on top. See picture
But what will happen if you toss the bottle of water up into the air?
I'll post the answer later, because I'm sure you might want to try this out yourself! :)
Zz.
This time, it is another good one from the May 2011 issue. You have a regular plastic water bottle, filled with water. You poke a hole close to the bottom of the bottle, and another hole close to the top. With both holes opened, the water will flow out of the bottom hole. You can stop that by closing the hole on top. See picture
But what will happen if you toss the bottle of water up into the air?
I'll post the answer later, because I'm sure you might want to try this out yourself! :)
Zz.
Fermilab To Reduce Staff By 5%
With tight budget foreseen in the coming years, and with the Tevatron to be shut down at the end of September, Fermilab management is seeking to reduce its staff by 5%.
The mood that I'm getting out of Fermilab employees right now is one of dejected disillusionment. Despite a list of experiments ongoing and in the future at the lab, there is a sense of an end-of-an-era with the impending shut down of the Tevatron. Students and postdocs trying to wind down their research work as soon as they can, and I'm not even sure if any new students or postdocs can be recruited to continue the data analysis that will go on for years after the end of the Tevatron (aren't some people still looking at data from Zeus years after its shut down?).
One can only hope that the golden years of this laboratory is not already behind them.
Zz.
Oddone said that the lab will move as many employees as possible to jobs on new experiments and projects, many of which already are well under way and in need of extra help. Still, he said, there will be a mismatch between the lab’s current work force and what is needed for the future programs.
According to Fermilab’s website, if fewer than 100 people volunteer, the laboratory may have to move to layoffs.
The mood that I'm getting out of Fermilab employees right now is one of dejected disillusionment. Despite a list of experiments ongoing and in the future at the lab, there is a sense of an end-of-an-era with the impending shut down of the Tevatron. Students and postdocs trying to wind down their research work as soon as they can, and I'm not even sure if any new students or postdocs can be recruited to continue the data analysis that will go on for years after the end of the Tevatron (aren't some people still looking at data from Zeus years after its shut down?).
One can only hope that the golden years of this laboratory is not already behind them.
Zz.
Sunday, June 19, 2011
Closure and Fire?
This article wouldn't have caught my eye if it weren't for my ability to remember things that often are utterly useless. This article is reporting a fire at one of the physics laboratory at Reading University in the UK. By itself it isn't a significant news (well, it isn't because hopefully, no one was injured). But then I remember a while back of report that the physics program at this very same university was about to be closed.
So I wonder if (i) the two incidents are related (ii) the program still running (iii) this simply what's left of the program after closure (iv) I am hallucinating.
Zz.
So I wonder if (i) the two incidents are related (ii) the program still running (iii) this simply what's left of the program after closure (iv) I am hallucinating.
Zz.
Saturday, June 18, 2011
New USPS Stamps Honor Physicist
A new set of United States Post office stamps honors several American scientists, including Physicist and Nobel laureate Maria Goeppert Mayer.
Zz.
Zz.
Friday, June 17, 2011
LHC Achieves 2011 Luminosity Milestone
A press release from CERN indicates that just today, the LHC has achieved its 2011 milestone for luminosity.
In a lot of cases, the LHC is performing better than expected at the current settings. So there is high hopes that they will produce amazing physics (with or without the Higgs), even before the long shutdown next year for the 14 TeV upgrade.
Zz.
Today at around 10:50 CEST, the amount of data accumulated by LHC experiments ATLAS and CMS clicked over from 0.999 to 1 inverse femtobarn, signalling an important milestone in the experiments' quest for new physics. The number signifies a quantity physicists call integrated luminosity, which is a measure of the total number of collisions produced. One inverse femtobarn equates to around 70 million million collisions, and in 2010 it was the target set for the 2011 run. That it has been achieved just three months after the first beams of 2011 is testimony to how well the LHC is running.
In a lot of cases, the LHC is performing better than expected at the current settings. So there is high hopes that they will produce amazing physics (with or without the Higgs), even before the long shutdown next year for the 14 TeV upgrade.
Zz.
A Philosophical Interlude
I have a love-hate relationship with the subject of Philosophy. But during the last, oh, 10 years or so, the proportion has been shifting more towards "hate" than "love". I took 2 classes in philosophy while I was an undergraduate, both of them in philosophy of science, and enjoyed them tremendously (I got "A's" in both of them, if you had to ask). It helped that both classes were taught by a physicist who decided to go into philosophy later on in his career. So when he was describing certain physics principles, he knew them intimately.
But then, my feelings towards the subject of philosophy changed late in my graduate years, and continue to deteriorate as I become a physicist. It started to turn during the infamous Sokal Hoax. I realize that there are those who consider themselves as professional social scientists, philosophers, political scientists, etc., actually do not care whether they understood something well enough to actually say something about it, AND publish it! I was just amazed how the post-modernists can be such an influence and get away with bastardizing physics (and science in general), without being challenged. I suppose that was the impetus of the Sokal Hoax, to show that, really, the Emperor has no clothes.
When I read an article such as this, it kinda reinforce my view of philosophy. The author, who is a high energy physics theorist, describes a workshop on science and philosophy at Wuppertal.
Of course, the very same issue on what really exists and what doesn't comes up again and again whenever a discussion on philosophy is involved.
At some point, this gets very tiresome. If no reality in anything means that nothing "exists" (and that word has yet to be defined properly), then even ideas do not exist, since ideas are less "solid" than atoms. So the idea that "nothing exist" doesn't exist either! For an atom that has no reality, it's property is certainly highly reproducible. How many times can you say that about something in philosophy?
In the end, the main theme that I got out of this article is that the whole thing is a lot of fun and quite enjoyable. Sure, I go to a Cher concert and found it a lot of fun and quite enjoyable as well. But is this merely fun, or interesting, or is it also important? Maybe the field of ornithology is fun and interesting to someone in that field, but for the birds, it is utterly insignificant. So yes, it may be interesting, but is it important? I don't see it, and it certainly doesn't look that way from this article.
Zz.
But then, my feelings towards the subject of philosophy changed late in my graduate years, and continue to deteriorate as I become a physicist. It started to turn during the infamous Sokal Hoax. I realize that there are those who consider themselves as professional social scientists, philosophers, political scientists, etc., actually do not care whether they understood something well enough to actually say something about it, AND publish it! I was just amazed how the post-modernists can be such an influence and get away with bastardizing physics (and science in general), without being challenged. I suppose that was the impetus of the Sokal Hoax, to show that, really, the Emperor has no clothes.
When I read an article such as this, it kinda reinforce my view of philosophy. The author, who is a high energy physics theorist, describes a workshop on science and philosophy at Wuppertal.
Recently, the Wuppertal group organized an international spring school on particle physics and philosophy, which I found very exciting and enjoyable. It included a mixture of lectures by physicists, philosophers and historians, as well as working groups where students debated topics like the "theory–ladenness of experiments" and the "reality of quarks". Everybody was very enthusiastic, and the talks and tutorials triggered plenty of discussion between lecturers and students.
Of course, the very same issue on what really exists and what doesn't comes up again and again whenever a discussion on philosophy is involved.
I mentioned this to Robert Harlander who was sitting next to me during the lecture. To my surprise Robert answered that he does not believe in the reality of atoms – or in the reality of anything, for that matter. We argued for a while and tried to place our beliefs into the philosophical categories at hand. I finally settled for "progression realist", not least because the alternatives of "instrumentalist" or "anti-realist" sounded too negative to me. Robert called himself an "anarchist" which gave me the impression that he did not take the reality discussion very seriously. In any case, one of the good things about philosophical labels is that there are arguments and counter-arguments for almost every point of view, so you can easily change your position when you get tired of it.
At some point, this gets very tiresome. If no reality in anything means that nothing "exists" (and that word has yet to be defined properly), then even ideas do not exist, since ideas are less "solid" than atoms. So the idea that "nothing exist" doesn't exist either! For an atom that has no reality, it's property is certainly highly reproducible. How many times can you say that about something in philosophy?
In the end, the main theme that I got out of this article is that the whole thing is a lot of fun and quite enjoyable. Sure, I go to a Cher concert and found it a lot of fun and quite enjoyable as well. But is this merely fun, or interesting, or is it also important? Maybe the field of ornithology is fun and interesting to someone in that field, but for the birds, it is utterly insignificant. So yes, it may be interesting, but is it important? I don't see it, and it certainly doesn't look that way from this article.
Zz.
Thursday, June 16, 2011
Single Photons Obey Light Speed Limit
A very interesting report on the latest effort to measure the speed of light of single photons[1]. Even in an anomalous dispersive material where the group velocity can be "superluminal" such as in the infamous NEC experiment, no part of the wave actually moves faster than c. This experiment confirms it.
Zz.
[1] S. Zhang et al., Phys. Rev. Lett. v.106, p.243602 (2011).
Zz.
[1] S. Zhang et al., Phys. Rev. Lett. v.106, p.243602 (2011).
Wednesday, June 15, 2011
T2K Finds Evidence For Muon Neutrino Oscillation
It is unfortunately that the recent earthquake in Japan disrupted the T2K experiment, especially at J-PARC, or they would have had an even stronger evidence for this. Still, based on what they already have, this collaboration found strong evidence for muon neutrino changing flavor into electron neutrino.
Hopefully, they get back online at the end of the year as expected.
Zz.
Now, researchers at J-PARC have made a step towards measuring the final mixing angle – theta-13 – by measuring muon neutrinos oscillating into electron neutrinos. From January 2010 until March this year, the SuperKamiokande detector observed 121 neutrinos that clearly originated from the J-PARC neutrino beam. The background signal, which could mimic a signal from electron neutrinos that are present anyway, was estimated to be around 1.5 events. However, over 13 months, researchers at T2K, which has more than 500 researchers from 12 countries, spotted six events arising from muon neutrinos turning into electron neutrinos. The probability of observing, by chance, six events when only 1.5 are expected is 0.7%, or a little less than 1 in 100.
Hopefully, they get back online at the end of the year as expected.
Zz.
Movie Warning Should Be At Every Physics Conference
Here's the thing. Physicists in general are thoughtful, courteous people. Yet, at time, some of them can be downright rude.
When you go to a movie nowadays, especially in the US, before the start of the main feature, you will see a sign imploring you to either turn off your cell phones/electronic devices, or put them into silent mode. I think we will have to do this at every conference, especially physics conferences and lectures. I lost count how many times cell phones went off at the past TIPP conference, and in many cases, it is in the same lecture/talk! One would think that after the first one went off, considering the disruption (and embarrassment to the person involved), everyone would check their cell phones if they forgot to silent it. But nooooooo! A few minutes later, another one went off, and another! What gives??!!
Let's get this out of the way: IT IS UTTERLY INCONSIDERATE TO THE SPEAKER AND THOSE AROUND YOU WHEN YOUR CELL PHONES GO OFF IN THE MIDDLE OF THE TALK!
Please don't perpetuate the stereotypical image of scientists/engineers being socially inept! Be considerate and turn off those damn cell phones during a talk! If you are THAT important, then don't attend!
Of course, I do remember going to a seminar once when the speaker's cell phone went off in the middle of his presentation. When that happens, all you can do is shake your head and move on.
Zz.
When you go to a movie nowadays, especially in the US, before the start of the main feature, you will see a sign imploring you to either turn off your cell phones/electronic devices, or put them into silent mode. I think we will have to do this at every conference, especially physics conferences and lectures. I lost count how many times cell phones went off at the past TIPP conference, and in many cases, it is in the same lecture/talk! One would think that after the first one went off, considering the disruption (and embarrassment to the person involved), everyone would check their cell phones if they forgot to silent it. But nooooooo! A few minutes later, another one went off, and another! What gives??!!
Let's get this out of the way: IT IS UTTERLY INCONSIDERATE TO THE SPEAKER AND THOSE AROUND YOU WHEN YOUR CELL PHONES GO OFF IN THE MIDDLE OF THE TALK!
Please don't perpetuate the stereotypical image of scientists/engineers being socially inept! Be considerate and turn off those damn cell phones during a talk! If you are THAT important, then don't attend!
Of course, I do remember going to a seminar once when the speaker's cell phone went off in the middle of his presentation. When that happens, all you can do is shake your head and move on.
Zz.
Tuesday, June 14, 2011
"What Life Is Like As A Scientist In Congress"
Bill Foster is up for this talk at TIPP Conference.
He shows video introduction of himself by Barack Obama during his last run at congressional office.
He shows his timeline: Businessman, then physicist, then politicians.
Shows his business beginnings, and his work towards his PhD. He mainly work at the CDF as a physicist. Shows photo of CDF central tracking chamber. He goes on to show several of the stuff he worked on while at Fermilab.
Why did he go into politics?
His excuse was that it is in his genes.
He describes life as a congressman - not glamorous. He shows pictures of his efficiency apartment while in Washington DC.
His pleasant surprise : "I actually do like people!"
"Our economy had many of the properties of an oscillating analog circuit"
"There are no controlled experiments in politics" - no way we can check what would happen without such-and-such being done.
He shows example of negative feedback loop in the housing market bubble.
It was a fascinating talk, and how politicians simply don't quite grasp quantitative versus qualitative information.
Zz.
He shows video introduction of himself by Barack Obama during his last run at congressional office.
He shows his timeline: Businessman, then physicist, then politicians.
Shows his business beginnings, and his work towards his PhD. He mainly work at the CDF as a physicist. Shows photo of CDF central tracking chamber. He goes on to show several of the stuff he worked on while at Fermilab.
Why did he go into politics?
His excuse was that it is in his genes.
He describes life as a congressman - not glamorous. He shows pictures of his efficiency apartment while in Washington DC.
His pleasant surprise : "I actually do like people!"
"Our economy had many of the properties of an oscillating analog circuit"
"There are no controlled experiments in politics" - no way we can check what would happen without such-and-such being done.
He shows example of negative feedback loop in the housing market bubble.
It was a fascinating talk, and how politicians simply don't quite grasp quantitative versus qualitative information.
Zz.
"Detectors for Future Colliders"
Hitoshi Yamamoto from Tohoku University is talking about this topic in the plenary session of TIPP Conference (last day).
He talks about the LHC upgrade for the detectors. They are designed to work at 1e34/cm^2/s luminosity, but there are many issues for detectors under such conditions. He talks about the ATLAS, CMS, ALICE, and LHCb upgrades during the various planned LHC shutdowns.
SuperB Factories - KEK has luminosity of 2e34/cm^2/s. SuperB factories will have 40-50 times more luminosity. He describes the requirement for SuperB detectors.
Belle II upgrades are also mentioned.
ILC is next. Detector performance goals for ILC are described. The ILD detector has a B field of 3.5 Tesla, while the SiD detector has B field of 5 Tesla.
He then describes requirements for CLIC and Muon collider detectors. The 1.5 TeV version of the muon collider detector will have challenging beam background.
Zz.
He talks about the LHC upgrade for the detectors. They are designed to work at 1e34/cm^2/s luminosity, but there are many issues for detectors under such conditions. He talks about the ATLAS, CMS, ALICE, and LHCb upgrades during the various planned LHC shutdowns.
SuperB Factories - KEK has luminosity of 2e34/cm^2/s. SuperB factories will have 40-50 times more luminosity. He describes the requirement for SuperB detectors.
Belle II upgrades are also mentioned.
ILC is next. Detector performance goals for ILC are described. The ILD detector has a B field of 3.5 Tesla, while the SiD detector has B field of 5 Tesla.
He then describes requirements for CLIC and Muon collider detectors. The 1.5 TeV version of the muon collider detector will have challenging beam background.
Zz.
Labels:
Conference,
Detector Physics,
Experiment,
High energy physics
Last Day At TIPP
Today is the final day of the TIPP Conference. The closing plenary session is rather interesting and eclectic. What I'm looking forward to is the talk by former Congressman and physicist Bill Foster. His talk is titled "Applications of Analog Circuit Design to Life as a Scientist in the US Congress". I'm snickering already, so this should be a good one.
Foster, as you would have known if you have followed this blog, is planning to run again for office after his defeat in the last midterm election.
Zz.
Foster, as you would have known if you have followed this blog, is planning to run again for office after his defeat in the last midterm election.
Zz.
Monday, June 13, 2011
"Gravitational Wave Detection"
More from the ongoing TIPP conference. Sam Waldman from MIT is presenting this topic during the morning plenary session.
The talk covers the effort in detecting gravitational wave.
"Mass tells spacetime how to curve, and spacetime tells mass how to move" - Wheeler
Astrophysical tools to produce gravitational waves - neutron stars and black holes. He is showing a movie of two compact stars orbiting each other, and also had an audio of the frequency change as the two stars collapse into each other.
Zz.
The talk covers the effort in detecting gravitational wave.
"Mass tells spacetime how to curve, and spacetime tells mass how to move" - Wheeler
Astrophysical tools to produce gravitational waves - neutron stars and black holes. He is showing a movie of two compact stars orbiting each other, and also had an audio of the frequency change as the two stars collapse into each other.
Zz.
Labels:
Astrophysics,
Conference,
Detector Physics,
Relativity
Origin Of Life At CERN
A rather fascinating and unexpected topic of a workshop held at CERN recently.
There's plenty of opportunities to expand the horizon of a facility and organization such as CERN. This certainly is one way that it can contribute to a field of study that needs the expertise and capability that CERN has.
Zz.
On 20 May, a small group of biologists and chemists arrived at Cern for a workshop from the institution's experts on how to organise a disparate community of research groups all over the world into a single scientific force. While much of the research at Cern is focused on the beginnings of the Universe, the delegates also held a discussion on the beginnings of life.
Much of the research in the field is currently focused on so-called "autocatalytic sets". These are groups of molecules that undergo reactions where all molecules mutually catalyse each other -- speed up the rate at which the reaction takes place. In this way, the sets are self-sustaining. It's believed that protocells emerged from such a system, but there's a significant question mark over how likely it is for these sets to occur randomly.
There's plenty of opportunities to expand the horizon of a facility and organization such as CERN. This certainly is one way that it can contribute to a field of study that needs the expertise and capability that CERN has.
Zz.
Sunday, June 12, 2011
Free Public Lecture on LHC Technology
Remember, if you're in downtown Chicago this afternoon, there is a free public lecture on the LHC and its technology at the Sheraton Hotel and Towers at 3:00 pm as part of the TIPP conference that is ongoing.
Here's your chance to listen to the amazing technology of the LHC machine, and you can ask him about that electrical problem with the superconducting magnet that brought the giant down to its knees a while ago.
Zz.
Here's your chance to listen to the amazing technology of the LHC machine, and you can ask him about that electrical problem with the superconducting magnet that brought the giant down to its knees a while ago.
Zz.
Saturday, June 11, 2011
"Improved PMTs for the Cherenkov Telescope Array"
A talk at the TIPP conference going on now in the Photon Detector session. This one is presented by Razmik Mirzoyan of Max Planck Institute. It's a fascinating look at what can already be achieved now for the CTA effort, and a hint of what might be possible in the future simply based on existing technology.
The core people in the CTA are MAGIC, HESS and VERITAS collaborations. It is an initiative to buld the next generation of large ground-based gamma ray detector, with 10 times higher sensitivity. They want to study AGNs, Black holes, gamma ray burst, and galactic sources (pulsars, supernovae, etc. Energy range is from 10 GeV to 100 TeV. Try to answer long-standing question about origin of cosmic rays. Planning on ~100 telescope, 2-arrays (south and north pole).
3 types of telescopes are planned, Large: 23 m, midsize 12 m, and small 4-7 m diameter. Use standard PMTs and maybe SiPM.
PMTs mainly from Hamamatsu and Electron Tubes, with QE peaking around 35-40% at around 400 nm. He selected the 1.5" PMT and discussed at length the property of the PMTs from the two companies. These are the ones being considered for the CTA, I presumed, if not used already.
Zz.
The core people in the CTA are MAGIC, HESS and VERITAS collaborations. It is an initiative to buld the next generation of large ground-based gamma ray detector, with 10 times higher sensitivity. They want to study AGNs, Black holes, gamma ray burst, and galactic sources (pulsars, supernovae, etc. Energy range is from 10 GeV to 100 TeV. Try to answer long-standing question about origin of cosmic rays. Planning on ~100 telescope, 2-arrays (south and north pole).
3 types of telescopes are planned, Large: 23 m, midsize 12 m, and small 4-7 m diameter. Use standard PMTs and maybe SiPM.
PMTs mainly from Hamamatsu and Electron Tubes, with QE peaking around 35-40% at around 400 nm. He selected the 1.5" PMT and discussed at length the property of the PMTs from the two companies. These are the ones being considered for the CTA, I presumed, if not used already.
Zz.
Friday, June 10, 2011
D0 Does Not See CDF Bump
As was hinted several times, it is now official that the other detector at the Tevatron, D0, does not see the bump that has been reported by CDF.
Not only that, ATLAS and CMS at the LHC also do not see the CDF bump, even though the claim so far is that there's just not enough data yet at these detectors to make such a definitive statement one way or the other.
Still, unless D0 or another detector verifies this data, there is just very little to go on in terms of a verification.
Zz.
But today, researchers on the independent D0 experiment, also at Fermilab, announced that their data do not confirm the signal. "The result is not good for the CDF. We are not confirming the signal. We just see nothing," says Dmitri Denisov, spokesman for D0, which released its results online today.
Not only that, ATLAS and CMS at the LHC also do not see the CDF bump, even though the claim so far is that there's just not enough data yet at these detectors to make such a definitive statement one way or the other.
Still, unless D0 or another detector verifies this data, there is just very little to go on in terms of a verification.
Zz.
Einstein + Confucius = No Exhibition
A rather strange disagreement between to major museums is causing the Einstein exhibition to bypass Shanghai.
Hum.. not sure what the issue is. To me, as long as one doesn't try to justify its existence or validity based on the other, I can't see any possible problem with having both simultaneously.
Zz.
According to the Associated Press, the show’s organizers from the Historical Museum of Bern were unhappy with plans of Shanghai’s Science and Technology Museum. The would-be hosts had apparently wanted to merge the Einstein show with a separate exhibit of comparable size about the great Chinese philosopher Confucius who lived more than 2000 years earlier.
Hum.. not sure what the issue is. To me, as long as one doesn't try to justify its existence or validity based on the other, I can't see any possible problem with having both simultaneously.
Zz.
TIPP Conference, Day 2
Attending the TIPP 2011 conference. Great plenary session this morning. A very concise review of neutrino detectors and experiments, and what experiments are in the pipe line in the future. Then there were talks on Dark Matter detection, both "direct" and "indirect".
The detection process itself is a major field of study, and another area where engineering and physics merge.
This promises to be a very long day of very interesting talks, especially in the afternoon when there are several parallel sessions.
Zz.
The detection process itself is a major field of study, and another area where engineering and physics merge.
This promises to be a very long day of very interesting talks, especially in the afternoon when there are several parallel sessions.
Zz.
Thursday, June 09, 2011
Salford University Buys Joule House
Another historically significant building is now in the news. Joule house, where James Joule conducted his experiment, is now a property of Salford University. Presumably, it will at least be in safer hands for preservation.
Come to think of it, I don't think I am aware of anything about Joule. My knowledge of his life and times is non-existent.
Zz.
The university said James Joule lived at the property from 1819 to 1854 and, through his experiments conducted in the basement of the building late at night, established the "mechanical equivalent of heat". The international unit of energy, the joule, was named after him.
Come to think of it, I don't think I am aware of anything about Joule. My knowledge of his life and times is non-existent.
Zz.
Wednesday, June 08, 2011
"Direct Measurement of the Quantum Wavefunction"
Whoa! We seem to have quite a rush of papers that try to directly measure really basic quantum properties. We earlier had the measurement of the path taken by photons in a double-slit setup, using the weak measurement protocol. Now comes work that use the same weak measurement technique to make a direct measurement of the quantum wavefunction[1]!
Abstract: The wavefunction is the complex distribution used to completely describe a quantum system, and is central to quantum theory. But despite its fundamental role, it is typically introduced as an abstract element of the theory with no explicit definition. Rather, physicists come to a working understanding of the wavefunction through its use to calculate measurement outcome probabilities by way of the Born rule. At present, the wavefunction is determined through tomographic methods which estimate the wavefunction most consistent with a diverse collection of measurements. The indirectness of these methods compounds the problem of defining the wavefunction. Here we show that the wavefunction can be measured directly by the sequential measurement of two complementary variables of the system. The crux of our method is that the first measurement is performed in a gentle way through weak measurement so as not to invalidate the second. The result is that the real and imaginary components of the wavefunction appear directly on our measurement apparatus. We give an experimental example by directly measuring the transverse spatial wavefunction of a single photon, a task not previously realized by any method. We show that the concept is universal, being applicable to other degrees of freedom of the photon, such as polarization or frequency, and to other quantum systems—for example, electron spins, SQUIDs (superconducting quantum interference devices) and trapped ions. Consequently, this method gives the wavefunction a straightforward and general definition in terms of a specific set of experimental operations. We expect it to expand the range of quantum systems that can be characterized and to initiate new avenues in fundamental quantum theory.
Also read the News and Views article in the same issue of Nature that presents a review of this work.
Zz.
[1] J.S. Lundeen et al., Nature v.474, p.188 (2011).
Abstract: The wavefunction is the complex distribution used to completely describe a quantum system, and is central to quantum theory. But despite its fundamental role, it is typically introduced as an abstract element of the theory with no explicit definition. Rather, physicists come to a working understanding of the wavefunction through its use to calculate measurement outcome probabilities by way of the Born rule. At present, the wavefunction is determined through tomographic methods which estimate the wavefunction most consistent with a diverse collection of measurements. The indirectness of these methods compounds the problem of defining the wavefunction. Here we show that the wavefunction can be measured directly by the sequential measurement of two complementary variables of the system. The crux of our method is that the first measurement is performed in a gentle way through weak measurement so as not to invalidate the second. The result is that the real and imaginary components of the wavefunction appear directly on our measurement apparatus. We give an experimental example by directly measuring the transverse spatial wavefunction of a single photon, a task not previously realized by any method. We show that the concept is universal, being applicable to other degrees of freedom of the photon, such as polarization or frequency, and to other quantum systems—for example, electron spins, SQUIDs (superconducting quantum interference devices) and trapped ions. Consequently, this method gives the wavefunction a straightforward and general definition in terms of a specific set of experimental operations. We expect it to expand the range of quantum systems that can be characterized and to initiate new avenues in fundamental quantum theory.
Also read the News and Views article in the same issue of Nature that presents a review of this work.
Zz.
[1] J.S. Lundeen et al., Nature v.474, p.188 (2011).
Atom Watch? I'll Pass!
Would you want to wear an "atom watch" on your wrist?
I guess it is way too much to expect something like this to actually be "accurate".
Long live the Rutherford atom! :)
Zz.
The three-dimensional design represents each component of the atom; the nucleus of protons and neutrons, the orbiting electrons, the trajectory path of the orbiting electrons.
I guess it is way too much to expect something like this to actually be "accurate".
Long live the Rutherford atom! :)
Zz.
Tuesday, June 07, 2011
"Marvel of Technology: The LHC, Machine, And Experiments"
If you are in Chicago this coming Sunday, June 12, 2011, and you have a couple of hours to spare in the afternoon, why not hop on over to the free public lecture on this topic, given by Lyn Evans. It will be at the Chicago Ballrooms 8,9, and 10 at the Sheraton Towers.
The lecture is given as part of the Technology and Instrumentation in Particle Physics 2011 conference.
Zz.
The lecture is given as part of the Technology and Instrumentation in Particle Physics 2011 conference.
Zz.
Labels:
Detector Physics,
Experiment,
High energy physics,
LHC
M&I Curriculum Not As Effective For Intro Mechanics
OK, so now we are getting a bit of a conflicting results, even though they are for different subject areas.
I reported earlier of a study of students taking classical E&M using two different curricula: standard curriculum and the Matter & Interaction (M&I) curriculum. The effectiveness of these two curriculum were testing using Brief Electricity and Magnetism Assessment (BEMA), and students undergoing the M&I curriculum performed better in the assessment.
Now comes a similar study, but this time, on intro, calculus-based mechanics course. Again, the students either took the traditional curriculum, or the M&I curriculum, and the assessment was done using the well-known Force Concept Inventory (FCI). Surprisingly, those students taking the traditional curriculum performed better after the FCI assessment. This, of course, is an opposite result that we got earlier with BEMA and E&M courses.
The authors offered the following explanation for possible reasons that the M&I students didn't perform better on the FCI assesement:
So one of the reasons given here is that the traditional curriculum students tend to spend more time tackling the same type of problems that FCI would ask, so these students would be seeing more familiar questions than the M&I students. So does that mean that one might hazard a guess that the BEMA test might also ask questions that M&I students are more familiar with than standard curriculum students? I skimmed through this paper, and I didn't see any discussion on the earlier result, which is surprising since the two studies came almost out of the same group.
Zz.
I reported earlier of a study of students taking classical E&M using two different curricula: standard curriculum and the Matter & Interaction (M&I) curriculum. The effectiveness of these two curriculum were testing using Brief Electricity and Magnetism Assessment (BEMA), and students undergoing the M&I curriculum performed better in the assessment.
Now comes a similar study, but this time, on intro, calculus-based mechanics course. Again, the students either took the traditional curriculum, or the M&I curriculum, and the assessment was done using the well-known Force Concept Inventory (FCI). Surprisingly, those students taking the traditional curriculum performed better after the FCI assessment. This, of course, is an opposite result that we got earlier with BEMA and E&M courses.
The authors offered the following explanation for possible reasons that the M&I students didn't perform better on the FCI assesement:
The relatively poor performance of M&I students on the FCI might appear surprising given the sophistication of some of the mechanics problems addressed in the M&I course, for example, planetary motion, ball-and-spring models of solids, multi-particle systems, etc. From a physicist’s perspective, M&I students should be able to successfully solve the sorts of problems appearing on the FCI; yet, apparently they were unable to extend what they had learned, for example, in the context of the momentum principle, to questions on the FCI. Two interrelated factors are operating here: first, the context of learning; and second, the role of practice within that context. In general, students, especially at the introductory level in physics, are sufficiently challenged to learn what they have to learn and tend not be very successful in gen eralizing their skills to novel situations with which they have had little practice [27, 28].
We believe that the differences in instruction, how much and how long students learn about particular mechanics concepts, had a direct effect on their performance on the FCI. The relative fraction of homework questions and lecture topics covering FCI force and motion concepts provides a connection to the time students’ devoted to learning particular concepts and the depth to which concepts are covered in their respective courses (i.e., time-on-task). It is well-accepted that increased time-on-task will generally improve learning gains on the topics for which more time is devoted [29, 30]. While an accurate measure of student time-on-task requires interviewing individual students, our results suggest that students of the traditional curriculum devoted more time to learning FCI force and motion concepts than students of M&I.
As we have shown, traditional students had much greater practice in the sorts of problems the FCI presents and their relative performance shows the importance of that practice. It is possible that additional exposure to FCI force and motion concepts would improve M&I students’ performance on the FCI. However, making changes to the curriculum in this manner requires instructors to reflect on the learning goals for their course. The M&I curriculum was not designed to improve performance on the FCI. As mentioned previously, the M&I curriculum includes significant changes to the content of the intro ductory course, not just pedagogy, and the goals of its content might not align with those of the traditional curriculum. The amount of time in a semester is finite and including additional practice on FCI force and motion concepts might require the instructor to leave out other M&I topics (e.g., elementary statistical mechanics) and/or tools (i.e., computation).
So one of the reasons given here is that the traditional curriculum students tend to spend more time tackling the same type of problems that FCI would ask, so these students would be seeing more familiar questions than the M&I students. So does that mean that one might hazard a guess that the BEMA test might also ask questions that M&I students are more familiar with than standard curriculum students? I skimmed through this paper, and I didn't see any discussion on the earlier result, which is surprising since the two studies came almost out of the same group.
Zz.
Monday, June 06, 2011
Anti-hydrogen Trapped For 1000 Seconds
1000 seconds! That's almost an eternity in the field of elementary particle physics! :)
The Alpha collaboration continues to make progress with trapping antimatter atoms, and this is their latest success.
A lot of studies can be done with such a system, so stay tune. There are plenty more to come!
Zz.
The Alpha collaboration continues to make progress with trapping antimatter atoms, and this is their latest success.
"We think we make our anti-hydrogen in excited states; in other words the positron is at a larger distance from the nucleus. It has more energy. That's not the state we want to study. It takes some fraction of a second for these atoms, once they're produced, to get to the ground state.
"If you hold them 1,000 seconds, you can be quite sure they're in the state you want to study; and this is the first time that anyone can make that claim."
A lot of studies can be done with such a system, so stay tune. There are plenty more to come!
Zz.
Friday, June 03, 2011
"Observing the Average Trajectories of Single Photons in a Two-Slit Interferometer"
Wow! Astounding work reported in this week's Science[1].
Abstract: A consequence of the quantum mechanical uncertainty principle is that one may not discuss the path or “trajectory” that a quantum particle takes, because any measurement of position irrevocably disturbs the momentum, and vice versa. Using weak measurements, however, it is possible to operationally define a set of trajectories for an ensemble of quantum particles. We sent single photons emitted by a quantum dot through a double-slit interferometer and reconstructed these trajectories by performing a weak measurement of the photon momentum, postselected according to the result of a strong measurement of photon position in a series of planes. The results provide an observationally grounded description of the propagation of subensembles of quantum particles in a two-slit interferometer.
Also see the press release on this here.
Most astounding part is the path reconstruction shown in Fig. 3.
The authors conclude that what they have observed verifies the Bohmian picture of QM.
If this is true, then this might be the first evidence that there's something to this interpretation of QM.
Fascinating paper!
Zz.
[1] S. Kocsis et al., Science v.332, p.1170 (2011).
Abstract: A consequence of the quantum mechanical uncertainty principle is that one may not discuss the path or “trajectory” that a quantum particle takes, because any measurement of position irrevocably disturbs the momentum, and vice versa. Using weak measurements, however, it is possible to operationally define a set of trajectories for an ensemble of quantum particles. We sent single photons emitted by a quantum dot through a double-slit interferometer and reconstructed these trajectories by performing a weak measurement of the photon momentum, postselected according to the result of a strong measurement of photon position in a series of planes. The results provide an observationally grounded description of the propagation of subensembles of quantum particles in a two-slit interferometer.
Also see the press release on this here.
Most astounding part is the path reconstruction shown in Fig. 3.
For the experimentally reconstructed trajectories for our double slit (Fig. 3), it is worth stressing that photons are not constrained to follow these precise trajectories; the exact trajectory of an individual quantum particle is not a welldefined concept. Rather, these trajectories represent the average behavior of the ensemble of photons when the weakly measured momentum in each plane is recorded contingent upon the final position at which a photon is observed. The trajectories resemble a hydrodynamic flow with a central line of symmetry clearly visible:
The authors conclude that what they have observed verifies the Bohmian picture of QM.
Single-particle trajectories measured in this fashion reproduce those predicted by the Bohm–de Broglie interpretation of quantum mechanics (8), although the reconstruction is in no way dependent on a choice of interpretation.
If this is true, then this might be the first evidence that there's something to this interpretation of QM.
Fascinating paper!
Zz.
[1] S. Kocsis et al., Science v.332, p.1170 (2011).
Thursday, June 02, 2011
The Physics of 3D Without Glasses
We all know about the various technological improvements in 3D movies throughout the years. No longer do we now need those color glasses to view 3D movies now. But the improvement doesn't stop there. With the introduction of 3D displays on game units and computers that do not require 3D glasses, the technology in 3D image display is trying to go even further.
That's why I thought this is a rather fascinating explanation on the physics of 3D without glasses. Don't miss it if you are curious on how these things work.
Zz.
That's why I thought this is a rather fascinating explanation on the physics of 3D without glasses. Don't miss it if you are curious on how these things work.
Zz.
CDF "Bump" Gets To Almost 5-Sigma
Looks like the "bump" in the data gets to be more "definite".
A new report based on the latest presentation of the data on that bump that was reported earlier, seems to show that the latest set of data has pushed the statistical significance from 3-sigma to almost 5-sigma.
I suppose at this point, it should be taken seriously. But D0 and LHC have not offered any kind of support for such an observation. The data could produce a 10-sigma data for all we care, but without corresponding verification from D0 especially, it will be very difficult to be convinced of such an event, considering the amount of statistical processing that one has to do with the data.
Zz.
A new report based on the latest presentation of the data on that bump that was reported earlier, seems to show that the latest set of data has pushed the statistical significance from 3-sigma to almost 5-sigma.
At the time, CDF was looking for slightly rare di-boson pairs – W bosons produced in association with another W or a Z boson. It noted a bump between 120 and 160 GeV /C2 in the jets produced in the collisions with a statistical significance of about “three-sigma”, which meant that the result would not be considered valuable until a “five-sigma” statistical significance could be established. The new data, however, have established a significance that is officially “closer to five sigma” (unconfirmed sources suggest it is as close as 4.8) and that “it was not just a statistical fluctuation” and that it is now a “serious issue for CDF to understand this”, according to Punzi.
I suppose at this point, it should be taken seriously. But D0 and LHC have not offered any kind of support for such an observation. The data could produce a 10-sigma data for all we care, but without corresponding verification from D0 especially, it will be very difficult to be convinced of such an event, considering the amount of statistical processing that one has to do with the data.
Zz.
Wednesday, June 01, 2011
Results From Gravity Probe B
I mentioned earlier about the recent report out of Gravity Probe B that confirms one of the predictions of Einstein's General Relativity. In fact, a commenter questioned the validity of the result when it was announced to the public, rather than "waiting" for the publication of the work. Whatever, considering that the news reports were based on a manuscript that had already been accepted for publication.
Well, here is the paper in full glory, published in Phys. Rev. Lett., no less! A commentary by Clifford Will includes a link for a FREE DOWNLOAD of the paper.
Enjoy!
BTW, especially to those who often thinks that physicists only tries to stick to upholding current understandings, read the last paragraph of Will's article:
This is very much in-line with what I said earlier. It is also the reason why, for many physicists, not finding the Higgs would be even more exciting than finding it.
So there!
Zz.
Well, here is the paper in full glory, published in Phys. Rev. Lett., no less! A commentary by Clifford Will includes a link for a FREE DOWNLOAD of the paper.
Enjoy!
BTW, especially to those who often thinks that physicists only tries to stick to upholding current understandings, read the last paragraph of Will's article:
Even though it is popular lore that Einstein was right (I even wrote a book on the subject), no such book is ever completely closed in science. As we have seen with the 1998 discovery that the universe is accelerating, measuring an effect contrary to established dogma can open the door to a whole new world of understanding, as well as of mystery. The precession of a gyroscope in the gravitation field of a rotating body had never been measured before GP-B. While the results support Einstein, this didn’t have to be the case. Physicists will never cease testing their basic theories, out of curiosity that new physics could exist beyond the “accepted” picture.
This is very much in-line with what I said earlier. It is also the reason why, for many physicists, not finding the Higgs would be even more exciting than finding it.
So there!
Zz.
Subscribe to:
Posts (Atom)