: After hearing and assimilating this program, the listener will be better able to: Increase his/her basic knowledge of important advances in medicine Identify a broad range of clinical research reported in the medical literature Synthesize research findings through one-on-one interviews with authors and editorialists Integrate new treatments reviewed in the summaries into current practice Challenge oneself with thoughtful, clinically relevant questions. GUIDELINE WATCH: PREVENTINGPRIMARY BREAST CANCER IN WOMEN AT RISK The United States Preventive Services Task Force has updated its recommendations on medications for reducing the risk for primary breast cancer in women. Details appear in the November19, 2013 Annals of Internal Medicine ( http:// dx.doi.org /10.7326/0003-4819-159-10-201311190-00718 ). This recommendation helps guide primary care clinicians on when to implement a risk-modification strategy for asymptomatic women age 35 or older. Selective estrogen-receptor modulators (namely, tamoxifen and raloxifene [trade name: Evista]) lower the incidence of invasive breast cancer by up to 9 cases per 1000 women during 5years. Tamoxifen seems to be more effective and has been approved for women older than 35; raloxifene has been approved for postmenopausal women. Aromatase inhibitors have not been approved by the Food and Drug Administration and were not covered in this recommendation, but in a recent randomized trial, anastrozole was reported to be more effective than selective estrogen-receptor modulators in women at high risk ( www. jwatch .org /na33156 ). So what are the key points of these recommendations? Risk can be estimated by one of several tools, including the Breast Cancer Risk Assessment Tool ( www. cancer .gov /bcrisktool ). Clinicians should engage in shared decision making with women who are at an excess risk for breast cancer about available medications that might reduce their risk. For women who have a high risk for breast cancer and a low risk for adverse medication effects, clinicians should offer to prescribe tamoxifen or raloxifene. Specifically, the task force believes that women with a 5-year risk for breast cancer of 3% or higher are more likely to benefit from, than to be harmed by, tamoxifen or raloxifene. Tamoxifen or raloxifene should not be prescribed for women who are not at an excess risk. There are potential harms associated with therapy, namely: An excess risk for venous thromboembolic events (up to 7 per 1000 women during 5years); this risk increases with age and is higher with tamoxifen than with raloxifene. Tamoxifen, but not raloxifene, is associated with an excess risk for endometrial cancer (4 additional cases per 1000 women who take tamoxifen). This update reaffirms the statement issued by the task force back in 2002, which recommends discussions with and treatment for women at a high risk for breast cancer and a low risk for medication side effects ( www. jwatch .org /jw20020719000000 1 ). Shared decision making will be key to helping patients determine whether chemoprevention of breast cancer is in their best interest. Most women are not at an excess risk for breast cancer and, as noted by the authors, only a subset of at-risk women will benefit from starting therapy with tamoxifen or raloxifene. ARE DIETARY FODMAPs A CAUSE OF IRRITABLE BOWEL SYNDROME? The idea that dietary constituents called FODMAPs (an acronym that stands for Fermentable, Oligo-, Di-, Monosaccharides, And Polyols) might be responsible for some cases of irritable bowel syndrome is gaining traction. FODMAPs are poorly absorbed, short-chain carbohydrates that include fructose, lactose, fructans (found in wheat), galactans, and polyol sweeteners. In a crossover study in the January2014 issue of Gastroenterology ( http:// dx.doi.org /10.1053/ j. gastro.2013.09.046 ), researchers in Australia randomized 30 patients who met the criteria for irritable bowel syndrome and 8 healthy controls to either a low-FODMAP diet (prepared by the researchers) or a so-called typical Australian diet for 3weeks, followed by the opposite diet for another 3weeks; the two diet periods were separated by a 3-week washout. The patients were blinded to the constituents of the two diets. At baseline, the average symptom score for the patients with irritable bowel syndrome was 36 (on a 100-point scale); the average scores decreased to 23 during the low-FODMAP period and increased to 45 during the typical-diet period that is a highly significant difference. Regardless of the subtype of irritable bowel syndrome, the patients were more satisfied with stool consistency during the low FODMAP diet. In the controls, symptom scores were low at baseline and did not change during either diet period. This is the first randomized trial to provide high-quality evidence that FODMAPS contribute to irritable bowel symptoms. One potential confounding dietary constituent is gluten, because a low-FODMAP diet (which eliminates wheat, rye, and barley because of their fructan content) is also low in gluten, but in a recent study by the same research group, FODMAPS and not gluten were likely responsible for gastrointestinal symptoms in patients without celiac disease, but with perceived gluten sensitivity ( www. jwatch .org /na32277 ). Information on low-FODMAP diets ( www. med.monash .edu /cecs/gastro/fodmap ) is available from this research teams institution and from other sources ( http://fodmapliving .com /the-science/stanford-university-low-fodmap-diet ). Clinicians should consider recommending a low-FODMAP diet to patients with irritable bowel syndrome and abdominal bloating, flatus, and diarrhea. MORE EVIDENCE LINKS GUT MICROBIOME TO AUTISM The microbiome of the gut has been linked to several important diseases ( www. jwatch .org /na33084 ), including autism. In a study in the December19, 2013 issue of Cell ( http:// dx.doi.org /10.1016/ j. cell.2013.11.024 ), researchers present provocative evidence in favor of this hypothesis. The researchers studied a mouse model of autism affected animals display behaviors similar to those of autistic people. These mice have a characteristic unusual microbiome and a compromised gut mucosal barrier, which leads to absorption and high blood levels of several gut bacterial metabolites. One of these metabolites is chemically related to a metabolite that has been reported to be elevated in the urine of children with autism ( http:// dx.doi.org /10.3109/1354750x.2010.548010 ). Parenteral treatment of normal mice with another of these metabolites produced autistic behavior. Treating the autistic-model mice with probiotic Bacteroides fragilis reversed the abnormal microbiome, the high blood levels of the unique metabolites, and the behavioral abnormalities. If this mouse model of autism truly reflects a pathology similar to autism in people, these researchers might have identified yet another major human illness that is linked to the gut microbiome. Plus, the identification of two specific metabolites that induce autistic behavior could provide molecular targets for therapy. Finally although it seems too good to be true the suggestion that probiotic therapy might cure autism would surely be a remarkable event if it proves to be valid. PROGRESS TOWARD ALL-ORAL, INTERFERON-FREE THERAPYFOR HEPATITIS C VIRUS GENOTYPE 1 INFECTION Right now, clinicians still need interferon to treat patients with hepatitis C virus genotype 1 infections. But in two studies in the January16, 2014 New England Journal of Medicine , researchers evaluated interferon-free regimens in patients with hepatitis C. In the first study ( http:// dx.doi.org /10.1056/nejmoa1306218 ), researchers randomized 200 treatment-naïve or treatment-experienced patients (most with genotype 1 infections) to oral daclatasvir plus oral sofosbuvir, either with or without ribavirin; treatment lasted between 3months and 6months. Sustained viral response rates at 3months posttreatment ranged from 90% to 100% (depending on patient subgroup and regimen). Recently, the Food and Drug Administration approved sofosbuvir (trade name: Sovaldi) for the treatment of chronic hepatitis C ( www. accessdata.fda .gov / drugsatfda_docs /appletter/2013/204671Orig1s000ltr.pdf ). In the second study ( http:// dx.doi.org /10.1056/NEJMoa1306227 ), researchers followed nearly 600 treatment-naïve or treatment-experienced patients with genotype 1 hepatitis C virus infections who got various combinations of three oral drugs that are under development (namely, ABT-450, ABT-267, and ABT-333) plus ribavirin for 2months to 6months. Sustained viral response rates ranged from 80% to 100%, again, depending on patient subgroup and regimen. These all-oral, direct-antiviral regimens for hepatitis C virus genotype 1positive patients achieved high sustained virologic response rates with short durations of therapy. In both of these studies, resistance was rare, and side effects (like fatigue, headache, and nausea) were mild. Plus, none of the traditional predictors of negative response had a significant effect on treatment response. Soon, all-oral regimens for every patient with hepatitis C will be a reality. EARLY vs . DELAYED CHOLECYSTECTOMY FOR ACUTE CHOLECYSTITIS Years ago, cholecystectomy was often delayed for weeks in patients with acute cholecystitis; the rationale was that delayed surgery after antibiotic therapy had reduced inflammation would be easier and safer than early surgery. More recently, it has been suggested that early surgery is in fact preferable. Two studies in recent issues of the Annals of Surgery shed additional light on this issue. In the first study ( http:// dx.doi.org /10.1097/SLA.0b013e3182a1599b ), researchers in Germany randomized 618 patients with acute cholecystitis either to immediate laparoscopic cholecystectomy (no more than 24hours after hospital admission) or to antibiotic therapy and delayed surgery (between 7days and 45days after presentation). Patients who were critically ill and those with perforation or abscess were excluded. During 2.5months of follow-up, the immediate-surgery group had significantly fewer complications and less overall morbidity and spent fewer days in the hospital. In the second study ( http:// dx.doi.org /10.1097/SLA.0b013e3182a5cf36 ), researchers in Canada performed a population-based, retrospective analysis of all of the patients admitted to Ontario hospitals with acute cholecystitis between 2004 and 2011. Through propensity matching, the researchers compared 7000 patients who underwent early cholecystectomy (within a week after presentation) with 7000 patients who underwent later surgery (the median time to their surgeries was 2months). At 6months, major bile duct injury was significantly less common in the early-surgery group, and total days of hospitalization were fewer. As in the first study, critically ill patients were excluded. These studies confirm that most patients with acute cholecystitis should undergo early cholecystectomy. This approach provides more-rapid relief of their symptoms, minimizes their complications, and shortens their total hospital length of stay. UNCOMPLICATED ACUTE DIVERTICULITIS: INPATIENT OR OUTPATIENT TREATMENT? Frequently, clinicians treat patients who have acute diverticulitis with oral antibiotic therapy in outpatient settings. In a multicenter study in the January2014 Annals of Surgery ( http:// dx.doi.org /10.1097/SLA.0b013e3182965a11 ), researchers in Spain sought to confirm the safety of this practice in 130 patients who presented to emergency departments with acute, uncomplicated diverticulitis confirmed by computed tomography. Patients with abscesses or peritonitis and patients who were not able to tolerate oral intake were excluded. After getting an initial dose of intravenous antibiotics (either amoxicillin/clavulanate or ciprofloxacin plus metronidazole) in the emergency department, the patients were randomized either to outpatient oral antibiotic treatment or to inpatient treatment with additional IV antibiotics for 36hours to 48hours. During 2months of follow-up, treatment failure (that is to say, hospitalization for persistent symptoms or complications) was seen in three outpatients and four inpatients. Evaluations of quality-of-life in the two groups were similar at 2weeks and 2months. This study confirms the safety of what has already become common practice: The outpatient treatment of patients with acute uncomplicated diverticulitis. Even though all of the patients in this study got a single initial dose of intravenous antibiotics, these results can be extrapolated to fully oral outpatient antibiotic regimens. IS COLONOSCOPY NECESSARYAFTER ACUTE UNCOMPLICATED DIVERTICULITIS? Many clinicians believe that all patients with acute diverticulitis should undergo colonoscopy to rule out cancer after the episode has subsided. To see whether evidence supports this practice, researchers performed a meta-analysis of 11 studies in which nearly 2000 patients with radiologically confirmed diverticulitis underwent colonoscopy, most within several months after their episodes of diverticulitis. Findings appear in the February2014 Annals of Surgery ( http:// dx.doi.org /10.1097/SLA.0000000000000294 ). Colorectal cancer was diagnosed in 1% of the patients overall. Eight studies classified diverticulitis cases as complicated or uncomplicated; of the 1600 patients in this subset of studies, 80 had complicated diverticulitis. The incidence of colorectal cancer was 0.3% in the uncomplicated cases, but almost 8% in the complicated ones. The incidence of colorectal cancer in patients with uncomplicated diverticulitis is very low and similar to the incidence in the background asymptomatic population. So, the researchers reasonably conclude that colonoscopy is not necessary after radiologically proven uncomplicated diverticulitis (unless the patient is otherwise due for colorectal cancer screening). In contrast, the patients who have complications of diverticulitis (like refractory symptoms or abscess formation) should undergo colonoscopy. ARE MULTIVITAMIN AND MINERALSUPPLEMENTS USEFUL? Do vitamin and mineral supplements really prevent disease? In two clinical trials and a meta-analysis in the December17, 2013 Annals of Internal Medicine , researchers evaluated the efficacy of dietary supplements. In the first study ( http://annals .org / article.aspx ?articleid=1789248 ), researchers evaluated whether oral multivitamins prevent adverse cardiovascular events in patients with histories of myocardial infarction. Nearly 2000 patients with an average age of 65 were randomized either to a 28-component, high-dose multivitamin and mineral supplement or to placebo. Only half of the patients adhered to the study preparations for at least 3years. After a median follow-up of almost 5years, the incidences of recurrent adverse cardiovascular events were similar in the two groups (at about 30%). In a subgroup analysis of patients who did not take statins at baseline, event rates were lower in the supplement group than in the placebo group. There was no evidence of harm with vitamin use. In the second study ( http://annals.org / article.aspx ?articleid=1789250 ), researchers from the Physicians Health Study II ( www. jwatch .org /jw201211150000001 ) examined the effects of multivitamin supplementation on cognitive function later in life. Male doctors 65 or older were randomized to either multivitamin or placebo supplements every day for an average of nearly 9 years; there was no difference in change in cognitive function, as measured by five different tests of cognition. Finally, in an analysis for the United States Preventive Services Task Force ( http://annals .org / article.aspx ?articleid=1767855 ), researchers conducted a systematic review of studies that involved vitamin and mineral supplements for the primary prevention of cardiovascular disease, cancer, or all-cause mortality among healthy patients. There was no consistent evidence that suggested a benefit from supplements. But the findings were limited by the small number of fair- and good-quality studies available for analysis of supplements other than β-carotene or vitamin E. For vitamin E, the researchers found good evidence of a null effect, whereas β-carotene was strongly associated with excess lung cancers and death among the patients whose risk for lung cancer was high. In a strongly worded editorial ( http://annals .org / article.aspx ?Articleid=1789253 ), writers summarize the findings of these three studies by stating, Most supplements do not prevent chronic disease or death, their use is not justified, and they should be avoided . They do leave open the potential for a small benefit or harm in certain patient subgroups. RESPONSE TO ORAL VITAMIN D SUPPLEMENTATION IN OBESE ADULTS On average, obese patients have lower serum vitamin D levels and need higher doses of supplemental vitamin D to correct deficiency than do patients who are not obese. To examine dose-response effects, researchers randomized 60 adults with a high body-mass index (between 30kg/m 2 and 58kg/m 2 ) to 1000 IU, 5000 IU, or 10,000 IU of oral vitamin D 3 every day for 5months. The study was conducted during winter months in Nebraska (when skin synthesis of vitamin D is minimal). At baseline, the average serum hydroxyvitamin D level was 23 ng/ mL. Details appear in the December2013 Journal of Clinical Endocrinology and Metabolism ( http:// dx.doi.org /10.1210/jc.2012-4103 ). There was a dose-response effect. Average serum hydroxyvitamin D levels increased by 12 ng/ mL in the 1000 IU group; by 28 ng/ mL in the 5000 IU group; and by 48 ng/ mL in the 10,000 IU group. But the patients varied widely in individual responses: The ranges of increase in serum vitamin D in the three dosing groups were 2 ng/ mL to 39 ng/ mL in the 1000 IU group; 13 ng/ mL to 46 ng/ mL in the 5000 IU group; and 16 ng/ mL to 83 ng/ mL in 10,000 IU group. The incremental response to a given vitamin D dose varied inversely with body-mass index, but vitamin D dose was more important than body-mass index in predicting the response to supplementation. This study provides information on the response to vitamin D supplementation in patients with obesity. The wide range of individual responses might reflect genetic variability in binding proteins and in vitamin D hydroxylation. In comparing these results to past findings in patients without obesity, the researchers estimate that the response to a given vitamin D dose is roughly 30% lower in obese than in nonobese patients presumably because vitamin D is diluted in body tissue mass. VITAMIN E SUPPLEMENTATIONFOR MILD-TO-MODERATE ALZHEIMER DISEASE Back in 1997, it was shown that in patients with moderately severe Alzheimer disease, high-dose vitamin E supplementation (2000 IU/ day) conferred a modest benefit that was reflected mainly in fewer patients needing institutionalization and a slower decline in performing activities of daily living ( www. jwatch .org /wh199705010000008 ). Now, in a placebo-controlled study in the January1, 2014 issue of JAMA ( http:// dx.doi.org /10.1001/jama.2013.282834 ), researchers randomized 600 United States veterans (nearly all were men) with mild-to-moderate Alzheimer disease to 2000 IU/ day of vitamin E, memantine (trade name: Namenda), both, or neither. All of the patients were taking an acetylcholinesterase inhibitor at the time of enrollment. The primary outcome was decline in a 78-point activities-of-daily-living score (the average baseline score was 57). During an average follow-up of more than 2years, the decline was significantly smaller in the vitamin Eonly group than in the placebo group; there was no significant benefit with combination therapy or memantine alone. For secondary outcomes like cognitive decline and psychological and behavioral symptoms, none of the groups benefited, compared with placebo. The effect of vitamin E in this study was not dramatic, and why only vitamin E alone (but not vitamin E plus memantine) conferred benefit is not clear. Even so, we now have two studies in which vitamin E outperformed placebo in patients with Alzheimer disease, without apparent harm. So offering this treatment seems reasonable as long as patients and families understand that the potential benefit is modest. Finally, although a meta-analysis from 2005 showed increased mortality with vitamin E supplementation at doses of 400 I U/ day or higher ( http://annals .org / article.aspx ?articleid=718049 ), no excess mortality was seen in three studies of 2000 IU/ day for patients with chronic neurological disorders this study, the Alzheimer study from 1997, and a Parkinson disease study from 1998 ( http:// dx.doi.org /10.1002/ana.410430309 ). ENDURING EFFECT OF COGNITIVE TRAINING IN OLDER ADULTS The so-called ACTIVE study was started in 1998 to see whether cognitive training could improve outcomes in nearly 3000 older patients without significant cognitive deficits at baseline. The patients, who were older than 65, were randomized to undergo memory training, reasoning training, speed-of-processing training, or no intervention. Training consisted of 10hour-long small-group training sessions during a 5- to 6-week period; the patients were periodically reevaluated during 10years of follow-up. At 10years, those who underwent reasoning and speed-of-processing training had persistent, statistically significant, small-to-moderate improvements in their respective domains compared with the controls; in contrast, those in the memory-training group experienced no long-term improvement in memory. All three of the treatment groups reported significantly less difficulty with activities of daily living than did the control group. Details appear in the January2014 Journal of the American Geriatrics Society ( http:// dx.doi.org /10.1111/jgs.12607 ). Tai chi, a mind-body activity, is another increasingly popular approach to address both physical and cognitive decline in older patients. Researchers conducted a meta-analysis ( http:// dx.doi.org /10.1111/jgs.12611 ) of 11 randomized trials of tai chi in which cognition was evaluated and found that tai chi (compared with control interventions or no intervention) improved cognition moderately in both normal and impaired older patients. The duration of these studies was generally short (roughly a year or less). The enduring effect of the reasoning and speed-of-processing interventions in the ACTIVE study is somewhat surprising, but the results are encouraging for proponents of cognitive training in older patients. Although nearly half of the patients in this trial died, withdrew, or were lost to follow-up by 10years, attrition was similar in the treatment and the control groups. As for tai chi, its apparent cognitive and physical benefits with no obvious adverse effects make it an attractive approach. GUIDELINE WATCH: TREATINGANEMIAIN PATIENTSWITH HEART DISEASE In a clinical practice guideline on treating anemia in patients with heart disease (targeted for primary care physicians and cardiologists), the American College of Physicians recommends that transfusions and erythropoiesis-stimulating agents be restricted to patients with more-severe anemia. Details appear in the December3, 2013 Annals of Internal Medicine ( http://annals .org / article.aspx ?articleid=1784292 ). We know that treating anemia in patients with heart disease includes erythropoiesis-stimulating agents, red blood cell transfusions, and iron replacement, but we do not know whether these treatments improve patient outcomes. This new guideline is based on a systematic review of evidence on the benefits and harms of these treatments in patients with congestive heart failure or coronary heart disease ( http://annals .org / article.aspx ?articleid=1784290 ). There is only low-quality evidence on the value of treatment with red blood cell transfusions. No short-term mortality benefit was found for liberal red blood cell transfusion vs . more restrictive transfusion (a hemoglobin trigger level of greater than 10 g/dL vs . between 8 g/dL and 9 g/dL) in medical and surgical patients with anemia and heart disease. The aggressive treatment of anemia with red blood cell transfusions does not benefit, and might harm, patients with acute coronary syndrome or myocardial infarction or those who are undergoing percutaneous coronary interventions. There is moderate- to high-quality evidence on the value of treatment with erythropoiesis-stimulating agents. Among anemic patients with stable congestive heart failure, using erythropoiesis-stimulating agents does not lower all-cause mortality or the risk for adverse cardiovascular events, but might be associated with harms, like venous thromboembolism. So what are the recommendations? Clinicians are mildly advised to use a restrictive red blood cell transfusion strategy (with a trigger hemoglobin threshold of between 7 g/dL and 8 g/dL) in hospitalized patients with anemia and coronary heart disease. Clinicians are strongly advised not to use erythropoiesis-stimulating agents in patients with mild-to-moderate anemia and congestive heart failure or coronary heart disease. Often, patients with heart disease have anemia, and treating them aggressively seems like the best choice on an intuitive level. But this clinical practice guideline emphasizes that more (for example, red blood cells) is not always better and clarifies how and when clinicians should intervene. DIETARY FIBER INTAKE IS ASSOCIATED INVERSELY WITH CV RISKS High dietary fiber intake is associated with a lower risk for coronary heart disease. But it is not clear which types of fiber are protective, and whether there is a doseresponse effect. In a meta-analysis on the website of the British Medical Journal ( http:// dx.doi.org /10.1136/bm j. f6879 ), researchers examined 20 observational cohort studies to evaluate the associations between intakes of various dietary fiber types and the risks for first coronary heart disease and cardiovascular disease events. Every 7-g daily increase in total dietary fiber intake (like the amount found in about a cup of bran flakes or raw green peas or 2 apples) was significantly associated with a lower risk for coronary heart disease and cardiovascular disease events. The findings were similar for each of the various types of fiber (namely, soluble, insoluble, vegetable, fruit, and cereal), although the lowered risk just missed statistical significance for some subgroups. In this study, higher dietary fiber intakes from various sources were associated with lower risks for coronary heart disease and cardiovascular disease in a doseresponse pattern. In terms of recommended total dietary fiber intakes, women should aim for 25 g /day and men should aim for 38 g /day . Finally, although these observational studies adjusted for confounding variables to some extent, the possibility of confounding remains (for example, high dietary fiber intake is likely associated with other healthy behaviors). ADDING BUPROPION TO VARENICLINEDOES NOT IMPROVE LONG-TERM SMOKING ABSTINENCE Theoretically, treating tobacco dependence with a combination of smoking-cessation drugs could result in higher abstinence rates, compared with monotherapy. To evaluate this possibility, researchers randomized 500 patients with an average age of 42 who smoked at least 10cigarettes /day for at least 6months (the average smoking duration was 23years) to varenicline (trade name: Chantix; titrated to a maximum dose of 1mg twice /day ) plus either bupropion SR (trade name: Wellbutrin SR; titrated to a maximum dose of 150mg twice /day ) or placebo. None of the patients had serious medical or psychiatric illnesses, and all of them got regular behavioral counseling during the yearlong study. About two thirds of the patients in each group completed the program. Findings appear in the January8, 2014 issue of JAMA ( http:// dx.doi.org /10.1001/jama.2013.283185 ). Compared with the monotherapy patients, the combination-therapy patients had significantly higher prolonged abstinence rates at 3months and 6months, but not at a year; and among those who smoked 20 or more cigarettes /day , the prolonged abstinence rate was significantly higher at a year. The combination-therapy patients were also significantly more likely than were the monotherapy patients to experience anxiety and depressive symptoms. Combination therapy with varenicline and bupropion SR for tobacco dependence did not provide a significant long-term benefit overall, although it might be appropriate for heavy smokers. The psychiatric symptoms reported in this study have be factored into the harmbenefit equation, although past studies have shown no excess of psychiatric symptoms with either bupropion or varenicline ( www.jwatch.org/na32576 and www.jwatch.org/na32331 ). As is true for most studies of smoking cessation, the dropout rate was high (about 40%). PROGRAM LINKED TO PROFESSIONAL SOCCER TEAMS HELPS MEN LOSE WEIGHT Men are poorly represented in commercial weight-loss programs and clinical trials of weight-loss interventions. In a study on the website of The Lancet ( http:// dx.doi.org /10.1016/S0140-6736(13)62420-4 ), researchers in Scotland enrolled 750 overweight and obese men between the ages of 35 and 65 into a weight loss and healthy living program. The patients body-mass indices were 28 kg/m 2 or higher, and they were recruited through Scottish professional football (soccer) league clubs to participate in a regimen of dietary advice and physical activity, with context, content, and delivery style specifically tailored to men. The program consisted of 12weekly sessions conducted by team-employed community coaches in home-team stadiums using team-branded materials and peer-support activities that encouraged male banter; this was followed by a 9-month weight-loss maintenance phase that involved e-mails and group reunions at team clubs. The patients were randomized to get the intervention immediately (this was the intervention group) or after waiting a year (this was the comparison group). At baseline assessment, participants in the comparison group received information about their weight and BMI, an advice booklet, and information about the program from coaches. At 3months and 12months after participants were randomized to one of the groups, average weight loss was significantly greater in the intervention group than in the comparison group. Other differences between the two groups at a year significantly favored the intervention patients, namely, waist circumference; percent body fat; blood pressure; and self-reported physical activity, dietary behaviors, psychological health, and physical healthrelated quality of life. Interventions designed to change health-related behaviors in specific populations might be more effective if they are delivered by trusted figures in supportive settings outside of traditional health care environments. Similar outside-the-box programs tailored to other groups and behaviors merit rigorous investigation. MEDICAID EXPANSION IN OREGON LED TO MORE SHORT-TERM ED USE Back in 2008, Oregon expanded its Medicaid program to people with income below the federal poverty level and less than US$2000 in assets. Because funds were not available to cover all of the residents who qualified, a lottery was conducted to select 25,000 people to get Medicaid coverage. In a study in the January17, 2014 issue of Science ( http:// dx.doi.org /10.1126/science.1246183 ), researchers explored the effects of Medicaid coverage. A year-and-a-half after the lottery, the use of emergency departments among those who got Medicaid was 40% greater than among those who did not. In subgroup analyses, the difference in use was statistically significant among the patients deemed to require immediate evaluation in primary care settings (not in emergency departments) and among those who did not need immediate evaluation at all. For the subgroups with truly emergent conditions, the difference was marginally significant or not significant. Essentially, this Medicaid lottery system created a randomized trial. Past studies of this same trial reported that outpatient visits were more frequent in the covered group, but that hospital admissions were the same in the two groups ( www. jwatch .org /jw201305090000001 ). Multiple measures have suggested improvements, and no measures have suggested reductions, in the quality of care. This study accords with economic theory in suggesting that eliminating financial risk encouraged increased care-seeking. But why did these patients go to the emergency department and not to a primary care office? Despite the earlier finding that coverage increased outpatient use, many of these newly insured patients probably had not yet established relationships with primary care clinicians. If so, the excess use of emergency departments will attenuate with time. MORE EVIDENCE THAT MENISCAL TEARS MIGHT NOT REQUIRE SURGERY It has been shown that patients with coexisting meniscal tears and osteoarthritis who were treated with physical therapy alone or with arthroscopic repair followed by physical therapy have similar outcomes ( www. jwatch .org /jw201303280000003 ). In a study in the December26, 2013 New England Journal of Medicine ( http:// dx.doi.org /10.1056/NEJMoa1305189 ), researchers in Finland randomized 150 patients between the ages of 35 and 65 with knee pain consistent with nontraumatic meniscal tears and no osteoarthritis to either arthroscopic partial meniscectomy or sham arthroscopy. The patients were followed for 1year. All of the patients underwent the same postoperative care, including a graduated exercise program. During follow-up, both of the groups showed marked improvements in knee painrelated scores after exercise and a year after surgery and there were no significant differences between the two groups. Of the seven patients who underwent additional surgeries because of persistent symptoms, two were in the meniscectomy group and five were in the sham group, but this difference was not statistically significant. This study adds to growing evidence that patients with nontraumatic meniscal tears (with or without osteoarthritis) probably do not need arthroscopic surgery, at least not initially. Time and physical therapy might yield similar outcomes with much less expense. NOVEL APPROACH TO OBSTRUCTIVE SLEEP APNEA Because many patients with obstructive sleep apnea do not tolerate continuous positive airway pressure or mandibular advancement appliances, alternatives are needed. One novel approach is an upper-airway stimulation device, in which an electrode is surgically placed on the hypoglossal nerve: When the nerve is stimulated, the tongue protrudes and airway patency is improved. Another sensing lead , implanted into the chest wall, allows synchronization between the respiratory cycle and hypoglossal nerve stimulation during sleep (For NEJM subscribers http://www .nejm.org/action/showImage?doi=10.1056%2FNEJMoa1308659&iid=f01 ). In a study in the January9, 2014 New England Journal of Medicine ( http:// dx.doi.org /10.1056/nejmoa1308659 ), researchers tested this device in 130 patients with moderate-to-severe obstructive sleep apnea (with a median apnea-hypopnea index of 29 events /hour ) who were not able to adhere to continuous positive airway pressure. After a year with the device, the median apnea-hypopnea index fell to 9 events /hour , and median Epworth Sleepiness Scale scores significantly decreased. Eighty patients responded (meaning that their apnea-hypopnea index decreased by half to fewer than 20 events /hour ), and 50 responders were randomized to have their devices turned on or off during 1week. Apnea-hypopnea indices reverted back to baseline in the device-off group, but not in the device-on group. Two patients needed surgical repositioning of the stimulator, and 40% reported discomfort associated with stimulation . This device improved obstructive sleep apnea in most (although certainly not all) of the patients in this study. But many exclusion criteria limit the studys generalizability; for example, patients with a body-mass index greater than 32kg/m 2 were excluded. Because this approach involves an invasive procedure, long-term efficacy and safety are key issues needing longer follow-up. A Food and Drug Administration review of the device begins this month. This study was industry-supported. A 25-MINUTE DELAY IN SCHOOL START TIME MIGHT HELP TEENS Chronic sleep deficit in adolescents has been associated with crashes related to drowsy driving; obesity; cardiovascular disease; and impaired mood, attention, memory, and executive function. Inadequate sleep has also been associated with worse academic performance and less motivation to learn. Early start times in schools might contribute to sleep deprivation. By studying a modest change to school start time in a coeducational boarding high school, researchers eliminated environmental variables that could affect sleep (like morning and evening routines, sleeping environments, and after-school employment). Details appear in the January2014 Journal of Developmental and Behavioral Pediatrics ( http:// dx.doi.org /10.1097/DBP.0000000000000018 ). Before and during the winter semester, 200 adolescents (with an average age of 16) completed the standardized school sleep habits survey. During the intervention (when the usual school start time of 8:00 am was experimentally delayed 25minutes), average school-night bedtimes did not significantly change, but average school-night sleep duration significantly increased (by half an hour). The percentage of students who received 8 or more hours of sleep increased from 20% to 40%. Daytime sleepiness, depressive symptoms, and caffeine use were significantly lower. Among 70 students who completed the survey after the earlier start time was reinstated, sleep duration returned to baseline. Typically, studies that linked longer sleep time with later school start time used a change of an hour or more. The current results support the benefit of a more modest delay to improve sleep duration, daytime mood/sleepiness, and caffeine use. The researchers did not evaluate academic achievement. TEEN DRIVERS SHOULD NOT MULTITASK AT THE WHEEL Everyone knows that a teenager with a new drivers license and a mobile phone is a car accident waiting to happen. But is the main problem a lack of driving experience or a tendency toward distractive behaviors? Are specific activities particularly problematic? To find out, researchers installed data-collection devices (namely, cameras, accelerometers, forward radar, global positioning systems, and lane trackers) in cars belonging to 40 newly licensed teen drivers who were 16years old and 100 experienced drivers between the ages of 18 and 72. Details appear in the January2, 2014 New England Journal of Medicine ( http:// dx.doi.org /10.1056/NEJMsa1204142 ). During a year-and-a-half of observation, there were 30 crashes and 140 near-crashes (for which the drivers were at least partially responsible) among the newly licensed drivers; during a year, there were 40 crashes and almost 500 near-crashes among the experienced drivers. An analysis of video footage confirmed that, among the new teen drivers, reaching for a mobile phone, texting, dialing a mobile phone, reaching for something else in the car, looking at something on the roadside, and eating were all associated with a significantly higher risk for crashes or near-crashes. Among the experienced drivers, none of these activities, except for dialing a mobile phone, was risky (with the caveat that texting time was not measured in this group). In both of the groups, talking on the phone; drinking a beverage; and adjusting the car radio, heater, or other device, were not associated with an elevated risk. Although multitasking increased among the new drivers as time went on, their overall average rates of risky activities were no higher than those documented among the experienced drivers. These results validate standard advice: New drivers should concentrate on their driving and keep their eyes on the road. With experience, drivers can multitask a little more safely, although dialing a mobile phone remains risky, and, presumably, texting does as well. Meanwhile, the rates of near crashes and actual crashes in this study are a sobering commentary on overall driving safety in the United States. CAUTION USING IGRAs FOR LATENT TB IN U.S. HEALTHCARE WORKERS Healthcare workers are an important target group for latent tuberculosis infection testing. Until recently, the only assay available was the tuberculin skin test, which is limited by low sensitivity, a subjective endpoint requiring two clinical visits, and cross-reactivity in patients who have been vaccinated with bacille Calmette-Gurin or have nontuberculous mycobacterial infections. Interferon-γ release assays do not share these problems. In a cross-sectional study in the January1, 2014 American Journal of Respiratory and Critical Care Medicine ( http:// dx.doi.org /10.1164/rccm.201302-0365OC ), researchers examined the performance of interferon-γ release assays among more than 2000 healthcare workers who were undergoing serial testing for latent TB infection; they compared the results of the tuberculin skin test with two commercially available interferon-γ release assays the Quantiferon-TB Gold In-Tube and the T-SPOT. TB . Among the participants with no past positive tests or treatments for TB, positive baseline tests were significantly less likely with the tuberculin skin test than with the interferon-γ release assays; test conversion was also significantly less likely with the tuberculin skin test than with the interferon-γ release assays. There was no association between reported TB exposure and conversion on any test. Finally, among the participants who had repeat interferon-γ release assay testing 7days to 21days after tuberculin skin testing, about 10% showed boosted responses on the interferon-γ release assays. The researchers conclude that most newly positive interferon-γ release assays at baseline and follow-up were falsely positive and that the rates of false-positives were much higher for interferon-γ release assays than for tuberculin skin testing. As they note, these findings suggest that the specificity of interferon-γ release assays in healthcare workers at low risk for new tuberculosis infections is lower than previously reported. Interferon-γ release assays should be used with caution in this group.
Ratings and Reviews
To review this course, please login.Login