reference
stringlengths
12
13
country
stringclasses
8 values
date
timestamp[s]
title
stringlengths
2
239
author
stringclasses
325 values
is_gov
int64
0
1
text
stringlengths
131
239k
r170303a_FOMC
united states
2017-03-03T00:00:00
From Adding Accommodation to Scaling It Back
yellen
1
I am pleased to join you today to discuss the U.S. economy and the Federal Reserve's monetary policy. I strongly believe that my colleagues and I should explain, as clearly as we can, both the reasons for our decisions and the fundamental principles that underlie our strategy. Today I will review the conduct of monetary policy during the nearly 10 years since the onset of the financial crisis. Although the Federal Reserve's policy strategy for systematically pursuing its congressionally mandated goals of maximum employment and price stability has not changed during this period, the Federal Open Market Committee (FOMC) has made significant tactical adjustments along the way. I will spend most of my time today discussing the rationale for the adjustments the Committee has made since 2014, a year that I see as a turning point, when the FOMC began to transition from providing increasing amounts of accommodation to gradually scaling it back. The process of scaling back accommodation has so far proceeded at a slower pace than most FOMC participants anticipated in 2014. Both unexpected economic developments and deeper reevaluations of structural trends affecting the U.S. and global economies prompted us to reassess our views on the outlook and associated risks and, consequently, the appropriate stance of monetary policy, both in the near term and the longer run. Looking ahead, we continue to expect the evolution of the economy to warrant further gradual increases in the target range for the federal funds rate. However, given how close we are to meeting our statutory goals, and in the absence of new developments that might materially worsen the economic outlook, the process of scaling back accommodation likely will not be as slow as it was in 2015 and 2016. I should note that I will discuss the process of scaling back accommodation mostly from the perspective of our interest rate decisions, which my FOMC colleagues and I see as our primary tool for actively adjusting the stance of monetary policy when our actions are not constrained by the zero lower bound on short-term interest rates. In our monetary policy deliberations, the FOMC always faces two fundamental questions: First, how do we assess the current stance of monetary policy? Second, what are the strategic and tactical considerations that underpin our decisions about the appropriate stance of monetary policy going forward? These questions are difficult because the interactions between monetary policy and the economy are complex. Policy affects the economy through many different channels, and, in turn, many factors influence the appropriate course of policy. Gauging the current stance of monetary policy requires arriving at a judgment of what would constitute a neutral policy stance at a given time. A useful concept in this regard is the neutral "real" federal funds rate, defined as the level of the federal funds rate that, when adjusted for inflation, is neither expansionary nor contractionary when the economy is operating near its potential. In effect, a "neutral" policy stance is one where monetary policy neither has its foot on the brake nor is pressing down on the accelerator. Although the concept of the neutral real federal funds rate is exceptionally useful in assessing policy, it is difficult in practical terms to know with precision where that rate stands. As a result, and as I described in a recent speech, my colleagues and I consider a wide range of information when assessing that rate. As I will discuss, our assessments of the neutral rate have significantly shifted down over the past few years. In the Committee's most recent projections last December, most FOMC participants assessed the longer-run value of the neutral real federal funds rate to be in the vicinity of 1 percent. This level is quite low by historical standards, reflecting, in part, slow productivity growth and an aging population not only in the United States, but also in many advanced economies. Moreover, the current value of the neutral real federal funds rate appears to be even lower than this longer-run value because of several additional headwinds to the U.S. economy in the aftermath of the financial crisis, such as subdued economic growth abroad and perhaps a lingering sense of caution on the part of households and businesses in the wake of the trauma of the Great Recession. It is difficult to say just how low the current neutral rate is because assessments of the effect of post-recession headwinds on the current level of the neutral real rate are subject to a great deal of uncertainty. Some recent estimates of the current value of the neutral real federal funds rate stand close to zero percent. With the actual value of the real federal funds rate currently near minus 1 percent, a near-zero estimate of the neutral real rate means that the stance of monetary policy remains moderately accommodative, an assessment that is consistent with the fact that employment has been growing at a pace--around 180,000 net new jobs per month--that is notably above the level estimated to be consistent with the longer-run trend in labor force growth--between 75,000 and As I will explain, this policy stance seems appropriate given that the underlying trend in inflation appears to be still running somewhat below 2 percent. But as that gap closes, with labor market conditions now in the vicinity of our maximum employment objective, the Committee considers it appropriate to move toward a neutral policy stance. My colleagues and I generally anticipate that the neutral real federal funds rate will rise to its longer-run level over the next few years. This expectation partly underlies our view that gradual increases in the federal funds rate will likely be appropriate in the months and years ahead: Those increases would keep the economy from significantly overheating, thereby sustaining the expansion and maintaining price stability. I will now examine the strategic and tactical considerations that go into FOMC deliberations by discussing past monetary policy decisions in the context of our mandate from the Congress to pursue maximum employment and price stability. The FOMC's monetary policy strategy is based on three basic principles. First, our monetary policy must be goal driven. We must take care to ensure that our decisions over time are consistent with our commitment to achieve the Federal Reserve's congressionally mandated goals of maximum employment and price stability, and that the public understands and has confidence in that commitment. Second, our monetary policy must be forward looking because our decisions tend to influence economic activity and inflation with a substantial lag. Among other things, this implies looking through short- term and transitory developments and focusing on the medium-term outlook--roughly two or three years out--when making policy decisions. Third, our monetary policy must be risk sensitive. Because the outlook is uncertain, we must assess appropriate policy with an eye toward the risk that our expectations about the economy turn out to be significantly wrong. We have followed this basic strategy for decades and, in 2012, the FOMC The Committee has reaffirmed this commitment annually. But the challenges brought about by the financial crisis, and the very deep recession and painfully slow recovery that followed, compelled us to adjust our tactics for carrying out our policy strategy. In particular, once the Committee had cut the federal funds rate to near zero in late 2008, it became necessary to deploy new tools to supply the considerable monetary accommodation required by the extremely weak state of the job market and persistently low inflation. Those tools--especially our large-scale securities purchases and increasingly explicit forward guidance pertaining to the likely future path of the federal funds rate--enabled the Federal Reserve to provide necessary additional support to the U.S. economy by pushing down longer-term interest rates and easing financial conditions more generally. Much has been written and said already about the provision of additional accommodation between 2008 and 2014, when the FOMC completed its latest round of large-scale securities purchases, so I will turn now to our policy stance since 2014, when the FOMC's main focus started to shift from providing additional accommodation to scaling it back. By late 2013, the FOMC concluded that the economy had made sufficient progress, and the outlook was sufficiently favorable, that it should reduce the pace of its large-scale securities purchases. But we reiterated that these purchases would continue until the outlook for the labor market had improved substantially. The U.S. economy made notable progress toward the FOMC's statutory goals during 2014, with the unemployment rate dropping to close to 6 percent by mid-year--well below its Great- Recession peak of 10 percent--and other measures of labor market conditions also showing improvement: Payroll gains were solid; job openings had risen significantly; and the number of workers voluntarily quitting their jobs--a sign of confidence in the labor market--was rising back toward pre-crisis levels. We were also seeing progress on achieving our price stability goal: Total inflation as measured by changes in the headline mid-2014 after hovering around 1 percent in the fall of 2013. Inflation seemed to be moving toward the FOMC's 2 percent objective, a level that the FOMC judges to be consistent with price stability because it is low enough that it does not need to figure prominently into people's and businesses' economic decisions but high enough to serve as a buffer against deflation and provide greater scope for monetary policy to address economic weakness. The progress seen during 2014 indicated to the FOMC that it was no longer necessary to provide increasing amounts of support to the U.S. economy by continuing to add to the Federal Reserve's holdings of longer-term securities. Accordingly, the Committee continued to reduce the pace of asset purchases over the course of the year, ending its purchases in October. That step, however, did not mark an immediate shift toward tighter monetary policy because we also indicated then that we did not expect to raise interest rates for a considerable time after the end of our securities purchases. Moreover, as the Committee explained in a set of "normalization principles" issued that September, the intention was to maintain the overall size of the Federal Reserve's balance sheet at an elevated level until sometime after the FOMC had begun to raise its target for the federal funds rate. We decided that maintaining a highly accommodative stance of monetary policy remained appropriate because, while the U.S. economy was stronger and closer to meeting our statutory goals, we saw significant room for improvement. In particular, the unemployment rate still stood above our assessment of its longer-run normal level--that is, the unemployment rate that we expect to prevail when the economy is operating at maximum employment--and inflation remained below the 2 percent objective. Because my colleagues and I expected that labor market conditions would continue to improve and that inflation would move back to 2 percent over the medium term, we anticipated that the time was approaching when the economy would be strong enough that we should start to scale back our support. Indeed, the FOMC's June 2014 a higher federal funds rate as appropriate in the next calendar year. In contrast, only two participants in December 2013 thought that it would be appropriate to start raising that rate in the next calendar year. In 2015, the unemployment rate fell significantly faster than we generally had anticipated in 2014. However, a series of unanticipated global developments beginning in the second half of 2014--including a prolonged decline in oil prices, a sizable appreciation of the dollar, and financial market turbulence emanating from abroad--ended up having adverse implications for the outlook for inflation and economic activity in the United States, prompting the FOMC to remove monetary policy accommodation at a slower pace than we had anticipated in mid-2014. U.S. gross domestic product (GDP) growth generally surprised to the downside in 2015, reflecting, in part, weak economic activity abroad, the earlier appreciation of the dollar, and the effect of falling oil prices on business fixed investment. This unanticipated slowing in the pace of the economic recovery caused us to worry about the sustainability of ongoing improvements in employment and, thus, of likely progress toward our maximum employment goal. Our worry was reinforced by our assessment that, with the federal funds rate still near zero, there would likely be only limited scope for us to respond by lowering short-term rates if the weakening in economic activity turned out to be persistent. In contrast, if the weakening proved transitory and the economy instead began to overheat, threatening to push inflation to an undesirably high level, the FOMC would have ample scope to respond through tighter monetary policy. Inflation also was lower than expected, with headline PCE prices rising less than 1 percent over the course of 2015, instead of around 1-3/4 percent as we had anticipated in June 2014. Much of this shortfall reflected the effects of falling oil prices and the appreciation of the dollar. My colleagues and I typically look through the effects on inflation of fluctuations in oil prices and the dollar because these effects tend to be transitory. However, we became concerned in 2015 about the risk that part of the decline in inflation could prove to be longer lasting, especially given that inflation had already been running below our 2 percent objective for quite some time. These various considerations, along with our reassessment of longer-run economic conditions--which I will discuss shortly--explain why the Committee ended up raising the target range for the federal funds rate only 1/4 percentage point in 2015, substantially less than the full percentage point increase suggested by the median projection of FOMC participants reported in June 2014. 2016 also brought some unexpected economic developments that led us to proceed cautiously. During the first half of the year, mixed readings on the job market, along with additional disappointing data on real GDP growth, suggested again that progress toward the achievement of our maximum employment goal could be slowing markedly. Meanwhile, inflation hovered just below 1 percent as dollar appreciation continued to exert downward pressure on import prices, and financial market turbulence emanating from abroad--associated with concerns about the Chinese economy and the Brexit referendum--posed new risks to U.S. economic activity and inflation. Moreover, even as payroll gains turned solid again in the second half of 2016, the unemployment rate remained relatively flat, suggesting that perhaps there was more room for improvement in the job market than we had previously thought. Those unanticipated developments were part of the reason why the Committee again opted to proceed more slowly in removing accommodation than had been anticipated at the start of the year. We ended up increasing the target range for the federal funds rate by only 1/4 percentage point over the course of 2016, rather than the full percentage point suggested by our The slower-than-anticipated increase in our federal funds rate target in 2015 and 2016 reflected more than just the inflation, job market, and foreign developments I mentioned. During that period, the FOMC and most private forecasters generally lowered their assessments of the longer-run neutral level of the real federal funds rate. Indeed, at our October 2015 meeting, the FOMC had a comprehensive discussion of neutral real interest rates and was impressed by the breadth of evidence suggesting that those rates had declined both here and abroad, and that the decline had been going on for some time. In response to this growing evidence, the median assessment by FOMC participants of the longer-run level of the real federal funds rate fell from 1-3/4 percent in These reassessments reflected, in part, the persistence of surprisingly sluggish productivity growth--both in the United States and abroad--and suggested that fewer federal funds rate increases would be necessary than previously thought to scale back accommodation. Partly in response to persistently slow wage growth, FOMC participants and private forecasters have in recent years lowered their estimates of the normal longer-run rate of unemployment. The median projection of FOMC participants of the longer-run level of the unemployment rate fell from about 5-1/4 percent in June 2014 to approximately 4-3/4 percent in December 2016. Other things being equal, a lower longer-run level of the unemployment rate suggests that the economy has greater scope to create jobs without generating too much inflation. Thus, the downward revisions to FOMC participants' views on the unemployment rate over the longer run contributed to our assessment that monetary policy could stay accommodative longer than we had anticipated in 2014. The U.S. economy has exhibited remarkable resilience in the face of adverse shocks in recent years, and economic developments since mid-2016 have reinforced the Committee's confidence that the economy is on track to achieve our statutory goals. Job gains have remained quite solid, and the unemployment rate, at 4.8 percent in January, is now in line with the median of FOMC participants' estimates of its longer-run normal level. On the whole, the prospects for further moderate economic growth look encouraging, particularly as risks emanating from abroad appear to have receded somewhat. The Committee currently assesses that the risks to the outlook are roughly balanced. Moreover, after remaining disappointingly low through mid-2016, inflation moved up during the second half of 2016, mainly because of the diminishing effects of the earlier declines in energy prices and import prices. More recently, higher energy prices appear to have temporarily boosted inflation, with the total PCE price index rising nearly 2 percent in the 12 months ending in January. Core PCE inflation--which excludes volatile energy and food prices and, therefore, tends to be a better indicator of future inflation--has been running near 1-3/4 percent. Market-based measures of inflation compensation have moved up, on net, in recent months, although they remain low. With the job market strengthening and inflation rising toward our target, the median assessment of FOMC participants as of last December was that a cumulative 3/4 percentage point increase in the target range for the federal funds rate would likely be appropriate over the course of this year. In light of current economic conditions, such an increase would be consistent with the Committee's expectation that it will raise the target range for the federal funds rate at a gradual pace and would bring the real federal funds rate close to some estimates of its current neutral level. However, partly because my colleagues and I expect the neutral real federal funds rate to rise somewhat over the longer run, we projected additional gradual rate hikes in 2018 and 2019. Our individual projections for the appropriate path for the federal funds rate reflect economic forecasts that generally envision that economic activity will expand at a moderate pace in coming years, labor market conditions will strengthen somewhat further, and inflation will be at or near 2 percent over the medium term. In short, we currently judge that it will be appropriate to gradually increase the federal funds rate if the economic data continue to come in about as we expect. Indeed, at our meeting later this month, the Committee will evaluate whether employment and inflation are continuing to evolve in line with our expectations, in which case a further adjustment of the federal funds rate would likely be appropriate. Nonetheless, as we have said many times--and as my discussion today demonstrates--monetary policy cannot be and is not on a preset course. As in 2015 and 2016, the Committee stands ready to adjust its assessment of the appropriate path for monetary policy if unanticipated developments materially change the economic outlook. The U.S. economy has shown great improvement and is close to meeting our congressionally mandated goals of maximum employment and price stability, but we of course recognize that important challenges remain. For instance, as we noted in our latest to the Congress, the ongoing expansion has been the slowest since World War II, with real GDP growth averaging only about 2 percent per year. This subdued pace reflects, in part, slower growth in the labor force in recent years-- compared with much of the post-World War II period--and disappointing productivity growth both in the United States and abroad. Our report also noted that, despite a notable pickup in 2015, real incomes for the median family were still a bit lower than they were prior to the Great Recession, and the gains during this economic recovery have been skewed toward the top of the income distribution, as has been the case for quite some time. Families at the 10th percentile of the income distribution earned about 4 percent less in 2015 than they did in 2007, whereas families at the 90th percentile earned about 4 percent more. In addition, the economic circumstances of blacks and Hispanics, while improved since the depths of the recession, remain worse, on average, that those of whites or Asians. These unwelcome developments unfortunately reflect structural challenges that lie substantially beyond the reach of monetary policy. Monetary policy cannot, for instance, generate technological breakthroughs or affect demographic factors that would boost real GDP growth over the longer run or address the root causes of income inequality. And monetary policy cannot improve the productivity of American workers. Fiscal and regulatory policies--which are of course the responsibility of the Administration and the Congress--are best suited to address such adverse structural trends. To conclude, we at the Federal Reserve must remain squarely focused on our congressionally mandated goals. The economy has essentially met the employment portion of our mandate and inflation is moving closer to our 2 percent objective. This outcome suggests that our goal-focused, outlook-dependent approach to scaling back accommodation over the past couple of years has served the U.S. economy well. This same approach will continue to drive our policy decisions in the months and years ahead. With that in mind, our policy aims to support continued growth of the American economy in pursuit of our congressionally mandated objectives. We do that, as I have noted, with an eye always on the risks. To that end, we realize that waiting too long to scale back some of our support could potentially require us to raise rates rapidly sometime down the road, which in turn could risk disrupting financial markets and pushing the economy into recession. Having said that, I currently see no evidence that the Federal Reserve has fallen behind the curve, and I therefore continue to have confidence in our judgment that a gradual removal of accommodation is likely to be appropriate. However, as I have noted, unless unanticipated developments adversely affect the economic outlook, the process of scaling back accommodation likely will not be as slow as it was during the past couple of years. report.pdf . . . Frameworks for the Future," a symposium sponsored by the Federal Reserve . .
r170323a_FOMC
united states
2017-03-23T00:00:00
Welcoming Remarks
yellen
1
I would like to welcome all of you and thank you for joining us to discuss a set of topics of considerable importance to our country. This is the Federal Reserve's 10th biennial community development research conference, dedicated as always to issues of significance to people and communities around the country. The conference is cosponsored by, and includes substantive contributions from, the community development offices of all 12 Federal Reserve Banks as well as the Board of Governors. That united effort and level of commitment reflects how consequential we consider these issues to be. This conference is intended to present and highlight rigorous research that I am confident will inform how you think about your own work, whether from the perspective of policymaking, community development practice, or research. Our last conference, two years ago, explored various aspects of economic mobility, largely among adults. This year, we gather to discuss "The Economic Futures of Kids and Communities," and, in part, I see this topic as an extension of that earlier conversation about mobility. Today and tomorrow, we focus on research about the foundation or building blocks for economic success that are laid even before young people enter the workforce and assume responsibility for their own finances. We will hear from leading experts on a range of issues related to how children, youths, and young adults are shaped in ways that may ultimately affect their ability later to productively contribute to the economy and manage their finances. We can learn from what the data and analysis tell us, and our hope is that making use of this information will lead to more effective programs and policies and thus better outcomes. Considerable evidence shows that growing up poor makes it harder to succeed as an adult, and new research by the Fed likewise shows the strong connection between the typical experiences of poverty in childhood and economic challenges later as an adult. The data come from the Board's latest Survey of Household Economics and Decisionmaking (SHED), which will be published later this spring. In the most recent survey, we asked some of the younger respondents--aged 25 to 39--to think about their childhoods. We asked those young adults whether, during their childhoods, they found themselves worrying about having enough food to eat, having a stable caregiver, or about their personal safety. About 10 percent said they regularly worried about one or more of these concerns, and an additional 19 percent said they sometimes worried about them. We were then able to compare responses about their experiences in childhood to what these young adults told us about their current circumstances. Some pretty clear patterns emerged. Of those young adults who regularly had one or more of these childhood concerns growing up, more than one-half say that they are currently facing challenges in getting by financially. This fraction compares to just over one-fourth of those who said they never, or only rarely, worried about these concerns as children that now experience this level of financial challenge. Young adults who regularly or sometimes worried when they were children about their care, safety, or having enough to eat are also less likely to be employed, less likely to have consistent income month-to- month, and less likely to be able to pay all of their current monthly bills in full, compared with those who never or rarely worried about these concerns as children. Broadly speaking, children who grow up in insecure circumstances, those often experienced in poverty, seem disproportionately likely to experience financial insecurity as adults. This conference is about understanding what kinds of environments and resources can best help children meet with economic success after they reach adulthood. There has been a lot of discussion in the aftermath of the Great Recession about how to best connect people with steady jobs. But research presented over the next two days makes a compelling case that there is a need to also think longer term about how to prepare people for success in the labor market. In fact, this research underscores the value of starting young to develop basic work habits and skills, like literacy, numeracy, and interpersonal and organizational skills. These habits and skills help prepare people for work, help them enter the labor market sooner, meet with more success over time, and be in a position to develop the more specialized skills and obtain the academic credentials that are strongly correlated with higher and steadier earnings. Indeed, a growing body of economic and education literature has focused on the relative efficiency of addressing workforce development challenges through investments in early childhood development and education compared with interventions later in life. I believe that data, evidence, and research can help policymakers and practitioners think more clearly about the implications for improving economic and life outcomes for everyone. To this end, the speakers at this conference will focus on three broad issues. I would like to briefly mention each, highlighting some of the questions that I believe can be informed by the research that will be presented here. First, this morning's panel will address early childhood development and education. In recent years, medicine and social science has revealed more than we ever have known before about which factors and experiences in childhood can make a difference later in life. However, many questions demand further attention. A fundamental one is how positive developmental outcomes can be promoted among those who were not born into families with socioeconomic advantages. While we do know there are advantages to good quality early childhood education, we should strive to better understand what kind of returns on investment this education provides and how to maximize these returns. The answers to these questions may influence thinking about how programs and interventions meant to assist kids and their families should be structured for maximum effectiveness to help put kids on the road to economic success. Second, researchers have explored the effects of neighborhoods and community conditions on the development of young people. Some presenters at this conference will share their understanding of how physical surroundings influence personal development. For instance, how do the form and quality of community institutions such as schools, community centers, and libraries play a role? What other kinds of community characteristics--such as public safety, transportation, and environmental quality--might help or hinder general education and financial skill development? A particularly important question is how kids' home environments affect them in ways that matter for their future economic success. It is also critically important to ask, what kinds of interventions have proven track records, and are these programs scalable? Third, and finally, other presenters will explore issues around skill development of youths and young adults, workforce outcomes, and the implications for the broader economy. They will ask how we understand which formative experiences most affect the ability of young people to successfully move to the next chapter in their lives, whether that means college, a job, or other paths such as self-employment. What role does a range of programs--starting with early childhood education all the way through youth vocational or apprenticeship training--play in affecting job readiness? How effective are different approaches, and what are the returns on investment? We should also pay attention to how well young people form the sorts of "soft skills"--things like teamwork, communication, and the ability to handle conflict--that are so valued by employers. And, for young people whose paths become difficult, such as those who get caught up in the juvenile justice system, what effect do such experiences have on their futures as workers and consumers, and what are the most promising approaches to foster a course I hope that the data and other evidence presented today and tomorrow are of use to you in your work. Community development professionals attending this conference may consider how the design and implementation of their programs may be improved. Policymakers may look more closely at how kids are affected--purposefully or unintentionally--by public policies. And researchers may encounter ideas that spark new work that can shed further light on these important topics. I think it is important that we better understand these issues, and I applaud you for taking the time to be here to share your knowledge and to learn. Our young people are the future, and we all want them to have the support they need for successful and fulfilling lives. As a central banker, I recognize the benefits to the broader economy when more people are better prepared for work and for managing their finances. In short, ensuring that all of our kids have "strong foundations" will help build a similarly strong foundation for the U.S. economy.
r170328a_FOMC
united states
2017-03-28T00:00:00
America's Central Bank: The History and Structure of the Federal Reserve
powell
1
I am delighted to have this opportunity to speak at West Virginia University. Thanks to Brian Cushing for inviting me here today. Gathered in this part of West Virginia, we are located in the Fifth Federal Reserve District, which stretches down from here to South Carolina and east to the Atlantic Ocean divided the country into 12 of these Districts, each with its own Federal Reserve Bank. Together, the Board of Governors in Washington and the 12 Reserve Banks are the key elements of the Federal Reserve System. Today I will discuss how the Federal Reserve came to have this unique structure. The Fed's organization reflects a long-standing desire in American history to ensure that power over our nation's monetary policy and financial system is not concentrated in a few hands, whether in Washington or in high finance or in any single group or constituency. Rather, Americans have long desired that decisions about these matters be influenced by a diverse set of voices from all parts of the country and the economy. The structure of the Federal Reserve was designed to achieve this broad representation and promote a stronger financial system to build resiliency against the sort of periodic financial crises that had repeatedly damaged the country in the 19th and early 20th centuries. This structure was forged from compromise; the result of that compromise was a vitally needed central bank whose decisions take into account a broad range of perspectives. The question of how to structure our nation's financial system arose in the early years of the republic. In 1791, Congress created an institution known as the Bank of the United States, often considered a forerunner of the Federal Reserve. The Bank was created in part to assist the federal government in its financial transactions, a typical responsibility of central banks at that time. It was also designed to help America's financial system meet the needs of a growing economy--the same purpose behind the founding of the Federal Reserve more than 100 years later. The most famous proponent of the Bank was Alexander Hamilton, who has recently achieved the central banker's dream of being the subject of a hit Broadway musical (figure 2). Congress gave the Bank of the United States unique powers--its notes were accepted for making payments to the federal government and it was the only bank able to branch across state lines (figure 3). The Bank could affect the ebb and flow of credit around the country. People in different regions of the country came to have distinct views about the Bank. Borrowers in the western areas--in those times, the West meant places like Ohio--desired cheap and abundant loans but were also wary of lenders. These borrowers grew opposed to the power of the Bank in the credit market. Northern business interests favored the Bank's contribution to the country's industrial development, but at times disagreed with actions taken by the Bank to constrain credit. Southern agriculturalists viewed the Bank with suspicion but supported its occasional actions to constrain credit to non-agricultural businesses. The Bank's private ownership, intended to give it independence from government control, was a source of unpopularity. Ultimately, these disagreements undermined the Bank's political support. After 20 years, Congress chose not to renew the Bank's charter. A second Bank of the United States met a similar fate in 1836 when President Andrew Jackson vetoed a bill to extend its life These two short-lived experiments illustrate a theme in American history--of Americans from different regions holding distinct views about the structure and development of the financial system. People in the newer western parts of the country saw themselves as starved of access to credit and viewed higher interest rates in their areas as reflecting the scarcity of funds. Regional interest rate differentials persisted until around the time of World War I and helped shape the attitudes of Americans living in western areas toward the nation's financial system. These regional differences gave rise to a major political movement in the latter part of the 19th century, as western farm borrowers increasingly demanded a reform of the U.S. monetary system. Their chief complaints included the high interest rates they faced as well as the burdens placed on them by deflation that increased the real value of their debts. Indeed, the economy experienced 1 to 2 percent deflation annually in the years leading up to the 1890s. The country's currency was linked to gold, and deflation reflected the growing scarcity of gold relative to the amount of economic activity. The "free silver" movement grew in response to these economic forces. Its most famous advocate, William Jennings Bryan, the Democratic presidential nominee in 1896, sought an increase in the money supply--by the coining of silver in addition to gold--as a solution to reversing this deflation (figure 5). By the beginning of the 20th century, the debate about monetary policy and the nation's financial system had been going on for over a century. Increasingly, the shortcomings of the existing system were causing too much harm to ignore. Like a drumbeat, the country experienced one serious financial crisis after another, with major crises in 1839, 1857, 1873, 1893, and finally in 1907. These panics paralyzed the financial system and led to deep and extended contractions in the economy. These episodes exposed the weakness of our 19th century financial system, which repeatedly failed to supply the money and credit needed to meet the economy's demands. The financial system came under severe stress when the demand for liquidity surged. financial system strained in such a manner is like dry kindling in danger of being exposed to a spark. That spark could come from losses at a well-known bank, from a disappointing harvest, or from mere rumors. In response, depositors or other investors would seek the return of their funds, which would force financial institutions to sell assets quickly to generate the necessary cash (figure 6). That liquidation could lead banks to cut credit and force borrowers to repay debt sooner than expected. Simply put, the monetary system did not meet the country's needs. It was a system in crisis, boiling over repeatedly, harming the country. Central banks are designed in part to help the financial system meet occasional liquidity strains. When demands for liquidity rise, central banks can respond by increasing the supply of money and thus adding liquidity to the system. Central banks have a particularly important role in avoiding or mitigating extreme demands for liquidity during financial crises. They do this by making loans to solvent financial institutions so they can meet their liquidity demands and avoid forced sales of their assets. These ideas about central banks' lending role were developed over the course of the 19th century but not yet implemented in the United States, which at the time remained without a central bank. By the beginning of the 20th century, the United States was behind the game. The final catalyst leading to the creation of the Federal Reserve was the severe Panic of 1907, which caused inflation-adjusted gross national product to decline by 12 percent, more than two times the decline recorded during the Great Recession of 2007 to After the panic ended, there was a broad sense that reform was needed, although consensus on the exact nature of that reform was elusive. Some called for an institution similar in structure to the Bank of England at the time, with centralized power, owned and operated by the banking system. Some wanted control to be lodged with the federal government in Washington instead. Others proposed that power be distributed to regional bodies with no central or coordinating board. Still others resisted any sort of central bank. This debate reflected the many and diverse interests in the United States-- farmers, laborers, businessmen, small-town bankers, big-city bankers, technocrats, populists, and more--that experienced different conditions across a large geographic expanse. The resulting institution was a compromise, created by the Federal Reserve Act in not structured to be entirely private in its ownership and operation. It was also not structured to have a single headquarters in Washington or New York with branches across the country, a structure that was proposed but failed to attract enough political support. Instead, a more federated system was created, establishing the country. The Board was the part of the System intended to be most directly accountable to the public (figure 7). The Board is an independent agency within the federal government, and members of the Board--now called Governors--are appointed by the President and confirmed by the Senate. Governors serve 14 year terms that expire at 2-year intervals and are not linked to election cycles. The Federal Reserve Board is charged with general oversight of the Reserve Banks. The Reserve Banks combine both public and private elements in their makeup and public interest in mind. Commercial banks that are members of the Federal Reserve System are required to purchase stock in their District's Reserve Bank. These shares are nontransferable and yield only limited powers and benefits. Dividends are set by federal law. The commercial bank shareholders elect two-thirds of the directors that oversee the Reserve Banks; the Board in Washington appoints the remaining one-third. Only three bankers can serve on a Reserve Bank's board of directors, and only one of those can be from a large commercial bank in the District. The remaining six directors represent the interests of the public. The Federal Reserve System benefits enormously from the insights and support of the boards of directors of the Reserve Banks and their Branches. Directors include prominent private-sector leaders who represent a wide and growing diversity of backgrounds and views about the economy. The federated structure of the Federal Reserve System earned the endorsement of even the populist hero of the late 19th and early 20th centuries, William Jennings Bryan. The compromise created an institution that could address the shortcomings of the American financial system while assuring that control of the Federal Reserve would be shared widely. The structure was different from those of the first and second Banks of the United States, and from those of foreign central banks at the time. Congressman Carter Glass, who worked to win passage of the Federal Reserve Act in Congress, called the Federal Reserve's uniquely American design "an adventure in constructive finance" In the System's early years, the decentralized structure gave the Reserve Banks considerable scope to make independent decisions that applied to their own Districts, which made it difficult to effect policy. For example, one Bank's purchases of securities could be offset by another Bank's sale, given that the market for securities was national in scope. As a result, the Reserve Banks created a committee to coordinate these "open market operations." But in these years, the Reserve Banks were not bound by that committee's decisions and could derail any attempt at coordinated action. This decentralization was thought by some to have undermined the Federal Reserve's response to the Great Depression. With that experience in mind, the 1935 Banking Act modified the distribution of power within the Federal Reserve System, giving the Board of Governors 7 of the 12 seats on the Federal Open Market Committee The other 5 seats are held by the Reserve Banks. The Federal Reserve Bank of New York has a permanent seat, and the other Reserve Banks share the remaining 4 seats on a rotating basis. While FOMC members are free to dissent from the majority decision about open market operations, the Reserve Banks are nevertheless required to adhere to that decision in conducting open market operations. The structure set out in 1935 has been essentially unchanged to this day and has served the country well. As intended by the framers, the federal nature of the system has ensured a diversity of views and promotes a healthy debate over policy. My strong view is that this institutionalized diversity of thinking is a strength of our System. In my experience, the best outcomes are reached when opposing viewpoints are clearly and strongly presented before decisions are made. Members of the Board of Governors and Presidents of the Reserve Banks arrive at their own independent viewpoints about the economy and the appropriate path for monetary policy. Congress has assigned the FOMC the task of achieving stable prices and maximum employment; however, policymakers may disagree on the best way to achieve those goals. The System's structure encourages exploration of a diverse range of views and promotes a healthy policy debate. each Reserve Bank has an independent research department, with its own external publications. In addition, while the members of the Board tend to focus on developments in the nation as a whole, the Reserve Bank Presidents bring specialized information about their regional economies to the FOMC discussion. Before each FOMC meeting, Reserve Bank Presidents consult with their staff of economists as well as their boards of directors, business contacts in their Districts, and market experts to develop their independent views of appropriate monetary policy. The FOMC works to achieve a consensus policy by blending inputs from the members of the Board of Governors and from the Reserve Bank Presidents under the leadership of its Chair. By tradition, the Chair of the Board has been chosen as the Chair of the FOMC and has had a central role in setting the agenda for the FOMC and developing consensus among the Committee's members. In addition, the Chair is the most visible public face of the Federal Reserve System. The Fed is accountable to Congress and the public for its activities and decisions. Historically, the activities of central banks were shrouded in mystery. Montagu Norman, the famously secretive Governor of the Bank of England from 1920 to 1944, reportedly In the modern era, all that has changed, as central banks have come to see transparency both as a requirement of democratic accountability and as a way of supporting the efficacy of their policies. Over recent decades the Fed has significantly augmented its public communications, as have other major central banks. The Chair testifies before Congress twice each year about the U.S. economy and the FOMC's monetary policy in pursuit of its statutory goals of stable prices and maximum to accompany that testimony. The Chair also holds press conferences after four FOMC meetings each year. The FOMC releases statements after its meetings that explain the economic outlook and the rationale for its policy decision. Detailed minutes of the Committee's meetings are published three weeks later. participants have submitted quarterly macroeconomic projections that are published in In 2012, the FOMC issued a Statement on This statement discusses the Committee's interpretation of its statutory goals of maximum employment and price stability; it indicates that the Committee judges inflation of 2 percent, as measured by the annual change in the price index for personal consumption expenditures, to be most consistent over the longer run with the Federal Reserve's statutory mandate. Transcripts of FOMC meetings are released to the public after a delay of about five years. Federal Reserve's transparency with frequent public speeches and other communications. I believe that support for the Federal Reserve as a public institution is sustained by the public expression of our diverse views. These communications with Congress and the public are critical parts of the Federal Reserve's institutional accountability and transparency, and are essential complements to its independence. It is important that Federal Reserve officials regularly demonstrate that the Fed has been appropriately pursuing its mandated goals. Transparency can also make monetary policy more effective by helping to guide the public's expectations and clarify the Committee's policy intentions. In recent years, the governance of the Federal Reserve System has continued to provided that directors representing financial institutions--the class A directors, of which there are three on each Reserve Bank board--may not participate in the appointment of Reserve Bank presidents and first vice presidents. The Federal Reserve Board has long had policies preventing Reserve Bank directors from participating in supervisory matters or in determining the appointment of any Reserve Bank officer whose primary duties involve supervisory matters. These directors continue to provide highly valuable information about developments in their markets, and take part fully in other roles with the other six directors. Another aspect of governance involves the better representation of women and minorities in the Federal Reserve System. Indeed, while I have focused my remarks on the history of geographical diversity in the Federal Reserve System, we also strive to have diversity in gender and race both at the Board and at the Reserve Banks. In recent years, the Reserve Banks' boards of directors have made significant progress along these lines. Women now account for 34 percent of the directors, up from 24 percent five years ago. In addition, minorities now account for 29 percent of directors, up from 19 percent five years ago. The long history of political discourse in the United States helps explain the Federal Reserve's unique structure, in which the Board of Governors in Washington and the 12 regional Reserve Banks share power over monetary policy (as shown in figure 1). Throughout our history, Americans have questioned the structure and even, at times, the need for a central bank. Current discussions of Fed reforms echo these past debates. But it is important to understand that history in both advanced and emerging economies across the world has consistently demonstrated the need for a central bank, and both the existence and the structure of the Federal Reserve are products of that historical experience. Our structure is fundamentally a compromise, shaped by American history stretching back to the first Bank of the United States and, later, by the lessons of the Great Depression. It is designed to deliver the United States a vitally needed central bank in a country that has had a long-standing aversion to centralized power over monetary and financial affairs. It preserves diverse regional voices while ensuring that policy can be implemented through a cooperative consensus. The balance between national and regional interests is critical to the spirit of the original compromise that created the Federal Reserve, and to its democratic legitimacy. The structure achieves a practical balance that should not be changed lightly, as it continues to serve the country well.
r170328b_FOMC
united states
2017-03-28T00:00:00
Addressing Workforce Development Challenges in Low-Income Communities
yellen
1
Thank you for this opportunity to be part of the National Community Reinvestment Coalition's annual conference. It is a pleasure to address a group of organizations committed to improving the lives of low- and moderate-income Americans and strengthening communities. I am especially pleased to be with you in 2017, which As you know, the CRA requires banks to help meet the credit needs of the communities they are chartered to serve, including low- and moderate-income neighborhoods. Since its enactment, the CRA has helped channel capital into communities and, in the process, supported innovative and effective approaches to community development. We at the Federal Reserve take our CRA responsibilities seriously. We evaluate the CRA performance of the state-chartered banks we supervise and make the ratings and written evaluations public. We currently are improving our examination procedures and examiner training. We also work with our fellow bank regulators to continually improve our implementation of the law. And, as many of you know, we recently revised our interagency guidance to clarify how various community development activities are considered in assessing CRA performance, among them workforce development, which is my topic today. Workforce development is a bit of a catchall phrase encompassing different types of initiatives that help prepare people for jobs by providing them with training, placement assistance, and other support. Organizations dedicated to providing workforce development are interested not just in helping people secure any jobs, but jobs that pay well, provide benefits, offer opportunities for advancement, and are less likely to be eliminated during economic downturns. Significant job market changes in recent years, brought about by global competition and technological advances--and the new and shifting skills that these changes demand--make workforce development more important than ever before. As community development practitioners, you know that good-paying, stable jobs are not only important to workers and their families, but also are the foundation of strong neighborhoods. As part of our CRA responsibilities, the Federal Reserve, together with the Federal Deposit Insurance Corporation and the Office of the Comptroller of the Currency, has made it clear that banks will receive CRA recognition for lending to, investing in, and providing services to workforce development initiatives. In fact, two Federal Reserve Banks recently published a framework describing how workforce development initiatives can fit within a bank's broader CRA strategy. While the job market for the United States as a whole has improved markedly since the depths of the financial crisis, the persistently higher unemployment rates in lower-income and minority communities show why workforce development is so essential. For instance, unemployment rates averaged 13 percent in low- and moderate- income communities from 2011 through 2015, compared with 7.3 percent in higher- income communities. The challenges for workers in minority communities are even greater. The average unemployment rate across all census tracts where minorities made up a majority of the population averaged 14.3 percent from 2011 through 2015. Also, a much smaller share of the prime working-age population in these areas is employed--67.7 percent during this period, which is nearly 9 percentage points lower than in communities with smaller minority populations. These elevated unemployment rates and depressed employment-to-population ratios underscore the strong need for effective workforce development options for these communities. Probably the most important workforce development strategy is improving the quality of general education. The rapid rise in U.S. education levels in the 20th century, facilitated by the growing availability of high school education in the first part of the century and the rapid expansion of public universities after World War II, contributed enormously to the broad-based economic gains associated with that period. unfortunately, for a wide variety of reasons that are beyond the scope of this talk, education levels have historically lagged in low- and moderate-income communities, particularly communities of color. Between 2011 and 2015, the average proportion of adults in low- and moderate-income communities who had dropped out of high school was 23.5 percent, which is more than double the 10.9 percent rate in higher-income communities during that period. When students from low- and moderate-income families do complete high school, they are less likely to pursue a college degree. And when they do attend college, those students are less likely to graduate. Among the reasons for the disparity are a lack of money to pay for college and the burden of family responsibilities. As a result, the proportion of adults in low- and moderate-income communities from 2011 through 2015 who had a four-year college degree or graduate degree was about half the share in higher- income communities--17.6 percent versus 34 percent. This educational disparity matters because, among many reasons, people with less education experience both higher unemployment and lower average earnings. In December 2016, for example, the unemployment rate for people aged 25 and older with a bachelor's or higher degree was only 2.6 percent. However, the rate was nearly double, 5.1 percent, for workers in this age group with only a high school diploma and about triple, 7.9 percent, for those without a high school degree. While high school graduates earn somewhat more than people who did not finish high school, the big payoff comes with a four-year college or advanced degree. advantages of higher education, evidenced by lower unemployment rates and higher earnings, are clear across the spectrum but are greater for non-Hispanic whites than for Improving educational levels in low- and moderate-income communities is a long-term task. At least partially because of a lack of early childhood education and the sometimes lower quality of schools, children in these neighborhoods score substantially lower on standardized tests and drop out of high school at higher rates. Thus, a starting point is to improve access to quality education in early childhood and improve the quality of primary and secondary schooling. We must also recognize and address the barriers faced by low- and moderate- income students trying to attain higher levels of education--barriers not typically faced by their more well-off peers. First, these students often do not have friends or family who have achieved higher educational levels, which matters because students whose parents did not attend college are much less likely to pursue a college degree themselves. so, these students can benefit from high school counselors or other mentors who can assist them in choosing schools that provide the financial and counseling support that will help them complete their degrees. And they can benefit from help in picking majors and degrees that lead to higher earnings. Second, lower-income students often pursue their education while working to support themselves and family members. Educational programs that help students balance these competing responsibilities go a long way to improving completion rates. In recognition of this fact, the U.S. Department of Education is offering competitive grants to postsecondary institutions to support or establish campus-based childcare programs primarily serving the needs of low-income students. I want to reiterate that addressing the particular barriers standing in the way of lower-income students attending college and earning a degree requires a long-term strategy. However, not every student wants a two- or four-year degree or will have the financial wherewithal to pursue and complete a postsecondary degree. To support programs aimed at students not bound for college, in 2015, the Federal Reserve Banks of Philadelphia, Cleveland, and Atlanta identified occupations with above-average wages for workers without a bachelor's degree in the country's 100 largest metropolitan areas. The Reserve Banks made this information available to workforce development providers online and in workshops. Educational programs and training that lead to better paying and more steady work are crucial for people without college degrees, particularly lower- income workers. In the rest of my remarks, I would like to highlight some examples of successful initiatives that illustrate key features of effective workforce development. As you know, within our community affairs offices, the Federal Reserve devotes considerable effort to studying and promoting effective workforce development, so I am able to draw on the knowledge and experience of our staff. In discussing the examples, I would like to make five points. First, it is crucial for younger workers to establish a solid connection to employment early in their work lives. The Federal Reserve's 2013 Survey of Young Workers found that 18- to 30-year-olds with early work experience were more likely to emerge from the recent recession with a permanent job. Other studies have found that students who worked 20 hours per week in their senior year of high school earned higher wages later in life than those who did not, and that summer youth employment programs improved participants' attitudes toward their communities, raised their academic aspirations, and boosted their job readiness skills. The findings point to some of the reasons that the Boston Fed decided to lend technical support to a pilot program called Pocket Change, which aims to reduce unemployment among low-income 18- to 24-year- olds in Somerville, Massachusetts, through internships, training in job skills, and reinforcement of important soft skills such as punctuality and effective communication. In its first two years, the initiative trained 53 low-income young people and placed 20 of them in jobs. The results of this modest program indicate the promise of efforts that focus on first-time work experiences, and Somerville is now seeking to expand the initiative. My second observation is that career and technical education (CTE) programs, which have seen a revival in recent years, have considerable potential. For some time, vocational education had fallen out of favor or was in decline in the United States, as it was associated with the deleterious practice of "tracking" less advantaged students that denied them the opportunity for the best education. But more recently, CTE has been refined and has made a comeback as an effective way to help non-college-bound workers gain valuable skills and obtain a foothold in a labor market that increasingly requires technical proficiency. These programs teach the skills needed to pursue careers in fields such as construction, manufacturing, health care, information technology, hospitality, and financial services. This point brings me to my third observation, which is that effective CTE programs and other workforce development initiatives are able to match education and training to good-paying jobs when they actively engage employers in the training process. WorkAdvance, a regional sector-specific program, is a good example. It delivers an array of aligned services to meet local business needs and provide jobs for unemployed and low-income adults in multiple cities. An evaluation of the program last year in Tulsa, New York City, and northeast Ohio found it was especially effective because it offered training for in-demand skills and industry-recognized certifications, and it focused on jobs that have clear paths for advancement. A fourth observation is that apprenticeships, which are more common in other countries, could play a larger role for low- and moderate-income individuals in our country as part of broader career and technical education efforts. For instance, a state-run program in South Carolina, Apprenticeship Carolina, helps employers develop apprenticeships at no cost to them. Businesses receive a $1,000 annual tax credit per apprenticeship, and the program assists them with information and technical needs, paperwork, and the integration of classroom learning at local technical colleges. The program has led to sizable job gains at a modest cost to the state. Washington State registered apprenticeship programs contributed to substantial long-term increases in employment rates and hourly wages. My fifth and final observation is that promoting entrepreneurship could play a greater role in workforce development. Entrepreneurship is a fundamental strength of the American economy, and owning your own business or working for yourself can offer income, a means of building wealth, and, sometimes, greater flexibility for balancing job and family commitments. Yet we see less self-employment in low- and moderate-income communities. Moreover, when businesses are owned by minorities, they are less likely to have paid employees. These findings speak to the opportunities that could be realized by helping people start their own businesses and then helping them grow their businesses. Programs that equip people with the management skills and knowledge they need to start and operate a successful small business can help. Relevant and effective training can reduce the failure rate of businesses by helping owners make better decisions and avoid costly mistakes. These programs are especially critical in low-income and rural communities where other resources to support small business development may be scarce. As part of their community affairs work, several Federal Reserve Banks have organized a small business protection and education series last year in partnership with the Brooklyn Chamber of Commerce and a local development corporation. Participants learned about capital resources available to small businesses, online credit alternatives for small businesses, and the risks of handling large volumes of cash. In another example, the Federal Reserve Bank of Kansas City developed a guide to help rural and smaller metropolitan communities promote conditions favorable to growing local businesses rather than relying only on efforts to attract large companies. Of course, in addition to training, small business owners need financing. But, as many of you know, factors such as lack of a credit history or a poor credit history and limited collateral--for example, home equity--make it difficult for the owners of small enterprises to access traditional business credit. This situation is true for many minority, women, and low-income borrowers. Nontraditional lenders, including more than 1,000 Community Development Financial Institutions around the country, help fill the gap. To conclude, while the economy overall is recovering and the job market has improved substantially since the recession, pockets of persistently high unemployment, as well as other challenges, remain. Fortunately, programs such as the ones I have highlighted today can help address these challenges in more targeted ways than the Federal Reserve is equipped to do through monetary policy. I want you to know that we applaud your work, and we thank you for doing all that you do to serve the needs of lower-income communities across the country. Whether you work to provide affordable housing, homeownership counseling, small business credit and technical support, or workforce development, I hope you know that you have a partner in the Federal Reserve. In the ways we can, with the different tools we each have, our aim is the same: to make the economy work for the benefit of all Americans. This goal is of utmost importance, and I am glad to work alongside you in striving to achieve it.
r170404a_FOMC
united states
2017-04-04T00:00:00
Departing Thoughts
tarullo
0
Tomorrow is my last day at the Federal Reserve. So in this, my final official speech, it seems appropriate to offer a broad perspective on how financial regulation changed after the crisis. In a moment, I shall offer a few thoughts along these lines. Then I am going to address in some detail the capital requirements we have put in place, including our stress testing program. Eight years at the Federal Reserve has only reinforced my belief that strong capital requirements are central to a safe and stable financial system. It is important for the public to understand why this is so, especially at a moment when there is so much talk of changes to financial regulation. To understand the regulatory changes made in response to the 2007 to 2009 financial crisis, it is useful to recall the circumstances with which regulators and legislators were confronted. First, of course, was the sheer magnitude of the impact on the economy, which suffered its worst recession since the Great Depression. Second was the dramatic freezing up of many parts of the financial market, risking successive waves of fire sales that would send asset values plummeting anew. Third was the rapid deterioration of financial firms. Hundreds of smaller banks eventually failed. Bear Stearns, Merrill Lynch, Wachovia, and Countrywide were all close to failure when they were acquired by other financial firms with one or more forms of government support or assistance. American International Group was rescued directly by the did fail, which set off the most acute phase of the crisis. The impact of Lehman's bankruptcy seemed to confirm fears that failure of the largest financial firms risked the complete implosion of the financial system. This, of course, is the too- big-to-fail problem: government officials may feel compelled to save private financial firms with public (that is, taxpayer) capital. Meanwhile, financing markets had nearly frozen up. Hence the extraordinary government actions that followed. Public capital was injected into all of the nation's largest remaining banking firms following congressional enactment of the Troubled provided financing and backstops, respectively, for money market funds and various forms of securitized assets. The Federal Deposit Insurance Corporation extended its guarantees to bank deposits and the senior debt of banks. These and other measures ultimately proved successful in placing a floor under the downward spiral of the financial system. But it was against the backdrop of the need for massive taxpayer-backed assistance--to firms and to markets more generally--that Congress and financial regulators developed responses to the woefully inadequate capital levels of prudentially regulated firms; the systemic consequences of stress at previously non-prudentially regulated firms such as the free-standing investment banks; the widespread failures of risk management within these firms; the parallel failures in supervision of these firms; and the fragility of a financial system that had become characterized by large amounts of runnable short-term funding. The first and, to my mind, still the most important element of regulatory strengthening was to increase the amount of capital held by banks to ensure they remained viable financial intermediaries that could finance economic activity. In fact, this effort began as part of the emergency stabilization efforts in early 2009, when we conducted a stress test of the 30 largest banking firms. Where we determined a firm did not have enough capital, we required that it either raise equity in the public markets or take some of the remaining government TARP capital. The quick action in assessing the firms, recapitalizing them where needed, and sharing the results of the stress tests with the public stands as one of the turning points in the crisis. From there, we pursued a strategy of gradually strengthening ongoing capital requirements. With a few exceptions, the approach we took from the fall of 2009 onward allowed the banks to use retained earnings to build their capital. We also began development of the first quantitative liquidity regulations to be used in prudential regulation by the U.S. banking agencies. This initiative was, of course, a direct response to the liquidity squeezes encountered during the crisis itself. The capital and liquidity efforts were well underway by the time Congress passed the And Congress was, of course, aware of these efforts. So it is perhaps not surprising that the provisions of the Dodd-Frank Act relating to capital set some important qualitative standards for capital regulation rather than addressing capital levels as such. A law as long and wide-ranging as the Dodd-Frank Act cannot be reduced to a single key premise or concern, excepting its general focus on financial stability and systemic risk. However, with respect to the too-big-to-fail problem, I do think it fair to say--on the basis of both the text itself and its legislative history--that a pivotal choice was to make tighter the prudential regulation of the practices and activities of large banking organizations the presumptive approach to taming too-big-to-fail problems. The alternative, much discussed at the time and since, would have been a structural approach. One such approach could have been something like the old Glass-Steagall Act separation of commercial banking from investment banking, which prohibited rather than simply regulated certain activities in different types of firms. Another structural approach would have been outright size limitation resulting in the breakup of some of the largest financial firms. The Dodd-Frank Act does give regulators authority to require divestitures by firms posing risks to financial stability. But these authorities, which contain only the most general of standards, seem intended to be used only if the panoply of other measures in the legislation have failed to contain the too-big-to-fail problem. Thus, at least in the first instance, the Dodd-Frank Act forgoes structural solutions, which might have been cleaner conceptually, but perhaps much more complicated as a practical matter. Instead, it imposes a host of restrictions and requirements. So we have counterparty credit limits, risk retention requirements, incentive compensation constraints, resolution planning requirements, and others. These new statutory measures were meant to, and do, coexist with the capital and liquidity requirements put in place by the banking agencies under their pre-existing authority, as enhanced by the Dodd-Frank Act. An important corollary of this basic approach was that the Dodd-Frank Act requires that many of these regulations be progressively more stringent as applied to firms of greater systemic importance. From this perspective, then, it is not surprising that the Dodd-Frank Act implementation has been a major undertaking, that banks (and sometimes supervisors) feel overwhelmed by the breadth of the resulting compliance effort, and that there is some overlap among some of the regulations. This outcome was, in effect, the price of the largest banks not being subject to a direct structural solution such as breakup. None of this is to say that the Dodd-Frank Act got the mix of restrictions just right. To the contrary, it would have been surprising for such a major piece of legislation passed in the immediate aftermath of the crisis to have done so. Usually, a law like the Dodd-Frank Act would have been followed some months later by another law denominated as containing technical corrections, but also usually containing some substantive changes deemed warranted by analysis and experience. But partisan divisions prevented this from happening. And the novelty of many of the forms of regulations adopted by financial regulators, either in implementing the Dodd-Frank Act or under existing authorities, almost assures that some recalibration and reconsiderations will be warranted on the basis of experience. So there are clearly some changes that can be made without endangering financial stability. Foremost among these are the various bank size thresholds established in the Dodd- Frank Act or in agency regulations for the application of stricter prudential requirements. For example, as I have said for several years now, we have found that the $50 billion in assets threshold established in the Dodd-Frank Act for banks to be "systemically important," and thus subject to a range of stricter regulations, was set too low. Similarly, the $10 billion asset threshold for banks to conduct their own required stress tests seems too low. And the fact that community banks are subject at all to some of the Dodd-Frank Act rules seems unnecessary to protect safety and soundness, and quite burdensome on the very limited compliance capabilities of these small banks. Beyond the thresholds issue, though, are there statutory provisions or regulations whose substance could be adjusted to better match economic or compliance costs with financial stability benefits? Again, it would be very surprising if this did not prove to be the case over time. It would also be surprising if we did not find areas in which rules needed to be strengthened in order to achieve financial stability goals, particularly as financial markets change. Generally, I think it is a bit early to judge the balance of costs and benefits of many of the new rules. Some are not yet fully implemented. Firms are still in the process of adjusting to the new rules. And it is still somewhat difficult to determine, for example, what should be considered "normal" levels of liquidity or lending, insofar as the pre-crisis period was one in which high levels of both lending and liquidity proved unsustainable. Moreover, given the healthy increases in lending over the last several years and the record levels of commercial bank profits recorded in 2016, it would seem a substantial overreach to claim that the new regulatory system is broadly hamstringing either the banking industry or the economy. But there are areas where I think the case for change has become fairly strong. The Volcker rule is one. During the debates on what became the Dodd-Frank Act, former Chairman Paul Volcker offered a fairly straightforward proposal: no insured depository institution or affiliate thereof should be permitted to engage in proprietary trading. It seemed then, and seems now, like an idea that could contribute to the safety and soundness of large financial firms. However, several years of experience have convinced me that there is merit in the contention of many firms that, as it has been drafted and implemented, the Volcker rule is too complicated. Achieving compliance under the current approach would consume too many supervisory, as well as bank, resources relative to the implementation and oversight of other prudential standards. And although the evidence is still more anecdotal than systematic, it may be having a deleterious effect on market making, particularly for some less liquid issues. There are three problems--two in the statute and one in the regulatory approach--that I think are related. The first statutory problem is that five different agencies are involved. While the statute does not require a single regulation agreed upon by all five, it understandably calls for coordination and consistency in rulemaking and implementation. The joint or parallel rulemaking among multiple agencies required in various parts of the Dodd-Frank Act has advantages and disadvantages that differ across subject matter. Here, though, the disadvantages seem to dominate. Because almost any effort to distinguish market making from proprietary trading, for example, is impossible to sensibly reduce to a formula or precise rule across all traded instruments, there is ongoing and substantial need for context-specific, data-heavy judgment. Efforts to achieve consistency in treatment across agencies have been both time- consuming and, at times, unsuccessful. The second problem is that the approach taken in the regulation in pursuit of consistency was one that essentially contemplated an inquiry into the intent of the bankers making trades to determine, for example, whether the trades were legitimate market making. The agencies knew this approach would be complicated when we adopted it, but it seemed the best way to achieve consistency, at least over time. I think the hope was that, as the application of the rule and understanding of the metrics resulting from it evolved, it would become easier to use objective data to infer subjective intent. This hasn't happened, though. I think we just need to recognize this fact and try something else. Had there been an obviously better approach, we would have taken it five years ago. My suspicion is that it lies in reviewing and monitoring the trading limits established on all trading desks. As contemplated in the statute, capital requirements might also be used as a complementary tool, such as by requiring progressively higher amounts of capital as trading inventories age--a pretty good indicator that market making is morphing into proprietary trading. Whether a consistent approach can be developed while five agencies continue to be involved is not clear, but it is certainly worth trying. The third problem, also in the statute, is that the Volcker rule applies to a much broader group of banks than is necessary to achieve its purpose. As I have said before, the concerns underlying the Volcker rule are simply not an issue at community banks. Many regional banks have few or no trading assets of any sort, so proprietary trading is obviously not a concern there either. While the regulatory agencies have tried to tailor the rules so as to avoid burdening these banks, even the process of figuring out that the rules do not constrain them is a compliance cost that should be eliminated. One approach would be to exempt all banks with less than $10 billion in assets and other banks that report less than some nominal amount of trading assets. Let me now turn to capital. The history of financial regulation over the last several decades is in many respects defined by an increasing emphasis on capital requirements and, specifically, higher minimum ratios based on a more rigorous definition of what constitutes loss- absorbing capital. This tendency can be explained by the fact that capital is a particularly supple prudential tool. As activity and affiliation restrictions on banks have been loosened, and as the integration of traditional lending with capital markets has created new financial products at a rapid pace, capital requirements can provide a buffer against losses from any activities. No single measure of capital is sufficient to ensure an adequate buffer however. Simple leverage ratios are a good check on banks becoming too debt dependent, but they encourage more risk-taking, insofar as they impose the same capital charge for every asset, no matter how risky. Standardized risk-based capital ratios implement the intuitively appealing notion that a bank's capital should depend on the riskiness of its assets. But the grouping of individual loans and securities into necessarily broad categories of risk weights (e.g., residential mortgages) can be arbitraged. And a firm holding lots of assets that look very low-risk in normal times can be vulnerable if its total leverage is high during stress periods. Models-based capital requirements can better distinguish among risks to some degree, and they can be made more forward-looking than static leverage or risk-based ratios. But, to the extent that banks' internal models are used, it is difficult to monitor whether banks are intentionally or unintentionally running models that understate their risks. And, of course, they are subject to the usual limitations of models that are based only on past experience and correlations. In the post-crisis period, we have continued the previous U.S. practice of using complementary leverage ratio and standardized risk-weighted capital requirements, though at higher levels and with more reliance on common equity as the preferred measure of true capital strength on a going concern basis. We have added a stress test, now based on a supervisory model. We, along with some other jurisdictions that are home to banks of global systemic importance (G-SIBs), have also applied surcharges to the leverage and risk-weighted requirements of such banks. The rationale for this feature of our capital regime is that the potential negative externalities caused by the stress or failure of a G-SIB warrant a higher level of capital. Graduated capital surcharges have the additional policy benefit of providing these firms with an incentive to reduce the size and scope of their activities so as to present lesser risk to the financial system. The U.S. banking agencies based post-crisis capital requirements on historical loss experiences so as to require banks to have a capital buffer that could absorb losses associated with a significant economic downturn and remain viable financial intermediaries in the eyes of customers, counterparties, and financial markets. But our researchers, like those at some other official-sector entities, have been using more formal economic analysis to estimate the level of capital requirements that best balances the benefits associated with reduced risk of financial crisis with the costs of banks funding with capital rather than debt. A recent study by three Federal Reserve Board researchers concludes that the tier 1 capital requirement that best achieves this balance is somewhere in the range of 13 percent to 26 percent, depending on reasonable choices made on some key assumptions. By this assessment, current requirements for the largest U.S. firms are toward the lower end of this range, even when one takes account of the de facto capital buffers imposed on most firms in connection with the stress test. This assessment, when added to our original historically-based approach and the methodology used in developing the capital surcharges, suggests strongly that a reduction in risk- based capital requirements for the U.S. G-SIBs would be ill-advised. In fact, one might conclude that a modest increase in these requirements--putting us a bit further from the bottom of the range--might be indicated. This conclusion is strengthened by the finding that, as bank capital levels fall below the lower end of ranges of the optimal trade-off, the chance of a financial crisis increases significantly, whereas no disproportionate increase in the cost of bank capital occurs as capital levels rise within this range. In other words, in trying to avoid a future financial crisis, it is wise to err somewhat toward the higher end of the range of possible required capital levels for this group of firms. On the other hand, it seems reasonably apparent that the increased granularity of the standardized risk-weighted capital requirements put in place after the crisis, while necessary to deal with the range of risks in larger banks, is unduly complicated for community banks. It's not that these requirements have increased appreciably the amount of capital community banks hold, but more that the complexity of compliance and reporting imposes costs that are disproportionately much greater for these banks, given that they have much smaller balance sheets over which to amortize the associated costs. For this reason, I believe we should be moving toward a much simpler capital regime for community banks. The federal banking agencies have already taken some steps in this direction, and they can take a few more. But it may be helpful to amend the law so as to make clear that the agencies would have the flexibility to create a simple capital regime applicable only to community banks. There has been much discussion of late of the leverage ratio requirement, from multiple perspectives. There are proposals to make a higher leverage ratio requirement either mandatory or optional for banks, which would then be relieved of risk-weighted capital requirements and many other prudential regulations. There are also those who have questioned the relative cost- benefit tradeoff of the "enhanced supplementary leverage ratio," the 2 percent surcharge Increasing the current 4 percent or 5 percent leverage ratio requirement to, say, 10 percent would certainly yield a very well-capitalized set of banks based on the current balance sheets of large banks. But one needs to look at the dynamic effects of such a requirement. Since a higher leverage ratio would also make banks less profitable, and with the constraints of risk- based capital and liquidity requirements lifted, they would be strongly incentivized to change the composition of their balance sheets dramatically, shedding safer and more liquid assets like Treasuries in exchange for riskier but higher-yielding assets. After all, with a leverage ratio as the only significant constraint, the regulatory cost of holding a short-term Treasury bill is identical to that of a junk bond. It is this very limitation of a leverage ratio that led to the creation of a complementary risk-based capital requirement in the 1980s. To truly assure the safety and soundness of the financial system, a leverage ratio serving as the sole or dominant form of prudential regulation would probably have to be set considerably higher, at a level where the impact on financial intermediation could be quite substantial. As to the impact of the 2 percent enhanced supplementary leverage ratio, our experience leads me to believe that it may be worth changing to account for the quite different business operations of the G-SIBs, particularly those in the custody business. The complementarity of risk-based capital requirements and leverage ratios suggests that there should be some proportionality between the two. This is, of course, the current situation with respect to the standards applicable to non-systemic banks, with the leverage ratio requirement being sensibly set somewhat below the risk based requirement. However, with the additional standards applicable only to the eight systemically important firms, we have a sliding scale of risk-based surcharges but an across-the-board 2 percent leverage ratio surcharge. In practical terms, the asymmetry is most significant for the two banks that are dominantly custodial and transactional in nature, rather than lending and trading firms. These banks have had the lowest risk-based surcharges of the eight G-SIBs--currently 1- 1/2 percent--but their leverage surcharge is 2 percent. This is especially problematic for their operations, since they prudently reinvest custody customer deposits into safe and liquid assets. I think it would be sensible for the banking agencies to consider altering the enhanced supplementary leverage ratio requirement so that it would be set with an eye toward the risk- based surcharge. One, but certainly not the only, way to do this would be for the enhanced supplemental leverage ratio to be 1 percent for the firms with a 1 percent to 1- 1/2 percent risk- based surcharge, 1- 1/2 percent for those with a 2 percent or 2- 1/2 percent risk-based surcharge, and 2 percent for those at 3 percent or above. An alternative approach to mitigating the distortionary effects of the leverage ratio requirements is to exclude certain "riskless" assets from the denominator. Some central bankers around the world have been arguing to exclude central bank reserves from the leverage ratio denominator, on the ground that they are "safe" and that including them may make monetary policy harder to execute in a period of unusually large central bank balance sheets. But this would defeat the whole purpose of a leverage ratio, which is to place a cap on total leverage, no matter what the assets on the other side of the balance sheet may be. Cash holdings, for example, are not excluded. This proposal would also create a classic slippery slope risk, which was illustrated during a discussion in which I participated last year. When a central banker raised the idea of excluding reserves, a finance ministry official mused aloud that perhaps sovereign debt should also be excluded. Raising the minimum ratios in leverage and risk-based capital standards, requiring that qualifying regulatory capital be truly loss absorbing, and setting higher requirements for the most systemically important banks have been important steps toward the goal of a well-capitalized, and thus safer, financial system. But the stress testing system begun during the crisis, and continually refined since, has been the key innovation in capital regulation and supervision and makes those other measures more effective. The success of the 2009 stress test in restoring confidence in the financial system during the crisis encouraged Congress to make stress testing a required and regular feature of large-firm prudential regulation. As the term suggests, stress tests evaluate the capacity of banks to absorb losses that may be associated with major economic adversity and remain not only technically solvent, but also viable financial intermediaries. They are explicitly forward-looking, in that they involve creating unlikely but plausibly severe economic scenarios and then modeling the likely impact of those scenarios on bank assets and earnings. The Federal Reserve has tied the results of stress tests into capital regulation by requiring that bank capital distributions be consistent with maintaining viability in the event the severe scenario were to materialize. That is, dividends and share repurchases cannot bring the bank's capital level below the sum of minimum capital requirements and the amount of losses that could be sustained in the stress event. By looking at the impact of such scenarios on the considerable part of the financial industry accounted for by the larger bank holding companies subject to the requirement, the Federal Reserve's approach gives insight into how substantial economic or financial shocks would affect the financial system and the real economy. One virtue of stress testing is that it allows a forward-looking assessment of potential losses that is customized to the portfolios and business models of each bank, while still being consistent across the banks. The Federal Reserve uses independent supervisory models to estimate losses and revenues under stress, both to achieve more comparability across the results for different banks and to preclude any temptation for banks to game their own models. This linkage of stress testing to bank capital requirements has been a good way for regulators to regularize exercise of their broad statutory discretion to set individual capital requirements on a bank-by-bank basis. Banks subject to the supervisory stress tests have generally found it to be their binding capital constraint. This is as it should be, insofar as stress testing is meant to help set capital requirements for when they will most be needed--that is, in a serious economic downturn. From the first stress test performed in the winter of 2009, the Federal Reserve has publicly disclosed progressively more information about its supervisory model, the scenarios, and the results. During the crisis, disclosure was intended to help restore confidence in the banking system. Our continuation and expansion of disclosure helps market participants, analysts, academics, and the public better evaluate both the condition of the banks and the rigor of supervisory oversight. It thus serves the dual purpose of market discipline and government agency accountability. To serve its important financial stability purpose, stress testing must never become static. As the financial system evolves, with the creation of new products and new correlations among asset price movements, the supervisory model must account for these changes. And as salient risks to the financial system arise, the scenarios must test for these new risks. Apart from the inherent need for adaptation, though, there is one respect in which the Federal Reserve's stress testing program is incomplete and other respects in which it is still in transition from a crisis and post-crisis measure to a permanent and central feature of prudential oversight. The significant way in which the stress testing program is incomplete is that it has only limited features with which to assess the condition of participating banks from a macroprudential perspective. For example, it generally does not directly take account of second-round effects of stress on the financial system, such as the possible fire sale of assets by financial firms in need of capital or funding, which can further depress asset values of other firms below levels resulting from the initial economic or financial shock. These effects are harder to model but very important for a stress test designed to achieve financial stability objectives. The Federal Reserve has begun a research program to try to develop, over the next few years, sound macroprudential elements to incorporate into the stress test alongside some of the countercyclical features that have already been added. The transition of stress testing from crisis program to a permanent feature of prudential oversight is unfinished in both Federal Reserve regulations and supervisory practice. The de facto capital requirements produced by the stress test have not been fully integrated into, and reconciled with, other applicable capital rules. Thus, for example, our stress testing program assumes that a firm will continue to make its planned capital distributions during a stress period even though the regulatory capital rules now include a capital conservation buffer to limit such distributions. As to supervision--because the failure of a firm to meet Federal Reserve expectations with respect to its capital risk-management and planning processes can lead to a "qualitative" objection to its capital distribution plans--firms (and, at times, perhaps supervisors) have placed more emphasis on these matters than on other issues raised in the supervisory process throughout the year. It is probably worth noting at this juncture that one of the features of the stress testing program that some banks have found most troubling is that it culminates in the annual announcements of whether the Federal Reserve objects to each participating bank's capital plan-- an event that still garners considerable investor and public attention. The potential for embarrassing, public objections to their plans has been disconcerting to some banks, which pointed out that--by design--they were not given the supervisory model for calculating post- stress minimum capital levels and that they might not be able to predict when supervisory concerns with some aspects of their capital planning processes would ripen into a public objection. It is certainly the case that this feature of our stress testing program was intended to, and has, focused the minds of banks' senior management on their capital positions and capital planning processes. Motivating management with the stress test was appropriate in a time when capital needed to be built up and when serious shortcomings of pre-crisis risk management at many large U.S. banks needed to be remedied. To be honest, I was stunned in my first few months at the Federal Reserve to find out that many of these banks were unable to aggregate their total exposure to particular counterparties across the many parts of the bank in anything like a reasonable time. Some firms did not have ready access to basic information about the location and value of collateral that they held. As recently as a couple of years ago, we were still seeing some significant problems with data and modelling reliability in banks' internal risk- management processes. Still, the question was always how long we would need this highly focused set of annual determinations. Several years ago we took a first step to reduce the potential for a quantitative objection by giving any bank whose planned distributions would have brought it below the post- stress minimum capital requirements a short time in which to adjust its plan. Last fall, I gave a speech in which I previewed the Board's additional thinking on this subject, following the Board's year-long review of the stress testing program. One point was our intention to remove the "qualitative" part of the annual stress testing exercise for participating banks with less than $250 billion in assets. We have since done just that, in recognition of the fact that these firms had generally met the supervisory expectations for capital planning and risk management put in place after the crisis. In that speech I also indicated that the Board of Governors was considering a significant revision to our stress testing program that would both integrate it into other applicable capital requirements and begin to reduce the amount of attention directed at the annual announcement of stress test results. The proposal for what our staff has called a "stress capital buffer" (SCB) would simplify our capital regime by replacing the existing 2.5 percent fixed capital conservation buffer applicable to all banks with a buffer requirement equal to the maximum decline in a firm's common equity ratio under the severely adverse scenario of the stress test. This change would, of course, apply only to the roughly 30 banks that participate in the supervisory stress test. This buffer would be recalculated after every year's stress test. Then, through the succeeding year, a bank would have to observe the constraints on capital distributions written into our point-in-time capital requirements if its capital ratio fell below the sum of our minimum capital requirement and the applicable stress capital buffer. Because the capital surcharge on the eight G-SIBs already exists as a part of our regular capital rules, the stress capital buffer approach would effectively add the surcharge to our estimates of the amount of capital needed under stress. The surcharges were put in place because the material distress or failure of a G-SIB would have an adverse impact on the financial system as a whole that is far greater than the impact on the financial system of the distress or failure of a Accordingly, G-SIBs should face capital surcharges that compel internalization of those external costs. Because the difference in the external costs of the distress or failure of a G-SIB as compared to a non-G-SIB is likely to be at least as high during times of macroeconomic and financial market stress as during ordinary times, there is no reason why the G-SIB surcharge should not be a part of the post-stress capital regime. A complementary point is that the extra buffer required by the G-SIB surcharge reflects the fact that even the best- conceived annual stress scenarios cannot capture all tail risks in the financial system. The SCB proposal would thus raise somewhat the capital requirements of the eight G- SIBs. This outcome is consistent with analysis of the costs and benefits of capital requirements that I discussed earlier, as well as the rationale for surcharges. It is also consistent with the intuition, itself having some analytic backing, that because Congress decided against fundamental structural measures to deal with the too-big-to-fail problem, we should err somewhat on the side of higher capital requirements for these firms. Indeed, there are some academics and others who continue to make a case for even higher capital requirements. The inclusion of the surcharges would allow the Federal Reserve to relax some of the conservative assumptions currently made in the stress test without lowering the overall post- stress capital requirements for G-SIBs. While conservative assumptions were appropriate coming out of the financial crisis in the early days of the stress test, the SCB and its inclusion of the surcharges would offer an opportunity to update these assumptions without reducing the overall capital requirements for G-SIBs. At the same time, relaxing these assumptions would result in a modest decline in the effective capital requirements of the non-G-SIB participating banks when, as I hope and expect, the Board of Governors moves forward with a rulemaking implementing the SCB idea. Adoption of the SCB should remove a bit more of the drama originally associated with the annual announcement of the stress test results. But some would remain, particularly given the possibility of a qualitative objection, even where the supervisory model shows that the firm would have enough capital to remain a viable intermediary in the event something like the severely adverse scenario came to pass. Although the largest firms, unlike those with less than $250 billion in assets, are not yet generally meeting all supervisory expectations around stress testing and capital planning, they have each made substantial progress since 2009. With a few exceptions, the issues observed during recent Comprehensive Capital Analysis and Review (CCAR) cycles are less fundamental than those we were seeing even a few years ago. So I think the time may be coming when the qualitative objection in CCAR should be phased out, and the supervisory examination work around stress testing and capital planning completely moved into the normal, year-round supervisory process, even for the G-SIBs. Coupled with adoption of the SCB, and the changes in modeling and assumptions associated with that proposal, the elimination of the qualitative objection process would integrate the process and substance of stress testing into the rest of the Federal Reserve's prudential oversight activities. In doing so, it should alleviate the apprehension of banks that they may be subject to objections to their capital plans that are both very public and hard to fully anticipate. The SCB itself would continue the Federal Reserve's efforts to tier prudential requirements even among larger banks, with the G-SIBs having somewhat higher capital requirements commensurate with the damage their failure would inflict on the broader economy, and the regional banks subject to modestly lower requirements than those that effectively apply at present. Having just described some good directions for the evolving stress testing regime, let me comment on what I regard as some ill-advised ideas circulating in current policy discussions. One is to detach the stress test from any limitations on capital distributions. This would, in effect, make the stress test simply an informational exercise for supervisors and markets and would, accordingly, presumably be treated less seriously by all concerned. Were we to do so, the very virtues of the stress test that I recounted earlier would be lost, as we would return to using only general, backward-looking risk weights. Of course, it would also reduce capital requirements for the largest banks, which may be one of the motivations for the idea. I have heard two arguments for this idea. One is that Congress did not require that the stress test be used to limit capital distributions. The other is that it is somehow an unacceptable encroachment on the prerogatives of bank boards of directors to limit their discretion to declare dividends or authorize stock repurchases. While Congress did not explicitly call for stress tests to be used to assure adequate capital levels in larger banks, it did call for increasingly stringent capital measures as the systemic importance of banks increased. Section 165 of the Dodd-Frank Act, which contains the stress testing requirement, is singularly focused on achieving financial stability objectives. Moreover, as I noted earlier, Congress in the 1980s gave the federal banking agencies authority--within their discretion--to set capital requirements individually for specific banks. Again, as noted earlier, using stress tests to do so is not only wholly within that discretion. It is a more regularized way of doing so than an ad hoc judgment on a bank-by-bank basis. The argument that bank boards should not be constrained in making capital distributions amounts to an argument that there should be no capital regulation, since even traditional capital regulations limited boards from, say, declaring a dividend that would take the bank below minimum capital levels. And those who make this argument seem to have forgotten that some banks continued to pay dividends in 2007 and 2008 even as their situations became increasingly precarious. Another unwise idea would be to give the supervisory model to the banks. Some have argued that it is only fair to do so, because otherwise banks cannot know exactly what their capital requirements will be. For example, if a bank doesn't know with precision what capital charge will effectively be applied to a certain class of home equity loans, it will be handicapped in deciding how many such loans to offer, and on what terms. In fact, observation of the stress test results over time has given the banks--as well as analysts and other outside parties--a reasonable idea of the loss functions and other elements of the supervisory model. And the Federal Reserve has increased over time the amount of information it discloses about its stress test models. But there are very good reasons not to publish the model itself. In the first place, remember why this exercise is called a stress test . This is not a case of using a model to set a regulation that stands on its own as a constraint, and then testing to see if there is compliance with the rule. There, the model is essentially the reasoning by which the regulation was set. Even in a case where the test is independent of the regulatory end, risks exist, as we saw in the Volkswagen case, where the company is said to have designed its cars to pass the required emissions test but not to actually achieve the regulatory goal of reduced emissions. In the financial area, the dangers of disclosure are much greater. We are trying to evaluate what may happen to a bank's assets under stress. If a bank has the model, it will be able to optimize its balance sheet for the day on which the stress test is to apply by shifting into assets for which relatively lower loss functions apply. But it can then shift those assets back over succeeding days or weeks. Thus, the test will give a misleading picture of the actual vulnerabilities of the firm. In this and other ways, banks would use the models to guide changes in their behavior that do not change the risk they pose to financial stability, but do change the measured results of the stress test. Regulators and academics have long recognized that this type of behavior by banks, known as regulatory capital arbitrage, has been a persistent threat to financial stability. Additionally, giving the firms the model will likely encourage increased correlations in asset holdings among the larger banks--a trend that increases systemic risk, since everyone will be exposed should those asset classes suffer reversals. Releasing the computer code used in the model projections would repeat a serious error made a quarter century ago. In 1992, Congress established revised capital standards for the Corporation (Freddie Mac), the centerpiece of which was a stress test. However, for reasons that foreshadowed many of the arguments adduced today, all the details of the model were made public and any changes went through the standard notice and comment process. As a result, the government-sponsored enterprises (GSEs) and the public clearly understood the model. With the model in the hands of the GSEs, even a scenario of the severity of the 2006 to 2008 experience produced only mild losses for them. Of course, this result stands in stark contrast to the actual losses, which were sufficient to drive them into conservatorship in September 2008. In short, we should recognize that what might appear to be a reasonable transparency measure in publishing the models will in fact result in less protection for the financial system. Thus, if the model were to be published, I would suggest that the minimum required capital levels would need to be materially increased in order to take account of the dynamics I just described. There are a couple of ways to respond to bank concerns without courting these dangers. One would be to add some granularity in the definition of asset categories subject to a specific loss function. At times, some banks have felt that the breadth of certain categories of assets used by the supervisory model means there is a good bit of divergence in the risks associated with assets within the same category. The other would be for the Federal Reserve to publish a set of hypothetical portfolios with the model-implied losses on these portfolios. To that end, staff have been working on "control portfolio" level disclosures. These would permit a fairly accurate inference of the expected losses on any given set of assets. At the same time, they would not permit participants to game the models by scrutinizing them for the precise points where they were weakest. Much as I would have liked to touch upon important topics such as the need for credible resolution mechanisms for large banks and for adaptable regulatory processes to respond to new forms of shadow banking, I needed to be selective in drafting this speech. I concentrated on capital regulation because it is the single most important element of prudential financial regulation. The new features of G-SIB surcharges and stress testing help guard against a severe new financial crisis and contain the too-big-to-fail problem. As proposals for regulatory change swirl about, it is crucial that the strong capital regime be maintained, especially as it applies to the most systemically important banks. Neither regulators nor legislators should agree to changes that would effectively weaken that regime, whether directly or indirectly. It would be tragic if the lessons of the financial crisis were forgotten so quickly.
r170405a_FOMC
united states
2017-04-05T00:00:00
Welcoming Remarks
powell
1
Thank you, Donna. Good morning and welcome to the Federal Reserve. We are honored to have you here today as we host the biennial Interagency Minority Depository My colleagues from the Federal Reserve Bank of San Francisco and I are especially honored to be hosting you in Los Angeles. As you probably know, all of the previous Interagency MDI conferences have been held on the East Coast, mainly in Washington, D.C. However, because the largest concentration of minority banks is located here in Southern California, it seemed natural to bring this conference west. The Federal Reserve seeks to support MDIs in a number of ways, including our Partnership for Progress, our program for outreach and technical assistance to MDIs. Corporation share our goal of preserving and promoting MDIs because you are critical institutions to the communities you serve and the larger U.S. economy. And I note that Congress has also recognized your importance, mandating our respective agencies to help support MDIs. From the perspective of someone who sits on the Federal Open Market Committee, I see many ways that the Federal Reserve can not only support MDIs but is itself also supported by them, and I would like to talk about four of these ways today. First, half of our monetary policy mandate is maximum sustainable employment. That means that we need to be aware of employment trends across all communities in America, not just the top-line averages, since unemployment rates vary significantly across races and geographies. For the first time, last year, we put into our to Congress a section that detailed how post-recession economic gains have been distributed across races. You, as MDIs, are committed to understanding and serving these diverse communities. I know that, for example, your small business loans to minority business owners make a difference in the employment rates of minority communities. I thank you for that work, and we will continue to work closely with you to better understand the employment dynamics of underserved and minority communities. Second, the Fed is unique as a research institution. We have many economists on staff and therefore have the ability to engage in wide-ranging research that may be useful to your firms and communities. Specific to MDIs, we commissioned two new research papers for this conference to better understand trends in the MDI banking field. In addition, we have two new research papers on MDIs out of the Chicago Fed, one that explores MDI primary markets, and one that looks at MDI small business lending. Tomorrow you'll have an opportunity to hear about and discuss this new research, which will be finalized later this year. Third, we have a great deal of expertise in community banks, which I know most of you are. Of the 829 state member banks that the Federal Reserve directly supervises, 97 percent are community banks. Therefore, we spend a good deal of time thinking about the issues facing community banks and how to help them be competitive in today's economy. I recognize that as MDIs you share many of the same issues as other community banks, and also some issues that are unique to your sector. We want to work with you to better understand those issues and to help you, where possible, to better serve your communities. Fourth, and last, the Fed has a unique Community Development function that seeks to mobilize ideas, networks, and approaches that address a wide range of community and economic development challenges. One thing that makes our Community Development function unique is that we have deep geographic coverage at the 12 Reserve Banks and their Branch locations. Last year, we combined the resources of our Supervision and Regulation division with those of our Community Development department to staff our Partnership for Progress. By bringing in Community Development, we brought in a new perspective, one that has an explicit focus on low- and moderate-income communities. We know that you serve many of these same individuals and communities, and we are asking our Community Development staff around the country to reach out to you to gather your perspectives on the communities you serve to identify emerging issues of which we should be aware. In closing, your institutions are important to the American economy and our understanding of that economy. Therefore, on behalf of the Federal Reserve, I'd like to once again thank you for the work you do in your communities and welcome you to Los
r170417a_FOMC
united states
2017-04-17T00:00:00
Monetary Policy Expectations and Surprises
fischer
0
I will address the topic of central bank communications, with a particular emphasis on those times when financial markets and the central bank have different expectations about what a central bank decision will be. Such situations lead to surprises and often to market volatility. Of course, not all surprises are equal. For one, communications that shift or solidify expectations that are diffuse or not strongly held are less likely to be disruptive than communications that run counter to strongly held market beliefs. Further, there are worse things than surprises. The central bank must provide its views regarding the likely evolution of monetary policy, even when this view is not shared by market participants. A concern for surprising the market should not be a constraint on following or communicating the appropriate path of monetary policy. That said, there are good reasons to avoid unintended surprises in the conduct of policy. Why should central banks avoid surprising financial markets? In recent decades, it has been increasingly acknowledged that monetary policy implementation relies importantly on the management of market expectations. In theory, clarity about the central bank's reaction function--that is, how the central bank adjusts the stance of monetary policy in response to changing economic conditions--allows the market to alter financial conditions smoothly. This typically helps meet the bank's policy targets, with the result that the markets are working in alignment with the policymaker's goals. Under this theory, repeated market surprises that raise questions about the central bank's reaction function could threaten to disrupt the relationship between the central bank and the markets, making the central bank's job more difficult in the future. How can the Fed avoid surprising markets? Clear communication of the Federal Open Market Committee's (FOMC's) views on the economic outlook and the likely evolution of policy is essential in managing the market's expectations. The Committee has a number of communication outlets, including the policy statement, the Chair's news been useful in providing information on policymakers' assessments of the potential growth rate of the economy and r*, the equilibrium real interest rate, both of which help guide the market's expectations of the eventual path of policy. However, avoiding unintended market reactions has not always been easy. The example that immediately comes to mind is the taper tantrum of mid-2013. To recap, over the course of May and June in 2013, the yield on 10-year Treasury securities increased almost 1 percentage point amid increased market discussion of the eventual tapering of Fed asset purchases and some key communications on the topic (figure 1). In particular, the 10-year yield rose about 10 basis points after then Chairman Bernanke discussed tapering in public for the first time during the question-and-answer session of his Joint Economic Committee testimony on May 22, commenting that the FOMC could reduce the pace of purchases "in the next few meetings" if it saw continued improvement in the labor market that it was confident would be sustained. Yields rose even more sharply after the June FOMC meeting, when, during his postmeeting press conference, Chairman Bernanke noted that if the economy evolved as expected, the FOMC anticipated reducing the pace of purchases in the latter part of 2013 and halting purchases altogether by the middle of 2014. Information gathering is an important part of managing market expectations--for the simple reason that you do not know if you are going to surprise the market unless you have a good estimate of what the market is expecting. A remarkable feature of the taper tantrum is that it was a surprise that should not have been a surprise, at least from the perspective of the information the FOMC had at the time. In assessing market expectations for policy, the FOMC reviews a variety of market indicators and also draws heavily on the Federal Reserve Bank of New York's Survey of Primary Dealers, whose respondents are the market makers in government securities and the New York Fed's trading counterparties. This survey, conducted about one week prior to each FOMC meeting, gauges primary dealers' expectations about the economy, monetary policy, and financial market developments. In the June 2013 primary dealer survey, the median expectation was for tapering to start in December 2013, with purchases ending in June 2014, a path not significantly different from that laid out by Chairman Bernanke in his postmeeting press conference. Thus, one could view Chairman Bernanke's remarks during his June 2013 press conference as consistent with "market expectations." Why did markets react so sharply to the apparent confirmation of the median expectation? One simple possibility is that the median expectation of the primary dealers was not reflective of the median expectation of a wider range of market participants. Respondents to the primary dealer survey are more likely to be Fed watchers and therefore more likely in tune with Fed thinking than the average market participant. For example, as seen in figure 2, a comparison of the June 2013 primary dealer survey with the contemporaneous Blue Chip Economic Indicators survey, which draws from a wider sample of forecasters, reveals that Blue Chip respondents were more likely to expect a later start of tapering and thus more likely to have been surprised by Chairman Bernanke's communications. an insightful speech in May 2014 addressing how diversity in market expectations could have contributed to the taper tantrum. Jeremy pointed out that it is unhelpful to view the "market" as a single individual, a theme that has been explored by Hyun Song Shin of the Rather, the market is a collection of agents that can have widely divergent but perhaps strongly held beliefs at the individual level. Jeremy attributes the taper tantrum to the existence of highly leveraged quantitative easing optimists --in other words, individuals who expected the Federal Reserve to continue to accumulate assets much longer than the median expectation and who put little weight on the median market expectation. Once Chairman Bernanke affirmed the median expectation, these optimists had to quickly unwind their trades, with consequent sharp movements in asset prices. Where does that leave us? The problem, to quote Jeremy at length, "is that in some circumstances there are very real limits to what even the most careful and deliberate communications strategy can do to temper market volatility. This is just the nature of the beast when dealing with speculative markets, and to suggest otherwise--to suggest that, say, 'good communication' alone can engineer a completely smooth exit from a period of extraordinary policy accommodation--is to create an unrealistic expectation." Jeremy was speaking about ending the accumulation of assets onto the Fed's balance sheet. As reported in the minutes for the March 2017 meeting, the FOMC is now discussing a different inflection point, the phasing out of reinvestment and the shrinking of the balance sheet. Question: How concerned should we be about a repeat of the taper tantrum as we move through this new inflection point? We should start answering such a question by recognizing that there is always a chance of some market volatility. Nonetheless, we need to take into account that the New York Fed's Open Market Desk enhanced its information-gathering efforts after and, in part, as a response to the experience of the taper tantrum along two important dimensions. First, in 2014, the Desk augmented its Survey of Primary Dealers with a Survey of Market Participants, going some way to addressing concerns that primary dealers alone were not providing sufficient coverage of market beliefs. Second, more recently, questions have been added to the surveys to identify uncertainty about reinvestment policy for each individual survey respondent and not just the dispersion of beliefs about the expected change across respondents. Starting with the market participant survey, as I noted earlier, one informational constraint that complicated the Fed's understanding of market dynamics around the taper tantrum was the possible divergence of beliefs between the primary dealers, who were surveyed, and the market at large. The differences between the expectations of the primary dealers and those of the panel for the Blue Chip Economic Indicators, shown in figure 2, provide some support for the view that the primary dealers' views may well have differed from those of a wider range of market participants, but it would have been preferable to have a poll of market participants rather than forecasters. Not long after the taper tantrum, in January 2014, the Desk began its separate Survey of Market Participants. The survey panel currently consists of 30 so-called buy-side firms, including hedge funds and asset managers. Turning now to measures of individual uncertainty, in the April 2013 primary dealer survey, just prior to the taper tantrum, dealers were mostly questioned on their point estimates regarding the timing and conditions under which tapering would commence. Respondents were asked to provide their expectation for the monthly pace of asset purchases after each of several upcoming policy meetings. They were also asked to provide point estimates, or estimates of single particular values, for the quarter and year during which they expected asset purchases in Treasury and agency mortgage-backed securities to be completed. While these questions did provide some notion of the variation in beliefs across respondents, they did not provide much information on how strongly these beliefs were held by the individual respondents, nor on the extent to which their individual beliefs might have been reflected in the size of their market positions and, in particular, the amount of leverage underlying those positions. In contrast, the most recent primary dealer and market participant surveys, conducted prior to the March 2017 FOMC meeting, asked survey participants to indicate their view of their own uncertainty over several different aspects of policy. For example, in addition to their point estimates, participants were asked to indicate the percent chance they assigned to the federal funds rate being at various levels when the FOMC first announces a change to the reinvestment policy. They were also asked to assign probabilities to different dates for the first announced change in reinvestment policy. Why is this information important? To go back to Jeremy Stein's argument about the taper tantrum, Jeremy pointed out that market participants' expectations for tapering varied widely, but he conjectured that some of the participants were very certain in their expectations and that it was primarily their reaction that fueled the taper tantrum. When the surveys reported only point estimates, we had a measure of dispersion across market participants, but we were in the dark on how firmly held these beliefs were. By asking participants to provide a distribution of outcomes, we also obtained a measure of how certain they are of a particular outcome. To highlight some results from the March 2017 surveys, as shown in figure 3, the primary dealers' median projection for the level of the target federal funds rate when the FOMC first announces a change in its reinvestment policy was reported to be 1.63 percent. The 25th percentile of the distribution across respondents was 1.38 percent, and the 75th percentile was 1.88 percent, suggesting a fairly tight range around the median expectation. The reported range was even tighter for the market participants around a median projection of 1.63 percent. However, it would be a mistake to infer from the narrowness of these ranges a firmness in expectations. As shown in figure 4, when respondents of each survey were asked to indicate the percent chance assigned to different fed funds target levels when the change in policy is announced, the average of all of their reported distributions was wide and flat. The primary dealer survey places roughly equal weight on rates between 1.26 and 2.00 percent. In the underlying nonpublic data for the individual responses, the reported distributions were somewhat narrower but still reflected significant uncertainty, with no primary dealer placing more than 50 percent probability on any particular target range. Like the dealers, the market participants also report wide individual distributions of beliefs. Likewise, when primary dealers were asked about the timing of the announced change in reinvestment policy, the average of their responses was a relatively flat distribution of possible dates, with almost equal probability on the announcement occurring in the fourth quarter of 2017, the first two quarters of 2018, or the second half of 2018 (figure 5). Again, the individual distributions were narrower but still showed a significant amount of uncertainty. Highlighting the usefulness of also surveying market participants, expectations in the market survey are distinctly shifted toward an early announcement date relative to the expectations of the primary dealers. The surveys reveal that while beliefs are dispersed across participants, importantly individual survey participants are also significantly uncertain--in other words, any given participant does not appear to have firmly decided on the likely path of policy. The general point is that while we often measure and report differences in views across individuals, the uncertainty that individuals feel internally is also relevant. Recent survey results that show that market participants assign a positive probability to a wide range of outcomes also suggest that the factors that exacerbated the taper tantrum-- dispersed but firmly held beliefs--may be less pronounced in current circumstances than they were at the time of the taper tantrum. The market reaction to the release of the minutes of the March 2017 FOMC meeting supports this interpretation of the interaction of uncertainty and Fed policy communications. The minutes reported that, "provided that the economy continued to perform about as expected, most participants anticipated that gradual increases in the federal funds rate would continue and judged that a change to the Committee's reinvestment policy would likely be appropriate later this year." As was shown in figure 5, in the March 2017 surveys, respondents placed the most weight, 71 percent for the primary dealers and 57 percent for the market participants, on an announced change in reinvestment policy not occurring until 2018 at the earliest. Presumably, the April survey will reveal a shift in these distributions. It is noteworthy, however, that even though the statement in the minutes of the March FOMC meeting regarding Committee members' expectations for announcing changes in the reinvestment policy was not aligned with market expectations, there was only a muted market reaction. Perhaps in part, that is because the market participant survey actually revealed a considerable amount of weight, though not the majority, on an announcement occurring this year. Or it is also possible that the diffuse expectations on timing prior to the release of the minutes were a factor in tamping down market volatility as market participants adjust their expectations. My tentative conclusion from market responses to the limited amount of discussion of the process of reducing the size of our balance sheet that has taken place so far is that we appear less likely to face major market disturbances now than we did in the case of the taper tantrum. But, of course, as we continue to discuss and eventually implement policies to reduce our balance sheet, we will have to continue to monitor market developments and expectations carefully. I would like to conclude by briefly discussing two issues. First, a question: Can the Fed be too predictable? And, second, I will add a short comment on the SEP, the quarterly Summary of Economic Projections of the participants in the FOMC. With regard to whether the Fed can be too predictable, it is hard to argue that predictability in our reaction to economic data could be anything but positive. To reference the beginning of my talk, clarity about the Fed's reaction function allows markets to anticipate Fed actions and smoothly adjust along with the path of policy. But there is a circumstance where it might be reasonable to argue that the Fed could be too predictable--in particular, if the path of policy is not appropriately responsive to the incoming economic data and the implications for the economic outlook. Standard monetary policy rules suggest that the policy rate should respond to the level of economic variables such as the output gap and the inflation rate. As unexpected shocks hit the economy, the target level of the federal funds rate should adjust in response to those shocks as the FOMC adjusts the stance of policy to achieve its objectives. Indeed, it is these unexpected economic shocks that give rise to the range of uncertainty around the median federal funds rate projection of FOMC participants, represented through fan charts, which was recently incorporated into the SEP. The Federal Reserve could be too predictable if this type of fundamental uncertainty about the economy does not show through to uncertainty about the monetary policy path, which could imply that the Federal Reserve was not being sufficiently responsive to incoming data bearing on the economic outlook. Let me conclude with a few words on the SEP results as portrayed in the dot plots. The SEP is a highly useful vehicle for providing information to market participants and others for whom Fed actions are important. But we need to remind ourselves that the SEP data for an individual show that person's judgment of the appropriate path of future fed funds rates and the corresponding paths of other variables for which the SEP includes forecasts. Thus, one may say that the SEP shows the basis from which each participant in the FOMC discussion is likely to start. But the task of moving from that information to an interest rate decision is not simple and requires a great deal of analysis and back-and- forth among FOMC participants at each meeting. . . . . . . . . proceedings of a symposium .
r170419a_FOMC
united states
2017-04-19T00:00:00
International Effects of Recent Policy Tightening
fischer
0
I appreciate your invitation to participate in this afternoon's panel discussion. In my remarks, I will discuss how U.S. monetary policy actions affect our foreign trading partners, with particular focus on how foreign economies have responded to the Federal Extensive empirical research on spillovers--including by Federal Reserve and actions of major central banks occur through several important channels. While the exchange rate is a key channel of transmission and gets a great deal of attention in the public debate about monetary spillovers, it is not the only channel. U.S. monetary policy also affects foreign economies by influencing U.S. domestic demand and by affecting global financial conditions. My reading of the evidence is that the Fed's highly accommodative monetary policy during the Global Financial Crisis and its aftermath probably raised foreign gross While U.S. monetary easing caused the dollar to depreciate, which reduced foreign GDP by shifting demand toward cheaper U.S. goods, foreign economies benefited from a stronger expansion in U.S. domestic demand. Moreover, U.S. monetary easing also stimulated foreign GDP by depressing foreign bond yields and raising the prices of risky assets. Of course, there were considerable differences in how foreign economies were affected by the Fed's policies. Because the advanced foreign economies (AFEs) also experienced slow growth after the financial crisis, their central banks adopted similar policies. By contrast, the Fed's accommodative policies put further upward pressure on asset prices and currencies in some emerging market economies (EMEs) that were already experiencing rapid output growth. Thus, EME central banks had to navigate between tightening policy more--and hurting exports through a bigger exchange rate appreciation--and maintaining an accommodative stance closer to the Fed's, but with a higher risk of overheating. These tradeoffs faced by EME central banks underscore some of the challenges posed by monetary policy divergence with the United States--a tradeoff with which I am personally very familiar. Monetary policy divergence remains a familiar theme today, but the focus has obviously shifted to the consequences of tighter U.S. monetary policy for the global economy. Policy divergence is an ongoing concern given that most AFEs and many EMEs have continued to pursue highly accommodative monetary policies that remain appropriate in light of their weaker cyclical positions and subdued levels of underlying inflation. Many observers point to the "taper tantrum" in 2013 as illustrating how monetary tightening by the Federal Reserve can potentially have strong contractionary effects on foreign financial conditions. Subsequently, the expectation that a steadily improving U.S. labor market would call for tighter U.S. monetary policy--and hence imply greater monetary divergence with our trading partners--helped drive a sharp appreciation of the dollar between the middle of 2014 and the end of last year that was accompanied by capital outflows from many EMEs. Against this backdrop and the concerns it raises, the reaction in financial markets to the FOMC's decisions to increase the target range for the federal funds rate following its December 2016 and March 2017 meetings--by a cumulative total of 50 basis points--seems benign. The yields on risky foreign bonds, especially in EMEs, have continued to decline to below historical norms, and global stock prices have risen. The dollar has depreciated since mid-December, especially against EMEs, and the EMEs have experienced capital inflows. In my view, this favorable reaction partly reflects a view by market participants that the rate hikes are a signal of the FOMC's confidence in the underlying prospects for the U.S. economy that in turn has increased confidence in the global outlook: A strong U.S. economy is a major plus for the global economy. But the main reason for the positive market reaction is that foreign output expansions appear more entrenched, and downside risks to those economies noticeably smaller than in recent years. In Europe, unemployment has fallen steadily; inflation and inflation expectations are moving toward central bank targets; and, while Brexit entails many unknowns, so far it has not resulted in significant financial market disruptions. China's economy also appears to be on a more solid footing, which has helped stabilize the renminbi as well as support growth in other EMEs. The IMF staff has taken these developments into account in the April 2017 (WEO) and forecasts that world GDP growth will be noticeably higher over the next two years than in 2016--a slight upward revision relative to the There may well even be some chance that foreign economies kick into gear enough that U.S. and foreign business conditions become reasonably well aligned, as occurred during the U.S. monetary tightening cycles that began in 1999 and in 2004. In both of those episodes, U.S. exports grew substantially against the backdrop of a brisk expansion in foreign activity and a stable or even slightly depreciating dollar. Of course, it is hard to predict whether foreign economies continue to strengthen so that the global economy will move more in sync--as I hope--or if a substantial gap will remain between the business cycle positions of the United States and our foreign trading partners. However, even if monetary policy divergence remains substantial, there is good reason to think that spillovers to foreign economies will be manageable. First, I expect that the Fed's removal of accommodation will be driven by a continued expansion of the U.S. economy; thus, foreign economies are likely to benefit from the developments that induce the FOMC to tighten. Second, most foreign central banks should be able to mitigate an undesirable tightening of their own financial conditions through appropriate policy actions. An important lesson of the taper tantrum was that effective communication and actions by major central banks, including the European Central Bank and the Bank of England, were helpful in quickly pushing bond yields down to levels that these central banks regarded as appropriate to their economic situation. Third, many EMEs have markedly improved fundamentals--including smaller current account deficits and more anchored inflation expectations--that should allow them to better withstand the effects of U.S. tightening, though some vulnerabilities remain. Finally, I expect that U.S. policy normalization will be gradual under likely scenarios for the evolution of output and inflation. A gradual and ongoing removal of accommodation seems likely both to maximize the prospects of a continued expansion in the U.S. economy and to mitigate the risk of undesirable spillovers abroad. . . speech delivered at the 20th Annual Conference of the Central Bank of Chile, . . . .
r170420a_FOMC
united states
2017-04-20T00:00:00
Brief Remarks
powell
1
Thank you for inviting me to speak here today. I will begin by looking back at the global financial crisis and the great recession, which were arriving on the horizon at about this time 10 years ago. For the United States and many other countries, this would turn out to be the most painful economic crisis since the Great Depression. The fact that we had a severe recession but not another depression is a tribute to the aggressive response of those who were in a position to act at that time. In the event, the financial system avoided collapse but incurred severe damage and proved incapable, for a time, of performing its key functions. That was true of the largest investment and commercial banks, several of which either failed or required taxpayer support to survive. It was also true of the many pieces of the financial market infrastructure whose structural weaknesses contributed to the crisis, such as the triparty repurchase agreement (repo) market, the over-the-counter derivative market, and prime money market funds. The financial turmoil caused heavy damage to the real economy. Payroll employment declined by almost 9 million; over 7 million people lost their homes. Many young people entered a terrible job market; research shows that this may adversely affect their careers for many years. Many experienced workers who lost their jobs may suffer permanently lower income prospects. The nation faced two big tasks after the crisis. We had to get the economy growing again so people could get back to work and rebuild their financial lives. And we had to return the financial system to good health and address the many structural weaknesses that had become apparent. Today, the first of those tasks is well along. We have gone eight years without a subsequent recession--one of the longest recoveries on record. Employment is now almost 7 million jobs higher than its pre-crisis peak, with all of the net gains coming from the private sector. And with unemployment at 4.5 percent, we are at or close to full employment. But all is not well. Although job growth has been strong, gross domestic product has increased only about 2 percent annually since the crisis, held down by the weakest sustained period of labor productivity growth since World War II. Labor productivity--the increase in output per hour--has increased only 1/2 percent per year since 2011, about a quarter of its post-war average. The productivity slowdown has profound implications for our national well-being. This slowdown is a worldwide phenomenon, so it is likely that there are global forces at work. The slowdown has been associated with weak investment and a decline in output gains from technological innovation. We need a national focus on increasing the sustainable growth rate of our economy. That means investing in our workforce to give them the skills and aptitudes they need to compete in the global economy. It means policies that reward work, and policies that support investment and research. For the most part, these policies are not in the purview of the Federal Reserve. What about the second goal? As with the economy, we have made great progress toward our goals. Today, our financial system is without a doubt far stronger than it was before the crisis. The largest financial institutions now hold much higher levels of higher-quality capital. They hold higher levels of liquidity as well and are much less reliant on runnable short-term funding. They are subject to rigorous, forward-looking capital stress tests that recognize the dynamic nature of financial risks. And they have submitted several rounds of resolution plans that are helping to ensure that they could be safely reorganized should all these other safeguards prove insufficient. Our financial market infrastructures are also much stronger. The amount of intraday credit extended in the triparty repo market has been drastically reduced. Last year, the Securities and Exchange Commission implemented reforms that address weaknesses in the structure of prime money funds. And about 75 percent of interest rate and credit default swaps are now centrally cleared, which allows for greater transparency and more consistent risk management. While the move to central clearing has made the system safer, we need to make sure that the central counterparties have the resources and risk-management practices to withstand plausible but severe shocks. Many of the statutory provisions and regulations put in place to effect these changes were novel; it is not likely that we would have gotten everything exactly right on the first attempt. This is a good time to step back and ask what changes have worked and where adjustments should be made. Indeed, along with the other financial regulatory agencies, the Federal Reserve is contributing to just such an exercise by the Treasury Department. As I share some of my views on these issues, I should emphasize that I speak for myself and not for my Board colleagues or for the new colleagues who will soon join us. A few themes can guide us in this next phase. First, after years of raising capital and liquidity standards, and of stress tests and living wills, our financial system is much stronger now. We should protect these core reforms and avoid a return to the highly vulnerable system that existed before the crisis. Second, in too many cases new regulation has been inappropriately applied to small and medium-sized institutions. We need to go back and broadly raise thresholds of applicability and look for other ways to reduce burden on smaller firms. Third, the new rule book is excessively complex. We need to look for ways to simplify the rules so that they support our goals but also improve the efficiency of regulation. For example, we need to allow boards of directors and management to spend a smaller portion of their time on technical compliance exercises and more time focusing on the activities that support sustainable economic growth. Fourth, we need to continue to strive to provide an appropriate level of transparency to supervised firms and the public regarding our expectations. Some aspects of the new regulatory program are proving unnecessarily burdensome and should be better tailored to meet our objectives. Some provisions may not be needed at all given the broad scope of what we have put in place. I support adjustments designed to enhance the efficiency and effectiveness of regulation without sacrificing safety and soundness or undermining macroprudential goals. One example where some adjustments are warranted is our supervisory relationship with the boards of directors of banking firms. After the crisis, there was a broad increase in supervisory expectations for these boards. But it is important to acknowledge that the board's role is one of oversight, not management. We need to ensure that directors are not distracted from conducting their key functions by an overly detailed checklist of supervisory process requirements. Rather, boards of directors need to be able to focus on setting the overall strategic direction of the firm, while overseeing and holding senior management accountable for operating the business profitably, but also safely, soundly, and in compliance with applicable laws. We are currently reassessing whether our supervisory expectations for boards need to change to ensure that these principles, and not an ever-increasing checklist, are the basis of our supervisory work related to boards. I am sure that there are other areas where laws, regulations, and supervisory practices could be adjusted in a way that preserves the gains in safety and soundness but helps financial institutions devote as much of their resources as possible to supporting economic growth. I look forward to our discussion.
r170428a_FOMC
united states
2017-04-28T00:00:00
Where Do Banks Fit in the Fintech Stack?
brainard
0
We can learn a lot from the evolution of smartphones as we try to envisage where the fintech ecosystem--and banks' role within it--might be heading in the future. Smartphones have ushered in an age when different companies can easily work with each other's products to seamlessly provide services to consumers. Today I want to reflect on what we might learn from that model about the increasingly interconnected world of financial services. On the 10th anniversary of the iPhone, a Wired.com article revealed that even Steve Jobs hadn't predicted the smartphone's potential as a platform. Apple was just trying to design an iPod that made phone calls. Today, the average American spends five hours a day on their phone, unlocking it an average of 80 times daily. Even the Supreme Court has noted that smartphones are now "such a pervasive and insistent part of daily life that the proverbial visitor from Mars might conclude they were an important feature of human anatomy." Of course, we aren't using these appendages primarily to make phone calls. Instead, we mainly use our smartphones to access applications (apps). In June of last year, Apple announced that over 2 million apps were available on its App Store. For the most part, these apps were not created or even envisaged by Apple. These apps have been downloaded 130 billion times, generating over $50 billion in revenue for third-party developers. The iPhone is a key platform on which that app ecosystem operates. How did that happen? Apple essentially made the smartphone a toolkit for third-party developers to experiment, innovate, build, and scale new apps. It did so by investing heavily in developing open application programming interfaces (APIs) that provided third-party developers clear instructions and open access to the iPhone platform. This strategy enabled those outside developers to build new applications that delivered Apple's customers additional value by taking advantage of the existing functionality of the iPhone. Specifically, this open architecture makes available to outside developers clear instructions that enable them to use the iPhone's various sensors, processors, displays, and other interfaces in combination with their own code to develop new products. On top of that, a robust secondary layer of developers use the APIs of other developers in their technology stacks to quickly assemble new business models. Take ride-sharing services, for instance. They have built multibillion-dollar businesses that are, in large part, dependent on combinations of APIs from different companies. They may use Google Maps' APIs for location services, Stripe or Braintree's APIs for payments, Twilio's APIs for text messaging, and Amazon Web Services' or IBM's APIs for computing power. All of these products, and more, work seamlessly together in real time to provide products that are so ubiquitous that we now use them as verbs for how we navigate the world. We "Uber" to the store or "Snapchat" a friend. There is every reason to expect financial services to make a similar transition to an increasingly interconnected digital world. By now, we've all heard estimates of the thousands of fintech companies that have launched in the past few years and the billions of investment dollars that are flooding into this sector. But for all of the talk of "disruption," I want to underscore an important point: More often than not, there is a banking organization somewhere in the fintech stack. Just as third-party app developers rely on smartphone sensors, processors, and interfaces, fintech developers need banks somewhere in the stack for such things as: (a) access to consumer deposits or related account data, (b) access to payment systems, (c) credit origination, or For instance, account comparison services rely on access to data from consumers' bank accounts. Savings and investment apps analyze transactions data from bank accounts to understand how to optimize performance and manage the funds consumers hold in those accounts. Digital wallets draw funds from payment cards or bank accounts. Marketplace loans most often depend on loan origination by a bank partner. And payment innovations often "settle up" over legacy payment rails, like the automated clearinghouse system. In short, the software stacks of almost all fintech apps point to a bank at one layer or another . So as fintech companies and banks are catching up to the interconnected world, the various players are sorting out how best to do the connecting. Much of the work so far has been focused on the technical challenges, which are notable. Most banks' core systems are amalgams of computing mainframes built decades ago before the Internet or cloud computing were widely available and, in many cases, stitched together over the course of mergers and consolidations. It takes a lot of investment to securely convert that infrastructure to platforms that can operate in real-time with ready access for Internet-native third-party developers. But important policy, regulatory, and legal questions also demand attention. And that is where the smartphone analogy loses its power. On balance, bank activities are much more highly regulated than smartphones. Those regulations enable consumers to trust their banks to secure their funds and maintain the integrity of their transactions. While "run fast and break things" may be a popular mantra in the technology field, it is ill suited to an arena where a serious breach could undermine confidence in the payments system. Indeed, some of the key underpinnings of consumer protection and safety and soundness in the banking world--that consumers should be exceptionally careful in granting account access, that in certain conditions banks could be presumed to bear liability for unauthorized charges, and that banks can be held responsible for ensuring that service providers and vendors do right by their customers--sit uneasily alongside the requisites of openness, connectivity, and data access that enable today's app ecosystem. For instance, before entering an outsourcing arrangement, a bank is expected to consider whether the service provider's internal processes or systems (or even human error at the outside party) could expose the bank and its customers to potential losses or expose the bank's customers to fraud and the bank to litigation; whether the service provider complies with applicable laws and regulation; and whether poor performance by that outside party could materially harm the bank's public reputation. The smartphone app ecosystem developed without the regulations or associated guardrails pertaining to institutions that people trust to hold their life savings. For instance, when Pokemon Go was first launched, its creator, Niantic, used an outdated Google API to verify consumer identities. This created confusion about whether millions of consumers had unwittingly granted Niantic full access to their e-mails, contact lists, and calendars. it did not stand in the way of Pokemon Go subsequently being downloaded a half billion times. In contrast, these kinds of mistakes in the banking sector could raise grave concerns about consumer data privacy and security and the integrity of consumer transactions data. That's why banks are expected to conduct extensive risk assessments and due diligence of their service providers, extending even to operations and internal controls, among other requirements. While that helps ensure a safe and sound banking system, that also makes it more challenging for both the banks and fintech companies to harness safely the interconnectivity that has powered other parts of the digital world. Because of the high stakes, fintech firms, banks, data aggregators, consumer groups, and regulators are all still figuring out how best to do the connecting. There are a few alternative approaches in operation today, with various advantages and drawbacks. A number of large banks have developed or are in the process of developing interfaces to allow outside developers access to their platforms under controlled conditions. Similar to Apple opening the APIs of its phones and operating systems, these financial companies are working to provide APIs to outside developers, who can then build new products on the banks' platforms. It is worth highlighting that platform APIs generally vary in their degree of openness, even in the smartphone world. If a developer wants to use a Google Maps API to embed a map in her application, she first must create a developer account with Google, agreeing to Google's terms and conditions. This means she will have entered a contract with the owner of the API, and the terms and conditions may differ depending on how sensitive the particular API is. Google may require only a minimum amount of information for a developer that wants to use an API to display a map. Google may, however, require more information about a developer that wants to use a different API to monitor the history of a consumer's physical locations over the previous week. And in some cases, the competitive interests of Google and a third-party app developer may diverge over time, such that the original terms of access are no longer acceptable. The fact that it is possible and indeed relatively common for the API provider--the platform--to require specific controls and protections over the use of that API raises complicated issues when imported to the banking world. As banks have considered how to facilitate connectivity, the considerations include not only technical issues and the associated investment, but also the important legal questions associated with operating in a highly regulated sector. The banks' terms of access may be determined in third-party service provider agreements that may offer different degrees of access. These may affect not only what types of protections and vetting are appropriate for different types of access over consumers' funds and data held at a bank in order to enable the bank to fulfill its obligations for data security and other consumer protections, but also the competitive position of the bank relative to third-party developers. There is a second broad type of approach in which many banks have entered into agreements with specialized companies that essentially act as middlemen, frequently described as "data aggregators." These banks may lack the budgets and expertise to create their own open APIs or may not see that as a key element in their business strategies. Data aggregators collect consumer financial account data from banks, on the one hand, and then provide access to that data to fintech developers, on the other hand. Data aggregators organize the data they collect from banks and other data sources and then offer their own suite of open APIs to outside developers. By partnering with data aggregators, banks can open their systems to thousands of developers, without having to invest in creating and maintaining their own open APIs. This also allows fintech developers to build their products around the APIs of two or three data aggregators, rather than 15,000 different banks and other data sources. And, if agreements between data aggregators and banks are structured as data aggregators performing outsourced services to banks, the bank should be able to conduct the appropriate due diligence of its vendors, whose services to those banks may be subject to examination by safety and soundness regulators. Some banks have opted for a more "closed" approach to fintech developers by entering into individual agreements with specific technology providers or data aggregators. agreements often impose specific requirements rather than simply facilitating structured data feeds. These banks negotiate for greater control over their systems by limiting who is accessing their data--often to a specific third party's suite of products. Likewise, many banks use these agreements to limit what types of data will be shared. For instance, banks may share information about the balances in consumers' accounts but decline to share information about fees or other pricing. While recognizing the legitimate need for vetting of third parties for purposes of the banks fulfilling their responsibilities, including for data privacy and security, some consumer groups have suggested that the standards for vetting should be commonly agreed to and transparent to ensure that banks do not restrict access for competitive reasons and that consumers should be able to decide what data to make available to third-party fintech applications. A third set of banks may be unable or unwilling to provide permissioned access, for reasons ranging from fears about increased competition to concerns about the cost and complexity of ensuring compliance with underlying laws and regulations. At the very least, banks may have reasonable concerns about being able to see, if not control, which third-party developers will have access to the banking data that is provided by the data aggregators. Accordingly, even banks that have previously provided structured data feeds to data aggregators may decide to limit or block access. In such cases, however, data aggregators can still move forward to collect consumer data for use by fintech developers without the permission or even potentially without the knowledge of the bank. Instead, data aggregators and fintech developers directly ask consumers to give them their online banking logins and passwords. Then, in a process commonly called "screen scraping," data aggregators log onto banks' online consumer websites, as if they were the actual consumers, and extract information. Some banks report that as much as 20 to 40 percent of online banking logins is attributable to data aggregators. They even assert that they have trouble distinguishing whether a computer system that is logging in multiple times a day is a consumer, a data aggregator, or a cyber attack. For community banks with limited resources, the necessary investments in API technology and in negotiating and overseeing data-sharing agreements with data aggregators and third-party providers may be beyond their reach, especially as they usually rely on service providers for their core technology. Some fintech firms argue that screen scraping--which has drawn the most complaints about data security--may be the most effective tool for the customers of small community banks to access the financial apps they prefer--and thereby necessary to remain competitive until more effective broader industry solutions are developed. Clearly, getting these connectivity questions right, including the need to manage the consumer protection risks, is critically important. It could make the difference between a world in which the fintech wave helps community banks become the platforms of the future, on the one hand, or, on the other hand, a world in which fintech instead further widens the gulf between community banks and the largest banks. The different approaches to integrating banks into the fintech stack represent different risks and tradeoffs. Connectivity solutions that require intermediaries such as data aggregators and rely on screen scraping potentially create repositories of consumer credentials for hackers to target. Banks argue that if such a repository is breached, thousands of banks could be impacted. Further complicating things, because screen scrapers operate without contractual relationships with the banks from which they pull information, banks have little leverage or ability to vet the security of the screen scrapers' systems and methods or their overall risk. In these circumstances, some commentators have noted that if a data aggregator or third-party developer is breached, it may not be clear who would bear responsibility for any losses--the bank, the data aggregator, the fintech developer, or the consumer. Some third-party developers have included terms and conditions that specifically limit their liability to consumers. It is not clear the extent to which many consumers understand the risks involved with sharing their banking credentials, the more limited liability accepted by many third-party developers relative to their bank or credit card issuer, and the fact that the third-party developers may in turn provide those credentials to others in some instances. On the other side of the debate, fintech companies are concerned that banks could use their control over consumer data access in the context of bilateral contracts with data aggregators to leverage their position in order to impede competition elsewhere in the stack. This argument about access and competition echoes similar concerns in the smartphone arena. Further, third-party developers argue that open standards for data access can help banks meet consumers' expectations for mobile banking by providing access to the fintech apps that best serve their needs. The relatively open architecture of the iPhone platform means that Apple profits from outside developers' products without having to design or invest in them directly. For instance, Apple didn't include a home-grown mapping app during the first few years of the Instead, it relied on Google to provide that important function for its smartphones before trying to build its own mapping tool--a process that took a number of iterations before getting it right. Open platform strategies may mean that banks can essentially outsource product development to fintech firms. This could be a boon--particularly for small community banks that would not have to worry about developing the best consumer interface, mobile app, digital wallet, or lending product. The bank would only have to worry about getting the connections to an open API right and then reap the benefits of the innovation by third parties. As regulators, we have a responsibility to ensure that the institutions subject to our supervision are operated safely and soundly and that they comply with applicable statutes and regulations. More broadly, we have a strong interest in permitting socially beneficial innovations to flourish, while ensuring the risks that they may present are appropriately managed, consistent with the legal requirements. We do not want to unnecessarily restrict innovations that can benefit consumers and small businesses through expanded access to financial services or greater efficiency, convenience, and reduced transaction costs. Nor do we want to drive these activities away from regulated banks and toward less governed spaces in the financial system. Regulators in the United Kingdom and continental Europe have recently outlined new approaches to facilitate connectivity in financial services, while attempting to mitigate the package of mandates aimed at increasing competition for consumer and small business current This year nine of the country's largest banks were required to create open APIs to share nonsensitive, non-consumer-specific information, like pricing, fees, terms, and conditions as well as branch and automated teller machine locations. This initial limited sharing of information has started communication and collaboration across the industry on areas like data standards and organizational governance, which will facilitate work on more contentious questions. Before March 2018, the CMA is scheduled to enforce a broader package of reforms, including mandating that the nine banks create APIs that allow third-party banks and nonbanks to access consumer accounts for reading transaction data and payment initiation. In the European Union, beginning in 2018, member states will be required to start other elements, PSD2 created licensing regimes for third parties that access bank accounts for purposes of initiating payment orders or consolidating information with consumers' consent. The directive mandates that banks allow these licensed third parties to access their consumer accounts (with consumer permission) without premising such access on contractual agreements with the banks. Indeed, PSD2 requires that credit institutions not block or hinder access to payment accounts and that licensed third parties have access to credit institutions' payment accounts services in an objective, nondiscriminatory, and proportionate manner. When credit institutions do reject access, they are required to provide the relevant authorities detailed reasoning for the rejection. The directive attempts to mitigate the attendant data-security and consumer-protection risks with a number of measures that, by and large, are not readily available policy options in the United States. Importantly, third parties that access bank accounts will be subject to licensing and registration requirements, as well as associated capital and insurance requirements. Moreover, the directive envisions that electronic payments will be authorized by two-factor authentication--for example "something you know" and "something you are." The United States is likely to address these issues in a different way, at least initially, given that regulatory authorities are more broadly distributed, and the relevant statutory language issued a Request for Information last fall to explore issues surrounding consumers' granting access to account information to third parties. Of course, safety and soundness regulation--and with it, concerns about data security, cyber security, and vendor risk management--is distributed among a number of regulators. For instance, there may be value to examining the vendor risk management guidance so that it facilitates banks connecting more securely and efficiently with the fintech apps that consumers prefer. Similarly, it could be useful to periodically assess whether and how authority under the Bank Service Company Act might pertain to developments in the fast evolving fintech sector. In addition, the private sector is continuing to actively experiment with a variety of different approaches to the connectivity question and may itself move toward one or more widely accepted standards. Accordingly, efforts to craft approaches that enhance connectivity while mitigating the associated risks will likely benefit from the engagement of multiple agencies, along with input from the private sector and other stakeholders. for administering national bank charters, has announced that it is exploring offering "special purpose national bank charters" to fintech companies. As envisioned by the OCC, obtaining a special purpose charter would have the practical effect of allowing certain fintech companies (companies that make loans, make payments, or accept deposits) to potentially bypass the need for connecting to a bank for certain purposes in favor of becoming licensed as banks themselves. The OCC's proposal raises interpretive and policy issues for the Federal Reserve regarding whether charter recipients would become Federal Reserve members or have access to Federal Reserve accounts and services, such as direct access to payment systems. If the OCC proposal is finalized, the Federal Reserve would have to closely analyze these issues with respect to any fintech firms that express an interest in moving forward with an application. When Apple launched the iPhone in 2007, who could have predicted that it would net billions from a game like Pokemon Go, which involved no investment, development, or advertising on Apple's part beyond opening its platform to developers? It is still too early to have any confidence that we know which fintech innovations will prove to be the most long- lasting or widely adopted. By the same token, the fintech industry is still figuring out the fundamental questions of the best ways to make the necessary connections to the banking platforms to facilitate consumers' ability to better monitor and manage their financial lives, while providing the level of data security and protection they have come to rely on from their banks. Change is surely coming, as financial products and services move onto interconnected platforms. As the sector evolves, it's important that all parties involved pay close attention not only to the technical questions, but to the requisite regulatory, policy, and legal considerations to ensure continued trust and confidence in the financial system.
r170505b_FOMC
united states
2017-05-05T00:00:00
Committee Decisions and Monetary Policy Rules
fischer
0
It is a pleasure to be at the Hoover Institution again. I was privileged to be a . In addition, many of the researchers and practitioners with whom I have discussed monetary policy over the years have had affiliations with the Hoover Institution--including several people here today. It is a pleasure also to have been invited to speak at this Hoover Institution Monetary Policy Conference, for the Hoover conference series provides a valuable forum for policymakers and researchers to engage in dialogue about important monetary policy issues facing the United States and other countries. Today I will offer some observations on monetary policy rules and their place in I have two messages. First, policymakers should consult the prescriptions of policy rules, but--almost needless to say--they should avoid applying them mechanically. Second, policymaking committees have strengths that policy rules lack. In particular, committees are an efficient means of aggregating a wide variety of information and perspectives. Since May 2014, I have considered monetary policy rules from the vantage point of a member of the FOMC. But my interest in them began many years ago and was reflected in some of my earliest publications. At that time, the literature on monetary policy rules, especially in the United States, remained predominantly concerned with the money stock or total bank reserves rather than the short-term interest rate. Seen with the benefit of hindsight, that emphasis probably derived from three sources: First, the quantity theory of money emphasized the link between the quantity of money and inflation; second, that research was carried out when monetarism was gaining credibility in the profession; and, third, there was a concern that interest rate rules might lead to price-level indeterminacy--an issue disposed of by Bennett McCallum and others. Subsequently, John Taylor's research, especially his celebrated 1993 paper, was a catalyst in changing the focus toward rules for the short-term interest rate. work thus helped shift the terms of the discussion in favor of rules for the instrument that central banks prefer to use. His 1993 study also highlighted the practical relevance of monetary policy rules, as he showed that a particular simple rule--the rule that now bears his name--provided a good approximation to the behavior of the federal funds rate during the early Greenspan years. The research literature on monetary policy rules has experienced a major revival since Taylor's seminal paper and has concentrated on rules for the short-term interest rate. Consideration of interest rate rules has also, as I will discuss, come to have a prominent role in FOMC discussions, with the Taylor rule being one benchmark that we regularly consult. But--building on recent remarks I made elsewhere--I will also indicate why policymakers might have good reasons for deviating from these rule benchmarks and why, in pursuing the objectives of monetary policy, they could appropriately behave in ways that are not very well characterized by simple monetary policy rules. particular, I will point to reasons why the FOMC's discussions might lead to decisions that depart--temporarily or permanently--from the prescriptions of baseline monetary policy rules. Some perspective on the status of policy rules in FOMC discussions is provided by considering what has changed over the past 20 years. Donald Kohn, at a landmark conference organized by John Taylor in January 1998, described the role played by monetary policy rules in the FOMC briefing process. His account noted that Federal Reserve staff members presented FOMC participants with prescriptions from several policy rules, including the Taylor (1993) rule. This description remains true today. Publicly available Bluebooks and Tealbooks of successive years demonstrate that the coverage of policy rules in the briefing material provided by the Board staff expanded considerably in the years after Kohn spoke. Kohn noted that policy rule prescriptions served two functions: as a "benchmark for the stance of policy" and "to structure thinking about the implications of incoming information for the direction of policy action." These two functions continue to be important: Policy rule prescriptions provide a useful starting point for FOMC deliberations and a convenient way of organizing alternative arguments about the appropriate policy decision. Policy rule prescriptions, particularly prescriptions that are obtained from a dynamic model simulation, also help policymakers take to heart a key message of the literature on policy rules--namely, that monetary policy decisions should concern the appropriate path for the policy instrument and not merely the current setting of that instrument. Kohn also observed, however, that "in truth, only a few members look at this or similar information regularly, and the number does not seem to be growing." That state of affairs has probably changed in the two decades since Kohn wrote. It is clear from transcripts in the public record that rule prescriptions have frequently been cited at FOMC meetings. The prominence that interest rate rules have achieved in Federal Reserve policymakers' analysis of monetary policy was underscored by Chair Yellen in her speech at Stanford University earlier this year. Further, as is clear from Taylor's econometric derivation of his 1993 rule, actual monetary policy decisions may--and probably should--exhibit systematic patterns that can be described as a rule. In fact, as I have already noted, one attraction of the 1993 Taylor rule was that it described U.S. monetary policy patterns well over a certain period, one that was associated with a reasonable degree of economic stability. Nevertheless, central bankers who are aware of the merits of the arguments for policy rules have on occasion deviated substantially from the prescriptions of standard policy rules. Further, while the implications of different monetary rules are described in the Tealbook and typically referred to in the presentations by several FOMC participants, the overall discussion in FOMC meetings is not generally cast in terms of how it relates to one version or another of the Taylor or any other rule. The other set of rules mentioned frequently in FOMC discussions are Wicksellian, for there is often a discussion of r *, which in some formulations of the Taylor rule is also the constant term. The period since 2008 bears testimony to central bankers' willingness to depart from the prescriptions of a pre-specified rule. In the wake of the financial crisis, policymakers found it necessary to follow a more accommodative monetary policy that was appropriate for the new economic conditions. In addition, structural changes in the U.S. economy have apparently lowered the value of the interest rate--that is, r *--consistent with neutral policy. Such structural changes were not anticipated in advance. Of course, once a structural change has occurred and been ascertained by policymakers, they will know what rules would likely have performed well in the face of that change. For this reason, policymakers might change their judgment about what monetary policy rules constitute reasonable benchmarks, or, over time, they might develop a procedure for revising the monetary rule. But a frequently revised rule does not really qualify as a rule in the sense that we currently use the term. Consequently, when considering the relationship between monetary policy decisions and monetary policy rules, we can expect two regularities to hold. First, actual monetary policy will sometimes appropriately depart from the prescriptions of benchmark rules even when those benchmarks describe past decisions well. Second, in their use of rules, policymakers will from time to time change their assessment of what rule they regard as the appropriate benchmark. Both regularities have been amply observed in recent years, but they were also present 20 years ago, as reflected in Kohn's remark that policymakers "do not see their past actions as a very firm guide to current or future policy." Or, as a teacher of mine at the London School of Economics, Richard Sayers, put it much earlier, "There is no code of eternal rules. . . . We have central banks for the very reason that there are no such rules." As I will now elaborate, I believe the fact that monetary policy is made by committees in most economies is important in understanding both of these regularities. Monetary policy decisions in the United States and elsewhere typically arise from a discussion and vote of a committee. In principle, a monetary policy committee could decide to follow a rule. But a decision of this kind is unlikely to occur in practice. Committee discussions bring into policymaking features that a rule lacks. A committee- based decision process is, I suggest, likely to produce policy decisions that depart from the prescriptions of benchmark rules. A policy rule prescription is more consistent with a single perspective on the economy than with the pooling of multiple perspectives that is associated with a committee policymaking process. Roger Lowenstein's book details how the founding of the Federal Reserve involved reconciling a large number of interests in In a similar vein, the modern FOMC framework involves participation by 12 Reserve Bank presidents, each of whom represents a different district of the country. The FOMC framework also balances centralized and decentralized decisionmaking by having most of the permanent voting members--specifically, the All of the FOMC participants have common goals--maximum employment and price stability--that are given by the Federal Reserve's statutory mandate. They have also agreed, for pursuing that mandate, on the Statement on Longer-Run Goals and Monetary But while they have this common ground, each FOMC participant brings to the table his or her own perspective or view of the world. Part of their role in meetings is to articulate that perspective and perhaps persuade their colleagues to revise their own perspectives--or vice versa. A member of a committee may well have valuable economic information not known by their colleagues until he or she relays it. This point has been brought home to me by Reserve Bank presidents' accounts of recent economic developments in their Districts. These narratives shed light on the real-world developments that lie behind the recorded economic data. They also help shape my interpretation of what part of incoming data may be an important signal and what part may reflect transitory factors or mismeasurement. The information underlying a policy decision is, therefore, crucially shaped by a committee system. Committees can aggregate a large volume of diverse information about current and expected future economic conditions. The information includes anecdotes and impressions gleaned from business and other contacts, which can provide insights that are not recorded in current data releases. In practice, it is likely that the information obtained and processed by the Committee will leave the FOMC less inclined to follow a benchmark rule. For example, the Committee's discussions might point up factors that have not yet affected real economic activity and inflation. Such factors would not lead to an immediate change in the prescription for the federal funds rate obtained from a rule like the Taylor rule, as this prescription is a function of current values of the output gap and inflation. The Committee might nevertheless wish to adjust the federal funds rate immediately because the newly unearthed factors are likely to affect output and inflation in coming months. In addition, and as I have suggested, policymakers might also encounter unexpected or unusual events, or both, or they might perceive changes in the structure of the economy. A committee process is conducive to assessing the appropriate policy response to these developments. A case in point is the decline, as I mentioned, in estimates of the neutral interest rate. The concept of the neutral interest rate is a way of summarizing the various forces, many of them unobservable, that shift the relationship between monetary policy and economic activity. Bringing to the table diverse perspectives is a pragmatic way of confronting such deep sources of uncertainty and deciding how to deal with them. A committee discussion can flesh out the factors behind changes in the neutral rate, and a committee would likely be able to identify such changes more promptly than would a statistical exercise, because of the wider set of information from around the country that the committee is able to process. The decisionmaking environment that I have described involves more flexibility for FOMC members than they would have if they simply followed a policy rule. But transparency and accountability must figure heavily in this more flexible environment. The FOMC's policy communications include its postmeeting statement, the minutes of its meetings, the Chair's quarterly press conference, the Chair's semiannual monetary policy testimony to the Congress, and other public remarks by individual FOMC members. In this framework, policymakers articulate the reasoning behind each decision and, in particular, explain how the policy decision contributes to the achievement of the Committee's statutory mandate. There remains a deeper question about committee decisionmaking: Why have almost all countries decided that monetary policy decisions should be made by a committee rather than by a rule? One answer is that laws in most countries are passed by institutions in which committee deliberation is the norm. Of course, we then have to ask why that has become a norm in almost all democracies. The answer is that opinions--even on monetary policy--differ among experts, while the economy is in a constant process of change. Because opinions differ among experts, democracies tend to prefer committees in which decisions are made by discussion among the experts--and, in many cases, other representatives of the public--who discuss, try to persuade each other, and must at the end of their deliberations reach a decision. But those decisions have to be explained to the public and to other parts of the government--and hence the appropriate emphasis on transparency and accountability. That is the democratic way of making decisions when opinions differ, as they often do in the monetary field. I have been a governor of two central banks and, even as the sole monetary policy decisionmaker in the Bank of Israel, would sometimes find that my initial view on the next decision changed as a result of discussions with the informal advisory committee with whom I consulted at that time. Those discussions, which recognize human frailty in analyzing a situation and the need to act despite considerable uncertainties, are the reason why committee decisionmaking is, on average, preferable to the use of a rule. Emphasis on a single rule as the basis for monetary policy implies that the truth has been found, despite the record over time of major shifts in monetary policy--from the gold standard, to the Bretton Woods fixed but changeable exchange rate rule, to Keynesian approaches, to monetary targeting, to the modern frameworks of inflation targeting and the dual mandate of the Fed, and more. We should not make our monetary policy decisions based on that assumption. Rather, we need our policymakers to be continually on the lookout for structural changes in the economy and for disturbances to the economy that come from hitherto unexpected sources. Let me now sum up. The prescriptions of monetary policy rules play a prominent role in the FOMC's monetary policy deliberations. And this is as it should be, in view of the usefulness of rules as a starting point for policy discussion and the fact that comparison with a benchmark rule provides a useful means of articulating one's own preferred policy action. But, for the reasons I have outlined, adherence to a simple policy rule is not the most appropriate means of achieving macroeconomic goals--and there are very good reasons why monetary policy decisions are typically made in committees whose structure allows them to assess the varying conditions of different regions and economic sectors, as well as to reflect different beliefs about the working of the economy. , vol. 37 . , -------- (1979). . . Policy Strategy, amended effective January 31 (original version adopted effective . . . ," in p. 67. . , vol. 8 , vol. 74 Journal of . , . of Money, .
r170505a_FOMC
united states
2017-05-05T00:00:00
So We All Can Succeed: 125 Years of Women's Participation in the Economy
yellen
1
Thank you, and let me say what an honor it is, as an alumna of this great university, to be here today and part of this important occasion. As we celebrate the 125th anniversary of women being admitted to Brown, it seems appropriate to reflect on the progress that women have achieved in the intervening years. Since 1891, women have made tremendous strides in their ability to pursue their dreams of education and meaningful work and to support themselves and their families. In pursuing these goals, women have helped improve working conditions for all workers and have been a major factor in America's prosperity over the past century and a quarter. Despite this progress, evidence suggests that many women remain unable to achieve their goals. The gap in earnings between women and men, although smaller than it was years ago, is still significant; women continue to be underrepresented in certain industries and occupations; and too many women struggle to combine aspirations for work and family. Further advancement has been hampered by barriers to equal opportunity and workplace rules and norms that fail to support a reasonable work-life balance. If these obstacles persist, we will squander the potential of many of our citizens and incur a substantial loss to the productive capacity of our economy at a time when the aging of the population and weak productivity growth are already weighing on economic growth. To enliven the history I will present today, I will include the experiences of women graduates of this institution, in most cases in their own words, as related in oral histories preserved by Brown. Among these alumnae, I am proud to say, is a member of my own family who was an early graduate of Pembroke, Elizabeth Stafford Hirschfelder of the Class of 1923. Her career and achievements as a mathematician embody both the opportunities that opened for Pembroke graduates in the decades after she left here and the limitations many women faced and the compromises she, like so many others, was forced to make. From the time that Brown began to accept women and into the 1920s, most women in the United States did not work outside the home, and those who did were primarily young and unmarried. In that era, just 20 percent of all women were "gainful workers," as the Census Bureau then categorized labor force participation outside the home, and only 5 percent of those married were categorized as such. Of course, these statistics somewhat understate the contributions of married women to the economy beyond housekeeping and childrearing, since women's work in the home often included work in family businesses and the home production of goods, such as agricultural products, for sale. Also, the aggregate statistics obscure the differential experience of women by race. African American women were about twice as likely to participate in the labor force as were white women at the time, largely because they were more likely to remain in the labor force after marriage. What was true for women in general was also true of the early graduates of what was then called the Women's College, the large majority of whom got married, raised families, and did not pursue careers. The fact that many women left work upon marriage reflected cultural norms, the nature of the work available to them, and legal strictures. The occupational choices of those young women who did work were severely circumscribed. Most women lacked significant education--only 54 percent of girls aged 5 to 19 were enrolled in school in 1890. And women with little education mostly toiled as piece workers in factories or as domestic workers, jobs that were dirty and often unsafe. Educated women, like those who attended Brown's Women's College, were scarce. Fewer than 2 percent of all 18- to 24-year-olds were enrolled in an institution of higher education, and just one-third of those were women. Such women did not have to perform manual labor, but their choices were likewise constrained. Edna McDonald was a graduate of the Class of 1919, and in her oral history, she summed up the opportunities for her and her classmates: "Let's be frank," she said. "What choices did women have? Teaching. You could teach. You could be a lab technician. Or you could go into office work and be a secretary. Those were the only real choices." Margery Chittenden Leonard graduated from Pembroke in 1929 and went on to earn a J.D. as the only woman in her class at Boston University--after two others withdrew. And with that law degree, her first job was as a secretary, and she continued to struggle to find work as a lawyer. In her oral history, Doris Madeline Hopkins, a 1928 graduate, talked about the opportunity that she had to work, but also about being told she had to leave her job once she got married. Indeed, at the time, marriage bars were widespread. There were notable exceptions, such as, of course, Mary Emma Woolley, a Brown graduate who went on to serve as the president of Mount Holyoke College, and Ethel Robinson, the first black woman to graduate from Brown, who taught English at Howard University. Helen Butts, from the Class of 1928, taught natural sciences at Smith and later zoology at Wellesley, the beginning of a long and productive career as a biological researcher. Another exception was Betty Stafford, the aunt of my husband, George. She grew up in Providence, earned bachelor's and master's degrees at Brown in mathematics and then rather adventurously headed west, teaching at two universities in Texas in the 1920s before completing her Ph.D. and then teaching at the University of Wisconsin. Despite the widespread sentiment against women, particularly married women, working outside the home and with the limited opportunities available to them, women did enter the labor force in greater numbers over this period, with participation rates reaching nearly 50 percent for single women by 1930 and nearly 12 percent for married women. This rise suggests that while the incentive, and in many cases the imperative, remained for women to drop out of the labor market at marriage when they could rely on their husband's income, mores were changing. Indeed, these years overlapped with the so-called first wave of the women's movement, when women came together to agitate for change on a variety of social issues, including suffrage and temperance, and which culminated in the ratification of the 19th amendment in 1920 guaranteeing women the right to vote. Between the 1930s and mid-1970s, women's participation in the economy continued to rise, with the gains primarily owing to an increase in work among married women. By 1970, 50 percent of single women and 40 percent of married women were participating in the labor force. Several factors contributed to this rise. First, with the advent of mass high school education, graduation rates rose substantially. At the same time, new technologies contributed to an increased demand for clerical workers, and these jobs were increasingly taken on by women. Moreover, because these jobs tended to be cleaner and safer, the stigma attached to work for a married woman diminished. And while there were still marriage bars that forced women out of the labor force, these formal barriers were gradually removed over the period following World War II. Another innovation was the introduction in the late 1940s of part-time schedules, which combined with the proliferation of modern appliances to make it more feasible for married women to work outside the home. Over the decades from 1930 to 1970, increasing opportunities also arose for highly educated women, such as the graduates of what was by then called Pembroke College, to work in professions. That said, early in that period, most women still expected to have short careers, and women were still largely viewed as secondary earners whose husbands' careers came first. Thus, while it was becoming more common for women such as Betty Stafford to teach at colleges and universities, their career prospects were not the same as those for men. After earning her Ph.D. at Wisconsin, Betty married a fellow student and over the next decade coauthored five important papers with him and a well-regarded reference work. But, while her husband progressed from instructor to professor at Wisconsin, Betty worked as an instructor on an ad hoc basis. During World War II, while he worked for the government in Washington and New York, Betty stayed in Madison, teaching math to servicemen. When he took a job teaching in California after the war, they divorced, and it was only then that she was a given a position as assistant professor. As time progressed, attitudes about women working and their employment prospects did change. As women gained experience in the labor force, they increasingly saw that they could balance work and family. A new model of the two-income family emerged. Some women began to attend college and graduate school with the expectation of working, whether or not they planned to marry and have families, as did Rita Schorr-Germain, an immigrant who survived Auschwitz, graduated from Pembroke in 1953, and went on to teach European history while her husband also had a successful academic career. In her oral history, Rita says she was encouraged by many Brown professors and never considered the possibility that her gender would stand in the way of an academic career, a shift in outlook that was becoming increasingly common in the 1950s. As did most women's colleges at the time, Pembroke continued to produce nurses, schoolteachers, and social workers, and many women who worked only until they married and had children. But, from the late 1950s on, it also increasingly graduated writers, doctors, lawyers, diplomats, physicians, psychotherapists, and archeologists, and, in 1959, the first female faculty member of Brown University. Among those women fortunate to attend Pembroke in this era of dramatic change was me. I enrolled at Brown fully planning to attend graduate school and have a career, as did many of my classmates in the Class of 1967. By the 1970s, a dramatic change in women's work lives was under way. In the period after World War II, many women had not expected that they would spend as much of their adult lives working as turned out to be the case. By contrast, in the 1970s young women more commonly expected that they would spend a substantial portion of their lives in the labor force, and they prepared for it, increasing their educational attainment and taking courses and college majors that better equipped them for careers as opposed to just jobs. In surveys of young people about their expectations of their futures, young women during this era increasingly placed an emphasis on career success. Graber Slusky of the Class of 1971 said in her oral history that she chose Pembroke for Brown's excellence in chemistry and physics, because she was already planning the career she went on to have as a researcher. Perhaps unsurprisingly, this is also the period in which many all-male colleges admitted women or combined their women's and men's undergraduate schools, as Brown did when it merged Pembroke and Brown College in 1971. These changes in attitudes and expectations were supported by other changes under way in society. Workplace protections were enhanced through the passage of the Pregnancy Discrimination Act in 1978 and the recognition of sexual harassment in the workplace. Access to birth control increased, which allowed married couples greater control over the size of their families and young women the ability to delay marriage and to plan children around their educational and work choices. And in 1974, women gained, for the first time, the right to apply for credit in their own name without a male co-signer. By the early 1990s, the labor force participation rate of prime working-age women--those between the ages of 25 and 54--reached just over 74 percent, compared with roughly 93 percent for prime working-age men. By then, the share of women going into the traditional fields of teaching, nursing, social work, and clerical work declined, and more women were becoming doctors, lawyers, managers, and, yes, professors. As women increased their education and joined industries and occupations formerly dominated by men, the gap in earnings between women and men began to close significantly. Looking back, the story of the past 125 years is one of slow but steady progress toward women's full participation in the economy and the fulfillment of their career goals. Unfortunately, the success of women has often been seen as coming at the expense of men. Indeed, regularly in the late 19th and 20th centuries there were calls to protect men from women's entry into the labor force. The early female graduates of Brown faced such attitudes from fellow students and even from faculty. Ruth Pederson, a member of the Class of 1919, said some professors did not want to teach women and prohibited women from taking their classes. Margery Leonard remembered one Boston University professor who urged her to drop out of law school. When she refused, this professor punished her by forcing her to recite the details of rape and seduction cases before her jeering, stomping classmates. And it wasn't only men who had this attitude. Among the women who were fighting for better labor standards early in the 20th century, many were heavily influenced by elite cultural standards that viewed a woman's place as in the home and argued that men should be paid a "family wage" that would allow them to support their family singlehandedly--a standard that many working-class families could not afford. Moreover, many of the labor protections promoted to protect women were often based on theories about women's weaker nature, and these protections served to circumscribe their work. During the Great Depression, limiting women's role in the workforce was considered a way to address the high rates of unemployment, although the experience of those years showed the importance of women in supporting their families financially. Similarly, women who had successfully worked during World War II, either as part of the war effort or to support their families while their husbands were fighting, often were pushed out of their jobs to make room for returning soldiers. After the war and then single, my relative, Betty Stafford, remained an assistant professor at the University of Wisconsin despite an enviable body of research. She married another Wisconsin professor, an eminent chemist, and collaborated with him on his research. But, in 1954, she gave up her assistant professorship, she said, to be able to accompany her husband on his frequent international travels. Betty later moved with her husband to California, and after his death, she endowed a graduate fellowship in the sciences and a prize in theoretical chemistry. Although Betty's accomplishments were considerable, against the backdrop of increasing opportunity for women over her lifetime I believe that Betty Stafford Hirschfelder was denied opportunities and greater success simply because she was a woman. Despite the fears of some that women entering into the workforce would crowd out men, the evidence shows that the rise in women's participation has contributed to widespread improvements in the safety and productivity of our workplaces, to the health of families, and to the macroeconomic success that our country has enjoyed over the past 125 years. In the first decades of the 20th century, the struggle to improve the working conditions of young women drawn into factories was a pillar of the overall movement toward improved labor standards. Women's demands for safer factories, humane workweeks, and higher pay, which were often pursued through organizing and striking, contributed substantially to the social upheaval and public debate of that period that eventually led to the passage of stronger labor standards. These efforts also produced generations of women who went on to be leaders in the broader labor movement and in the broader movements for equality. The rise in female labor force participation was an early focus of and helped establish the fields of statistics and labor economics in their modern incarnations. Carroll Wright, the first commissioner of what is now known as the Bureau of Labor Statistics and who established the high standards for data collection and analysis for which the bureau is known, devoted his agency's fourth annual report, for the year 1888, to the Moreover, the issues surrounding women's work, such as the minimum wage, pay equity, and maximum workweeks, were topics of great interest to early practitioners of labor economics. It is often said that we should welcome women's presence in the workplace because it allows us to capitalize on the talents of our entire population, and this is certainly true. But it is also good business. A number of studies on how groups perform indicate that workforces that vary on dimensions such as gender, race, and ethnicity produce better decisionmaking processes and better outcomes. Evidence also suggests that women's work has positive spillovers to their family lives and to the success of their children, which in turn benefits all of society. It is a well- established finding in the literature on development that maternal education and work are positively associated with better health and educational outcomes for children. recent meta-study also suggests that children in the United States with working mothers do as well if not better in school, both academically and behaviorally, than children with mothers that stay home full time. This effect is particularly strong for families that have fewer social and economic resources, including single-parent families. As time goes on, girls with working mothers are more likely to be employed and hold supervisory positions, and they earn somewhat more. In addition, sons raised in families with working mothers assume greater childcare responsibilities as adults than sons whose mothers did not work. This is not to say that children do not need attention from both parents to develop into academically successful and socially well-adjusted adults--they certainly do. Also, as I will discuss, women are making choices that reflect their desire to balance work and family. These findings bear on the question of how best to support women's work through public policies aimed at helping women and men better manage work and family. From a macroeconomic perspective, women's incorporation into the economy contributed importantly to the rapid rise in economic output and well-being over the 20th century. Between 1948 and 1990, the rise in female participation contributed about 1/2 percentage point per year to the potential growth rate of real gross domestic product. And this estimate does not take into account the effect of the increases in women's education and work experience that also occurred over that period and boosted their productivity. In addition, since 1979, women have accounted for a majority of the rise in real household income. In dollar terms, the gains were greatest for households in the top third of the earnings distribution, but without the increase in women's earnings, families in the bottom and middle thirds of the distribution would have experienced declines. I have argued thus far that we, as a country, have reaped great benefits from the increasing role that women have played in the economy. But evidence suggests that barriers to women's continued progress remain. The participation rate for prime working-age women peaked in the late 1990s and currently stands at about 75 percent. Of course, women, particularly those with lower levels of education, have been affected by the same economic forces that have been pushing down participation among men, including technical change and globalization. However, women's participation plateaued at a level well below that of prime working-age men, which stands at over 88 percent. While some married women choose not to work, the size of this disparity should lead us to examine the extent to which structural problems, such as a lack of equal opportunity and challenges to combining work and family, are holding back women's advancement. As I mentioned earlier, the gap in earnings between men and women has narrowed substantially, but progress has slowed lately, and women working full time still earn about 17 percent less than men, on average, each week. Even when we compare men and women in the same or similar occupations who appear nearly identical in background and experience, a gap of about 10 percent typically remains. As such, we cannot rule out that gender-related impediments hold back women, including outright discrimination, attitudes that reduce women's success in the workplace, and an absence of mentors. Recent research has shown that although women now enter professional schools in numbers nearly equal to men, they are still substantially less likely to reach the highest echelons of their professions. For instance, 47 percent of students at top-50 law schools are female, and women obtain 40 percent of M.B.A.'s from top programs. Nonetheless, women are still poorly represented among corporate CEOs, as partners in top law firms, and as executives in finance. Even in my own field of economics, women constitute only about one-third of Ph.D. recipients, a number that has barely budged in two decades. This lack of success in climbing the professional ladder would seem to explain why the wage gap actually remains largest for those at the top of the earnings distribution. One of the primary factors contributing to the failure of these highly skilled women to reach the tops of their professions and earn equal pay is that top jobs in fields such as law and business require longer workweeks and penalize taking time off. This would have a disproportionately large effect on women, who continue to bear the lion's share of domestic and child-rearing responsibilities. Within academia, the short timeframe in which assistant professors have to prove themselves good candidates for tenure by publishing typically overlaps with the period in which many women contemplate starting a family, forcing difficult trade-offs. Employers may require the long hours and short absences for good reasons--for instance, the work may involve relationships with clients or accumulating a significant amount of knowledge about a deal or case in a condensed period of time. If it is costly for employees to share information and split the work, then there would be a high premium, in the form of compensation, for those who can work the long hours. Workplaces where the income of employees depends on the effort of co-workers, such as law partnerships, also have an incentive to require long workweeks. But however sensible such arrangements may be from a business perspective, it can be difficult for women to meet the demands in these fields once they have children. The very fact that these types of jobs require such long hours likely discourages some women--as well as men--from pursuing these career tracks. Advances in technology have facilitated greater work-sharing and flexibility in scheduling, and there are further opportunities in this direction. Economic models also suggest that while it can be difficult for any one employer to move to a model with shorter hours, if many firms were to change their model, they and their workers could all be better off. Of course, most women are not employed in fields that require such long hours or that impose such severe penalties for taking time off. But the difficulty of balancing work and family is a widespread problem. In fact, the recent trend in many occupations is to demand complete scheduling flexibility, which can result in too few hours of work for those with family demands and can make it difficult to schedule childcare. that encourage companies to provide some predictability in schedules, cross-train workers to perform different tasks, or require a minimum guaranteed number of hours in exchange for flexibility could improve the lives of workers holding such jobs. Another problem is that in most states, childcare is affordable for fewer than half of all families. And just 5 percent of workers with wages in the bottom quarter of the wage distribution have jobs that provide them with paid family leave. This circumstance puts many women in the position of having to choose between caring for a sick family member and keeping their jobs. In this context, it is useful to compare the workforce experiences of American women to those in other advanced economies. In 1990, the labor force participation rate in the United States of prime working-age women, 74 percent, was higher than in all but a few industrialized nations. But in the intervening years, while the participation rate of U.S. women was roughly stable, elsewhere it increased steadily, and by 2010 the United States fell to 17th place out of 22 advanced economies with respect to female labor force participation. A number of studies have examined the role of various public policies in explaining patterns in female labor force participation across countries. These studies find that policy differences--in particular, the expansion of paid leave following childbirth, steps to improve the availability and affordability of childcare, and increased availability of part-time work--go a long way toward explaining the divergence between advanced economies. Evidence suggests that if the United States had policies in place such as those employed in many European countries, female labor force participation could be as high as 82 percent. However, these policies entail tradeoffs. Women in other advanced economies are more likely than women in the United States to be employed part time, which could reflect a greater ease in arranging flexible schedules and more time with family, but it also comes with costs, including a wage penalty and fewer opportunities for training and advancement. Such findings raise the question of whether the policies enacted overseas in recent years have had the unintended consequence of making it more expensive for employers there to hire women into full-time jobs with opportunities for advancement, as women are more likely to be eligible for and to make use of such benefits. This possibility should inform our own thinking about policies to make it easier for women and men to combine their family and career aspirations. For instance, improving access to affordable and good quality childcare would appear to fit the bill, as it has been shown to support full-time employment. Recently, there also seems to be some momentum for providing families with paid leave at the time of childbirth. The experience in Europe suggests picking policies that do not narrowly target childbirth, but instead can be used to meet a variety of the health and caregiving responsibilities. The United States faces a number of longer-term economic challenges, including the aging of the population and the low growth rate of productivity. One recent study estimates that increasing the female participation rate to that of men would raise our gross domestic product by 5 percent. And, as I have argued, our workplaces and families, as well as women themselves, would benefit from continued progress. However, a number of factors, which I have only had a chance to touch upon, appear to be holding women back, including the difficulty women currently have in trying to combine their careers with other aspects of their lives, including caregiving. In looking to solutions, we should consider improvements to work environments and policies that benefit not only women, but all workers. Pursuing such a strategy would be in keeping with the story of the rise in women's involvement in the workforce, which, as I have described here, has contributed not only to their own well-being but more broadly to the welfare and prosperity of our country. Malala Yousafzai, the advocate for girls' and women's education, who said, "We cannot all succeed when half of us are held back." Brown University has played its own role by admitting women 125 years ago, by educating many thousands of women over the decades, and by continuing to be a place that equips men and women with the means to make our nation and the world a better place. . . . . . . . . _primeage_male_lfp.pdf . vol. 37 ------ (1990). Journal of ------ (2008). testimony presented to the Joint Economic Committee and the House . . ort.pdf . . . . . .
r170522a_FOMC
united states
2017-05-22T00:00:00
Why Opportunity and Inclusion Matter to America’s Economic Strength
brainard
0
I want to thank Neel Kashkari for launching the Opportunity and Inclusive Growth Institute and for inviting me to join the deliberations of this distinguished group today. This new Institute is another great example of how individual Reserve Banks are taking the initiative in illuminating key dimensions of our work and shaping the agenda While it has long been understood that opportunity is central to the strength of America's social fabric, it is now increasingly clear that opportunity and inclusion are central to the strength of America's economy. I will touch on the key ways that opportunity and inclusion matter for policymaking at the Federal Reserve, ranging from our dual-mandate goal of maximum employment to our monitoring of household financial health to our engagement in low- and moderate-income communities all over the country. I will focus on how our work intersects with the groundbreaking work of the accomplished group of researchers assembled here. In the original design of the Federal Reserve, it was recognized that the American economy is not monolithic; that is why the Congress created our system of 12 Federal Reserve Districts. We are present in communities all across America through our Reserve Banks and Branches and their boards and advisory councils. This local presence, by design, gives us valuable perspectives on how Americans are experiencing the economy in different communities around the country and critical insights about the varied challenges that lie beneath the aggregate numbers. In turn, our local engagement helps stakeholders in these communities partner to improve opportunity and inclusive growth. Inclusion is an enduring goal of public policy that is embodied in our maximum- employment mandate. The Employment Act of 1946 charges the federal government with creating "conditions under which there will be afforded useful employment for those able, willing, and seeking to work, and to promote maximum employment, production, and purchasing power." Maximum employment is inherently an inclusive goal. In 1977, the Congress amended the Federal Reserve Act to make achieving maximum employment an explicit objective of monetary policy, along with stable prices. In fulfilling its dual mandate, the Federal Open Market Committee (FOMC) has set a target of 2 percent for inflation but does not have a similarly fixed numerical goal for maximum employment. That is because the level of maximum employment depends on "nonmonetary factors that affect the structure and dynamics of the labor market," which can change in important ways over time. The recognition that maximum employment evolves over time to reflect changes in the economic landscape serves us well. It puts the onus on members of the FOMC to analyze the changing features of the labor market and develop a nuanced understanding of the different margins of slack. This approach to maximum employment has allowed the FOMC to navigate the current recovery in a way that has likely brought more people back into productive employment than might have been the case with a fixed, aggregate unemployment-rate target based on pre-crisis norms, in effect, achieving more inclusive growth. This is especially true at a time when the traditional Phillips curve relationship between unemployment and inflation is extremely flat for reasons we do not fully understand. When we disaggregate the economy-wide labor market statistics, we often find significant and persistent racial disparities. For many decades, the unemployment rate of African Americans has been nearly double the national unemployment rate, with little indication that the relative difference is narrowing or that it can be fully accounted for by education or sectoral mix; the unemployment rate for Hispanics also has consistently been higher than the national unemployment rate. Similarly, during the Great Recession, the unemployment rates of African Americans and Hispanics rose more sharply and rapidly than for workers as a whole. Even though the unemployment rates of these groups are back around their pre-recession levels, they remain higher than the national average. We can also see persistent disparities by gender, such as the well- known wage premium earned by men relative to women with similar experience and expertise. With its focus on inclusive growth, this Institute could give us important insights on how far the overall economy is from full employment, as well as the barriers that could be limiting the economy's potential, by studying labor market outcomes of men and women of different racial and ethnic backgrounds in more depth. Research on the drivers of disparities in labor market outcomes can also help the Federal Reserve better assess potential tradeoffs in monetary policy. In meetings with community groups, we often hear from advocates who point to the stark discrepancy they see between the economy's aggregate U-3 unemployment rate, which many forecasters estimate to be at or approaching full employment, and the much higher rates of unemployment among the people in their neighborhoods. For instance, Rod Adams, a neighborhood advocate here in Minneapolis, noted the unemployment rate for African Americans locally was still almost 9 percent late last summer and observed that "if the labor market were truly healthy, people in my community would all be able to find full- time jobs at decent wages." While the policy tools available to the Federal Reserve are not well suited to addressing the barriers that contribute to persistent disparities in the labor market outcomes of different groups, understanding these barriers and efforts to address them is vital in assessing maximum employment as well as potential growth. The Federal Reserve's community development work is invaluable in supporting our efforts to understand and improve the labor market experiences of different groups. For instance, during the Great Recession, workforce development organizations in Atlanta found themselves overwhelmed by the sharp rise in unemployment, which highlighted the need for a better connected and stronger network of job training and placement services. I recently spent time with these organizations, along with community members and some of our Atlanta staff, who have been working on the only comprehensive directory for workforce development services. Just as there is a connection between maximum employment and inclusive growth, so, too, there is an important connection between potential output and opportunity. If there are large disparities in opportunity based on geography or race or gender, such that households' enterprise, exertion, and investments are not rewarded commensurately, then families and small businesses will invest less in the future and potential growth will fall short. Indeed, one worrisome trend is the decline in the labor force participation of prime-age workers with less education, a trend that has been going on for decades among men and that has more recently begun to be mirrored in the participation rate of women. Understanding this growing detachment from work is important to improving both opportunity and potential growth. residents and community organizations about the challenging barriers standing between the many workers seeking jobs and the many jobs seeking workers. The local barriers separating jobs from job seekers can be as concrete as the physical isolation created by major traffic arteries or poorly designed transit systems. where our staff have been actively engaged with businesses, transit authorities, and community groups in efforts around "equitable transit-oriented development" so that public transit systems are designed to enhance access for low- and moderate-income residents. Inclusion and opportunity also figure prominently in our work on financial resilience. While the resilience of the financial system has long been central to Federal Reserve policy, in recent years we have come to more fully appreciate that a resilient financial system rests on the foundations of financially resilient households and businesses. The ability to manage the ups and downs in family income and expenses without hardship and the ability to make sound investments for the future are both crucial to financial health. Yet we see from the latest edition of the Federal Reserve's Survey of American households with high school degrees or less report that they are struggling financially. insights into the large amount of time and effort these families with thin financial buffers must devote to managing their volatile cash flows. A seemingly modest mismatch between income and expenses can threaten to send the finances of some families into a downward spiral from which it can be expensive and difficult to recover. The results of the 2016 SHED show that nearly one-fourth of all households are unable to pay their current month's bills in full, nearly one-third would rely on borrowing or selling something to cover a $400 emergency expense, and one in eight would not be able to cover a $400 emergency expense by any means. Over half of households lack savings to cover three months' expenses if they lost their main source of income. This finding corroborates the evidence found in the financial diaries of low- to moderate-income families that show it is all too common for households to have no short- term savings to cover emergencies. According to the Survey of Consumer Finances, on average from 1989 to 2013, about 80 percent of households in the bottom quintile of the income distribution had less than $3,000 adjusted for inflation in liquid assets (cash, checking, or savings accounts). Even among households in the middle quintile of income, about half do not meet this threshold for liquid assets. In addition, the financial crisis demonstrated that household financial imbalances can have important consequences for overall financial stability in extreme circumstances. The rapid and widespread rise in poorly underwritten mortgage debt prior to the Great Recession is widely viewed as a key contributor to the financial crisis. This suggests the potential value of better understanding the specific patterns in household finances that would give an early warning of a crisis. In carrying out our responsibilities to monitor and safeguard the stability of the financial system, although much of the work has focused on marketwide risks, core financial institutions, and macro-level shocks, we are also developing a more granular understanding of the distribution and strength of household balance sheets. Progress on this frontier is being aided by greater access to timely, account-level, and geographically specific data on consumer credit, mortgages, and spending, although more research in this area would be valuable. Slower income growth, as well as substantial volatility in income, has raised the financial stress faced by low- and moderate-income families and may be limiting absolute mobility across generations. Over time, the "American Dream" that each generation can expect to be better off than their parents' generation has gone from being widespread to increasingly out of reach for much of the population. Researchers have found that the reduction in economic mobility has been driven primarily by a more unequal distribution of economic growth, with slower overall gross domestic product (GDP) growth a secondary factor. Many households had been contending with volatile incomes even before the large negative shocks of the Great Recession and the increase in contingent work Unpredictable income and dangerously low emergency savings raise the strain on households and, over time, have pushed them to rely on other means, such as borrowing and government transfers, to try to meet their spending needs. Education and homeownership have long been key paths to opportunity, but the Great Recession has raised some important questions about asset building strategies. The sharp decline in house prices and the substantial rise in student loan debt have made it clear that investments in homeownership and education are not without risk, and the payoff can vary depending on the circumstances. Homeownership for many has been a way to turn a regular expense into an asset- building investment in the future, which is especially important given the wide and persistent disparities in wealth by race and ethnicity. But the experience of the past decade suggests that owning a home can, in some circumstances, exacerbate financial difficulties for vulnerable families in a downturn. The lesson that even a moderate decline in house prices can erase home equity applies broadly, along with the importance of sound underwriting and servicing, but the painful consequences in the recession were greater among minority and low-income homeowners. The fact, discussed earlier, that African American and Hispanic homeowners households are more likely to lose their jobs in a recession and are also more likely to live in neighborhoods with concentrated job loss led to even larger house price declines and more foreclosures among these households. Indeed, there are many low-income neighborhoods in which many homeowners remain "underwater" on their mortgages even today. Community development organizations are putting this more nuanced view of asset building into practice and thereby increasing opportunities for individuals to make smart investments in their future. Better Family Life, a community group I visited in North St. Louis, provides would-be homebuyers with education and counseling on how to manage the costs of homeownership, tools to navigate real estate markets, and information on lending. There is ample research demonstrating that housing counseling makes a notable improvement in the likelihood that asset building through homeownership will pay off for first-time buyers in low- to moderate-income communities. Similarly, under the right circumstances, education can be a critical investment in the future and a path to opportunity, leading to higher wages and improved financial outcomes. Over the past several decades, the earnings premium for those with a college degree relative to those with a high school education has risen substantially, making higher education, on average, even more valuable. Nonetheless, even though education is a sound investment for most students, the benefits can vary with the quality and type of education received. The SHED finds that fewer than 40 percent of nongraduates or graduates from for-profit institutions say their education was "worth the cost," compared with two-thirds of graduates from public or nonprofit institutions. The downsides from such low-return education are compounded for those who took out student loans, in some cases leaving them worse off than before. As an indication of this problem, nearly three-fourths of recent borrowers who attended for-profit schools failed to make progress on paying off their student loans in the first few years, and almost half were in default within five years. Investments in education that do not pay off can set these individuals back on asset building as well as on other life goals they may have. To advance more inclusive growth and opportunity, it is essential to help people, especially first-time and nontraditional college students, access smarter educational investments with more reliable and better returns. The connection between the conditions in a community and individual opportunity has been demonstrated in powerful research that many of you have pioneered, and we see this connection every day in our work in communities around the country. The neighborhood where a family lives can have profound implications for their economic opportunities and their children's prospects. Families living in neighborhoods with high concentrations of poverty and low economic or demographic diversity are more likely to experience a range of negative outcomes, including exposure to crime and violence, physical and mental health problems, and weak academic performance. skilled workers who live far from potential employers or accessible transportation networks have more difficulty finding and keeping jobs. These effects of geography on opportunity can stretch from one generation to the next. Raj Chetty and his collaborators have shown that upward mobility varies immensely across the country and even within a single metro area. Taken together, this research underscores the urgency of understanding how we can make communities work better for all their members. Since communities play a central role in determining opportunity, policy to promote inclusion often focuses on improving local conditions. With our presence in communities around the country and our efforts under the Community Reinvestment Act, the Federal Reserve is a source of high-quality research and region-specific expertise as well as a trusted convener and catalyst on community development approaches for lenders, community groups, and local and regional governments. One important area of focus is housing, which connects families concretely to place and can be a source of strength or fragility. Last year, I met with Milwaukee community development groups and residents in one of the more racially segregated residential markets in the country. They highlighted the challenges facing the highly insecure rental population in Milwaukee, which were brought alive by Matthew Desmond's careful research . Other communities across the nation face similar challenges. In the recently released SHED, we found that among renters who had recently moved, 12 percent of African Americans, 16 percent of Hispanics, and 8 percent of whites had moved because of eviction or the threat of eviction. The barriers to safe and affordable housing often take on a different form in rural areas, where ownership of manufactured housing is often coupled with insecure land ownership. The geographic footprint of the 12 Federal Reserve Districts gives us a valuable presence in rural America as well as in towns and cities of all sizes and economic fortunes. Near El Paso, our team has developed important analysis of housing challenges in the colonias neighborhoods, where the lack of basic infrastructure and costly financing of warranty deeds pose special hurdles for local families. We have also seen successful models of providing affordable and safe housing when community development organizations and financial institutions, along with banks and local residents, work together collaboratively. On a recent visit in El Paso, I saw the value of these approaches, as a single mother with significant health challenges received the keys to a new home in a stable community, after many long years. While the densely wooded hills and hollers of Eastern Kentucky are a sharp contrast to the desert and floodplain expanses of the southwest, the keys to affordable housing in a healthy community can bring just as great an improvement in opportunity. These successes would not be possible without the ingenuity and collaboration of community development financial institutions, local officials, banks, and community members. As I have witnessed, whether it be for a retiree in Helena, Arkansas; a single mom in El Paso, Texas; or a dad on disability in Emlyn, Kentucky, the keys to affordable housing in a stable community can unlock opportunity for future generations. In some parts of the country, rural residents and small businesses also face increasing challenges in accessing financial services as small community banks close and larger banks close branches in low-population areas. Consequently, as I learned from the Mayors of Itta Bena and Moorhead, Mississippi, some rural residents, small businesses, and even municipalities have to drive long distances to reach a bank. Union and Southern Bancorp are acquiring bank branches earmarked for closing in order to maintain financial services for some rural communities. Although both pockets of opportunity and of persistent poverty are found in large metro and rural areas alike, a greater share of the new jobs and business establishments created in the recovery following the Great Recession have been in larger metro areas than was the case in previous recoveries. In countless communities, especially in rural towns and small to midsize cities, we have seen how a deep setback can leave a profound and long-lasting mark. These experiences challenge common assumptions about the ability of the economy to recover from an economic setback. This could be the legacy of concentrated reliance on an industry that experiences decline due to trade or technology or the byproduct of lack of connectivity-whether by highways or broadband. Technological change, globalization, and other shifts in demand and costs are not new to the U.S. economy, but there are troubling signs that less diversified or more isolated localities have diminished ability to recover. And there is increasing evidence that such concentrated economic shocks can also lead to severe labor market stress, as well as broader consequences for health and mortality. Over the past 30 years, the convergence in income across regions of the country has slowed dramatically. Even so, some localities fare better than others in establishing new paths to opportunity and inclusive growth, and their successes provide actionable lessons. The Boston Fed's Working Cities Challenge undertook an in depth study of 25 medium-sized cities nationwide that had experienced a post-industrial decline and identified 10 that experienced an economic resurgence. The critical determinant of success was the ability of leaders in those cities to collaborate across sectors around a long-term vision for revitalization. To encourage such collaboration in other cities, the Boston Fed facilitated competitions that reward effective public-private collaboration in developing plans to reach community-wide goals. For example, Holyoke, Massachusetts, proposed a plan to simplify the city's permitting and licensing systems in order to raise the presence of Latino-owned businesses. On economic revitalization, as in other areas of community development, effective solutions start with the community setting its own goals, are powered by broad collaboration, and rely on evidence to drive results. We all have our work cut out for us in helping to understand the state of opportunity and inclusion for different groups and communities across our country, and ensuring that policy is informed by those important insights. At the Federal Reserve, we will continue to navigate the recovery to ensure we reach and sustain our long-term goals of maximum employment and price stability. We will remain attentive to the financial health of vulnerable households. And we will remain committed to helping illuminate the specific challenges faced by low- and moderate-income communities around the country and to supporting banks and other financial institutions as they partner in strengthening these communities. In all of these efforts, our work will be greatly strengthened by the cutting-edge research and policy insights of the outstanding group gathered here tonight. , . Review of . . . -------- (2017). . . . -------- (2016). _primeage_male_lfp.pdf . . . . . . . . . vol. 64 Changes in the Characteristics of Borrowers and in the Institutions They Attended . System, eds., Communities, and the Economy. . . . Krishnamurthy, eds., . . in the Economy," speech delivered at . .
r170530a_FOMC
united states
2017-05-30T00:00:00
Navigating the Different Signals from Inflation and Unemployment
brainard
0
For the first time in many years, we are seeing signs of synchronized economic expansions at home and abroad, and the balance of risks globally has become more positive. Recent data suggest that the underlying momentum of the domestic expansion remains solid. While U.S. consumption was weak in the first quarter of 2017, the data so far are consistent with a rebound in the current quarter. Moreover, financial conditions remain supportive of continued economic expansion despite some recent volatility. The ongoing progress in bringing Americans back into productive employment is especially heartening. With continued strength in the labor market, economic activity regaining momentum, and a brighter outlook abroad, it would be appropriate soon to see the federal funds rate moving closer to its neutral level. If the economy evolves in line the federal funds rate is likely to be well underway before too long, setting the stage for a gradual and predictable running off of the balance sheet. Even so, I see some tension between signs that the economy is in the neighborhood of full employment and indications that the tentative progress we had seen on inflation may be slowing. If the tension between the progress on employment and the lack of progress on inflation persists, it may lead me to reassess the expected path of the federal funds rate in the future, although it is premature to make that call today. Let me start by reviewing the conflicting readings we are getting from the labor market and from inflation. The labor market has continued to strengthen. Payroll growth has averaged 175,000 over the past three months, more than sufficient to absorb new entrants into the labor market. Although earlier in the recovery, it appeared that the U-3 unemployment rate was running ahead of broader indicators of slack, more recently, it has been encouraging to see other margins of slack being drawn down. The labor force participation rate has held stable, against what many believe to be a downward trend based on demographics, and the employment-to-population ratio has reached a new post- recession high. Moreover, the share of employees who work part time for economic reasons has recently moved down close to its pre-crisis level, after a long period of remaining at elevated levels. The most commonly used U-3 measure of the unemployment rate moved down to 4.4 percent in April. This happens to be the cyclical low reached in 2006-07, although unemployment was at or below this level much of the time from the middle of 1998 to the middle of 2001. Relative to recent decades, the unemployment rate is now quite low. In fact, some have voiced concerns that the economy has proven unable to sustain its expansion when the unemployment rate has fallen below these levels. With that in mind, it is worth asking whether we should be worried that history will inevitably repeat itself. The truth is, we cannot know for sure. Although rising inflation often heralded the death knell of economic expansions in earlier decades, inflation expectations have been well anchored and rising inflation has presented less of a risk in the most recent business cycles. From 1998 to 2001, for instance, core personal consumption expenditures (PCE) inflation never exceeded 2 percent on a four-quarter basis. Core PCE inflation did reach as high as 2.4 percent in the period from 2006 to 2007, but, at the time, this higher inflation was viewed as reflecting the pass-through of a significant run-up in energy and non-energy import prices. Today, there is little indication of an outbreak of inflation--rather, the latest data on inflation have been lower than expected. If anything, the puzzle today is why inflation appears to be slowing at a time when most forecasters place the economy at or near full employment. Even wage inflation, which is most tightly connected to labor market slack, shows little sign of heating up by most measures. Overall, wages are increasing a bit more rapidly than they were a few years ago, but the latest data on wages do not show much progress over the past year. Average hourly earnings rose only 2-1/2 percent in the 12 months through April, the same as a year earlier. Similarly, the employment cost index was up only 2-1/4 percent in the 12 months through March. While that is up from a year earlier, it is lower than two years ago. The Atlanta Fed's Wage Growth Tracker tells a similar story: Upward movement in wage gains was observed until about a year or so ago, but there has been little acceleration recently. Turning to overall inflation, earlier this year, reports indicated that the Federal of consumer price inflation on a national accounts basis--had, on a 12-month change basis, risen close to the FOMC's objective, but the latest figures have edged down somewhat as the rebound in energy prices has abated. I tend to place greater weight on the core measure of inflation, which abstracts from the transitory movements in energy prices and is a better predictor of future inflation. In the April report, the core measure-- that is, excluding food and energy prices--had increased only 1.5 percent on a 12-month change basis. That reading marks a considerable shortfall from the Committee's 2 percent objective. And there does not seem to have been any progress over the past year or so: Core PCE inflation is about the same over the past 12 months as over the preceding period. Although the past two monthly readings of core inflation have been held down in part by idiosyncratic factors, including upgrades to cell-phone plans, the apparent lack of progress in moving core inflation back to 2 percent is a source of concern. Traditionally, economists assessed that as labor market slack diminished and the economy approached full employment, upward pressure on inflation would result, in the statistical relationship known as the Phillips curve. But I am not confident we can count on the Phillips curve to restore inflation to target in today's economy. Since 2012, inflation has tended to change relatively little--both absolutely and relative to earlier decades--as the unemployment rate has fallen considerably. At a time when the unemployment rate has fallen from 8.2 percent to 4.4 percent, core inflation has undershot our 2 percent target for 58 straight months. In other words, the Phillips curve appears to be flatter today than it was previously. This is also true in a number of advanced foreign economies, where declines in unemployment rates to low levels have failed to generate significant upward pressures on inflation. With the Phillips curve appearing to be a less reliable guidepost than it has been in the past, the anchoring role of inflation expectations remains critically important. Here, recent developments are mixed. The May reading of the University of Michigan Surveys of Consumers' measure of longer-term inflation expectations remained near its all-time low, while the New York Fed's measure of three-year inflation expectations edged up in its latest reading to the highest level in more than a year. And although market-based measures of inflation compensation have improved relative to their lows in the middle of last year, they are still below the average level in the period from 2010 to Attaining the Committee's symmetric target for inflation on a sustainable basis is especially important in the current environment, with the neutral real interest rate at historically lower levels, in order to ensure conventional policy has room to respond to unexpected adverse developments. Underlying fundamentals, such as import prices and diminishing slack, should lead inflation to resume moving closer to its goal. Nonetheless, currently I see more signs that progress on inflation is slowing than of a breakout of inflation to the upside, as might be the case with a nonlinearity in the relationship between inflation and unemployment when unemployment is very low. But as noted earlier, a breakout in inflation also was not a primary concern following the past two times the unemployment rate dropped as low as it is now, in 1998 and 2006, when recessions followed within two or three years. One notable feature of both episodes was that they were preceded by sharply elevated financial imbalances. In the late 1990s, equity prices had reached very high levels, according to common measures of stock market valuations. And the period from 2006 to 2007 coincided with a house price bubble, along with extreme leverage at a number of large financial institutions and widespread use of exotic financial products. Broadly speaking, financial conditions today appear to be more balanced: In most markets, house prices seem fairly well aligned with rents. Large banks are much better capitalized than before the crisis and appear to be managing their risk exposures and liquidity much more carefully. While today's equity market valuations appear somewhat elevated, they do not seem to be near the dizzying heights reached in 1999 and 2000. Moreover, for a variety of reasons, importantly including critical financial reforms as well as changes in risk appetite, leverage and maturity transformation are at much lower levels than they were before the crisis. One area that merits ongoing vigilance is corporate indebtedness, which remains at a high level and where investor appetite still seems strong. Another area of concern is auto lending--particularly in the subprime segment--where underwriting appears to have become quite lax last year and, consequently, delinquency rates indicate more borrowers struggling to keep up with their payments. Eight years into the recovery, it is important to recognize that financial conditions can change rapidly and bear special vigilance. Nonetheless, risks to the U.S. financial system do not appear to be flashing red in the way they did in the run-up to previous downturns. It is also possible that the natural rate of unemployment has moved lower or that the unemployment rate still may be overstating the strength of the labor market. While it is encouraging that the share of employees who work part time for economic reasons has continued to move down, there may well be slack remaining along this margin. And another key measure--the prime-age employment-to-population ratio--remains more than 1 percentage point below pre-crisis levels, and further improvement there would be welcome. Looking at economic activity more broadly, although first-quarter gross domestic product (GDP) was soft, the data so far suggest a rebound in the second quarter. The weak Q1 reading follows a recurring pattern in recent years, with the first quarter of the year often weaker than subsequent quarters. Moreover, below the top-line number, there were some encouraging signs of strength: Residential construction posted a double-digit increase and contributed 1/2 percentage point to first-quarter GDP growth. Drilling for oil and natural gas is rebounding sharply, and nonresidential construction contributed 3/4 percentage point to first-quarter GDP growth. Business spending on equipment and intangibles, which fell slightly in 2016, rebounded to a 7 percent annualized increase in the first quarter and contributed another 3/4 percentage point to the overall increase. A key reason overall GDP was so weak last quarter was consumer spending, which rose only 0.6 percent at an annual rate. Nonetheless, there are good reasons to think that the first-quarter weakness in consumer spending will not persist. Household incomes should continue rising with the continued strengthening in employment and wages, home prices should be contributing through improved household balance sheets, and consumer sentiment remains upbeat. Recent changes in financial conditions have, overall, been supportive of further gains in the real economy. The S&P 500 index is up almost 8 percent since the start of the year. At the same time, a broad measure of the exchange value of the dollar is down about 4 percent so far this year, which should help boost net exports. After moving up sharply late last year, long-term interest rates have moved down somewhat so far this year. In addition, the balance of risks has shifted over the past two quarters, with a number of downside risks receding and some upside risks emerging. In particular, the latest international economic data have suggested waning downside risks from abroad, while continued labor market strength and the prospect for fiscal stimulus in the United States present a possible upside risk to domestic demand. Importantly, we are seeing synchronized global growth for the first time in many years. Growth forecasts for both advanced and emerging market economies are being marked up, breaking a pattern of repeated downward revisions from 2013 to 2016. Recent political developments significantly enhanced the prospects for policy continuity in the euro area, and there has been continued growth in euro-area employment and economic activity. While Italy continues to face political, economic, and financial risks, recent developments augur well for the resilience of the broader euro area. China's first-quarter growth came in above 7 percent at an annual rate, although there appears to have been some moderation since then, and capital outflows slowed notably. China's economy bears watching in the medium term, especially given financial-sector risks and elevated debt levels. Although Mexico's growth may moderate this year, both the Mexican equity market and the exchange rate have strengthened, along with confidence, following sharp falls late last year. Along with the favorable shift in foreign risks, recent announcements on fiscal policy suggest some upside risk to U.S. aggregate demand. The Administration has proposed deep tax cuts, which, if implemented, could amount to about 2 percentage points of GDP in the first few years according to independent estimates. Most estimates suggest that the supply-side effects of these policies would be fairly small, so, if enacted, the net effect could well be a boost to U.S. aggregate demand at a time when the economy could be at full employment. Nonetheless, there is considerable uncertainty about the magnitude and timing of any policy changes. There is also important uncertainty about the deliberations over the debt limit, which are likely to garner increasing attention in the early fall and will factor into my considerations of risks to the outlook. On balance, when assessing economic activity and its likely evolution, it would be reasonable to conclude that further removal of accommodation will likely be appropriate soon. As I noted earlier, the unemployment rate is now at 4.4 percent, and we are seeing improvement in other measures of labor market slack, such as participation and the share of those working part time for economic reasons. There are good reasons to believe that the improvement in real economic activity will continue: Financial conditions remain supportive. Indicators of sentiment remain positive. The balance of risks at home has shifted favorably, downside risks from abroad are lower than they have been in several years, and we are seeing synchronous global growth. The time for a change in balance sheet policy is coming into clearer view as normalization of the federal funds rate approaches the range that can be considered "well under way." If the outlook and the expected federal funds rate path evolve in line with the median projection of FOMC participants reported in the March SEP, the federal funds rate will soon approach midway to its expected long-run equilibrium value. I shared my framework for thinking about the change in balance sheet policy in early March, and today I will elaborate on the approach that seems most appropriate to achievement of our goals. Consideration that normalization of the federal funds rate is well underway was the criterion the Committee adopted in its December 2015 decision to continue to reinvest principal payments. In my view, that "well under way" standard has served an important purpose. With asymmetry in the scope for conventional monetary policy to respond to shocks, maintaining reinvestments provided an important benefit by enabling the federal funds rate to rise more quickly than would have been possible with a shrinking balance sheet and sooner reach a level that allows for reductions if conditions deteriorate. This approach has ensured that our most proven tool, the federal funds rate, will have reached a level at which it can be cut if needed to buffer adverse shocks, thus helping to guard against the asymmetric risks associated with the effective lower bound. With the federal funds rate projected to be in the range that is midway to the Committee's projection of the long-run value of the federal funds rate later this year, I would consider it reasonable to assess that this threshold will have been attained before too long. As we shrink the size of our balance sheet, the public's holdings of Treasury securities will rise, and that will tend to boost longer-term interest rates. In particular, most studies conclude that increases in central bank holdings of longer-maturity assets chiefly affect interest rates by reducing the quantity of longer-term securities held by the public and putting downward pressure on the term premium--that is, the difference between the yields on longer-dated assets and the path of expected short-term interest rates over the holding period. By some estimates, the effect is modestly above 90 basis points currently. Thus, balance sheet normalization should be associated with higher term premiums, which in turn, other things held equal, should be associated with higher long-term Treasury yields. Most studies find that higher Treasury yields also affect yields and prices of other securities: increasing interest rates faced by private-sector borrowers, making dollar-denominated assets more attractive, which tends to boost the exchange value of the dollar, and making fixed-income assets more attractive relative to stocks, tending to depress share prices. Together, these channels contribute to a tightening in financial conditions. These effects are, of course, in many respects, similar to the effects of increases in short-term interest rates. Thus, away from the zero lower bound, the two tools are, to a large extent, substitutes for one another. As a result, the FOMC will be in the unfamiliar posture of having two tools available for adjusting monetary policy. It is, therefore, important to clarify how they will be used in relation to each other. While, under most circumstances, the two tools are largely substitutes for one another in terms of their effects on the economy the federal funds rate is the tool with which we have the most experience. And using two tools at once could easily foster confusion. Thus, in my view, predictability, precision, and clarity of communications all argue in favor of focusing policy on the federal funds rate as the single active tool. In this framework, the balance sheet essentially would remain subordinate to the federal funds rate. Under the subordinated balance sheet approach, once the change in reinvestment policy is triggered, the balance sheet would essentially be set on autopilot to shrink passively until it reaches a neutral level, expanding in line with the demand for currency thereafter. I favor an approach that would gradually and predictably increase the maximum amount of securities the market will be required to absorb each month, while avoiding spikes. Thus, in an abundance of caution, I prefer to cap monthly redemptions at a pace that gradually increases over a fixed period. In addition, I would be inclined to follow a similar approach in managing the reduction of the holdings of Treasury securities and mortgage-backed securities (MBS), calibrated according to their particular characteristics. The Committee's policy normalization principles have made clear that the Federal will, in the longer run, hold no more securities than necessary to implement monetary policy efficiently and effectively." Over time, the gradual reduction in our balance sheet should result in a gradual decline in reserves to a longer-run level that is well below today's level but likely somewhat higher than in the pre-crisis regime. It is difficult to know in advance with any precision how low reserves can be allowed to drop. That minimum level will depend on the structural demand for reserves and the short-term variability in the demand for and supply of reserves. During the process of balance sheet normalization, I favor an approach of monitoring money markets carefully to gauge the appropriate longer-run level of reserves consistent with efficient and effective policy implementation. Finally, while subordination of the balance sheet to the federal funds rate should be our baseline policy, in my view, there may be circumstances when we may need to rely on the balance sheet more actively. During the period when the balance sheet is running down, if the economy encounters significant adverse shocks, it may be appropriate to commence the reinvestment of principal payments again in order to preserve conventional policy space. In recent quarters, the balance of risks has become more favorable, the global outlook has brightened, and financial conditions have eased on net. With the labor market continuing to strengthen, and GDP growth expected to rebound in the second quarter, it likely will be appropriate soon to adjust the federal funds rate. And if the economy evolves in line with the SEP median path, the federal funds rate will likely approach the point at which normalization can be considered well under way before too long, when it will be appropriate to adjust balance sheet policy. I support an approach that retains the federal funds rate as the primary tool for adjusting monetary policy, sets the balance sheet to shrink in a gradual and predictable way for both Treasury securities and MBS, and avoids spikes in redemptions. While that remains my baseline expectation, I will be watching carefully for any signs that progress toward our inflation objective is slowing. With a low neutral real rate, achieving our symmetric inflation target is more important than ever in order to preserve some room for conventional policy to buffer adverse developments in the economy. If the soft inflation data persist, that would be concerning and, ultimately, could lead me to reassess the appropriate path of policy. . . . . . . . . .
r170601a_FOMC
united states
2017-06-01T00:00:00
Thoughts on the Normalization of Monetary Policy
powell
1
Thank you for the opportunity to speak here at the Economic Club of New York. Today I will discuss the ongoing progress of our economy and the prospects for returning both the federal funds rate and the size of the Fed's balance sheet to more normal levels. As always, the views I express here are mine and not necessarily those of the Federal The Federal Reserve is committed to fulfilling our statutory mandate of stable prices and maximum employment. To begin with the labor market, many indicators suggest that the economy is close to full employment. In April, the unemployment rate was 4.4 percent, a level not reached since May 2007 and below most current estimates of the natural rate of unemployment (figure 1). Estimates of the natural rate are inherently uncertain, but other labor market measures are also near their pre-crisis levels, including a broader measure of labor market underutilization that includes those who would like to work but have not recently looked for a job and those working part time who want full- time work. The labor force participation rate, which had declined sharply after the crisis, has now been roughly stable for 3-1/2 years, which represents an improvement against its estimated downward trend (figure 2). Participation is now close to estimates of its trend level. Wage data have gradually moved up, consistent with a tightening labor market. Although average hourly earnings are rising only about 2.5 percent per year, slower than before the crisis, much of that downshift may reflect the slowdown in productivity growth we have experienced. For example, over the past three years, unit labor costs-- that is, nominal wages adjusted for increases in productivity--have been generally rising a bit faster than prices. Turning to inflation, the FOMC interprets price stability to mean inflation of 2 percent as measured by the price index for personal consumption expenditures (PCE). This objective is symmetric, so the Committee would be concerned if inflation were to run persistently above or below this target. Inflation has run below 2 percent for most of the period since the financial crisis, reflecting generally soft economic conditions as well as transitory factors such as the earlier declines in energy prices. But over the past two years, inflation has moved gradually closer to our objective. Prices rose 1.6 percent over the 12 months ending in April, compared with only 0.2 percent two years earlier (figure 3). But much of that movement reflects price changes in the often-volatile energy and food components of the index. Core inflation, which excludes food and energy prices, has proven historically to be a better indicator of where overall inflation is heading, although it, too, can be affected by transitory factors such as import prices. Core inflation was 1.5 percent for the 12 months through April. This measure has also risen since 2015, although its gradual increase appears to have paused because of weak inflation readings for March and April. Some of the recent weakness can be explained by transitory factors. And there are good reasons to expect that inflation will resume its gradual rise. Incoming spending data have been relatively strong, and the labor market should continue to tighten, exerting some upward pressure on wages and prices. Nonetheless, it is important that the Committee assess incoming inflation data carefully and continue to demonstrate a strong commitment to achieving our symmetric 2 percent objective. Despite strong job gains, very weak productivity gains have led to disappointingly slow economic growth of only about 2 percent over the course of this expansion (figure 4). While monetary policy can contribute to growth by supporting a durable expansion in a context of price stability, it cannot reliably affect the long-run sustainable level of the economy's growth. The success of monetary policy should be judged by the economy's performance against our statutory mandates of price stability and maximum employment. Today, the economy is as close to our assigned goals as it has been for My baseline expectation is that the economy will continue on a path of growth of about 2 percent, strong job creation and tightening labor markets, and inflation moving up toward our 2 percent target. I expect that unemployment will decline a bit further and remain at low levels for some time, which could draw more workers into the workforce, put upward pressure on wages, or cause businesses to invest more as labor costs rise, all of which I would view as desirable outcomes. Risks to the forecast now seem more balanced than they have been for some time. In particular, the global picture has brightened as growth and inflation have broadly moved up for the first time in several years. Here at home, risks seem both moderate and balanced, including the downside risk of lower inflation and the upside risk of labor market overheating. The healthy state of our economy and favorable outlook suggest that the FOMC should continue the process of normalizing monetary policy. The Committee has been patient in raising rates, and that patience has paid dividends. While the recent performance of the labor market might warrant a faster pace of tightening, inflation has been below target for five years and has moved up only slowly toward 2 percent, which argues for continued patience, especially if that progress slows or stalls. If the economy performs about as expected, I would view it as appropriate to continue to gradually raise rates. I would also see it as appropriate to begin the process of reducing the size of the balance sheet later this year. Of course, both decisions will depend on the performance of the economy. To put this process in context, consider where we have come from. Ten years ago, in the summer of 2007, we were just entering the most painful economic crisis since the Great Depression. The crisis and its aftermath prompted large-scale policy interventions by the Federal Reserve and other authorities to avert the collapse of the financial system and prevent the economy from spiraling into depression. Most of the Federal Reserve's targeted financial measures--such as liquidity facilities to ensure the flow of credit to households and businesses--were withdrawn soon after the crisis as orderly conditions resumed in financial markets. In contrast, the FOMC's easing of monetary policy increased over time as the longer-term economic effects of the crisis gradually became clear. From 2007 through 2013, the FOMC added ever greater support for the economy. From late 2008, with rates pinned at the zero lower bound, the Committee resorted to unconventional policies to put additional downward pressure on long-term rates, including strong calendar-based forward guidance regarding the likely future path of the federal funds rate, and several rounds of large-scale asset purchases (often referred to as quantitative easing (QE)). Both the federal funds rate and the balance sheet are currently set at levels intended to provide significant support to economic activity. Normalization of the stance of monetary policy will return both tools to a more neutral setting over time. That process can be said to have begun in 2014, when the FOMC ended its asset purchases and began active discussions on lifting the federal funds rate from its lower bound. Our first rate hike came in December 2015, with another in December 2016, and one additional increase so far this year. The normalization process is projected to have several years left to run. In the case of the federal funds rate, the endpoint of that process will occur when our target reaches the long-run neutral rate of interest. Estimates of that rate are subject to significant uncertainty. The median estimate of its level by FOMC participants in March was 3 percent, more than a full percentage point below pre-crisis estimates. This decline in the long-run neutral rate, and an even larger decline in the short-run neutral rate, imply that even the very low rates of recent years may be providing less support to the economy than may appear. At present, the median FOMC participant estimates that we will reach a long-run neutral level by year-end 2019 if the economy performs about as In September 2014, the FOMC outlined its plans for the balance sheet. That initial guidance has been supplemented over time in other FOMC communications, most recently in the minutes of the May meeting. Here is a summary of the key points: Normalization of the balance sheet will commence only after the normalization of the level of the federal funds rate is well under way. FOMC participants think that this condition will be satisfied later this year if the economy continues broadly on its current path. The balance sheet will be allowed to shrink passively as our holdings of Treasury and agency securities mature (or prepay) and roll off. The process will be gradual and predictable. As noted in the May minutes, although no decisions have been made, the Committee has discussed preannouncing a schedule of gradually increasing caps on the dollar value of securities that would be allowed to run off in a given month. The Committee will continue to use the federal funds rate as its principal tool for adjusting the stance of monetary policy. Once the process of balance sheet normalization has begun, it should continue as planned as long as there is no material deterioration in the economic outlook. In the long run, the balance sheet should be no larger than it needs to be to allow the Committee to conduct monetary policy under its chosen framework. Taken together, the Committee's communications present the broad outline of our likely approach to normalizing the balance sheet. Although the process of normalizing the size of the balance sheet will be in the background, that process will interact with the Committee's decisions regarding the federal funds rate. As the Fed's balance sheet shrinks, so debt held by the public will grow, which in theory should tighten financial conditions by putting upward pressure on long-term rates. Any such tightening could affect the Committee's decisions on the federal funds rate. But how big is this effect likely to be? Model-based approaches to that question estimate changes to financial conditions through increases in the term premium as the balance sheet shrinks. These estimates vary but are generally modest. One reason is that, for several years, the FOMC has signaled its intention to normalize the balance sheet as economic conditions allow, and so some of the effects of normalization should already be priced in. A recent research paper by Federal Reserve staff estimated that unconventional policies are holding down term premiums by about 100 basis points, but that these effects should decline to about 85 basis points by the end of 2017 as market participants see the normalization process approaching. The same approach suggests that bringing forward the date of the start of the anticipated phasing out of the Federal Reserve's reinvestments from mid-2018 to the end of 2017 should have raised the term premium by only about 5 basis points. course, markets sometimes react quite differently than expected, as the 2013 taper tantrum showed. The market's response to recent changes in expectations for reinvestment policy also suggests that there need not be a major reaction when the Committee begins to phase out reinvestments. Long-term rates did not react strongly to the reinvestment discussions in the minutes for the FOMC's March and May meetings, which led market participants to bring forward their expectations about the starting date for this process by about six A recent survey of economists also suggests that they expect that a gradual, well-telegraphed reduction in the Fed's balance sheet should have modest effects. These results augur well for an orderly phaseout of reinvestments. If changes to reinvestment policy do tighten financial conditions more than anticipated, then I expect that the FOMC would take that into account. Over the next few years, the runoff of assets acquired through QE is expected to reduce the balance sheet substantially below its current level of $4.5 trillion. In the long run, the ultimate size of the balance sheet will depend mainly on the demand for Federal Reserve liabilities--currency, reserves, and other liabilities--and on the Committee's long- run framework for setting interest rates. The next slide compares the Fed's balance sheet of May 2007 with that of May 2017 (figure 8). On the asset side, the balance sheet increased by about $3.5 trillion as the FOMC acquired securities in its QE programs. These assets were matched on the liability side of the Federal Reserve's balance sheet by a $2.2 trillion increase in reserve balances held by commercial banks, a $700 billion increase in currency outstanding, and a $500 billion increase in other liabilities. As can be seen more clearly in the next slide, prior to the crisis, currency was the Fed's main liability (figure 9). Currency outstanding has nearly doubled over the past 10 years to $1.5 trillion, growing at a compound annual rate of 6.8 percent. This growth reflects strong domestic and international demand for U.S. currency, which is expected to continue. The eventual level of demand for reserves is less certain but is highly likely to exceed pre-crisis levels when reserve balances averaged only about $15 billion. Reserves are the ultimate "safe asset," and demand for safe assets has increased substantially over time because of long-run trends, including regulatory requirements. Other liabilities include the Treasury General Account, the foreign repurchase agreement (or repo) pool, balances held at the Fed by designated financial market utilities, and other items. To gain a sense of the possible long-run size of the balance sheet, the next slide shows simulations under three different assumptions for the ultimate level of reserve These simulations show that, due to the growth of currency and other liabilities, the balance sheet will remain considerably above its pre-crisis levels even if reserves were to fall back to $100 billion (the black line). At its current growth rate, currency in circulation would reach $2 trillion by 2022 and $2.8 trillion in 2027. Even in the low case in which reserves decline to $100 billion, our balance sheet would be about $2.4 trillion in 2022 and would grow from there in line with currency demand. If the long-run level of reserves is $600 billion in 2022, then the balance sheet would be about $2.9 trillion. The appropriate long-run level of reserves will also depend on the operating framework the Committee chooses. Before the crisis, reserves were scarce, and the Committee used open market operations to control the federal funds rate by managing reserve supply. This process was operationally and resource intensive for the Desk and its counterparties. As a consequence of QE, however, reserves have been highly abundant and will remain so for some years. To affect financial conditions, the Federal Reserve has therefore used administered rates, including the interest rate paid on excess reserves (IOER) and, more recently, the offering rate of the overnight reverse repurchase is simple to operate and has provided good control over the federal funds rate. In November 2016, when the Committee discussed using a floor system as part of its longer-run framework, I was among those who saw such an approach as "likely to be relatively simple and efficient to administer, relatively straightforward to communicate, and effective in enabling interest rate control across a wide range of circumstances." Some have advocated a return to a framework similar to the pre-2007 system, in which the volume of reserves would likely be far below its present level and the federal funds rate would be managed by frequent open market operations. framework remains a feasible option, although, in my view, it may be less robust over time than a floor system. After a tumultuous decade, the economy is now close to full employment and price stability. The problems that some commentators predicted have not come to pass. Accommodative policy did not generate high inflation or excessive credit growth; rather, it helped restore full employment and return inflation closer to our 2 percent goal. The current discussions of the normalization of monetary policy are a result of that success.
r170620a_FOMC
united states
2017-06-20T00:00:00
Housing and Financial Stability
fischer
0
It is often said that real estate is at the center of almost every financial crisis. That is not quite accurate, for financial crises can, and do, occur without a real estate crisis. But it is true that there is a strong link between financial crises and difficulties in the real estate sector. In their research about financial crises, Carmen Reinhart and Ken Rogoff document that the six major historical episodes of banking crises in advanced economies since the 1970s were all associated with a housing bust. Plus, the drop in house prices in a bust is often bigger following credit-fueled housing booms, and recessions associated with housing busts are two to three times more severe than other recessions. perhaps most significantly, real estate was at the center of the most recent crisis. In addition to its role in financial stability, or instability, housing is also a sector that draws on and faces heavy government intervention, even in economies that generally rely on market mechanisms. Coming out of the financial crisis, many jurisdictions are undergoing housing finance reforms, and enacting policies to prevent the next crisis. Today I would like to focus on where we now stand on the role of housing and real estate in financial crises, and what we should be doing about that situation. We shall discuss primarily the situation in the United States, and to a much lesser extent, that in other countries. Why are governments involved in housing markets? Housing is a basic human need, and politically important--and rightly so. Using a once-popular term, housing is a merit good --it can be produced by the private sector, but its benefit to society is deemed by many great enough that governments strive to make it widely available. As such, over the course of time, governments have supported homebuilding and in most countries have also encouraged homeownership. Governments are involved in housing in a myriad of ways. One way is through incentives for homeownership. In many countries, including the United States, taxpayers can deduct interest paid on home mortgages, and various initiatives by state and local authorities support lower-income homebuyers. France and Germany created government-subsidized home-purchase savings accounts. And Canada allows early withdrawal from government-provided retirement pension funds for home purchases. And--as we all know--governments are also involved in housing finance. They guarantee credit to consumers through housing agencies such as the U.S. Federal Housing government also guarantees mortgages on banks' books. And at various points in time, jurisdictions have explicitly or implicitly backstopped various intermediaries critical to the mortgage market. Government intervention in the United States has also addressed the problem of the fundamental illiquidity of mortgages. Going back 100 years, before the Great Depression, the U.S. mortgage system relied on small institutions with local deposit bases and lending markets. In the face of widespread runs at the start of the Great Depression, banks holding large portfolios of illiquid home loans had to close, exacerbating the contraction. In response, the Congress established housing agencies as part of the New Deal to facilitate housing market liquidity by providing a way for banks to mutually insure and sell mortgages. In time, the housing agencies, augmented by post-World War II efforts to increase homeownership, grew and became the familiar government-sponsored enterprises, or mortgages from both bank and nonbank mortgage originators, and in turn, the GSEs bundled these loans and securitized them; these mortgage-backed securities were then sold to investors. The resulting deep securitized market supported mortgage liquidity and led to broader homeownership. While the benefits to society from homeownership could suggest a case for government involvement in securitization and other measures to expand mortgage credit availability, these benefits are not without costs. A rapid increase in mortgage credit, especially when it is accompanied by a rise in house prices, can threaten the resilience of the financial system. One particularly problematic policy is government guarantees of mortgage-related investors as having an implicit government guarantee, despite the GSEs' representations to the contrary. Because of the perceived guarantee, investors did not fully internalize the consequence of defaults, and so risk was mispriced in the agency MBS market. This mispricing can be notable, and is attributable not only to the improved liquidity, but also to implicit government guarantees. Taken together, the government guarantee and resulting lower mortgage rates likely boosted both mortgage credit extended and the rise in house prices in the run-up to the crisis. Another factor boosting credit availability and house price appreciation before the crisis was extensive securitization. In the United States, securitization through both public and private entities weakened the housing finance system by contributing to lax lending standards, rendering the mid-2000 house price bust more severe. Although the causes are somewhat obscure, it does seem that securitization weakened the link between the mortgage loan and the lender, resulting in risks that were not sufficiently calculated or internalized by institutions along the intermediation chain. For example, even without government involvement, in Spain, securitization grew rapidly in the early 2000s and accounted for about 45 percent of mortgage loans in 2007. Observers suggest that Spain's broad securitization practices led to lax lending standards and financial instability. Yet, as the Irish experience suggests, housing finance systems are vulnerable even if they do not rely on securitization. Although securitization in Ireland amounted to only about 10 percent of outstanding mortgages in 2007, lax lending standards and light regulatory oversight contributed to the housing boom and bust in Ireland. To summarize, murky government guarantees, lax lending terms, and securitization were some of the key factors that made the housing crisis so severe. Since then, to damp the house price-credit cycle that can lead to a housing crisis, countries worldwide have worked to create or expand existing macroprudential policies that would, in principle, limit credit growth and the rate of house price appreciation. Most macroprudential policies focus on borrowers debt-to-income (DTI) ratio limits aim to prevent borrowers from taking on excessive debt. The limits can also be adjusted in response to conditions in housing markets; for example, the Financial Policy Committee of the Bank of England has the authority to tighten LTV or DTI limits when threats to financial stability emerge from the U.K. housing market. Stricter LTV or DTI limits find some measure of success. One study conducted across 119 countries from 2000 to 2013 suggests that lower LTV limits lead to slower credit growth. In addition, evidence from a range of studies suggests that decreases in the LTV ratio lead to a slowing of the rate of house price appreciation. However, some other research suggests that the effectiveness of LTV limits is not significant or somewhat temporary. Other macroprudential policies focus on lenders . First and foremost, tightening bank capital regulation enhances loss-absorbing capacity, strengthening financial system resilience. In addition, bank capital requirements for mortgages that increase when house prices rise may be used to lean against mortgage credit growth and house price appreciation. These policies are intended to make bank mortgage lending more expensive, leading borrowers to reduce their demand for credit, which tends to push house prices down. Estimates of the effects of such changes vary widely: After consideration of a range of estimates from the literature, an increase of 50 percentage points in the risk weights on mortgages would result in a house price decrease from as low as 0.6 percent to as high as 4.0 percent. These policies are more effective if borrowers are fairly sensitive to a rise in interest rates and if migration of intermediation outside the banking sector to nonbanks is limited. Of course, regulatory reforms and in some countries, macroprudential policies--are still being implemented, and analysis is currently under way to monitor the effects. So far, research suggests that macroprudential tightening is associated with slower bank credit growth, slower housing credit growth, and less house price appreciation. Borrower, lender, and securitization-focused macroprudential policies are likely all useful in strengthening financial stability. Even though macroprudential policies reduce the incidence and severity of housing related crises, they may still occur. When house prices drop, households with mortgages may find themselves underwater, with the amount of their loan in excess of their home's current price. As Atif Mian and Amir Sufi have pointed out, this deterioration in household balance sheets can lead to a substantial drop in consumption and employment. Extensive mortgage foreclosures--that is, undertaking the legal process to evict borrowers and repossess the house and then selling the house--as a response to household distress can exacerbate the downturn by imposing substantial dead-weight costs and, as properties are sold, causing house prices to fall further. Modifying loans rather than foreclosing on them, including measures such as reducing the principal balance of a loan or changing the loan terms, can allow borrowers to stay in their homes. In addition, it can substantially reduce the dead-weight costs of foreclosure. Yet in some countries, institutional or legal frictions impeded desired mortgage modifications during the recent crisis. And in many cases, governments stepped in to solve the problem. For example, U.S. mortgage loans that had been securitized into private-label MBS relied on the servicers of the loans to perform the modification. However, operational and legal procedures for servicers to do so were limited, and, as a result, foreclosure, rather than modification, was commonly used in the early stages of the crisis. In 2008, new U.S. government policies were introduced to address the lack of modifications. These policies helped in three ways. First, they standardized protocols for modification, which provided servicers of private-label securities some sense of common practice. Second, they provided financial incentives to servicers for modifying loans. Third, they established key criteria for judging whether modifications were sustainable or not, particularly limits on mortgage payments as a percentage of household income. This last policy was to ensure that borrowers could actually repay the modified loans, which prompted lenders to agree more readily to the modification policies in the first place. Ireland and Spain also aimed to restructure nonperforming loans. Again, government involvement was necessary to push these initiatives forward. In Ireland, mortgage arrears continued to accumulate until the introduction of the Mortgage Arrears Resolution Targets scheme in 2013, and in Spain, about 10 percent of mortgages were restructured by 2014, following government initiatives to protect vulnerable households. Public initiatives promoting socially desirable mortgage modifications in times of crises tend to be accompanied by explicit public fund support even though government guarantees may be absent in normal times. With the recent crisis fresh in mind, a number of countries have taken steps to strengthen the resilience of their housing finance systems. Many of the most egregious practices that emerged during the lending boom in the United States--such as no- or low- documentation loans or negatively amortizing mortgages--have been severely limited. Other jurisdictions are taking actions as well. Canadian authorities withdrew government insurance backing on non-amortizing lines of credit secured by homes. The United States and the European Union required issuers of securities to retain some of the credit risk in them to better align incentives among participants (although in the United States, MBS issued by Fannie and Freddie are currently exempt from this requirement). And post- crisis, many countries are more actively pursuing macroprudential policies, particularly those targeted at the housing sector. tighter LTV limits or guidelines for areas that had overheating housing markets. Globally, the introduction of new capital and liquidity regulations has increased the resilience of the banking system. now the dominant providers of mortgage funding, and the FHLBs have expanded their balance sheets notably. House prices are now high and rising in several countries, perhaps as a result of extended periods of low interest rates. What should be done as we move ahead? First, macroprudential policies can help reduce the incidence and severity of housing crises. While some policies focus on the cost of mortgage credit, others attempt directly to restrict households' ability to borrow. Each policy has its own merits and working out their respective advantages is important. Second, government involvement can promote the social benefits of homeownership, but those benefits come at a cost, both directly, for example through the beneficial tax treatment of homeownership, and indirectly through government assumption of risk. To that extent, government support, where present, should be explicit rather than implicit, and the costs should be balanced against the benefits, including greater liquidity in housing finance engendered through a uniform, guaranteed instrument. Third, a capital regime that takes the possibility of severe stress seriously is important to calm markets and restart the normal process of intermediation should a crisis materialize. A well-capitalized banking system is a necessary condition for stability in bank-based financial systems as well as those with large nonbank sectors. This necessity points to the importance of having resilient banking systems and also stress testing the system against scenarios with sharp declines in house prices. Fourth, rules and expectations for mortgage modifications and foreclosure should be clear and workable. Past experience suggests that both lenders and borrowers benefit substantially from avoiding costly foreclosures. Housing-sector reforms should consider polices that promote efficient modifications in systemic crises. In the United States, as around the world, much has been done. The core of the financial system is much stronger, the worst lending practices have been curtailed, much progress has been made in processes to reduce unnecessary foreclosures, and the actions associated with the Housing and Economic Recovery Act of 2008 created some improvement over the previous ambiguity surrounding the status of government support for Fannie and Freddie. But there is more to be done, and much improvement to be preserved and built on, for the world as we know it cannot afford another pair of crises of the magnitude of the Journal of . . , vol. 17 . , vol. Journal of . . , vol. 25 . . Journal of Assessing the Macroeconomic Impact of the . vol. , vol. 3. vol. 107 .pdf . . . .
r170623a_FOMC
united states
2017-06-23T00:00:00
Central Clearing and Liquidity
powell
1
Thank you for inviting me to speak today. Darrell Duffie have provided a valuable public service in hosting this annual symposium on central clearing. I will start my remarks by taking stock of the progress made in strengthening central counterparties (or CCPs), and then offer some thoughts on central clearing and liquidity risks. counter derivatives positions contributed to the financial crisis and highlighted the risks in derivatives markets. In response, the Group of Twenty nations committed in 2009 to moving standardized derivatives to central clearing. Central clearing serves to address many of the weaknesses exposed during the crisis by fostering a reduction in risk exposures through multilateral netting and daily margin requirements as well as greater transparency through enhanced reporting requirements. Central clearing also enables a reduction in the potential cost of counterparty default by facilitating the orderly liquidation of a defaulting member's positions, and the sharing of risk among members of the CCP through some mutualization of the costs of such a default. But central clearing will only make the financial system safer if CCPs themselves are run safely. Efforts to set heightened expectations for CCPs and other financial market infrastructures have been ongoing for years, with the regulatory community working collectively to clarify and significantly raise expectations. These efforts resulted in the Principles for Financial Market Infrastructures (or PFMI), which was published in 2012 by the Committee on Payments and (IOSCO). The PFMI lays out comprehensive expectations for CCPs and other financial market infrastructures. Extensive work has been done to implement the PFMI. The joint CCP work plan agreed on Banking Supervision (or BCBS) laid out an ambitious program to provide further guidance on the PFMI and better understand interdependencies between CCPs and their members. CPMI and IOSCO will soon publish more granular guidance on CCP resilience, focusing on governance, stress testing credit and liquidity risk, margin, and recovery planning. While this guidance will not establish new standards beyond those set out in the PFMI, I believe that it will encourage CCPs and their regulators to engage in thoughtful dialogue on how they could further enhance their practices. The FSB has led the work on resolution planning for CCPs, publishing draft guidance this past February. The guidance covers a range of topics, including the powers that resolution authorities need in order to effectively resolve a CCP, the potential incentives related to using various loss-allocation tools in resolution, the application of the "no creditor worse off" safeguard, and the formation of crisis management groups--a key step in facilitating cross-border regulatory coordination in the event of a failure of a systemically important CCP. Ensuring the safety of the system also requires an understanding of the interdependencies between CCPs and their clearing members. Work on this is ongoing. Preliminary analysis confirms that there are large interdependencies between banks and CCPs, including common exposures related to financial resources held to cover market and credit risk, as well as common lending and funding arrangements. Having pushed for the move to greater central clearing, global authorities have a responsibility to ensure that CCPs do not themselves become a point of failure for the system. The progress I have just described is helping to meet this responsibility by making central clearing safer and more robust. Global authorities also have a responsibility to ensure that bank capital standards and other policies do not unnecessarily discourage central clearing. In my view, the calibration of the enhanced supplementary leverage ratio (SLR) for the U.S. global systemically important banks (G-SIBs) should be reconsidered from this perspective. A risk- insensitive leverage ratio can be a useful backstop to risk-based capital requirements. But such a ratio can have perverse incentives if it is the binding capital requirement because it treats relatively safe activities, such as central clearing, as equivalent to the most risky activities. There are several potential approaches to addressing this issue. For example, the BCBS is currently considering a proposal that would set a G-SIB's SLR surcharge at a level that is proportional to the G-SIB's risk-based capital surcharge. Taking this approach in the U.S. context could help to reduce the cost that the largest banks face in offering clearing services to their customers. The Federal Reserve is also considering other steps. First, we are developing an interpretation of our rules in connection with the movement of some centrally cleared derivatives to a "settled-to-market" approach. Under this approach, daily variation margin is treated as a settlement payment rather than as posting collateral. Under our capital rules, this approach reduces the need for a bank to hold capital against these exposures under risk-based and supplementary leverage ratios. Second, we are also working to move from the "current exposure method" of assessing counterparty credit risk on derivative exposures to the standardized approach for counterparty credit risk (SA-CCR). The current exposure method generally treats potential future credit exposures on derivatives as a fixed percentage of the notional amount, which ignores whether a derivative is margined and undervalues netting benefits. SA-CCR is a more risk-sensitive measurement of exposure, which would appropriately recognize the counterparty risks on derivatives, including the lower risks on most centrally cleared derivatives. CCPs are different from most other financial intermediaries in the sense that the CCP stands between two parties to a cleared transaction whose payment obligations exactly offset each other. A CCP faces market or credit risk only in the event that one of its members defaults and its required initial margin or other pre-funded financial resources are insufficient to cover any adverse price swings that occur during the period between the time of default and the time that the CCP is able to liquidate the defaulting party's positions. However, like most other financial intermediaries, CCPs do face liquidity risks. Their business model is based on timely payments and the ability to quickly convert either the underlying assets being cleared or non-cash collateral into cash. For this reason, CCPs should carefully consider liquidity when launching new products and only offer clearing of products that can be sold quickly, even in times of stress. Liquidity problems can occur in central clearing even if all counterparties have the financial resources to meet their obligations, if they are unable to convert those resources into cash quickly enough. The amount of liquidity risk that CCPs face can sometimes dwarf the amount of credit or market risk they face. This is particularly true for the clearing of cash securities such as Treasuries. In that case, the securities being cleared are extremely safe and likely to rise in value in times of stress. But in contrast to most cleared derivatives, the cash payments involved are very large because counterparties exchange cash for the delivery of the security on a gross basis. I will look at these risks from two perspectives, first in terms of the payments that CCPs must make, and then in terms of the payments they expect to receive. In the case of a member's default, a CCP must be equipped to make the cash payments owed to non-defaulting counterparties when due. This requirement can be met as long as there is sufficient margin, mutualized resources such as the guarantee fund, or the CCP's own resources held in cash and in the required currency. But if those funds are held in securities, then the CCP will need to convert them to cash, either by entering into a repurchase agreement, or using them as collateral to draw on a line of credit. And if the CCP holds either cash or securities denominated in a currency different from the one in which payment must be made, it will need to either engage in a spot FX transaction or in an FX swap. Principle seven of the PFMI addresses these liquidity risks, calling for all CCPs to meet a "Cover 1 standard"--that is, to hold enough liquid resources in all relevant currencies to make payments on time in the event of the default of the clearing member that would generate the largest payment obligation under a wide range of potential stress scenarios. More complex CCPs and those with a systemic presence in multiple jurisdictions are encouraged to meet a "Cover 2 standard." The PFMI also provide guidance on the resources that qualify in meeting these requirements: cash held at a central bank or a creditworthy commercial bank, committed lines of credit, committed repurchase agreements, committed FX swap agreements, or highly marketable collateral that can be converted into cash under prearranged and highly reliable funding arrangements. There are two sets of risks involved here. First, where should CCPs put their cash? As central clearing has expanded, CCPs have had to deal with increasingly large amounts of cash margin. CCPs can deposit some of these funds with commercial banks. But regulatory changes have made it more expensive for banks to take large deposits from other financial firms, and in some cases banks may be unwilling to accept more cash from a CCP. And many of the largest banks are also clearing members, which introduces a certain amount of wrong-way risk. A clearing-member default could be especially fraught if the defaulting bank also held large cash balances for the CCP. For this reason, CCPs may prudently place limits on the amount deposited at a given institution. In order to diversify their holdings, many also place cash in the repo market. If it is available, the ability to deposit cash at a central bank allows for another safe, flexible, and potentially attractive option--a subject I will return to later. Second, how can CCPs be assured that they will be able to convert securities into cash or draw on other resources in times of stress? The PFMI uses the words "committed" and "pre- arranged" in describing qualifying liquid resources. Indeed, the PFMI does not view spot transactions on the open market as reliable sources of liquidity during times of stress. The Federal Reserve has strongly supported this approach. Liquidity plans should not take for granted that, at a time of stress involving a member default, lines of credit, repurchase agreements, or FX swaps could be arranged on the spot. Committed sources of liquidity are more likely to be available. They also allow market participants and regulators to make sure that plans are mutually consistent. If a CCP has arranged for a committed liquidity source, then the provider should account for it in its own plans and demonstrate that it can meet its commitment. Of course, this liquidity is not free, nor should it be. Regulatory changes have forced banks to closely examine their liquidity planning and to internalize the costs of liquidity provision. The costs of committed liquidity facilities will be passed on to clearing members. These costs are perhaps highest in clearing Treasury securities, where liquidity needs can be especially large. To meet its estimated needs, DTCC's Fixed Income Clearing Corporation (FICC) is planning to institute a committed repo arrangement with its clearing members. Despite initial concerns, the industry seems prepared to absorb these costs, but they will not be trivial for many members. While initial and variation margin help mitigate credit risk in central clearing, they can also create liquidity risk. Clearing members and their clients are required to make margin payments to CCPs on a daily basis, and in times of market volatility these payments may rise dramatically. This source of liquidity risk can occur even in the absence of a default. For example, after the UK referendum on Brexit, the resulting price swings triggered many CCPs to make substantial intraday and end-of-day margin calls. Fortunately, members had prepared and were able to make the needed payments, but the sums involved caught many off guard and the experience served as a useful warning. five CCPs requested $27 billion in additional margin over the two days following the referendum, about five times the average amount. In most cases, clearing members have an hour to meet intraday margin calls. Clearing members have no choice but to hold enough liquid resources to meet the range of possible margin calls, as the consequences of missing a margin call are considerable. Brexit was only the most recent example in which margin calls were unexpectedly large. Margin calls were also quite large after the stock market crash of October 1987. That episode helps to demonstrate how complicated payments flows can be and why liquidity risk also needs to be viewed from a macroprudential perspective, considering potential risks to flows across the market index fell about 20 percent. Margin calls were about 10 times their normal size, and caused a very complicated set of payment flows across multiple exchanges, CCPs, and banks. The next slide helps to represent the ensuing payment flows. Calls requesting payment are on the left in red. When making a margin call, a CCP requests payment from its clearing members. Clearing members in turn request margin payments from the customers for whom they are clearing, and those customers must then direct their bank to make the payment. If everything works as it should, payments (in green, on the right) will ensue. The customer's bank will deliver the requested payment to the clearing member's bank (or payment bank). The payment bank will then deliver the funds to a settlement bank used by the CCP, and the settlement bank will then credit the funds to the CCP. In theory, each of these payments would have to happen sequentially, but often parties offer intraday credit (represented by the dashed orange lines) to help smooth these flows. For example, the settlement bank may provide intraday credit to the clearing member, sending funds to the CCP before the member has delivered funds to the payment bank or before those funds have been transferred to the settlement bank. Clearing members or the customer's bank may also provide intraday credit to their customers, again making payments before funds have been received in order to help speed the payments chain. Over the course of October 19, 1987, the system worked largely as I have just described. But on October 20, every single link in these payments and credit chains was interrupted. This is represented graphically in the next slide. By that morning, many settlement banks and clearing members had yet to receive offsetting payments for credit that had been extended the previous day. Goldman Sachs and Kidder, Peabody had together extended $1.5 billion in credit that had not yet been paid. As a result, some firms pulled back on providing further credit, which then forced each link of the payments chain to operate sequentially. Payments slowed, with the unintended consequence that uncovered positions grew larger and stayed open for even longer. Without credit, some customers were unable to meet their margin calls and were forced to liquidate their positions. This gridlock sparked fears that a clearing member or even a CCP would be forced to default. The Federal Reserve reacted to this threat by encouraging banks to continue to extend credit and by injecting funds into the system to help ensure that credit was available. The 1987 stock market crash did not leave much lasting impact on the economy, but if these liquidity problems had been allowed to cause the default of a major clearing member or even a CCP then it could have had a much more serious impact. While this might seem like simply an interesting bit of history, the payments chains involved in central clearing are still very similar today. To guard against the same sorts of liquidity risk today, we need to make sure that every link in these chains will work as it is intended to under stress. While neither Brexit nor the October 1987 crash involved a clearing member default, these incidents do point to the potential complications of such a default. I have already discussed the steps that a CCP might need to take in the event of a default to meet its liquidity needs. Some of those needs, such as committed lines of credit or repo agreements, could involve tapping financial resources at the same banks that are clearing members. Thus, clearing members may need to juggle several different liquidity exposures simultaneously in the event of a default. They may face draws on committed sources of liquidity, and if there are market stresses around the default, which seems a near certainty, they may at the same time face a sudden increase in intraday margin calls and their own internal demands for more liquid resources. These are risks that we should seek to understand better. The Federal Reserve is the primary supervisory authority for two designated financial market utilities (or DFMUs), and plays a secondary role relative to the six other DFMUs. As a central bank, we are particularly concerned with liquidity issues. I will address four policy issues that need careful consideration as the public sector and market participants continue to address liquidity risks in central clearing. The 1987 stock market crash showed that we need to look at liquidity risks from a systemwide perspective. That event involved multiple CCPs and also involved multiple links in the payments chains between banks and CCPs. Conducting supervisory stress tests on CCPs that take liquidity risks into account would help authorities better assess the resilience of the financial system. A stress test focused on cross-CCP liquidity risks could help to identify assumptions that are not mutually consistent; for example, if each CCP's plans involve liquidating Treasuries, is it realistic to believe that every CCP could do so simultaneously? Authorities in both the United States and Europe have made progress in conducting supervisory stress tests of CCPs. In the United States, the CFTC conducted a useful set of tests of five major CCPs last year. The tests analyzed each individual institution's ability to withstand the credit risks emanating from the default of one or more clearing members. This was innovative and necessary work. It would be useful to build on it by adding tests that focus on liquidity risks across CCPs and their largest common clearing members. Such an exercise could focus on the robustness of the system as a whole rather than individual CCPs. The European Securities and Markets Authority is already expanding its supervisory stress testing exercise to incorporate liquidity risk. A similar exercise here in the United States should be seriously considered. The industry collectively needs to ensure that the liquidity flows involved in central clearing are handled efficiently and in a way that minimizes potential disruption. As I noted, there was some concern about the size of margin calls following Brexit, and certain CCPs have taken measures to address this. For example, LCH has subsequently made changes to its intraday margining procedures in an effort to reduce liquidity pressures on its clearing members, allowing them to offset losses on their client accounts with gains on the house account. Other CCPs are also actively engaged in efforts to increase their efficiency. FICC is looking at potential solutions using distributed ledger technology to clear both legs of overnight repo trades, which could allow for greater netting opportunities and thereby reduce potential liquidity needs. Several CCPs are also looking at ways to expand central clearing to directly include more buy-side firms, which could also offer greater netting opportunities. Doing so could also offer new sources of liquidity if the new entrants are able to take part in the CCP's committed liquidity arrangements. Diversification of sources of liquidity would offer tangible benefits--CCPs would avoid relying on the same limited set of clearing members for all of their liquidity needs. As one example, the Options Clearing Corporation established an innovative pre-funded, committed repurchase facility with a leading pension fund. As regulators, we should encourage innovations that increase clearing efficiency and reduce liquidity risks where they meet the PFMI and our supervisory expectations. As I discussed earlier, CCPs have a complicated set of decisions on how and where to hold their cash balances. Title VIII of the Dodd-Frank Act authorized the Federal Reserve to establish accounts for DFMUs, and we now have accounts with each of the eight institutions that the Financial Stability Oversight Council has so designated. These accounts permit DFMUs to hold funds at the Federal Reserve, but not to borrow from it. Allowing DFMUs to deposit balances at the Federal Reserve helps them avoid some of the risk involved in holding balances with their clearing members. Doing so also provides CCPs with a flexible way to hold balances on days when margin payments unexpectedly spike and it is difficult to find banks that are willing to accept an unexpected influx in deposits. In such a case, it may also be too late in the day to rely on the repo market. The availability of Fed accounts could help avoid potential market disruptions in those types of circumstances. The lessons from Brexit also point to the need for cross-border cooperation. Brexit triggered payments flows to CCPs across many jurisdictions. As far as liquidity risks are concerned, it is immaterial whether a CCP is based in the United States or abroad so long as it clears U.S. dollar denominated assets and must make and receive U.S. dollar payments. There are different possible approaches to such cross-border issues. Efforts to address these liquidity risks should carefully take into consideration the effect that they would have on the broader financial system. For example, splintering central clearing by currency area would fragment liquidity and reduce netting opportunities, which in the case of events like Brexit could actually trigger even greater liquidity risk. In my opinion, we should be searching for cooperative solutions to these issues. In the years following the financial crisis, one of the primary lessons for market participants and their regulators was the criticality of liquidity risk management. Financial firms such as Lehman Brothers and AIG struggled to obtain sufficient liquidity to meet their obligations. Liquidity is also a crucial concern in central clearing, and while regulatory reforms have done much to strengthen both CCPs and their clearing members, we should continue to make progress in creating a more robust and efficient system.
r170626a_FOMC
united states
2017-06-26T00:00:00
Remarks
powell
1
I appreciate the opportunity to speak at the Salzburg Global Seminar. Today I will discuss our current regulatory regime, and areas where we may be able to make adjustments. As always, the views I express here are my own. We need a resilient, well-capitalized, well-regulated financial system that is strong enough to withstand even severe shocks and support economic growth by lending through the economic cycle. The Federal Reserve has approached the post-crisis regulatory and supervisory reforms with that outcome in mind. There is little doubt that the U.S. financial system is stronger today than it was a decade ago. Loss-absorbing capacity among banks is substantially higher as a result of both regulatory requirements and stress testing exercises. The banking industry, and the largest banking firms in particular, face far less liquidity risk than before the crisis. And progress in resolution planning by the largest firms has reduced the threat that their failure would pose. These efforts have made U.S. banking firms both more robust and more resolvable. Evidence overwhelmingly shows that financial crises can cause severe and lasting damage to the economy's productive capacity and growth potential. Post-crisis reforms to financial sector regulation and supervision have been designed to significantly reduce the likelihood and severity of future financial crises. We have sought to accomplish this goal in significant part by reducing both the probability of failure of a large banking firm and the consequences of such a failure were it to occur. As I mentioned, we have substantially increased the capital, liquidity, and other prudential requirements for large banking firms. These measures are not free. Higher capital requirements increase bank costs, and at least some of those costs will be passed along to bank customers and shareholders. But in the longer term, stronger prudential requirements for large banking firms will produce more sustainable credit availability and economic growth. Our objective should be to set capital and other prudential requirements for large banking firms at a level that protects financial stability and maximizes long-term, through-the-cycle credit availability and economic growth. To accomplish that goal, it is essential that we protect the core elements of these reforms for our most systemic firms in capital and liquidity, stress testing and resolution. With that in mind, I will highlight five key areas of focus for regulatory reform. The first is simplification and recalibation of regulation of small and medium-sized banks. We are working to build on the relief we have provided in the areas of call reports and exam cycles, by developing a proposal to simplify the generally applicable capital framework that applies to community banking organizations. The second area is resolution plans. The Fed and the Federal Deposit Insurance Corporation believe that it is worthwhile to consider extending the cycle for living will submissions from annual to once every two years, and focusing every other of these filings on key topics of interest and material changes from the prior full plan submission. We are also considering other changes, as I discussed last week when testifying to Congress. Third, the Federal Reserve is reassessing whether the Volcker rule implementing regulation most efficiently achieves its policy objectives, and we look forward to working with the other four Volcker rule agencies to find ways to improve that regulation. In our view, there is room for eliminating or relaxing aspects of the implementing regulation in ways that do not undermine the Volcker rule's main policy goals. Fourth, we will continue to enhance the transparency of stress testing and the concerning possible forms of enhanced disclosure, including a range of indicative loss rates predicted by the Federal Reserve's models for various loan and securities portfolios, and information about risk characteristics that contribute to the loss-estimate ranges. We will also provide more detail on the qualitative aspects of stress testing in next week's CCAR disclosure. Finally, the Federal Reserve is taking a fresh look at the enhanced supplementary leverage ratio. We believe that the leverage ratio is an important backstop to the risk-based capital framework, but that it is important to get the relative calibrations of the leverage ratio and the risk-based capital requirements right. U.S. banks today are as strong as any in the world. As we consider the progress that has been achieved in improving the resiliency and resolvability of our banking industry, it is important for us to look for ways to reduce unnecessary burden. We must also be vigilant against new risks that may develop. In all of our efforts, our goal is to establish a regulatory framework that helps ensure the resiliency of our financial system, the availability of credit, economic growth, and financial market efficiency. We look forward to working with our fellow regulatory agencies and with Congress to achieve these important goals. And finally, I would also like to note that work continues to address the risks identified with existing reference rates. Just last week, the Alternative Reference Rates Committee (ARRC) selected a new rate suitable for use with new derivative contracts. I am confident the broad Treasuries repo rate, which the Federal Reserve Bank of New York has proposed publishing in cooperation with the Office of Financial Research, is based on a deep and actively traded market and will be highly robust. With this choice, the ARRC has taken another step in addressing the risks involved with the LIBOR.
r170627a_FOMC
united states
2017-06-27T00:00:00
An Assessment of Financial Stability in the United States
fischer
0
In the years since the start of the global financial crisis, an enormous amount of effort has gone into ensuring that we have a robust financial system that promotes responsible risk taking and an efficient allocation of resources. But despite these efforts, financial stability cannot be taken for granted, for financial decisions that benefit the people who make them can create systemic risk and harm society as a whole. Further, the phenomenon familiar from macroeconomics--and for that matter from life--of decisions that result in short-run happiness and long-run grief is visible also in the area of financial stability. For example, excessive leverage and reliance on short-term funding, which may reward risk takers whose bets pay off, may also increase the risk of fire sales and contagion, creating a fragile financial situation. The disruption in credit intermediation that typically accompanies such episodes can have lasting negative consequences for the real economy and welfare--some of which we are still seeing today. The Federal Reserve's financial stability responsibilities therefore strongly complement its dual-mandate objectives of achieving price stability and full employment. Today I will review the monitoring framework we have implemented at the Federal Reserve, before providing an assessment of current U.S. financial stability conditions. I will conclude by arguing that while significant progress has been made in recent years toward making the financial system more stable and resilient, we should not ever be complacent. We still lack sufficient information to understand some parts of the shadow banking system, and risks sometimes evolve outside the scope of prudential regulation, with potentially negative implications for financial stability. And sometimes we fail to understand the situation in which we find the financial system and the economy. Participants in today's workshops have grappled with the question of what is the best framework to monitor financial stability. One approach is to focus on trends in, and interactions among, financial vulnerabilities across financial institutions, markets, and instruments. Another approach is to track the resilience of institutions , either broad categories or individual systemically important institutions. Good monitoring frameworks combine elements of both. Let me start with the vulnerabilities-focused approach, as developed by Tobias That approach defines financial vulnerabilities as a collection of factors that may amplify financial shocks and, when elevated, have the potential to generate systemic risk. The focus is on vulnerabilities rather than shocks, because the timing of shocks, such as sudden drops in asset prices, are inherently hard to predict. Vulnerabilities, in contrast, tend to build up over time, and policies can be designed to help contain these vulnerabilities, reducing the likelihood of systemic events. Some financial vulnerabilities are cyclical in nature, rising and falling over time, while others are structural, stemming from longer-term forces shaping the nature of credit intermediation. Informed by academic research, some of it in-house, we at the Federal Reserve focus on four broad cyclical vulnerabilities: (1) financial-sector leverage, (2) nonfinancial-sector borrowing, (3) liquidity and maturity transformation, and (4) asset valuation pressures. Briefly, leverage, across a range of institutions, is a key amplifier of solvency shocks, leading to a greater chance of a credit crunch or fire sale. Liquidity and maturity mismatches can generate run risk, leading to fire sales and contagion. Finally, elevated valuation pressures, especially when combined with high leverage, can lead to excessive credit growth. When asset prices are appreciating rapidly and expected to continue to do so, borrowers and lenders are more willing to accept higher degrees of risk and leverage. Using a range of indicators, we assess these cyclical vulnerabilities relative to past experience. That is, we evaluate where the current levels of these indicators stand compared with their historical values, to identify whether they point to a low, average, or high level of vulnerabilities. While we try to rely on quantitative indicators, in the end, this evaluation requires some degree of judgment. We also closely monitor potential structural vulnerabilities, such as funding models and institutions that provide critical plumbing services to the system. Because these structural vulnerabilities are less amenable to traditional quantitative monitoring, their identification and assessment follow a less formal process. I will leave that discussion to another time. As mentioned, complementary to the vulnerabilities-oriented approach is an approach that focuses on institutions. If the financial system is overleveraged, that vulnerability has to be evident at particular institutions. An institutions-oriented framework can help us keep track of sector- or institution-specific structural vulnerabilities that may be masked by our overall assessment and provides additional ways to understand how distress at a particular institution or class of institutions may spill over to the wider financial system. Regardless of whether we are looking at vulnerabilities or institutions, a key feature of this monitoring framework is its forward-looking nature. For example, evidence suggests that periods of elevated risk appetite are frequently accompanied by a rise in leverage at financial intermediaries. This evidence implies that elevated asset valuation pressures today may be indicative of rising vulnerabilities tomorrow. Of course, while a framework provides a disciplined way to evaluate financial stability, we constantly evaluate the framework so that we can identify new risks and vulnerabilities, which may arise as the financial system evolves--for example, in response to market-driven innovation or regulatory reform. Federal Reserve staff research helps us understand and evaluate the evolving, dynamic financial system. Before turning to the assessment of the current state of U.S. financial stability, let me discuss how we communicate our views on this matter. The Federal Reserve, unlike many other central banks, does not publish a financial stability report. The United States already has two congressionally mandated financial stability reports, one authored by the independent Office of Financial Research and a separate report published by the Financial Stability Oversight Council that represents the views of the range of financial regulators, including the Federal Reserve. Additional views of Federal Reserve officials can be reflected in a range of other venues, including, notably, the Board's semiannual the Board's annual report, and speeches, such as this one. That was the framework, now for the current assessment: In the interest of time, my main focus will be on the four cyclical vulnerabilities--leverage, borrowing by households and non-financial firms, liquidity and maturity transformation, and asset valuations--but I will also briefly touch on the most salient structural vulnerabilities. To summarize the assessment, overall, a range of indicators point to vulnerability that is moderate when compared with past periods: Leverage in the financial sector is at historically low levels, and, following the reforms of money market mutual fund associated with liquidity and maturity transformation appear to have decreased. However, the increase in prices of risky assets in most asset markets over the past six months points to a notable uptick in risk appetites, although this shift has not yet led to a pickup in the pace of borrowing or a sizable rise in leverage at financial institutions. To start with, leverage: Regulatory capital at large banks is now at multidecade highs. The largest banks have already met their fully phased-in capital requirements, including the conservation buffer and the capital surcharge for the global systemically important banks. Also, the largest banks have been able to meet the post-stress capital requirements in the past couple of stress-test exercises run by the Federal Reserve. Measures of earnings strength, such as the return on assets, continue to approach pre- crisis levels at most banks, although with interest rates being so low, the return on assets might be expected to have declined relative to their pre-crisis levels--and that fact is also a cause for concern. Borrowing by households and businesses In the private nonfinancial sector, which includes corporations and households, total debt remains well below its long-run trend, largely driven by subdued borrowing among households. However, the corporate business sector appears to be notably leveraged, with the current aggregate corporate-sector leverage standing near 20-year highs. Some studies, including one by the International Monetary Fund in this April's , have recently highlighted this vulnerability, so let me briefly offer my perspective. Despite the elevated levels of corporate borrowing, recent developments show signs of improvement. Leverage has declined slightly since its peak a year ago, and firms with high debt growth appear relatively healthy. Interest expenses relative to earnings also declined of late and are below their median value since 2001. Furthermore, the fraction of corporate debt due within one year relative to total debt stands at historically low levels. Thus, positive shocks to interest rates may adversely affect the ability of some firms to service debt, but this risk may have only limited system-wide effects. Nonetheless, elevated leverage leaves the corporate sector vulnerable to other shocks, such as earnings shocks. In the household sector, new borrowing is driven mostly by borrowers with higher credit scores, and the amount of debt that borrowers have relative to their incomes is falling, suggesting that the debt is more manageable. That said, two pockets in the household sector deserve scrutiny. Auto loan balances and delinquency rates are high for borrowers with lower credit scores, meaning that the riskiest borrowers are borrowing more and not paying it back as often. Of note, delinquencies on recently issued auto loans have also increased, indicating that underwriting standards in the auto loan industry may be deteriorating. Student loan balances keep rising, and delinquency rates on those loans are near historical highs. These strains within the household sector leave such borrowers vulnerable to adverse shocks and probably weigh on their spending. At first glance, one is tempted to say that the potential for this distress to adversely affect the financial system seems moderate, because both subprime auto loan and student loan borrowers account for a small share of other debt categories. But, on second thought, one should remember that pre-crisis subprime mortgage loans were dismissed as a stability risk because they accounted for only about 13 percent of household mortgages, and not take excessive confidence. Liquidity and maturity transformation Similar to my assessment of leverage, I believe that the primary vulnerability associated with liquidity and maturity transformation--that of a self-fulfilling run--is relatively low. In recent years, banks have shifted away from more run-prone short-term wholesale sources of funds toward more stable sources such as core deposits. Large domestic banks have also significantly boosted their holdings of high-quality liquid assets, making them more resilient to funding stress. In the nonbanking sector, the SEC revised the regulations governing money market mutual funds, first adopted in 2014, with the aim of reducing the key structural vulnerabilities exposed by the massive and destabilizing run on the funds in late 2008. The second round of reforms went into full effect in October 2016; ahead of this date, $1.2 trillion flowed out of prime money funds--the more fragile funds that also provide direct funding to large banking institutions--toward government money funds, which are constrained to hold government-guaranteed assets. Those assets include repurchase agreements (or repos) with private banks backed by Treasury securities and the liabilities of government-sponsored enterprises, such as, notably, the Federal Home Loan Banks (FHLBs). While the current configuration of money markets reveals a reduced financial stability risk compared with the situation prior to the recent reforms, this configuration may not yet represent the final equilibrium. It will be important to keep an eye on the growth of alternative investment vehicles that perform liquidity transformation in money markets. Of note, in part supported by increased demand from government-only money funds, the FHLB system has increased its issuance of shorter-maturity liabilities, which are more attractive to money funds. In turn, this development has led to an increase in the FHLB system's maturity transformation because its assets--loans to banks and insurance companies--have remained relatively long maturity. As a result, the FHLBs face an increased need to roll over maturing liabilities and thus greater vulnerability should they encounter liquidity pressures. I should note that the FHLBs' regulator, the than a year ago and is working with the FHLBs (the FLUBS) to address it. Asset valuations Let me conclude my assessment of current financial stability conditions with a discussion of asset valuation pressures. Prices of risky assets have increased in most major asset markets in recent months even as risk-free rates also rose. In equity markets, price-to-earnings ratios now stand in the top quintiles of their historical distributions, while corporate bond spreads are near their post-crisis lows. Prices of commercial real estate (CRE) have grown faster than rents for some time, and measures of the amount of operating income relative to the sale price of commercial properties--the capitalization rate--have reached historical lows, suggesting continued pressures in the CRE market despite some tightening in credit conditions. Valuation pressures in single-family residential real estate markets appear, at most, modest, with price-to-rent ratios only slightly higher than their long-run trend. The general rise in valuation pressures may be partly explained by a generally brighter economic outlook, but there are signs that risk appetite increased as well. For example, estimates of equity and bond risk premiums are at the lower end of their historical distributions, and, relative to some non-price-based measures of uncertainty, the implied volatility index VIX is particularly subdued. So far, the evidently high risk appetite has not lead to increased leverage across the financial system, but close monitoring is warranted. Let me conclude by offering my view on where we stand in our effort to promote financial stability in the United States. There is no doubt the soundness and resilience of our financial system has improved since the 2007-09 crisis. We have a better capitalized and more liquid banking system, less run-prone money markets, and more robust resolution mechanisms for large financial institutions. However, it would be foolish to think we have eliminated all risks. For example, we still have limited insight into parts of the shadow banking system, and--as already mentioned--uncertainty remains about the final configuration of short-term funding markets in the wake of money funds reform. The U.S. financial system is inherently dynamic, with a range of institutions competing to offer a changing mix of financial products. New financial technologies promise great benefits but will no doubt carry novel risks. As a result, we monitor these vulnerabilities, and we are vigilant with respect to economic and financial developments across markets and institutions within the United States and around the world. And we know that complacency must be avoided. Journal of . .
r170706a_FOMC
united states
2017-07-06T00:00:00
Government Policy and Labor Productivity
fischer
0
I want to talk tonight about labor productivity growth. Labor productivity is the amount of goods and services produced per hour spent on the job. Increases in labor productivity--again, that's the amount of goods and services produced per hour on the job--are a fundamental factor in determining how fast the economy grows, and how fast the average standard of living grows. And productivity growth can be influenced by government policy, about which I also want to say a few words. Labor productivity growth varies a lot from year to year, but it is possible to discern longer historical periods with high or low productivity growth, as shown in figure 1. For example, labor productivity rose at an average annual rate of 3-1/4 percent from 1948 to 1973, whereas in the period 1974 to 2016, the average growth rate of productivity was about 1.7 percent. That is to say that, with the important exception of the information technology (IT) boom beginning in the mid-1990s, the U.S. economy has been in a low-productivity growth period since 1974. The record for the past five years has been particularly dismal. How much does productivity growth matter? A great deal. The person who made that clear, in an article published in 1957, 60 years ago, Professor Robert Solow, is here tonight. That is a pleasure, an honor, a joy, and something of a difficulty for anyone wanting to talk about productivity and its growth in the presence of the master. The reason the rate of productivity growth matters so much is that it is a basic determinant of the rate of growth of average income per capita over long periods. understand that one needs to know only the trick of calculating how long it takes for a growing economy to double. A good rule of thumb for calculating the time it takes labor productivity (or anything else that is growing) to double can be calculated by dividing 70 by the growth rate. When labor productivity was growing at 3-1/4 percent per year--during the 25 years from 1948 to 1973--it took 22 years for labor productivity to double. Looking again at Figure 1, in the 42 years from 1974 to 2016, when labor productivity was growing on average at a rate of 1-3/4 percent, it would have taken approximately 41 years for labor productivity to double. There is a vast difference between the prospects facing the young in an economy where incomes per capita are doubling every 22 years and an economy in which incomes are on average doubling only every 41 years. Now, productivity statistics are imperfect in many respects--for example, capturing the value of the seemingly free apps we use on our smartphones is challenging. And many of us who live in the modern age cannot believe that the iPhone has not fundamentally changed our lives. It has certainly changed our lives to some extent, and there is likely some underestimation of productivity growth in the official data. But to figure out whether the current degree of data bias has reduced estimated growth, we have to ask not whether there is bias, but whether the bias has increased. To a first approximation, one could assume that the rate of bias is constant, and does not account for the estimated decline in productivity growth and that we should not dismiss the slowdown as an artifact of measurement difficulties That is the conclusion most researchers reach, but the data issue is not settled. As Bob Solow famously said, just before the increase in productivity growth of 1996-2003, "the computer is everywhere except in the growth data." And there are serious researchers who have made serious arguments that we will soon be seeing more rapid growth in the productivity data. Factors determining productivity growth Clearly, a key question for economic forecasters, and even more so for U.S. citizens, and indeed for the entire global economy, is whether we should anticipate a return of the more rapid productivity gains experienced in the IT boom and for the quarter century after the end of World War II, or should instead resign ourselves to tepid economic growth in future years. And a central policy issue is whether government policies can help push the economy toward a higher-productivity regime. In this context, it is useful to think of labor productivity growth as coming from three sources, as shown in figure 2. First, greater investment by firms in tangible equipment and structures, as well as "intangible" investments such as software and product designs, raise labor productivity. Second, improvements in labor quality, or the capabilities of the workforce, contribute as well--through education, training, and experience. Finally, innovations yield more or better output from the same inputs--the same capital and labor--such as the introduction of the assembly line and computer-aided product design. I will consider the role that policy may play through each of these channels. It is noteworthy that most of the recent drop in productivity is due to a lower contribution from innovation, although weaker investment has played a role as well. The contribution to labor productivity from labor quality has changed very little. Our prospects for further significant technological innovations are hotly debated. Some observers believe that we have exhausted the low-hanging fruit on the productivity tree and, in particular, that efficiency gains from the use of IT have run their course. Other observers argue that we can reach fruit higher on the tree with each passing year. These observers believe that innovation yields better tools, such as 3-D printers and genetic sequencing equipment, which themselves enable further technological advances. For what it is worth, I believe the early signs of self-driving cars, the emergence of disease treatments based on genetics, and the falling costs for conventional and alternative energy production suggest that we are continuing to innovate, both in IT as well as in other parts of the economy. One possibility is that we are in a productivity lull while firms reorganize to exploit the latest innovations; it took decades before the full benefits of the steam engine, electrification, and computers were seen. One way to ensure the vigor of innovation is to support research and development since recovered. However, government-funded R&D as a share of gross domestic product is at the lowest level in recent history. A great deal of the "R" in overall R&D is government funded and not tied to a specific commercial goal. The applied research built on this basic research ultimately yields productivity gains far into the future. Consequently, the decline in government-funded R&D is disturbing. To raise productivity and economic well-being, firms must adopt innovations that emerge from R&D as quickly as possible. This adjustment may occur as start-ups introduce innovation to the market, as existing innovative firms expand, or as competing firms imitate the innovators. Recent research suggests that all three of these channels, which reflect the economic dynamism of businesses, have been operating sluggishly of late: New firms are not created as often as in the past, innovative firms are not hiring or investing as aggressively as they once did, and the diffusion of innovations is weak from frontier firms to trailing firms. It is difficult to pinpoint specific policy actions that would address this decline in dynamism. Broadly speaking, however, government policymakers should carefully consider the effects of regulations and tax policy on the free flow of workers, capital, and ideas. In recent years, the contribution to labor productivity growth from investment has declined. Business fixed investment rose roughly 2-1/2 percent per year, on average, from 2004 to 2016, compared with about 5 percent from 1996 to 2003. Some bright spots do exist: Capital expenditure by leading IT companies--Google, Amazon, and the like--has soared since 2010, and investment in the energy sector has returned to life. Nevertheless, firms as a whole seem reluctant to invest. This cautious approach to investment may in part reflect uncertainty about the policy environment. By one measure, U.S. policy uncertainty was elevated for much of the recovery, subsided in 2013, and then rose again late last year, underpinned by uncertainty about policies associated with health care, regulation, taxes, and trade. Reasonable people can disagree about the right way forward on each of those issues, but mitigating the damping effect of uncertainty by providing more clarity on the future direction of government policy is highly desirable--particularly if the direction of policy itself is desirable. Government investment can be an important source of productivity growth as well. For example, the interstate highway system is credited with boosting productivity in the 1950s and 1960s. That highway system and many other federally supported roadways, waterways, and structures have been neglected in recent years. Indeed, real infrastructure spending (that is, adjusting for inflation) has fallen nearly 1 percent per year since 2005. This area of government investment deserves more attention. Also important to raising labor productivity is investment in human capital-- workers' knowledge and skills. Such investment is a particular issue because most forecasts anticipate that the long rise in educational attainment--both for college and high school--may soon come to an end. One area where policy may play a role is promoting educational access and readiness for groups for whom educational attainment is relatively low. Recent research has shown a substantial return to public investment in early childhood education for economically disadvantaged groups. Such programs increase high school graduation, promote income over the life cycle for both participants and their parents, and produce other socially beneficial outcomes, such as greater health. At the other end of the education process, a college degree has long been considered a worthwhile investment, and thus our society should promote access to and readiness for college among a broad range of individuals--in particular through federal support for need-based financial aid. Lastly, I will note that ultimately the return on the human capital embodied in our workforce is closely tied to public health. A rise in morbidity or fall in longevity in the U.S. population is not a concern only for humanitarian reasons. Workers too ill to perform at their potential represent lost productivity and welfare for society as a whole. Research has shown just such a trend among prime-age non-Hispanic Americans without a college degree. More study is needed to determine what policies would help reverse this trend, and government funding could likely assist the effort. More broadly, programs to promote clean air and drinking water are examples of public health policies that bolster the health and longevity of the present and future workforce as a whole. Concluding remarks To conclude, we return to the basic question: How much does productivity growth matter? The basic answer: simple arithmetic says it matters a lot. If labor productivity grows an average of 2 percent per year, average living standards for our children's generation will be twice what we experienced. If labor productivity grows an average of 1 percent per year, the difference is dramatic: Living standards will take two generations to double. But fortunately, when it comes to productivity, we are not simply consigned to luck or to fate. Governments can take sensible actions to promote more rapid productivity growth. Broadly speaking, government policy works best when it can address a need that the private sector neglects, including investment in basic research, infrastructure, early childhood education, schooling, and public health. Reasonable people can disagree about the right way forward, but if we as a society are to succeed, we need to follow policies that will support and advance productivity growth. That is easier said than done. But it can be done. Frontier firms, technology diffusion and public policy: Micro evidence from OECD countries, . . . . Historical statistics of the United States, colonial times to 1970 .
r170706b_FOMC
united states
2017-07-06T00:00:00
The Case for Housing Finance Reform
powell
1
My topic today is the urgent need for fundamental reform of our system of housing finance--the great unfinished business of post-financial crisis reform. The Federal Reserve is not charged with designing or evaluating proposals for housing finance reform. But we are responsible for regulating and supervising banking institutions to ensure their safety and soundness, and more broadly for the stability of the financial system. A robust, well-capitalized, well-regulated housing finance system is vital to achieving those goals, and to the long-run health of our economy. We need a system that provides mortgage credit in good times and bad to a broad range of creditworthy borrowers. While reforms have addressed some of the problems of the pre-crisis system, there is broad agreement that the job is far from done. The status quo may feel comfortable today, but it is also unsustainable. Today, the federal government's role in housing finance is even greater than it was before the crisis. The overwhelming majority of new mortgages are issued with government backing in a highly concentrated securitization market. That leaves us with both potential taxpayer liability and systemic risk. It is important to learn the right lessons from the failure of the old system. We can also apply lessons from post-crisis banking reform. Above all, we need to move to a system that attracts ample amounts of private capital to stand between housing sector credit risk and taxpayers. We should also use market forces to increase competition and help to drive innovation. The global financial crisis ended in 2009, and the economy has just completed its eighth consecutive year of expansion. We are near full employment. The housing market is generally strong, although it is still recovering in some regions. To preserve these gains, we must ensure the stability of our financial system. With that goal in mind, we are near completion of a comprehensive program to raise prudential standards for our most systemically important banks. But fundamental housing finance reform--including reform to address the ultimate status of Fannie Mae and Freddie Mac, two systemically important government sponsored enterprises (GSEs)--remains on the "to do" list. As memories of the crisis fade, the next few years may present our last best chance to finish these critical reforms. Failure to do so would risk repeating the mistakes of the past. institutions prudently pursued their core mission of enhancing the availability of credit for housing. Beginning in the early 1980s, Fannie and Freddie helped to facilitate the development of the securitization market for home mortgages. They purchased and bundled mortgage loans, and sold the resulting mortgage-backed securities (MBS) to investors. Fannie and Freddie also guaranteed payment of principal and interest on the MBS. With this guarantee in place, MBS investors took the risk of changing interest rates, and the GSEs took the risk of default on the underlying mortgages. Thanks to the growth in securitization, these two GSEs have dominated U.S. housing finance since the late 1980s. The pre-crisis system did its job for many years. By promoting standardization, structuring securities to meet a broad range of investor risk appetites, and issuing guaranteed MBS, Fannie and Freddie brought greater liquidity to mortgage markets and made mortgages more affordable. But the system ultimately failed due to fundamental flaws in its structure. In the early days of securitization, the chance that either GSE would ever fail to honor its guarantee seemed remote. But the question always loomed in the background: Who would bear the credit risk if a GSE became insolvent and could not perform? Would Congress really allow the GSE to fail to honor its obligations, given the devastating impact that would have on mortgage funding and the housing market? The law stated explicitly that the government did not stand behind the GSEs or their MBS, as Fannie and Freddie frequently pointed out in order to avoid tougher regulation. Nonetheless, investors understandably came to believe that the two GSEs were too- big-to-fail, and priced in an implicit federal government guarantee behind GSE obligations. In the end the investors were right, of course. The implicit government guarantee also meant that investors--including banks, the GSEs themselves, and investors around the world--did not do careful due diligence on the underlying mortgage pools. Thus, securitization also enabled declining lending standards. This was not only a problem of the GSEs--private label securitizations also helped to enable lower underwriting standards. Over time, the system's bad incentives caused the two GSEs to change their behavior and take on ever greater risks. The GSEs became powerful advocates for their own bottom lines, providing substantial financial support for political candidates who supported the GSE agenda. Legislative reforms in the 1990s and the public/private structure led managements to expand the GSEs' balance sheets to enormous size, underpinned by wafer-thin slivers of capital, driving high shareholder returns and very high compensation for management. These factors and others eventually led to extremely lax lending conditions. The early 2000s became the era of Alt-A, low doc, and no doc loans. These practices contributed to the catastrophic failure of the housing finance system. Almost nine years ago, in September 2008, Fannie and Freddie were put into "temporary" conservatorship and received injections of taxpayer funds totaling $187.5 billion. In the end, the system privatized the gains and socialized the losses. The buildup of risks is clear in hindsight. But many officials and commentators raised concerns long before the collapse. The long-standing internal structural weaknesses of the old system ultimately led to disastrous consequences for homeowners, taxpayers, the financial system, and the economy. Before considering the path forward, it is important to acknowledge that today's housing sector is healthier and in some respects safer than it was in 2005. Although there are significant regional differences, national data show that housing prices have fully recovered from their gut- wrenching 35 percent drop during the crisis. Mortgage default rates have returned to pre-crisis levels. Mortgage credit is available and affordable for strong borrowers. There has also been meaningful progress in reforming the old system. In 2008, Congress enacted the Housing and Economic Recovery Act, which, among other things, created the Deposit Insurance Corporation. Under the FHFA's oversight, the two GSEs' retained portfolios have declined to about half of their pre-crisis size, and are expected to continue on a downward path. The FHFA and the GSEs have also been working to develop a market for the GSEs to lay off their credit risk. These innovative transactions have raised about $50 billion in private capital that now stands between taxpayers and mortgage credit risk in the GSEs' portfolios. In addition, the creation of a Common Securitization Platform should strengthen the GSEs' securitization infrastructure and facilitate further reforms with an eye toward enhancing competition. New regulations have also been put in place since the crisis with the goal of encouraging sound underwriting of mortgage loans. Today, lenders must make a good faith effort to determine that the borrower has the ability to repay the mortgage. Moreover, if the lender provides a "qualified mortgage" contract to the borrower, then the lender needs to meet certain other requirements. For example, some contract features such as an interest-only period or negative amortization, where the loan balance increases even though the borrower is making payments, are taboo. Upfront points and fees are limited too. These reforms represent movement in the right direction, but leave us well short of where we need to be. Despite the GSEs' significant role in this key market, there is no clarity about their future. When they were put into conservatorship, Treasury Secretary Paulson noted that "policymakers must view this next period as a 'time out' where we have stabilized the GSEs while we decide their future role and structure." Almost nine years into this time out, the federal government's domination of the housing sector has grown and is actually greater than it was Department of Veterans Affairs have a combined market share of about 80 percent of the purchase mortgage market, with the remaining 20 percent held by private financial institutions. After reaching nearly 30 percent of the market before the crisis, private-label securitization has dwindled to almost nothing today. The two GSEs remain in government conservatorship, with associated contingent liabilities to US taxpayers. Fannie and Freddie have remitted just over $270 billion of profits to the Treasury, more than paying back the government's initial investment. However, under current terms of the contracts that govern their access to Treasury funds, their capital will decline MBS and corporate debt outstanding, which is widely held and receives various forms of special regulatory treatment. And because of their scale, these enterprises continue to serve as important standard-setters and significant counterparties to other firms. While mortgage credit is widely available to most traditional mortgage borrowers, those with lower credit scores face significantly higher standards and lower credit availability than before the crisis. We can all agree that we do not want to go back to the poor underwriting standards used by originators prior to the crisis. But it may also be that the current system is too rigid, and that a lack of innovation and product choice has limited mortgage credit availability to some creditworthy households. According to a survey by the American Banker, in 2016 only nine percent of mortgage originations failed to meet the qualified mortgage contract criteria, down from 16 percent in 2013. The same survey reported that almost one-third of U.S. banks make only qualified mortgage loans, despite the fact that FHA- and GSE-eligible mortgages are exempt from the qualified mortgage requirements until January 2021 or until housing finance reform is enacted, whichever date comes first. The post-crisis reform program for our largest banks presents an appropriate standard against which the housing finance giants should be judged. After eight years of reform, our largest banking institutions are now far stronger and safer. Common equity capital held by the eight U.S. global systemically important banks has more than doubled to $825 billion from about $300 billion before the crisis. After the crisis revealed significant underlying liquidity vulnerabilities, these institutions now hold $2.3 trillion in high-quality liquid assets, or 25 percent of their total assets. Under rigorous annual stress tests, they must demonstrate a high level of understanding of their risks and the ability to manage them, and must survive severely adverse economic scenarios with high levels of post-stress capital. And they must file regular resolution plans that have made them significantly more resolvable should they fail. These measures were implemented to reduce the risk that a future crisis will result in taxpayer support, and to help ensure that the financial system could continue to function even in the event that one of these banks were to default. It is ironic that the housing finance system should escape fundamental reform efforts. The housing bubble of the early 2000s was, after all, an essential proximate cause of the crisis. Housing is the single largest asset class in our financial system, with total outstanding residential real estate owned by households of $24 trillion and roughly $10 trillion in single-family mortgage debt. While post-crisis regulation has addressed mortgage lending from a consumer protection standpoint, the important risks to taxpayers and the broader economy and financial system have not been robustly addressed. The most obvious and direct step forward would be to require ample amounts of private capital to support housing finance activities, as we do in the banking system. We should also strive for a system that can continue to function even in the event of a default of any firm. No single housing finance institution should be too big to fail. Greater amounts of private capital could come through a variety of sources, including through the entry of multiple private guarantors who would insure a portion of the credit risk, through risk-sharing agreements, or through expanded use of credit-risk transfers. Although private capital must surely be part of the reform effort, there may be limits to the amount of risk that we can credibly expect the private sector to insure. It is extremely difficult to appropriately price the insurance of catastrophic risk--the risk of a severe, widespread housing crisis. Both the private-sector insurance industry and government have struggled with this, particularly with how to smooth the consistent collection of premiums with the irregular payout of potentially enormous losses that may be needed only once or twice in a century. Furthermore, losses can be correlated across asset classes and geographies in these catastrophic events, rendering risk- diversification strategies ineffective. Fannie Mae and Freddie Mac have successfully transferred some credit risk to the private sector, but have thus far avoided selling off much of this catastrophic credit risk, arguing that doing so is not economical. After promising legislative initiatives have moved forward but fallen short of enactment, the air is again thick with housing finance reform proposals. As I mentioned at the outset, housing finance reform has important implications for the Federal Reserve's oversight of financial institutions, and for the U.S. economy and its financial stability. While I would not presume to judge these reform proposals, I will offer some principles for reform. These principles are based on the lessons learned from the old system's collapse, and from the experience of post-crisis bank reform. First, we ought to do whatever we can to make the possibility of future housing bailouts as remote as possible. Housing can be a volatile sector, and housing is often found at the heart of financial crises. Our housing finance institutions were not--and are not--structured with that in mind. Extreme fluctuations in credit availability for housing hurt vulnerable households, reduce affordability and availability, and, as we have seen, can threaten financial stability. As with banks, the goal should be to ensure that our housing finance system can continue to function even in the face of significant house price declines and severe economic conditions. Changing the system to attract large amounts of private capital would be a major step toward that goal. The question of the government's role in the new system is a challenging one for Congress. Many of the well-known reform proposals include some role for government. Some argue that government cannot avoid bearing the deep-in-the-tail risk of a catastrophic housing crisis. A number of proposals incorporate a government guarantee to cover this risk, to take effect after a significant stack of private capital is wiped out. That brings me to my second principle: If Congress chooses to go in this direction, any such guarantee should be explicit and transparent, and should apply to securities, not to institutions. Reform should not leave us with any institutions that are so important as to be candidates for too-big-to-fail. Third, we should promote greater competition in this market. The economics of securitization do not require a duopoly. Yet there is no way for private firms to acquire a GSE charter and enter the industry. This is akin to having only two banks with federal deposit insurance, which would make competition by other banks very difficult. Greater competition would help to reduce the systemic importance of the GSEs, and spur more innovation. Greater competition also requires a level playing field, allowing secondary market access to a wide-range of lenders and thereby giving homebuyers a choice among many potential mortgage lenders and products. Fourth, it is worth considering simple approaches that restructure and repurpose parts of the existing architecture of our housing finance system. We know that housing reform is difficult; completely redrawing the system may not be necessary and could complicate the search for a solution. Using the existing architecture would allow for a continued smooth, gradual transition. Fifth and last, we need to identify and build upon areas of bipartisan agreement. In my view, at this late stage we should not be holding out for the perfect answer. We should be looking for the best feasible plan to escape the unacceptable status quo. I see two reasons why this is a good time to address the housing finance system's shortcomings. First, the economy and the housing sector are healthy. It would be far more disruptive to implement fundamental structural changes during difficult economic times. Second, memories of the crisis are fading. If Congress does not enact reforms over the next few years, we are at risk of settling for the status quo--a government-dominated mortgage market with insufficient private capital to protect taxpayers, and insufficient competition to drive innovation. There is a serious risk, if not a likelihood, that this state of affairs may persist indefinitely, leaving our housing finance system in a semi-permanent limbo. Fortunately, we are blessed with a growing menu of reform options available for public vetting. And there appear to be areas of broad agreement among them. One of those plans, or a combination of different features of various plans, might well suffice to move us to a better system. Housing finance reform will protect taxpayers from another bailout, be good for households and the economy, and go some distance toward mitigating the systemic risk that the GSEs still pose.
r170711a_FOMC
united states
2017-07-11T00:00:00
Cross-Border Spillovers of Balance Sheet Normalization
brainard
0
For release on delivery Remarks by at a conference sponsored by When the central banks in many advanced economies embarked on unconventional monetary policy, it raised concerns that there might be differences in the cross-border transmission of unconventional relative to conventional monetary policy. These concerns were sufficient to warrant a special Group of Seven (G-7) statement in 2013 establishing ground rules to address possible exchange rate effects of the changing composition of monetary policy. Today the world confronts similar questions in reverse. In the United States, in my assessment, normalization of the federal funds rate is now well under way, and the Federal Reserve is advancing plans to allow the balance sheet to run off at a gradual and predictable pace. And for the first time in many years, the global economy is experiencing synchronous growth, and authorities in the euro area and the United Kingdom are beginning to discuss the time when the need for monetary accommodation will diminish. Unlike in previous tightening cycles, many central banks currently have two tools for removing accommodation. They can therefore pursue alternative normalization strategies--first seeking to guide policy rates higher before initiating balance sheet runoff, as in the United States, or instead starting to shrink the balance sheet before initiating a tightening of short-term rates, or undertaking both in tandem. Shrinking the balance sheet and raising the policy rate can both contribute to achieving the domestic goals of monetary policy, but it is an open question whether alternative normalization approaches might have materially different implications for the composition of demand and for cross- border spillovers, including through exchange rates and other financial channels. Before discussing the cross-border effects of normalization, it is worth noting that the two tools for removing accommodation--raising policy rates and reducing central bank balance sheets--appear to affect domestic output and inflation in a qualitatively similar way. This means that central banks can substitute between raising the policy rate and shrinking the balance sheet to remove accommodation, just as both were used to support the recovery following the Great Recession. Insofar as a range of approaches is likely to be consistent with achieving a central bank's domestic objectives, the choice of normalization strategy may be influenced by other considerations, including the ease of implementing and communicating policy changes, or the desire to minimize possible credit market distortions associated with the balance sheet. In the case of the Federal Reserve, the Federal Open Market Committee (FOMC) decided to delay balance sheet normalization until the federal funds rate had reached a high enough level to enable it to be cut materially if economic conditions deteriorate, thus guarding against the risk of returning to the effective lower bound (ELB) in an environment with a historically low neutral interest rate. The greater familiarity and past experience with the federal funds rate also weighed in favor of this instrument initially. Separately, for those central banks that, unlike the Federal Reserve, moved to negative interest rates, there may be special considerations associated with raising policy rates back into positive territory. One question that naturally arises is whether the major central banks' normalization plans may have material implications for cross-border spillovers--an important issue that until very recently had received scant attention. This question is a natural extension of the literature examining the cross-border spillovers of the unconventional policy actions taken by the major central banks to provide accommodation. Although this literature suggests there are good reasons to expect broadly similar cross-border spillovers from tightening through policy rates as through balance sheet runoff, the effects may not be exactly equivalent. T he balance sheet might affect certain aspects of the economy and financial markets differently than the short-term rate due to the fact that the balance sheet more directly affects term premiums on longer-term securities, while the short-term rate more directly affects money market rates. As a result, similar to the domestic effects, w hile the international spillovers of conventional and unconventional monetary policy may operate broadly similarly, the relative magnitude of the different channels may be sufficiently different that, on net, the two policy strategies have distinct effects. For example, as will be discussed at greater length shortly, the two strategies may have very different implications for the exchange rate. late 2014 and early 2015, and as we have seen again in reverse in recent weeks, in addition to the standard demand and exchange rate channels, expected or actual asset purchases may have spillovers to foreign financial conditions--by lowering term premiums and the associated longer-term foreign bond yields--that are greater than conventional monetary policy. To explore possible differences, it is useful to compare two different approaches to policy normalization, each of which is designed to have identical effects on aggregate domestic activity and thus, at least in the long run, on inflation. At one extreme, a central bank could opt to tighten primarily through conventional policy hikes, while maintaining the balance sheet by reinvesting the proceeds of maturing assets. At the other extreme, a central bank could rely primarily on reducing the balance sheet, while keeping policy rates unchanged in the near term. The question is whether there are circumstances in which the choice of normalization strategies, which are similarly effective in achieving domestic mandates, might matter for the global economy. Where the two approaches have entirely equivalent effects, the central bank could freely substitute between them without changing the composition of home demand, and net exports, the exchange rate, and foreign output would also be unaffected. Conversely, under different assumptions about the transmission channels of monetary policy, alternative approaches to normalization can have quite different implications for foreign economies. Most prominently, the exchange rate may be more sensitive to the path of short term rates than to balance sheet adjustments, as some research suggests. Although several papers using an event study approach find on balance little disparity in the exchange rate sensitivity to short-term compared to long- term interest rates, this lack of empirical consensus may simply reflect the difficulty of disentangling changes in short-term and longer-term interest rates, which are highly correlated. Indeed, the greater sensitivity of exchange rates to expected short-term interest rates than to term premiums was a key rationale behind the Operation Twist strategy in the early 1960s. large-scale purchases of longer-term Treasury securities to drive down yields and stimulate the economy, which was suffering from an unemployment rate of nearly 7 percent. This policy was combined with a modest increase in short-term interest rates intended to alleviate the capital outflow pressures that threatened the sustainability of the Bretton Woods global monetary system. Ultimately, this policy mix did succeed in reducing long-term interest rates, and also contributed to a reduction in private capital outflows that relieved pressure on U.S. international reserves, at least for a time. Let's turn to a simulation of a highly stylized model to explore how a greater sensitivity of the exchange rate to conventional policy relative to balance sheet actions can make a difference in terms of cross-border transmission. In particular, let's assume a 100-basis-point rise in long-term yields coming from the conventional channel of higher policy rates has double the effect on the exchange rate as a 100 basis point rise in yields coming from higher term premiums. If a large country, which is already at potential, experiences a favorable domestic demand shock, it would need to tighten monetary policy to return output to potential. If the central bank chooses to use the short-term interest rate as its active policy tool, and keeps its balance sheet on hold, the current and expected path of short-term interest rates rises, putting upward pressure on long-term bond yields and causing the real exchange rate to appreciate. The stronger currency coupled with some initial expansion of domestic demand in turn cause a deterioration in real net exports. Turning to the effects abroad, the decline in domestic real net exports corresponds to an increase in foreign net exports, which will tend to boost foreign GDP, other things being equal. How this affects a particular foreign economy will depend on its circumstances and the corresponding policy response of the foreign central bank. In the case where the foreign economy is pinned at the effective lower bound, the increase in net exports will provide a welcome boost to aggregate demand. By contrast, if foreign output is already near potential, the foreign central bank will need to respond by tightening policy in order to keep its economy in balance. Now, let's instead consider tightening through the balance sheet. If the same amount of policy tightening in the country experiencing a positive demand shock is achieved exclusively through a reduction in the balance sheet, while keeping the policy rate unchanged, the exchange rate would appreciate to a smaller degree, reflecting the lower assumed sensitivity of the exchange rate to the term premium than to policy rates. Net exports would decline by less--reflecting both the smaller exchange rate appreciation and the smaller rise in domestic demand--and similarly this would result in smaller cross- border spillovers to foreign GDP. Thus, for a foreign economy that is at the effective lower bound, tightening in the home country through balance-sheet policy will be less welcome than through short-term rates. The foreign economy will experience less exchange rate depreciation, and so less of a boost to net exports. In addition, the stimulus to the foreign economy could be further diluted to the extent that the balance sheet policy boosted term premiums on its long-term bonds and hence tightened financial conditions, although this effect has not been built into the simulation model. By contrast, for a foreign economy that is close to potential, adjustment through the balance sheet in the home country will mean less of a need for the foreign central bank to respond by tightening policy than under home country adjustment through conventional policy. So far, we have considered the case of central banks with freely floating exchange rates and well-anchored inflation expectations. What about central banks with managed exchange rates or weakly anchored inflation expectations? To keep the analysis simple, let's assume a foreign central bank aims to completely stabilize its exchange rate vis-a- vis a core country. Let's again consider circumstances in which the core country experiences a positive demand shock that calls for policy tightening. Although the pegging economy is likely to experience spillovers under either approach to normalization in the core country, the spillovers are likely to be greater when the core country tightens through the policy rate. The tightening in the core country will compel the country that is fixing its exchange rate to tighten policy in sync and the core country's currency will rise more against its trading partners with conventional tightening, leading to greater effective appreciation of the pegging country's currency as well. Although the pegging economy will benefit somewhat from the stronger demand of the core country, that benefit is likely to be outweighed by the adverse effects of a tightening of domestic monetary policy when domestic conditions would not otherwise call for it. Such considerations may have played a role in the market dynamics experienced by China as discussions about initiating rate hikes progressed in the United States in the second half of 2015 and early 2016. Next let's explore alternative approaches to policy normalization by countries facing a similar need to tighten. This question is timely; with synchronous expansions now underway, we may be approaching a turning point before too long. In particular, let's consider the case when two large countries, which are assumed identical for the sake of simplicity, experience the same positive shock to domestic demand. Under these assumptions, if both economies were to choose the same normalization strategy--putting primary reliance on either the balance sheet or short-term interest rates--the implications for the exchange rate and net exports are the same: In both cases, the exchange rate between the two countries does not change, and neither do net exports between the countries. Each central bank would adjust interest rates by the same amount--enough to offset the stimulus from the demand shock--and with interest differentials unchanged, there would be no pressure on the exchange rate between them to move. Of course, if there are other economies in the rest of the world that do not experience the same shock, the choice of normalization strategy does matter, similar to the analysis of spillovers from the single core country, presumably magnified by the larger combined global weight of the two economies. Now let's turn to the case in which the two central banks choose to rely on different policy tools. In this case, one country responds to the positive shock by hiking its policy rate to reduce output to its initial level, while the second country responds by shrinking its balance sheet. The country that relies on the policy rate to make the adjustment experiences an appreciation in the exchange rate, a deterioration in net exports and some expansion of domestic demand, while the country that chooses to rely solely on the balance sheet for tightening experiences a depreciation of its exchange rate and an increase in net exports. Thus, while both countries achieve their domestic stabilization objectives, whether the requisite policy tightening occurs through increases in policy rates or reductions in the balance sheet matters for the composition of demand, the external balance, and the exchange rate. I highlighted at the outset the commitment adopted by many leading nations to set monetary policy to achieve domestic objectives such that the exchange rate would not be a primary consideration in the setting of monetary policy. In the case that balance- sheet and conventional monetary policies have equivalent effects on both domestic spending and the exchange rate, this common principle is straightforward. But if the cross-border spillovers of reductions in the balance sheet and increases in the policy rate are not equivalent, the sequencing of policy rate and balance sheet normalization could have important implications for the exchange rate and external balance. Finally, in circumstances where a major central bank is continuing to expand its balance sheet or maintaining a large balance sheet over a sustained period, this policy would likely exert downward pressure on term premiums around the globe, especially in those foreign economies whose bonds were perceived as close substitutes. Indeed, until very recently, it had been notable how little long yields moved up in the United States even as discussions of balance sheet normalization have moved to the forefront. This likely reflects at least in part the expectation that ongoing asset purchase programs in other advanced economies would continue holding down long-term yields globally. The tide seems to have turned in recent weeks, as long yields in the U.S. have increased notably on market perceptions that foreign officials are beginning to deliberate their own normalization strategies. I have used a simple stylized model to illustrate circumstances in which the choice of normalization strategies adopted by major central banks can potentially be quite consequential. If anything, the analysis presented here serves to highlight the importance of research assessing this question from both an empirical and theoretical perspective. Let me conclude by returning to the policy choices facing central banks. The Federal Reserve chose to remove accommodation initially through increases in the federal funds rate. In light of recent policy moves, I consider normalization of the federal funds rate to be well under way. If the data continue to confirm a strong labor market and firming economic activity, I believe it would be appropriate soon to commence the gradual and predictable process of allowing the balance sheet to run off. Once that process begins, I will want to assess the inflation process closely before making a determination on further adjustments to the federal funds rate in light of the recent softness in core PCE (personal consumption expenditures) inflation. In my view, the neutral level of the federal funds rate is likely to remain close to zero in real terms over the medium term. If that is the case, we would not have much more additional work to do on moving to a neutral stance. I will want to monitor inflation developments carefully, and to move cautiously on further increases in the federal funds rate, so as to help guide inflation back up around our symmetric target. Meanwhile, in recent days, we have begun to hear acknowledgement from other major central banks that they too are seeing conditions that suggest policy normalization could be on the table before too long, against the backdrop of a brighter global outlook. As I just discussed, the pace and timing of how central banks around the world proceed with normalization, and the importance of balance sheet policy relative to changes in short term rates in these normalization plans, could have important implications for exchange rates and financial conditions globally. The model is a stylized open economy model that includes two symmetric countries linked through trade flows. The model is specified in real terms under the implicit assumption that inflation is constant (so that real and nominal variables move by the same amount). Moreover, the model abstracts from any financial linkages between the two economies, including the possibility that monetary policy actions in one country could directly affect yields in the other (e.g., through portfolio balance channels), though such effects are clearly important empirically. equal size. Variables in the foreign country are denoted with an asterisk. In each country, the national accounting identity specifies that output, y , is equal to the sum of absorption d and net exports nx , that is: where the second equation incorporates the global resource constraint that nx + nx . and absorption ( are expressed in percent deviation from their respective steady states, while net exports are expressed as share of output, and are equal to zero in the steady state (that is, prior to any shocks). Home and foreign absorption depend on long-term interest rates according to the following expressions: rc is the component of long-term interest rates that is driven by conventional monetary policy, ru is the component of long-term interest rates that is driven by unconventional monetary policy, and u is an exogenous aggregate demand shock (with autocorrelation given by ). These interest rate components are assumed to have identical effects on aggregate demand, with the parameter determining the sensitivity of aggregate demand to either component (n..b., interest rates are expressed in percentage points deviation from the steady state). Net exports are assumed to fall if the real exchange rate ( e also if domestic demand is higher relative to foreign demand (since this boost imports). where is the elasticity of net exports with respect to the exchange rate, and is the elasticity of net exports to the differential between home and foreign absorption. The real exchange rate is expressed in percent deviation from the steady state. The exchange rate is determined according to an interest rate parity condition which implies that the exchange rate appreciates when domestic interest rates are higher than foreign interest rates, with elasticities ( and ) that can differ depending on whether interest rate movements are driven by conventional or unconventional policy: The model is closed by specifying the behavior of the monetary authority. We assume that the monetary authority can adjust either the interest rate associated with conventional policy ( rc ), or the interest rate linked to balance sheet actions ( ru ), or both, to affect output (its goal variable). The conventional feedback rule is thus: whereas the unconventional feedback rule is: The system above contains 10 equations in 10 endogenous variables ( y, y , d, d , nx, e , rc, rc , ru, ru ), as well two shocks, u and u , that can move GDP, its components, exchange rates, and interest rates. Figures 1 and 2 show the results of simulating the model under alternative assumptions about the shocks and monetary policy reaction. In each case, the economy starts in steady state with all variables at zero and experiences a demand shock in period 1 that dies out with an autocorrelation of . All parameter values are reported in Figure 1 illustrates the case of a favorable demand shock in the home country. The solid lines illustrate the case when Home uses the short-term interest rate as its active policy tool, and keeps its balance sheet on hold, consistent with a desire to delay balance sheet normalization. The policy reaction is calibrated to be sufficiently aggressive that home GDP always remains at baseline (see column 2 of Table 1). The higher policy rate path (that is, higher rc ) causes the long-term interest rate (panel A) to rise, which in turn induces the real exchange rate to appreciate (panel B). The stronger currency and an expansion in domestic absorption (panel C) causes a deterioration in net exports (panel D). At the end of the period shown, domestic demand has nearly returned to baseline, while net exports are just a bit below baseline--consistent with Home country's GDP remaining at baseline (panel E). Because foreign monetary policy rates is assumed to remain on hold, foreign GDP (panel F) rises by the improvement in its net exports (that is, by the mirror image of panel D, given that foreign domestic absorption is unchanged). The dashed lines illustrate the case of a favorable demand shock in the Home country when the central bank opts to tighten exclusively through reducing its balance sheet (again, by enough to keep output at potential--see column 3 of Table 1). Long-term interest rates (panel A) rise in response, but the exchange rate appreciates less in this case (panel B), reflecting the lower assumed sensitivity of the exchange rate to unconventional monetary policy actions ( the smaller exchange rate appreciation and a smaller rise in absorption (panel C)--which translates into less of a boost to foreign GDP (panel F) than when the home country adjusts through conventional policy. Figure 2 shows a simulation in which the demand shock is assumed to be common across countries ( u = u ). The home country is assumed to pursue a policy of actively adjusting its policy rate, while the foreign Country is assumed to rely exclusively on normalizing through the balance sheet. In each case, the central banks of the two countries tighten policy aggressively enough to keep output at potential (see the parameter settings in column 4 of Table 1). As policy rates rise in the home country (panel A) and the exchange rate is more sensitive to policy rates than to the balance sheet, the home country's exchange rate baseline in each country (panels E and F) given our assumption that monetary policy keeps output at potential (which is unchanged), the alternative policy normalization choices clearly have important effects--even under a common shock--on both exchange rates and the composition of demand in each country. In particular, because exchange rates in the foreign country are less sensitive to balance sheet than to interest rate policy, the foreign central bank must enact a relatively larger interest rate tightening in order to keep its GDP at potential. . Journal of . . . . .
r170713a_FOMC
united states
2017-07-13T00:00:00
Cross-Border Spillovers of Balance Sheet Normalization
brainard
0
When the central banks in many advanced economies embarked on unconventional monetary policy, it raised concerns that there might be differences in the cross-border transmission of unconventional relative to conventional monetary policy. These concerns were sufficient to warrant a special Group of Seven (G-7) statement in 2013 establishing ground rules to address possible exchange rate effects of the changing composition of monetary policy. Today the world confronts similar questions in reverse. In the United States, in my assessment, normalization of the federal funds rate is now well under way, and the Federal Reserve is advancing plans to allow the balance sheet to run off at a gradual and predictable pace. And for the first time in many years, the global economy is experiencing synchronous growth, and authorities in the euro area and the United Kingdom are beginning to discuss the time when the need for monetary accommodation will diminish. Unlike in previous tightening cycles, many central banks currently have two tools for removing accommodation. They can therefore pursue alternative normalization strategies--first seeking to guide policy rates higher before initiating balance sheet runoff, as in the United States, or instead starting to shrink the balance sheet before initiating a tightening of short-term rates, or undertaking both in tandem. Shrinking the balance sheet and raising the policy rate can both contribute to achieving the domestic goals of monetary policy, but it is an open question whether alternative normalization approaches might have materially different implications for the composition of demand and for cross- border spillovers, including through exchange rates and other financial channels. Before discussing the cross-border effects of normalization, it is worth noting that the two tools for removing accommodation--raising policy rates and reducing central bank balance sheets--appear to affect domestic output and inflation in a qualitatively similar way. This means that central banks can substitute between raising the policy rate and shrinking the balance sheet to remove accommodation, just as both were used to support the recovery following the Great Recession. Insofar as a range of approaches is likely to be consistent with achieving a central bank's domestic objectives, the choice of normalization strategy may be influenced by other considerations, including the ease of implementing and communicating policy changes, or the desire to minimize possible credit market distortions associated with the balance sheet. In the case of the Federal Reserve, the Federal Open Market Committee (FOMC) decided to delay balance sheet normalization until the federal funds rate had reached a high enough level to enable it to be cut materially if economic conditions deteriorate, thus guarding against the risk of returning to the effective lower bound (ELB) in an environment with a historically low neutral interest rate. The greater familiarity and past experience with the federal funds rate also weighed in favor of this instrument initially. Separately, for those central banks that, unlike the Federal Reserve, moved to negative interest rates, there may be special considerations associated with raising policy rates back into positive territory. One question that naturally arises is whether the major central banks' normalization plans may have material implications for cross-border spillovers--an important issue that until very recently had received scant attention. This question is a natural extension of the literature examining the cross-border spillovers of the unconventional policy actions taken by the major central banks to provide accommodation. Although this literature suggests there are good reasons to expect broadly similar cross-border spillovers from tightening through policy rates as through balance sheet runoff, the effects may not be exactly equivalent. T he balance sheet might affect certain aspects of the economy and financial markets differently than the short-term rate due to the fact that the balance sheet more directly affects term premiums on longer-term securities, while the short-term rate more directly affects money market rates. As a result, similar to the domestic effects, w hile the international spillovers of conventional and unconventional monetary policy may operate broadly similarly, the relative magnitude of the different channels may be sufficiently different that, on net, the two policy strategies have distinct effects. For example, as will be discussed at greater length shortly, the two strategies may have very different implications for the exchange rate. late 2014 and early 2015, and as we have seen again in reverse in recent weeks, in addition to the standard demand and exchange rate channels, expected or actual asset purchases may have spillovers to foreign financial conditions--by lowering term premiums and the associated longer-term foreign bond yields--that are greater than conventional monetary policy. To explore possible differences, it is useful to compare two different approaches to policy normalization, each of which is designed to have identical effects on aggregate domestic activity and thus, at least in the long run, on inflation. At one extreme, a central bank could opt to tighten primarily through conventional policy hikes, while maintaining the balance sheet by reinvesting the proceeds of maturing assets. At the other extreme, a central bank could rely primarily on reducing the balance sheet, while keeping policy rates unchanged in the near term. The question is whether there are circumstances in which the choice of normalization strategies, which are similarly effective in achieving domestic mandates, might matter for the global economy. Where the two approaches have entirely equivalent effects, the central bank could freely substitute between them without changing the composition of home demand, and net exports, the exchange rate, and foreign output would also be unaffected. Conversely, under different assumptions about the transmission channels of monetary policy, alternative approaches to normalization can have quite different implications for foreign economies. Most prominently, the exchange rate may be more sensitive to the path of short term rates than to balance sheet adjustments, as some research suggests. Although several papers using an event study approach find on balance little disparity in the exchange rate sensitivity to short-term compared to long- term interest rates, this lack of empirical consensus may simply reflect the difficulty of disentangling changes in short-term and longer-term interest rates, which are highly correlated. Indeed, the greater sensitivity of exchange rates to expected short-term interest rates than to term premiums was a key rationale behind the Operation Twist strategy in the early 1960s. large-scale purchases of longer-term Treasury securities to drive down yields and stimulate the economy, which was suffering from an unemployment rate of nearly 7 percent. This policy was combined with a modest increase in short-term interest rates intended to alleviate the capital outflow pressures that threatened the sustainability of the Bretton Woods global monetary system. Ultimately, this policy mix did succeed in reducing long-term interest rates, and also contributed to a reduction in private capital outflows that relieved pressure on U.S. international reserves, at least for a time. Let's turn to a simulation of a highly stylized model to explore how a greater sensitivity of the exchange rate to conventional policy relative to balance sheet actions can make a difference in terms of cross-border transmission. In particular, let's assume a 100-basis-point rise in long-term yields coming from the conventional channel of higher policy rates has double the effect on the exchange rate as a 100 basis point rise in yields coming from higher term premiums. If a large country, which is already at potential, experiences a favorable domestic demand shock, it would need to tighten monetary policy to return output to potential. If the central bank chooses to use the short-term interest rate as its active policy tool, and keeps its balance sheet on hold, the current and expected path of short-term interest rates rises, putting upward pressure on long-term bond yields and causing the real exchange rate to appreciate. The stronger currency coupled with some initial expansion of domestic demand in turn cause a deterioration in real net exports. Turning to the effects abroad, the decline in domestic real net exports corresponds to an increase in foreign net exports, which will tend to boost foreign GDP, other things being equal. How this affects a particular foreign economy will depend on its circumstances and the corresponding policy response of the foreign central bank. In the case where the foreign economy is pinned at the effective lower bound, the increase in net exports will provide a welcome boost to aggregate demand. By contrast, if foreign output is already near potential, the foreign central bank will need to respond by tightening policy in order to keep its economy in balance. Now, let's instead consider tightening through the balance sheet. If the same amount of policy tightening in the country experiencing a positive demand shock is achieved exclusively through a reduction in the balance sheet, while keeping the policy rate unchanged, the exchange rate would appreciate to a smaller degree, reflecting the lower assumed sensitivity of the exchange rate to the term premium than to policy rates. Net exports would decline by less--reflecting both the smaller exchange rate appreciation and the smaller rise in domestic demand--and similarly this would result in smaller cross- border spillovers to foreign GDP. Thus, for a foreign economy that is at the effective lower bound, tightening in the home country through balance-sheet policy will be less welcome than through short-term rates. The foreign economy will experience less exchange rate depreciation, and so less of a boost to net exports. In addition, the stimulus to the foreign economy could be further diluted to the extent that the balance sheet policy boosted term premiums on its long-term bonds and hence tightened financial conditions, although this effect has not been built into the simulation model. By contrast, for a foreign economy that is close to potential, adjustment through the balance sheet in the home country will mean less of a need for the foreign central bank to respond by tightening policy than under home country adjustment through conventional policy. So far, we have considered the case of central banks with freely floating exchange rates and well-anchored inflation expectations. What about central banks with managed exchange rates or weakly anchored inflation expectations? To keep the analysis simple, let's assume a foreign central bank aims to completely stabilize its exchange rate vis-a- vis a core country. Let's again consider circumstances in which the core country experiences a positive demand shock that calls for policy tightening. Although the pegging economy is likely to experience spillovers under either approach to normalization in the core country, the spillovers are likely to be greater when the core country tightens through the policy rate. The tightening in the core country will compel the country that is fixing its exchange rate to tighten policy in sync and the core country's currency will rise more against its trading partners with conventional tightening, leading to greater effective appreciation of the pegging country's currency as well. Although the pegging economy will benefit somewhat from the stronger demand of the core country, that benefit is likely to be outweighed by the adverse effects of a tightening of domestic monetary policy when domestic conditions would not otherwise call for it. Such considerations may have played a role in the market dynamics experienced by China as discussions about initiating rate hikes progressed in the United States in the second half of 2015 and early 2016. Next let's explore alternative approaches to policy normalization by countries facing a similar need to tighten. This question is timely; with synchronous expansions now underway, we may be approaching a turning point before too long. In particular, let's consider the case when two large countries, which are assumed identical for the sake of simplicity, experience the same positive shock to domestic demand. Under these assumptions, if both economies were to choose the same normalization strategy--putting primary reliance on either the balance sheet or short-term interest rates--the implications for the exchange rate and net exports are the same: In both cases, the exchange rate between the two countries does not change, and neither do net exports between the countries. Each central bank would adjust interest rates by the same amount--enough to offset the stimulus from the demand shock--and with interest differentials unchanged, there would be no pressure on the exchange rate between them to move. Of course, if there are other economies in the rest of the world that do not experience the same shock, the choice of normalization strategy does matter, similar to the analysis of spillovers from the single core country, presumably magnified by the larger combined global weight of the two economies. Now let's turn to the case in which the two central banks choose to rely on different policy tools. In this case, one country responds to the positive shock by hiking its policy rate to reduce output to its initial level, while the second country responds by shrinking its balance sheet. The country that relies on the policy rate to make the adjustment experiences an appreciation in the exchange rate, a deterioration in net exports and some expansion of domestic demand, while the country that chooses to rely solely on the balance sheet for tightening experiences a depreciation of its exchange rate and an increase in net exports. Thus, while both countries achieve their domestic stabilization objectives, whether the requisite policy tightening occurs through increases in policy rates or reductions in the balance sheet matters for the composition of demand, the external balance, and the exchange rate. I highlighted at the outset the commitment adopted by many leading nations to set monetary policy to achieve domestic objectives such that the exchange rate would not be a primary consideration in the setting of monetary policy. In the case that balance- sheet and conventional monetary policies have equivalent effects on both domestic spending and the exchange rate, this common principle is straightforward. But if the cross-border spillovers of reductions in the balance sheet and increases in the policy rate are not equivalent, the sequencing of policy rate and balance sheet normalization could have important implications for the exchange rate and external balance. Finally, in circumstances where a major central bank is continuing to expand its balance sheet or maintaining a large balance sheet over a sustained period, this policy would likely exert downward pressure on term premiums around the globe, especially in those foreign economies whose bonds were perceived as close substitutes. Indeed, until very recently, it had been notable how little long yields moved up in the United States even as discussions of balance sheet normalization have moved to the forefront. This likely reflects at least in part the expectation that ongoing asset purchase programs in other advanced economies would continue holding down long-term yields globally. The tide seems to have turned in recent weeks, as long yields in the U.S. have increased notably on market perceptions that foreign officials are beginning to deliberate their own normalization strategies. I have used a simple stylized model to illustrate circumstances in which the choice of normalization strategies adopted by major central banks can potentially be quite consequential. If anything, the analysis presented here serves to highlight the importance of research assessing this question from both an empirical and theoretical perspective. Let me conclude by returning to the policy choices facing central banks. The Federal Reserve chose to remove accommodation initially through increases in the federal funds rate. In light of recent policy moves, I consider normalization of the federal funds rate to be well under way. If the data continue to confirm a strong labor market and firming economic activity, I believe it would be appropriate relatively soon to commence the gradual and predictable process of allowing the balance sheet to run off. Once that process begins, I will want to assess the inflation process closely before making a determination on further adjustments to the federal funds rate in light of the recent softness in core PCE (personal consumption expenditures) inflation. In my view, the neutral level of the federal funds rate is likely to remain close to zero in real terms over the medium term. If that is the case, we would not have much more additional work to do on moving to a neutral stance. I will want to monitor inflation developments carefully, and to move cautiously on further increases in the federal funds rate, so as to help guide inflation back up around our symmetric target. Meanwhile, in recent days, we have begun to hear acknowledgement from other major central banks that they too are seeing conditions that suggest policy normalization could be on the table before too long, against the backdrop of a brighter global outlook. As I just discussed, the pace and timing of how central banks around the world proceed with normalization, and the importance of balance sheet policy relative to changes in short term rates in these normalization plans, could have important implications for exchange rates and financial conditions globally. The model is a stylized open economy model that includes two symmetric countries linked through trade flows. The model is specified in real terms under the implicit assumption that inflation is constant (so that real and nominal variables move by the same amount). Moreover, the model abstracts from any financial linkages between the two economies, including the possibility that monetary policy actions in one country could directly affect yields in the other (e.g., through portfolio balance channels), though such effects are clearly important empirically. equal size. Variables in the foreign country are denoted with an asterisk. In each country, the national accounting identity specifies that output, y , is equal to the sum of absorption d and net exports nx , that is: where the second equation incorporates the global resource constraint that nx + nx . and absorption ( are expressed in percent deviation from their respective steady states, while net exports are expressed as share of output, and are equal to zero in the steady state (that is, prior to any shocks). Home and foreign absorption depend on long-term interest rates according to the following expressions: rc is the component of long-term interest rates that is driven by conventional monetary policy, ru is the component of long-term interest rates that is driven by unconventional monetary policy, and u is an exogenous aggregate demand shock (with autocorrelation given by ). These interest rate components are assumed to have identical effects on aggregate demand, with the parameter determining the sensitivity of aggregate demand to either component (n..b., interest rates are expressed in percentage points deviation from the steady state). Net exports are assumed to fall if the real exchange rate ( e also if domestic demand is higher relative to foreign demand (since this boost imports). where is the elasticity of net exports with respect to the exchange rate, and is the elasticity of net exports to the differential between home and foreign absorption. The real exchange rate is expressed in percent deviation from the steady state. The exchange rate is determined according to an interest rate parity condition which implies that the exchange rate appreciates when domestic interest rates are higher than foreign interest rates, with elasticities ( and ) that can differ depending on whether interest rate movements are driven by conventional or unconventional policy: The model is closed by specifying the behavior of the monetary authority. We assume that the monetary authority can adjust either the interest rate associated with conventional policy ( rc ), or the interest rate linked to balance sheet actions ( ru ), or both, to affect output (its goal variable). The conventional feedback rule is thus: whereas the unconventional feedback rule is: The system above contains 10 equations in 10 endogenous variables ( y, y , d, d , nx, e , rc, rc , ru, ru ), as well two shocks, u and u , that can move GDP, its components, exchange rates Figures 1 and 2 show the results of simulating the model under alternative assumptions about the shocks and monetary policy reaction. In each case, the economy starts in steady state with all variables at zero and experiences a demand shock in period 1 that dies out with an autocorrelation of . All parameter values are reported in Figure 1 illustrates the case of a favorable demand shock in the home country. The solid lines illustrate the case when Home uses the short-term interest rate as its active policy tool, and keeps its balance sheet on hold, consistent with a desire to delay balance sheet normalization. The policy reaction is calibrated to be sufficiently aggressive that home GDP always remains at baseline (see column 2 of Table 1). The higher policy rate path (that is, higher rc ) causes the long-term interest rate (panel A) to rise, which in turn induces the real exchange rate to appreciate (panel B). The stronger currency and an expansion in domestic absorption (panel C) causes a deterioration in net exports (panel D). At the end of the period shown, domestic demand has nearly returned to baseline, while net exports are just a bit below baseline--consistent with Home country's GDP remaining at baseline (panel E). Because foreign monetary policy rates is assumed to remain on hold, foreign GDP (panel F) rises by the improvement in its net exports (that is, by the mirror image of panel D, given that foreign domestic absorption is unchanged). The dashed lines illustrate the case of a favorable demand shock in the Home country when the central bank opts to tighten exclusively through reducing its balance sheet (again, by enough to keep output at potential--see column 3 of Table 1). Long-term interest rates (panel A) rise in response, but the exchange rate appreciates less in this case (panel B), reflecting the lower assumed sensitivity of the exchange rate to unconventional monetary policy actions ( the smaller exchange rate appreciation and a smaller rise in absorption (panel C)--which translates into less of a boost to foreign GDP (panel F) than when the home country adjusts through conventional policy. Figure 2 shows a simulation in which the demand shock is assumed to be common across countries ( u = u ). The home country is assumed to pursue a policy of actively adjusting its policy rate, while the foreign Country is assumed to rely exclusively on normalizing through the balance sheet. In each case, the central banks of the two countries tighten policy aggressively enough to keep output at potential (see the parameter settings in column 4 of Table 1). As policy rates rise in the home country (panel A) and the exchange rate is more sensitive to policy rates than to the balance sheet, the home country's exchange rate baseline in each country (panels E and F) given our assumption that monetary policy keeps output at potential (which is unchanged), the alternative policy normalization choices clearly have important effects--even under a common shock--on both exchange rates and the composition of demand in each country. In particular, because exchange rates in the foreign country are less sensitive to balance sheet than to interest rate policy, the foreign central bank must enact a relatively larger interest rate tightening in order to keep its GDP at potential. . Journal of . . . . .
r170728a_FOMC
united states
2017-07-28T00:00:00
Strengthening Diversity in Economics
brainard
0
Let me begin by expressing my appreciation to the American Economic institutions sponsoring this conference, which for nearly 20 years has focused attention on the benefits of diversity and the need for continued progress in ensuring that the best and the brightest have the opportunity to advance and contribute to the field of economics. I want to especially thank Lisa Cook for giving me the opportunity to be here today. I am particularly pleased to be able to speak with students participating in the AEA's Summer and Mentoring Programs. At the Fed, we share the goal of making sure the group of students who go on to practice economics look more like America. So if you are interested in pursuing economics, we want to make sure you have an opportunity to contribute to the field. Perhaps more than any other profession, it is in the DNA of economists to believe that equality of opportunity is important not only as a matter of fairness, but also to our country's vitality. In addition, economists like to base conclusions on hard numbers. The numbers make clear there is a persistent lack of diversity in the economics profession, which indicates we are falling short of our ideals. The quality of the economics profession and its contributions to society will be stronger when a broader range of people are engaged. We now have substantial empirical evidence documenting the benefits of diversity in broadening the range of ideas and perspectives that are brought to bear on solving problems, and thereby contributing to better outcomes, both in research and in policy. Studies suggest that increased diversity alters group dynamics and decisionmaking in positive ways. As Amanda Bayer and Cecilia Rouse have noted, microeconomic experiments and other research have confirmed these ideas. One experiment found that greater racial diversity helped groups of business students outperform other students in solving problems. And another found similar benefits from gender diversity. A study of 2.5 million research papers across the sciences found that those written by ethnically diverse research teams received more citations and had a greater impact than papers by authors with the same ethnicity. Diversity in the economics profession will bring important insights into the analysis of our economy and financial system and help policymakers make better decisions in promoting a healthier economy. So given that diversity in the economics profession is an important goal, how have we been doing? Unfortunately, by any measure, we are still falling short. Between 1995 and 2014, the share of women obtaining a doctorate in economics held roughly steady in the neighborhood of 30 percent. Among U.S. citizens and permanent residents earning doctorates, the representation of those identifying as black, Hispanic, or Native American among the pool of doctorate recipients improved from 6 percent in 1995 to 11 percent in 2007. But the improvement has since unwound and the underrepresented minority share stood at about 7 percent in 2014. I respectfully defer to others who have more closely studied the mix of possible reasons for why progress has not been greater, but I will defer to no one in expressing my view that the status quo is not good enough. I've laid out a number of arguments why policy and society at large would be better off if there were more women and minorities in the economics profession. But a more important question for the students here today might reasonably be, what's in it for me? One answer is that there are a ton of interesting questions out there for you to solve. Another is the considerable premium in earnings over other academic fields. Even for those students here who are not convinced that a career in economics is ultimately the goal for you, the intellectual framework associated with the study of economics is extremely powerful and applicable to a host of other areas. And who can say now how your career will unfold? I would encourage you to think expansively about your career trajectory. Some of you may already feel that economics is your calling and you may choose to go directly into graduate school with the aim of pursuing doctoral studies. Others may decide to work for a few years and then make a choice about graduate studies once you know more about what kind of job is likely to be most appealing and hopefully have made a dent in your student debt. Just as many people choose to leave the field of economics and pursue other interests at different junctures in their careers, so too there may be multiple points of entry. There are many stories of successful careers in economics that began only after other avenues had been explored. I want to make sure you know that even if you have not specialized in economics as an undergraduate, it is not too late to join. We need economists with diverse experiences and backgrounds, just as we need other sources of diversity. For what it is worth, I did not major in economics as an undergraduate, and I worked for several years in the private sector before I decided to pursue a doctorate in economics. I am glad I did. I enormously value the intellectual framework associated with economics, and my career has provided terrific opportunities both to range far afield and to work within the field. In short, I hope you will think of economics not as shutting doors on other opportunities, but rather as equipping you to pursue a whole new set of opportunities. Given the benefits I have described to the profession and society from increasing diversity in the ranks of economists, as well as the opportunity a degree in economics affords to individuals who pursue the field, it makes sense to look closely, as the AEA and others continue to do, at the reasons more women and minorities do not concentrate in economics in college or depart from economics as they move toward graduation. To the undergraduates in the audience today who are participating in the Summer Program, I would be interested to hear what you believe may be discouraging women and minorities from choosing economics in school and careers, so that we can take steps to reduce any barriers. The AEA and other groups are rightly focused on what universities and other institutions can do to promote diversity in economics. Let me now direct my attention to diversity at my institution, the Federal Reserve, because it is both one of the largest employers of PhD economists in the country and a prominent public institution. The share of women and minorities among economists at the Fed is similar to the numbers for the profession at large, and these numbers have likewise improved fairly little since 2009. Something that has changed over that time, however, is the elevation in importance of diversity efforts, including recruitment. The Board of Governors is making considerable effort to recruit and retain economists who are women and minorities. We are working hard at recruitment at the earliest stages of the career-formation process. Every year the Board hires 50 to 60 research assistants (RAs) for two-year terms; the 12 Reserve Banks together hire roughly an equal number. Our research assistant program is to some extent an apprenticeship program for economists and also for financial analysts and those in related fields. Our research assistants are typically recent college graduates and most are at least considering further study of economics. These are sought-after positions because our RAs have the opportunity to work with the Board's leading economists on both research and work that directly supports policymaking. For the Summer Program students here today, I would encourage you to explore the RA program. Serving as an RA either at the Federal Reserve Board or at one of the Reserve Banks has proven to be a good way to learn more about the profession and prepare for graduate school. I can report that five graduates of last year's program are now working at the Board as RAs and several are working at Reserve Banks. For the Federal Reserve System, our recruitment of RAs is a great opportunity to give a wide range of potential newcomers some exposure to what the career pathway of economics looks like. While not all RAs go on to further economics education and training, a considerable share do, and thus our diversity efforts have the potential to significantly affect career pathways later on. Indeed, at the highest levels of that process, the Federal Reserve under Chair Janet Yellen and with the strong support of the Governors and Reserve Bank Presidents is committed to increasing diversity among the top-ranking staff at the Board and among leaders of the Reserve Banks. In the tradition of the discipline of economics, I have cited empirical support for the argument that diversity is good for economics. I know from my own personal experience we can and must do better, and in that spirit I want to tell you about one of my experiences that might resonate with some of you here. When I served at the Treasury Department for President Obama, one of my responsibilities was to represent the United States in the Group of 20 (G-20) deputy ministers of finance and central bank officials. At one of our negotiating meetings, the nearly 50 officials gathered for a group photo. When this photo was distributed to each of us as a memento, it caused a bit of a stir. I remember a colleague from another delegation doing a double take when he looked at the photo. He turned to me and said "Did you realize you are the only woman in this group?" I had indeed already noticed that. And I am happy to say that his delegation added a woman after that. Since that time, we have seen some particularly noteworthy milestones. At the time, there had been only one woman among the G-20 countries serving as head of a central bank, in South Africa, and none among the G-7 advanced economies. Since then, two other women have led G-20 central banks, and now I have the honor of serving with the first woman to lead the Federal Reserve, Janet Yellen. So what's my take away from that experience? It was a good reminder to me that we have to see the challenges that are right in front of our eyes if we want to address them and we should not be satisfied until the people sitting around every decisionmaking table look like America in all its rich diversity. Earlier this year, the Federal Reserve Bank of Atlanta hired Raphael Bostic as its president and chief executive officer, the first African-American to lead a Reserve Bank. I look forward to the day when we have moved far beyond all the firsts, when we can see with satisfaction that the people of the Federal Reserve fully reflect the characteristics of the American people. We are working hard to get there, but we still have a long way to go. Of course, getting to that point in the future starts with people like you and the paths you choose today. You should choose the path where you feel you will make your greatest contribution--the one that is right for you. But if economics seems like a great fit, then choose it with the confidence that you have the capacity to make an important contribution that will be valued. So with that, I will be glad to respond to your questions.
r170731a_FOMC
united states
2017-07-31T00:00:00
The Low Level of Global Real Interest Rates
fischer
0
I am very happy to be participating in this conference celebrating Arminio. My tenure at the International Monetary Fund overlapped with the first two and a half years of Arminio's time as president of the Central Bank of Brazil, and in our capacities at the time, we had frequent opportunities to interact and converse. Of course, I watched with admiration the remarkable management of the economy by the Malan-Fraga team in the run-up to the election that brought Lula to power. In particular, Arminio and his Central Bank team's management of the exchange rate--which at one point reached 3.95 reais per dollar--was masterly and put in place a sound foundation for that essential part of Brazil's economic machinery in the years that followed. Subsequently, as I was on the brink of transitioning to the world of central banking early in 2005--that is, prior to taking up my position as governor of the Bank of Israel--Arminio was able to turn the tables and offered me some hard-edged advice on how to be a central banker. I have kept that advice close since then. It comes in the form of six commandments on a small laminated piece of paper. Needless to say, it very much reflects his values and his behavior. Let me quote just three of his rules: "Number 2. Do [the job] in a way that shows you care, but in a way that shows you will serenely pursue your goals"--excellent advice, which is easier said than done; "Number 5. Beware of a tendency to be overly conservative once you start wearing the central bank hat"; and "Number 6. Remember, most people lose half their IQ when they take a job such as this Now I will turn to the main topic of my discussion, the low level of global real interest rates, an important and distinguishing feature of the current global economic environment. In the United States, the yield on 10-year Treasury bonds is near all-time lows, with the same being true in the euro area, the United Kingdom, and Japan (figure 1). Yields have also declined in many emerging markets, with interest rates falling almost 400 basis points in Korea since the financial crisis and by a similar amount in Israel. As shown in figure 2, the decline has been less apparent in Brazil and South Africa, though interest rates in both countries remain well below previous peaks. In this talk, I will address two questions: Why are interest rates so low? And why has the decline in interest rates been so widespread? Lower inflation explains a portion of the decline in nominal interest rates. Longer-term interest rates reflect market participants' expectations of future inflation as well as the expected path of real, or inflation-adjusted, interest rates. And while lower realized inflation and credible central back inflation targets have likely stabilized expected inflation at relatively low levels compared with much of the 20th century, inflation-adjusted yields have also notably decreased. The decline in interest rates also does not appear to be primarily an outcome of the economic cycle. Longer-term interest rates in the United States have remained low federal funds rate by 100 basis points and as the unemployment rate has declined below the median of FOMC participants' assessments of its longer-run normal level. Rather, it appears as though much of the decline has occurred in the equilibrium level of the real interest rate--also known as the natural rate of interest or, alternatively, , wrote, "There is a certain level of the average rate of interest which is such that the general level of prices has no tendency to move either upwards or downwards." In recent years, the coincidence of low inflation and low interest rates suggests that the natural rate of interest is likely very low today. Wicksell was clearly referring to the natural rate as the real interest rate when the economy is at full employment. The widely cited methodology of my Federal Reserve colleagues Thomas Laubach and John Williams, attempts to gauge the natural rate in the longer run after various shorter-term influences, including the business cycle, have played out. In a recent update of their analysis, they find that the natural rate of interest has declined about 150 basis points in the United States since the financial crisis and is currently about 50 basis points. We must remember, however, that r* is a function and not a constant, and its estimation is subject to a number of assumptions, the modification of which can lead to a wide range of estimates. In an extension of this analysis, shown in figure 3, Laubach, Williams, and Kathryn Holston, also a Federal Reserve colleague, show that the decline in the natural rate of interest is a common feature across a number of foreign economies. The fall in equilibrium interest rates was most pronounced at the time of the financial crisis, but rates have shown little tendency to increase during the long recovery from the crisis. There are many factors that could be holding down interest rates, some of which could fade over time, including the effects of quantitative easing in the United States and abroad and a heightened demand for safe assets affecting yields on advanced-economy government securities. I will focus on some of the more enduring factors that could potentially lower the equilibrium interest for some time. In attempting to explain why real interest rates have fallen, a useful starting point is to think of the natural interest rate as the price that equilibrates the economy's supply of saving with the demand for investment in the long run, when the economy is at full employment. With this framework in mind, low interest rates reflect factors that increase saving, depress investment demand, or both. Focusing initially on the United States, I will look at three interrelated factors that are likely contributing to low interest rates: slower trend economic growth, an aging population and demographic developments, and relatively weak investment. I will then discuss global developments and spillovers between countries. But first I would like to interject a quick word on why we as policymakers might be concerned about low interest rates. I highlight three main worries. First, as John Maynard Keynes discussed in the concluding chapters of , a low equilibrium interest rate increases the risks of falling into a liquidity trap, a situation where the nominal interest rate is stuck, by an effective lower bound, above the rate necessary to bring the economy back to potential. Relatedly, but more broadly, low equilibrium interest rates are a key pillar of the secular stagnation hypothesis, which Larry Summers has carried forward during the past few years. Second, a low natural rate could potentially hurt financial stability if it leads investors to reach for yield or hurts financial firms' profitability. And, third--and perhaps most troubling--a low equilibrium rate sends a powerful signal that the growth potential of the economy may be limited. Slow trend growth One factor contributing to low equilibrium interest rates in the United States has been a slowdown in the pace of potential, or trend, growth. According to the currently around 1.5 percent, compared with a pace about double that, on average, in the two decades leading up to the financial crisis. A prime culprit in the growth slowdown has been the slow rate of labor productivity growth, which has increased only 1/2 percent, on average, over the past five years, compared with a 2 percent growth rate over the period from 1976 to 2005. A declining rate of labor force growth has also worked to push down trend growth. The CBO is projecting that the potential labor force in the United States will grow at about 1/2 percent per year over the next decade, less than half the pace observed, on average, in the two decades before the financial crisis. Slower growth can both boost saving and depress investment. As households revise down their expectations for future income growth, they become less likely to borrow and more likely to save. Likewise, slower growth diminishes the number of business opportunities that can be profitably undertaken, weighing on investment demand. The aging of the population can work to lower the equilibrium interest rate beyond its effect on the size of the labor force and trend growth. As households near retirement, they tend to save more, anticipating having to run down their savings after they leave the labor force. Federal Reserve economists, in one study, estimate that higher saving by near-retirement households could be pushing down the longer-run equilibrium federal funds rate relative to its level in the 1980s by as much as 75 basis points. Another factor weighing on equilibrium real interest rates has been the recent weakness of investment. What explains the tepid response of capital spending to historically low interest rates? As mentioned earlier, low productivity growth has certainly been a contributing factor, as firms see fewer profitable investment opportunities. But elevated uncertainty, both political and economic, has likely also played a role. For one, uncertainty about the outlook for government policy in health care, regulation, taxes, and trade can cause firms to delay projects until the policy environment clarifies. Firms also seem quite uncertain about the disruptive capacity of new technologies. Technological developments appear to be rapidly reshaping entire industries--in retail, transportation, and communications. Elevated uncertainty about the continued viability of long-standing business models could be weighing on investment decisions. Relatedly, it is possible that as the economy evolves in response to new technologies, production is becoming less capital intensive than it was in earlier decades. Another possible explanation for the weakness of investment in the United States has been a decrease in competition within industries, as evidenced by decreasing firm entry and exit rates as well as increased industry concentration. Less competition allows firms to maintain high profits while lowering the pressure on them to increase production to maintain market share. In an earlier discussion, I attempted to quantify the effect that these factors--slow growth, demographics, and investment--might be having on the long-run equilibrium rate in the United States. slowdown in growth appears likely to be the primary factor depressing the long-run equilibrium rate, although the contributions from demographics and weak investment demand were also sizable. Up until now, I have looked primarily at factors within the United States. However, as I have pointed out earlier, the decline in interest rates is a global phenomenon. Why has the decline in interest rates been so widespread? One important reason is that many of the same factors that have been driving down the equilibrium interest rate in the United States have operated with equal or even greater force in many foreign economies. The slowdown in labor productivity growth has been widespread across many countries. Likewise, the advanced economies and some emerging markets have experienced demographic shifts that are in some cases much more pronounced than in the United States, with the working-age population in some countries even declining over the past decade. Another explanation is that we live in an integrated global economy where economic developments in one country spill over into other countries via trade and capital flows as well as prices, including interest rates and exchange rates. In the most general sense, these spillovers are captured in the pattern of current balances, shown in figure 4. If we abstract from a somewhat sizable statistical discrepancy, the sum of global current accounts should be equal to zero, as, in the aggregate, one country's deficit must be matched by a surplus in some configuration of other countries--but it is not always apparent who is spilling over onto whom. Prior to the financial crisis, it was widely speculated that foreign developments were depressing U.S. interest rates. Former Chairman Bernanke characterized the foreign forces acting on U.S. interest rates as the "global saving glut," with particular reference to emerging market economies that were running persistent current account surpluses, sometimes as a result of specific policy decisions regarding exchange rates, reserve accumulation, and fiscal policy. The global saving glut was also a factor in the "Greenspan conundrum," or the observation that a series of Federal Reserve rate hikes over the period from 2004 to 2006 seemed to have little effect on longer-term interest rates in the United States. As shown in figure 5, the deterioration of the U.S. deficit in the early 2000s was matched by growing surpluses in the emerging markets, particularly in emerging Asia and China as well as OPEC. The explosive growth of the U.S. current account deficit from 2001 to 2006, coincident with falling interest rates both in the United States and globally, supports the notion that higher foreign saving relative to foreign investment was likely holding down U.S. interest rates at the time. What can the distribution of global current accounts tell us about international spillovers in the post-crisis era? As shown in figure 6, the most notable development has been the almost exact reversal of the expansion of the U.S. current account deficit observed during the time of the global saving glut. Has the global saving glut of the mid- 2000s faded away? Falling interest rates over the period that the U.S. deficit narrowed suggest not. If a shrinking supply of foreign saving, the reversal of the global saving glut, was behind the narrowing of the U.S. deficit, then the tendency would have been for equilibrium real interest rates to have increased. Rather, falling equilibrium rates suggest that falling U.S. demand for foreign savings has precipitated the narrowing of the U.S. current account deficit. U.S. demand likely decreased for the reasons discussed earlier, including slowing growth, demographics, and weak investment demand. Does the marked narrowing of the U.S. current account deficit post-crisis suggest that the United States has been the primary source of downward pressure on global interest rates over the past decade? Certainly, if the United States had maintained its previous deficit, interest rates would likely be higher around the world. However, the financial crisis revealed that the U.S. capacity to absorb global savings at the pace observed prior to the crisis was unsustainable. Rather, an alternative explanation would be that the sharp decline in global interest rates post-crisis reflects factors that were likely well in train before the financial crisis. The downward trend in interest rates would have been more pronounced earlier in the decade had not elevated, and ultimately unsustainable, borrowing in the United States slowed the decline in interest rates in the years immediately preceding the crisis. This narrative is consistent with empirical evidence that suggests that the slowdowns in global productivity growth and labor force growth, both key factors in the slowing pace of global growth and the downward pressure on interest rates, predate the global financial crisis. It is notable in figure 6 that the euro area has also seen a sizable increase in its current account position post-crisis, suggesting that developments in Europe have also played a role in pushing down interest rates. The increase in the euro-area current account in part reflects sharp reversals in the current account deficits of Greece, Portugal, Spain, and Ireland--all countries that had witnessed large increases in their deficits during the global saving glut period prior to the crisis, in a pattern similar to that experienced by the United States. However, the euro-area increase also reflects increased surpluses in Germany and the Netherlands, countries that were already in considerable surplus during the pre-crisis period. Given the potential risks around low interest rates I discussed earlier, including the impact on the effectiveness of monetary policy and financial stability concerns, what should policymakers do to address the problem? Monetary policy has a role to play. Transparent and sound monetary policy can boost confidence in the stability of the growth outlook, an outcome that can in turn alleviate precautionary demand for savings and encourage investment, pushing up the equilibrium interest rate. is not a panacea." Also, to repeat myself, policies to boost productivity growth and the longer-run potential of the economy are more likely to be found in effective fiscal and regulatory measures than in central bank actions. This statement is true not only in the United States, but also around the globe. But it is not to say that monetary policy is irrelevant to the growth rate of the economy. m . . . . . . .pdf . . . , vol. 105
r170825a_FOMC
united states
2017-08-25T00:00:00
Financial Stability a Decade after the Onset of the Crisis
yellen
1
United States since the Great Depression. Already, for some, memories of this experience may be fading--memories of just how costly the financial crisis was and of why certain steps were taken in response. Today I will look back at the crisis and discuss the reforms policymakers in the United States and around the world have made to improve financial regulation to limit both the probability and the adverse consequences of future financial crises. A resilient financial system is critical to a dynamic global economy--the subject of this conference. A well-functioning financial system facilitates productive investment and new business formation and helps new and existing businesses weather the ups and downs of the business cycle. Prudent borrowing enables households to improve their standard of living by purchasing a home, investing in education, or starting a business. Because of the reforms that strengthened our financial system, and with support from monetary and other policies, credit is available on good terms, and lending has advanced broadly in line with economic activity in recent years, contributing to today's strong economy. At the same time, reforms have boosted the resilience of the financial system. Banks are safer. The risk of runs owing to maturity transformation is reduced. Efforts to enhance the resolvability of systemic firms have promoted market discipline and reduced the problem of too-big-to-fail. And a system is in place to more effectively monitor and address risks that arise outside the regulatory perimeter. Nonetheless, the scope and complexity of financial regulatory reforms demand that policymakers and researchers remain alert to both areas for improvement and unexpected side effects. The Federal Reserve is committed to continuing to evaluate the effects of regulation on financial stability and on the broader economy and to making appropriate adjustments. I will start by reviewing where we were 10 years ago. I will then walk through some key reforms our country has put in place to diminish the chances of another severe crisis and limit damage during times of financial instability. After reviewing these steps, I will summarize indicators and research that show the improved resilience of the U.S. financial system--resilience that is due importantly to regulatory reform as well as actions taken by the private sector. I will then turn to the evidence regarding how financial regulatory reform has affected economic growth, credit availability, and market liquidity. The U.S. and global financial system was in a dangerous place 10 years ago. U.S. house prices had peaked in 2006, and strains in the subprime mortgage market grew acute over the first half of 2007. By August, liquidity in money markets had deteriorated enough to require the Federal Reserve to take steps to support it. And yet the discussion here at Jackson Hole in August 2007, with a few notable exceptions, was fairly optimistic about the possible economic fallout from the stresses apparent in the financial system. As we now know, the deterioration of liquidity and solvency within the financial sector continued over the next 13 months. Accumulating strains across the financial system, including the collapse of Bear Stearns in March 2008, made it clear that vulnerabilities had risen across the system. As a result, policymakers took extraordinary rate, and the Federal Reserve, in coordination with the Treasury Department and other agencies, extended liquidity facilities beyond the traditional banking sector, applying to the modern structure of U.S. money markets the dictum of Walter Bagehot, conceived in the 19th century, to lend freely against good collateral at a penalty rate. Still, the deterioration in the financial sector continued, with Fannie Mae and Freddie Mac failing in early September. But the deterioration from early 2007 until early September 2008--already the worst financial disruption in the United States in many decades--was a slow trickle compared with the tidal wave that nearly wiped out the financial sector that September and led to a plunge in economic activity in the following months. Not long after Fannie and Freddie were placed in government conservatorship, Lehman Brothers collapsed, setting off a week in which American International Group, Inc. (AIG), came to the brink of failure and required large loans from the Federal Reserve to mitigate the systemic fallout; a large money market fund "broke the buck" (that is, was unable to maintain a net asset value of $1 per share) and runs on other money funds accelerated, requiring the Treasury to provide a guarantee of money fund liabilities; global dollar funding markets nearly collapsed, necessitating coordinated action by central banks around the world; the two remaining large investment banks became bank holding companies, thereby ending the era of large independent investment banks in the United States; and the Treasury proposed a rescue of the financial sector. Within several weeks, the Congress passed-- initiated further emergency lending programs; and the Federal Deposit Insurance Corporation (FDIC) guaranteed a broad range of bank debt. Facing similar challenges in their own jurisdictions, many foreign governments also undertook aggressive measures to support the functioning of credit markets, including large-scale capital injections into banks, expansions of deposit insurance programs, and guarantees of some forms of bank debt. Despite the forceful policy responses by the Treasury, the Congress, the FDIC, and the Federal Reserve as well as authorities abroad, the crisis continued to intensify: The vulnerabilities in the U.S. and global economies had grown too large, and the subsequent damage was enormous. From the beginning of 2008 to early 2010, nearly 9 million jobs, on net, were lost in the United States. Millions of Americans lost their homes. And distress was not limited to the U.S. economy: Global trade and economic activity contracted to a degree that had not been seen since the 1930s. The economic recovery that followed, despite extraordinary policy actions, was painfully slow. These painful events renewed efforts to guard against financial instability. The Congress, the Administration, and regulatory agencies implemented new laws, regulations, and supervisory practices to limit the risk of another crisis, in coordination with policymakers around the world. The vulnerabilities within the financial system in the mid-2000s were numerous and, in hindsight, familiar from past financial panics. Financial institutions had assumed too much risk, especially related to the housing market, through mortgage lending standards that were far too lax and contributed to substantial overborrowing. Repeating a familiar pattern, the "madness of crowds" had contributed to a bubble, in which investors and households expected rapid appreciation in house prices. The long period of economic stability beginning in the 1980s had led to complacency about potential risks, and the buildup of risk was not widely recognized. As a result, market and supervisory discipline was lacking, and financial institutions were allowed to take on high levels of leverage. This leverage was facilitated by short-term wholesale borrowing, owing in part to market-based vehicles, such as money market mutual funds and asset-backed commercial paper programs that allowed the rapid expansion of liquidity transformation outside of the regulated depository sector. Finally, a self-reinforcing loop developed, in which all of the factors I have just cited intensified as investors sought ways to gain exposure to the rising prices of assets linked to housing and the financial sector. As a result, securitization and the development of complex derivatives products distributed risk across institutions in ways that were opaque and ultimately destabilizing. In response, policymakers around the world have put in place measures to limit a future buildup of similar vulnerabilities. The United States, through coordinated regulatory action and legislation, moved very rapidly to begin reforming our financial system, and the speed with which our banking system returned to health provides evidence of the effectiveness of that strategy. Moreover, U.S. leadership of global efforts through bodies such as the Basel Committee on Banking Supervision, the Financial Stability Board (FSB), and the Group of Twenty has contributed to the development of standards that promote financial stability around the world, thereby supporting global growth while protecting the U.S. financial system from adverse developments abroad. Preeminent among these domestic and global efforts have been steps to increase the loss- absorbing capacity of banks, regulations to limit both maturity transformation in short- term funding markets and liquidity mismatches within banks, and new authorities to facilitate the resolution of large financial institutions and to subject systemically important firms to more stringent prudential regulation. Several important reforms have increased the loss-absorbing capacity of global banks. First, the quantity and quality of capital required relative to risk-weighted assets have been increased substantially. In addition, a simple leverage ratio provides a backstop, reflecting the lesson imparted by past crises that risk weights are imperfect and a minimum amount of equity capital should fund a firm's total assets. Moreover, both the risk-weighted and simple leverage requirements are higher for the largest, most systemic firms, which lowers the risk of distress at such firms and encourages them to limit activities that could threaten financial stability. stress tests. In addition to contributing to greater loss-absorbing capacity, the CCAR improves public understanding of risks at large banking firms, provides a forward- looking examination of firms' potential losses during severely adverse economic conditions, and has contributed to significant improvements in risk management. Reforms have also addressed the risks associated with maturity transformation. The fragility created by deposit-like liabilities outside the traditional banking sector has been mitigated by regulations promulgated by the Securities and Exchange Commission affecting prime institutional money market funds. These rules require these prime funds to use a floating net asset value, among other changes, a shift that has made these funds less attractive as cash-management vehicles. The changes at money funds have also helped reduce banks' reliance on unsecured short-term wholesale funding, since prime institutional funds were significant investors in those bank liabilities. Liquidity risk at large banks has been further mitigated by a new liquidity coverage ratio and a capital surcharge for global systemically important banks (G-SIBs). The liquidity coverage ratio requires that banks hold liquid assets to cover potential net cash outflows over a 30-day stress period. The capital surcharge for U.S. G-SIBs links the required level of capital for the largest banks to their reliance on short-term wholesale funding. While improvements in capital and liquidity regulation will limit the reemergence of the risks that grew substantially in the mid-2000s, the failure of Lehman Brothers demonstrated how the absence of an adequate resolution process for dealing with a failing systemic firm left policymakers with only the terrible choices of a bailout or allowing a destabilizing collapse. In recognition of this shortcoming, the Congress adopted the orderly liquidation authority in Title II of the Dodd-Frank Wall Street resolution mechanism for systemically important firms to be used instead of bankruptcy proceedings when necessary to preserve financial stability. The orderly liquidation authority contains a number of tools, including liquidity resources and temporary stays on the termination of financial contracts, that would help protect the financial system and economy from the severe adverse spillovers that could occur if a systemic firm failed. Importantly, any losses incurred by the government in an Orderly Liquidation Authority resolution would not be at the expense of taxpayers, since the statute provides that all such losses must be borne by other large financial firms through subsequent assessments. In addition, the Congress required that the largest banks submit living wills that describe how they could be resolved under bankruptcy. And the Federal Reserve has mandated that systemically important banks meet total loss-absorbing capacity requirements, which require these firms to maintain long-term debt adequate to absorb losses and recapitalize the firm in resolution. These enhancements in resolvability protect financial stability and help ensure that the shareholders and creditors of failing firms bear losses. Moreover, these steps promote market discipline, as creditors--knowing full well that they will bear losses in the event of distress--demand prudent risk-taking, thereby limiting the problem of too-big-to-fail. Financial stability risks can also grow large outside the regulated banking sector, as amply demonstrated by the events of 2007 and 2008. In response, a number of regulatory changes affecting what is commonly referred to as the shadow banking sector have been instituted. A specific example of such risks, illustrative of broader developments, was the buildup of large counterparty exposures through derivatives between market participants and AIG that were both inappropriately risk-managed and opaque. To mitigate the potential for such risks to arise again, new standards require central clearing of standardized over-the-counter derivatives, enhanced reporting requirements for all derivatives, and higher capital as well as margin requirements for noncentrally cleared derivatives transactions. Another important step was the Congress's creation of the Financial Stability Oversight Council (FSOC). The council is responsible for identifying risks to financial stability and for designating those financial institutions that are systemically important and thus subject to prudential regulation by the Federal Reserve. Both of these responsibilities are important to help guard against the risk that vulnerabilities outside the existing regulatory perimeter grow to levels that jeopardize financial stability. The evidence shows that reforms since the crisis have made the financial system substantially safer. Loss-absorbing capacity among the largest banks is significantly higher, with Tier 1 common equity capital more than doubling from early 2009 to now. The annual stress-testing exercises in recent years have led to improvements in the capital positions and risk-management processes among participating banks. Large banks have cut their reliance on short-term wholesale funding essentially in half and hold significantly more high-quality, liquid assets. Assets under management at prime institutional money market funds that proved susceptible to runs in the crisis have decreased substantially. And the ability of regulators to resolve a large institution has improved, reflecting both new authorities and tangible steps taken by institutions to adjust their organizational and capital structure in a manner that enhances their resolvability and significantly reduces the problem of too-big-to-fail. The progress evident in regulatory and supervisory metrics has been accompanied by shifts in private-sector assessments that also suggest enhanced financial stability. Investors have recognized the progress achieved toward ending too-big-to-fail, and several rating agencies have removed the government support rating uplift that they once accorded to the largest banks. Credit default swaps for the large banks also suggest that market participants assign a low probability to the distress of a large U.S. banking firm. Market-based assessments of the loss-absorbing capacity of large U.S. banks have moved up in recent years, and market-based measures of equity now lie in the range of book estimates of equity. To be sure, market-based measures may not reflect true risks--they certainly did not in the mid-2000s--and hence the observed improvements should not be overemphasized. But supervisory metrics are not perfect, either, and policymakers and investors should continue to monitor a range of supervisory and market-based indicators of financial system resilience. Economic research provides further support for the notion that reforms have made the system safer. Studies have demonstrated that higher levels of bank capital mitigate the risk and adverse effects of financial crises. Moreover, researchers have highlighted how liquidity regulation supports financial stability by complementing capital regulation. Economic models of the resilience of the financial sector--so called top- down stress-testing models--reinforce the message from supervisory stress tests that the riskiness of large banks has diminished over the past decade. Similarly, model-based analyses indicate that the risk of adverse fire sale spillovers across banks or broker- dealers have been substantially mitigated. I suspect many in this audience would agree with the narrative of my remarks so far: The events of the crisis demanded action, needed reforms were implemented, and these reforms have made the system safer. Now--a decade from the onset of the crisis and nearly seven years since the passage of the Dodd-Frank Act and international agreement on the key banking reforms--a new question is being asked: Have reforms gone too far, resulting in a financial system that is too burdened to support prudent risk- taking and economic growth? The Federal Reserve is committed individually, and in coordination with other U.S. government agencies through forums such as the FSOC and internationally through bodies such as the Basel Committee on Banking Supervision and the FSB, to evaluating the effects of financial market regulations and considering appropriate adjustments. Furthermore, the Federal Reserve has independently taken steps to evaluate potential adjustments to its regulatory and supervisory practices. For example, the Federal Reserve initiated a review of its stress tests following the 2015 cycle, and this review suggested changes to reduce the burden on participating institutions, especially smaller institutions, and to better align the supervisory stress tests with regulatory capital requirements. addition, a broader set of changes to the new financial regulatory framework may deserve consideration. Such changes include adjustments that may simplify regulations applying to small and medium-sized banks and enhance resolution planning. More broadly, we continue to monitor economic conditions, and to review and conduct research, to better understand the effect of regulatory reforms and possible implications for regulation. I will briefly summarize the current state of play in two areas: the effect of regulation on credit availability and on changes in market liquidity. The effects of capital regulation on credit availability have been investigated extensively. Some studies suggest that higher capital weighs on banks' lending, while others suggest that higher capital supports lending. Such conflicting results in academic research are not altogether surprising. It is difficult to identify the effects of regulatory capital requirements on lending because material changes to capital requirements are rare and are often precipitated, as in the recent case, by financial crises that also have large effects on lending. Given the uncertainty regarding the effect of capital regulation on lending, rulemakings of the Federal Reserve and other agencies were informed by analyses that balanced the possible stability gains from greater loss-absorbing capacity against the possible adverse effects on lending and economic growth. This ex ante assessment pointed to sizable net benefits to economic growth from higher capital standards--and subsequent research supports this assessment. The steps to improve the capital positions of banks promptly and significantly following the crisis, beginning with the Supervisory Capital Assessment Program, have resulted in a return of lending growth and profitability among U.S. banks more quickly than among their global peers. While material adverse effects of capital regulation on broad measures of lending are not readily apparent, credit may be less available to some borrowers, especially homebuyers with less-than-perfect credit histories and, perhaps, small businesses. In retrospect, mortgage borrowing was clearly too easy for some households in the mid- 2000s, resulting in debt burdens that were unsustainable and ultimately damaging to the financial system. Currently, many factors are likely affecting mortgage lending, including changes in market perceptions of the risk associated with mortgage lending; changes in practices at the government-sponsored enterprises and the Federal Housing Administration; changes in technology that may be contributing to entry by nonbank lenders; changes in consumer protection regulations; and, perhaps to a limited degree, changes in capital and liquidity regulations within the banking sector. These issues are complex and interact with a broader set of challenges related to the domestic housing finance system. Credit appears broadly available to small businesses with solid credit histories, although indicators point to some difficulties facing firms with weak credit scores and insufficient credit histories. Small business formation is critical to economic dynamism and growth. Smaller firms rely disproportionately on lending from smaller banks, and the Federal Reserve has been taking steps and examining additional steps to reduce unnecessary complexity in regulations affecting smaller banks. Finally, many financial market participants have expressed concerns about the ability to transact in volume at low cost--that is, about market liquidity, particularly in certain fixed-income markets such as that for corporate bonds. Market liquidity for corporate bonds remains robust overall, and the healthy condition of the market is apparent in low bid-ask spreads and the large volume of corporate bond issuance in recent years. That said, liquidity conditions are clearly evolving. Large dealers appear to devote less of their balance sheets to holding inventories of securities to facilitate trades and instead increasingly facilitate trades by directly matching buyers and sellers. In addition, algorithmic traders and institutional investors are a larger presence in various markets than previously, and the willingness of these institutions to support liquidity in stressful conditions is uncertain. While no single factor appears to be the predominant cause of the evolution of market liquidity, some regulations may be affecting market liquidity somewhat. There may be benefits to simplifying aspects of the Volcker rule, which limits proprietary trading by banking firms, and to reviewing the interaction of the enhanced supplementary leverage ratio with risk-based capital requirements. At the same time, the new regulatory framework overall has made dealers more resilient to shocks, and, in the past, distress at dealers following adverse shocks has been an important factor driving market illiquidity. As a result, any adjustments to the regulatory framework should be modest and preserve the increase in resilience at large dealers and banks associated with the reforms put in place in recent years. So where do we stand a decade after the onset of the most severe financial crisis since the Great Depression? Substantial progress has been made toward the Federal Reserve's economic objectives of maximum employment and price stability, in putting in place a regulatory and supervisory structure that is well designed to lower the risks to financial stability, and in actually achieving a stronger financial system. Our more resilient financial system is better prepared to absorb, rather than amplify, adverse shocks, as has been illustrated during periods of market turbulence in recent years. Enhanced resilience supports the ability of banks and other financial institutions to lend, thereby supporting economic growth through good times and bad. Nonetheless, there is more work to do. The balance of research suggests that the core reforms we have put in place have substantially boosted resilience without unduly limiting credit availability or economic growth. But many reforms have been implemented only fairly recently, markets continue to adjust, and research remains limited. The Federal Reserve is committed to evaluating where reforms are working and where improvements are needed to most efficiently maintain a resilient financial system. Moreover, I expect that the evolution of the financial system in response to global economic forces, technology, and, yes, regulation will result sooner or later in the all-too- familiar risks of excessive optimism, leverage, and maturity transformation reemerging in new ways that require policy responses. We relearned this lesson through the pain inflicted by the crisis. We can never be sure that new crises will not occur, but if we keep this lesson fresh in our memories--along with the painful cost that was exacted by the recent crisis--and act accordingly, we have reason to hope that the financial system and economy will experience fewer crises and recover from any future crisis more quickly, sparing households and businesses some of the pain they endured during the crisis that struck a decade ago.
r170830a_FOMC
united states
2017-08-30T00:00:00
The Role of Boards at Large Financial Firms
powell
1
Good morning. Thank you to President Evans for inviting me to speak here today about the role of boards of directors of large banking firms. Ten years ago this month, the world witnessed the first tremors of what we now think of as the Global Financial Crisis and the subsequent Great Recession. For the United States and many other countries, this would turn out to be the most painful economic period since the Great Depression. In the wake of the crisis, governments around the world instituted a wide range of reforms that were designed to reduce the likelihood and severity of a recurrence. In the United States, the core elements of those reforms included significantly higher capital standards; new liquidity requirements; forward-looking stress tests; and resolution planning. Our largest banking firms are without question much stronger than before the crisis. We are nearing completion of the major parts of this reform program, and are undertaking a thorough review to help assure that the reforms we put in place are both effective and efficient. During the crisis, some large banking firms incurred massive losses. Some of these losses were from products - such as super-senior collateralized debt obligations (CDOs) or structured investment vehicles (SIVs) - whose risks were not even on the radar screen of the firm's board of directors. After the crisis, the Federal Reserve significantly raised our expectations for the boards of directors of large banking firms. Taking risk in service of clients is an essential part of the business of banking, including credit risk, interest rate risk, and various forms of operational risk. Our reforms were designed to assure that boards of directors understand and approve the strategy of the company and the risks inherent in that strategy, and that the institution has the capital, liquidity, and risk management capabilities necessary to manage those risks. Today, the role of a director of a large banking firm is more expansive, more challenging, and more important than ever. Boards now oversee management's participation in highly challenging annual exercises, such as stress testing, capital planning, and resolution planning, that have fundamentally changed the business of our largest institutions. Boards now more carefully evaluate the compensation practices of these large institutions to assure that they reinforce positive incentives and discourage unwanted risk taking. Across a range of responsibilities, we simply expect much more of boards of directors than ever before. There is no reason to expect that to change. We do take seriously our obligation to assess whether our reforms are achieving their desired effects, without imposing unnecessary burden. In 2014, we began a review of these higher expectations for directors. We found that many boards have significantly improved their practices. We also found some ways to make our reforms both more effective and more efficient. For example, while directors generally say that they understand and embrace their more challenging responsibilities, we consistently hear that directors feel buried in hundreds or even thousands of pages of highly granular information, to the point where more important strategic issues are crowded out of board deliberations. Some of this granular information was likely driven by our supervisory guidance, which included specific expectations not only for the management of the institution, but also for the board of directors. Over time, this guidance has increased the number of specific directives aimed at boards well into the hundreds, which may have fostered a "check-the-box" approach by boards. There is also a widespread feeling that our supervision seems to have downplayed the difference in roles between boards and management. Our current ratings system for bank holding companies, which for large banking firms would be replaced by the currently proposed LFI ratings system, refers to the "board and senior management" as a subcomponent rating of risk management. We have also combined the roles of the board and senior management in many of our supervisory feedback letters. After careful consideration, last month we proposed a new framework for our oversight of boards. In formulating this proposal, we had discussions with academics, consultants, legal practitioners, and directors of banking firms. Let me start by saying what the new approach will not do. We do not intend that these reforms will lower the bar for boards or lighten the loads of directors. The new approach distinguishes the board from senior management so that we can spotlight our expectations of effective boards. The intent is to enable directors to spend less board time on routine matters and more on core board responsibilities: overseeing management as they devise a clear and coherent direction for the firm, holding management accountable for the execution of that strategy, and ensuring the independence and stature of the risk management and internal audit functions. These were all areas that were found wanting in the financial crisis, and it is essential that boards get these fundamentals right. Our new proposal will move to a principles-based approach. We have identified five common attributes that effective boards should exhibit, and for which we will have high expectations. This principles-based approach recognizes that large firms have a broad range of business models, structures, and practices. While we want to be clear about our expectations, we also want to give directors the flexibility to meet them in a manner that works for their particular boards. First, an effective board should guide the development of a clear and coherent strategy for the firm and set the types and levels of risks it is willing to take. Alignment of business strategy and risk appetite should minimize the firm's exposure to large and unexpected losses. In addition, the firm's risk management capabilities need to be commensurate with the risks it expects to take. Second, an effective board should actively manage its information flow and deliberations, so that the board can make sound, well-informed decisions that take into account risks and opportunities. Third, an effective board should hold senior management accountable for implementing the firm's strategy and risk appetite and maintaining the firm's risk management and control framework. Fourth, an effective board should ensure the independence and stature of the independent risk management and internal audit functions. It is difficult to overstate the importance of this. Risk management systems and controls may discourage or limit certain revenue-generating opportunities. Failure to ensure the independence of these functions from the revenue generators and risk takers has been shown to be dangerous, and this is something for which the board is accountable. Finally, an effective board should have a composition, governance structure, and set of established practices that are appropriate in light of the firm's size, complexity, scope of operations, and risk profile. Boards need to be aware of their own strengths and weaknesses, and to ensure that directors bring an appropriately diverse range of skills, knowledge, experience, and perspective. Significant events, such as an unexpected loss or compliance failure, should cause boards to reflect and reassess their structure, composition, and processes. An effective board takes a preventative approach and engages in probing self-assessments regularly and systematically. Before I conclude, let me say a few words about an aspect of the proposal that has attracted some attention, which is the reversal of a relatively recent practice of directing all examination and inspection findings--what we call "matters requiring attention" (MRAs) and "matters requiring immediate attention" (MRIAs)--to the board as well as to management. practice resulted in boards of directors reviewing and signing off on management's compliance with every MRA and MRIA. When we began that practice in 2013, our intention was to ensure that directors were in a position to hold management accountable in addressing risk management shortcomings. By 2014, we realized that the practice was "almost surely distracting from strategic and risk-related analyses and oversight by boards". For perspective, a large banking firm may have one hundred or more MRAs outstanding at a given time, many of which are at a level of granularity that is more appropriate for management to remediate, with board oversight. The new proposed framework is designed to make boards more effective in holding management accountable in these efforts. While we have proposed that most MRAs and MRIAs be addressed in the first instance to management and not to the board, the board would continue to receive MRAs where board practices are at issue or where management has failed to promptly and adequately take the required actions. The board would also continue to receive copies of examination and inspection reports, including formal communications with the institution. In the parlance of the proposed guidance I just outlined, we fully expect the board to actively manage the information flow related to MRAs and to hold management accountable for remediating them. In doing so, a board may choose to track progress and closure of MRAs through an appropriate board committee, rather than getting into the granular detail on every individual We need financial institutions that are strong enough to support economic growth by lending through the economic cycle. To achieve that goal, we need strong and effective boards of directors at firms of all sizes. A strong and effective board provides strategic leadership and oversight, which is much more challenging and important than checking off lists of assigned tasks. I look forward to our continuing dialogue on this subject today and in the months to come, and reviewing carefully the comments received on the proposal.
r170905a_FOMC
united states
2017-09-05T00:00:00
Understanding the Disconnect between Employment and Inflation with a Low Neutral Rate
brainard
0
Overall, the U.S. economy remains on solid footing, against the backdrop of the first synchronized global economic growth we have seen in many years and accommodative financial conditions. This benign outlook is clouded somewhat by uncertainty about government funding and the fiscal outlook, and geostrategic risk has risen. While the heartbreaking human toll exacted by Hurricane Harvey is already all too clear, it will take some time to assess the macroeconomic impact. The labor market continues to bring more Americans off the sidelines and into productive employment, which is a very welcome development. Nonetheless, there is a notable disconnect between signs that the economy is in the neighborhood of full employment and a string of lower-than-projected inflation readings, especially since inflation has come in stubbornly below target for five years. With normalization of the federal funds rate under way and the start of gradual balance sheet normalization widely anticipated, I will want to take some time to assess the path of the federal funds rate that will best support a sustainable move in inflation to our 2 percent goal. achieving our inflation objective is especially important, given the apparent persistently low level of the neutral rate and the resulting limited room for maneuver above the effective lower bound. Let me start by reviewing the economic outlook. There has been a noteworthy pickup in business investment this year compared with last year. Investment in the equipment and intellectual property category has risen at an annual rate of 6 percent so far this year after remaining roughly flat last year. The latest data on orders and shipments of capital equipment suggest that solid growth will likely continue in the second half of the year. In addition, oil drilling had rebounded this year after dropping sharply last year, although Hurricane Harvey creates uncertainty about drilling in coming months. While lackluster consumer spending was one of the key reasons for the weak increase in first-quarter gross domestic product (GDP), growth in personal consumption expenditures (PCE) bounced back strongly in the second quarter, and recent readings on retail sales suggest another solid increase in consumer spending this quarter. Of course, the likely economic effects of Hurricane Harvey raise uncertainties about the economic outlook for the remainder of the year. Based on past experience, it appears likely that the hurricane will have a notable effect on GDP in the current quarter, although output is likely to rebound by the end of the year. According to the U.S. Department of Energy, between 20 and 30 percent of the nation's oil refining capacity was shut down at the peak last week, and it is estimated that about 50 percent of petrochemical production was similarly shut down. Some oil production has also been disrupted. These developments have put upward pressure on gasoline prices. Based on previous hurricane events, the increase in gasoline prices should be short lived, but this outcome is uncertain and will depend on the extent of damage to refining capacity. Improvements in the labor market have continued. According to last Friday's labor market report, nonfarm payrolls have increased around 185,000 per month over the three months through August, about the same as the average monthly gains last year. The unemployment rate has been roughly flat for the past several months at 4.4 percent, which is 1/2 percentage point lower than at the same time last year. The employment-to- population ratio for prime-age workers has also improved over the past year, although it is still 2 percentage points lower than its pre-crisis peak in 2007. Earlier this year, many observers saw the prospect of fiscal stimulus as presenting the possibility of a substantial boost to domestic demand. Since then, however, many commentators have downgraded their assessments of the extent and timing of fiscal stimulus, and I have revised my outlook as well. That said, we are seeing synchronized global economic growth for the first time in many years. Foreign economies--including Canada, the euro area, and China--have posted robust GDP growth so far this year. This improvement has been reflected here at home in dollar depreciation; higher earnings and stock prices; tighter risk spreads; and an increase in net exports, which made a small positive contribution in the first half of this year after holding down GDP growth over the past several years. In addition, there are indications that, before too long, central banks in several major economies could begin normalizing monetary policy, in many cases through adjustments to their balance sheets as well as their policy rates. Those changes in foreign monetary policies could have important implications for term premiums and, in turn, longer-term Treasury rates, depending on the timing and approach. Despite this benign picture for the U.S. economy and continued increases in resource utilization, core inflation, as measured by changes in the PCE price index for items other than food and energy, slowed by almost 1/2 percentage point relative to the pace a year ago. Indeed, both overall and core inflation were only 1.4 percent for the objective. To what extent does it make sense to look through the recent low inflation readings on the grounds they are transitory? It appears that temporary factors, such as discounted cell phone plans, are pushing down inflation to some extent this year. By the same token, it is likely that other temporary factors--for example, prescription drug prices--boosted inflation last year. Going forward, we should see a temporary boost to headline inflation due to Hurricane Harvey's effect on gasoline prices that I mentioned earlier. Temporary factors, by their nature, have little implication for the underlying trend in inflation. In contrast, what is troubling is five straight years in which inflation fell short of our target despite a sharp improvement in resource utilization. It is instructive to put the shortfall in inflation in recent years in perspective by comparing inflation in the past few years with the last time the economy was in the neighborhood of full employment-- namely, just before the financial crisis. In particular, over the past three years, unemployment has averaged roughly 5 percent. Similarly, over the three years ending in early 2007--before the unemployment rate started rising--the unemployment rate also averaged 5 percent. Despite a similar degree of resource utilization, core inflation averaged 2.2 percent from 2004 to 2007, notably higher than the comparable three-year average inflation rate today of 1.5 percent. Why is inflation so much lower now than it was previously? The fact that the period from 2004 to 2007 had inflation around target with similar unemployment rates casts some doubt on the likelihood that resource utilization is the primary explanation. Similarly, a 12-quarter average is typically long enough that temporary factors should not be the dominant concern. One key factor that may have played a role in the past three years is the decline in import prices, reflecting the dollar's surge, especially in 2015. By contrast, in the 2004-07 period, non-oil import prices increased at roughly a 2 percent annual rate and had a more neutral effect on inflation. Nonetheless, while the decline in non-oil import prices likely accounts for some of the weakness in inflation over the past few years, these prices have begun rising again in the past year at a time when inflation remains relatively low. So if import prices, resource utilization, and transitory factors together do not provide a complete account, why has inflation been so much lower in the past few years than it was previously? In many of the models economists use to analyze inflation, a key feature is "underlying," or trend, inflation, which is believed to anchor the rate of inflation over a fairly long horizon. Underlying inflation can be thought of as the slow- moving trend that exerts a strong pull on wage and price setting and is often viewed as related to some notion of longer-run inflation expectations. There is no single highly reliable measure of that underlying trend or the closely associated notion of longer-run inflation expectations. Nonetheless, a variety of measures suggest underlying trend inflation may currently be lower than it was before the crisis, contributing to the ongoing shortfall of inflation from our objective. That conclusion is suggested by estimates based on time-series models, longer-run expectations from the University of Michigan Surveys of Consumers and Survey of Professional Forecasters, and market-based measures of inflation compensation. Starting with time-series models, one model that has been used by a variety of researchers suggests that underlying trend inflation may have moved down by perhaps as much as 1/2 percentage point over the past decade. Market-based measures of inflation compensation provide another read on inflation expectations. Comparing the three-year period ending in the second quarter of this year with the three-year period ended just before the financial crisis, 10-year-ahead inflation compensation based on TIPS based measures of inflation expectations are also lower. The Michigan survey measure of median household expectations of inflation over the next five to 10 years suggests a 1/4 percentage point downward shift over the most recent three-year period compared with the pre-crisis years, similar to the five-year, five-year forward forecast for the consumer price index from the Survey of Professional Forecasters. Why might underlying inflation expectations have moved down since the financial crisis? One simple explanation may be the experience of persistently low inflation: Households and firms have experienced a prolonged period of inflation below our objective, and that may be affecting their perception of underlying inflation. A related explanation may be the greater proximity of the federal funds rate to its effective lower bound due to a lower neutral rate of interest. By constraining the amount of policy space available to offset adverse developments using our more effective conventional tools, the low neutral rate could increase the likely frequency of periods of below-trend inflation. In short, frequent or extended periods of low inflation run the risk of pulling down private-sector inflation expectations. Given today's circumstances, with the economy near full employment and inflation below target, how should the FOMC achieve its dual-mandate goals? Some might determine that preemptive tightening is appropriate on the grounds that monetary policy operates with long lags, and inflation will inevitably accelerate as the labor market continues to tighten because of the Phillips curve. However, in today's economy, there are reasons to worry that the Phillips curve will not prove very reliable in boosting inflation as resource utilization tightens. Since 2012, inflation has tended to change relatively little as the unemployment rate has fallen considerably, from 8.2 percent to 4.4 percent. In short, the Phillips curve appears to be flatter today than it was previously. This is also apparent in a number of advanced foreign economies, where declines in their unemployment rates to relatively low levels have failed to generate significant upward pressures on inflation. Given the flatness of the Phillips curve, it could take a considerable undershooting of the natural rate of unemployment to achieve our inflation objective if we were to rely on resource utilization alone. For all these reasons, achieving our inflation target on a sustainable basis is likely to require a firming in longer-run inflation expectations--that is, the underlying trend. The key question in my mind is how to achieve an improvement in longer-run inflation expectations to a level that will allow us to achieve our inflation objective. The persistent failure to meet our inflation objective should push us to think broadly about diagnoses and solutions. The academic literature on monetary policy suggests a variety of prescriptions for preventing a lower neutral rate of interest from eroding longer-run inflation expectations. One feature that is common to many proposals is that the persistence of the shortfall in inflation from our objective should be one of the considerations in setting monetary policy. Most immediately, we should assess inflation developments closely before making a determination on further adjustments to the federal funds rate. This brings me to the implications for monetary policy. A key upcoming decision for the Committee is when to commence balance sheet normalization. I consider normalization of the federal funds rate to be well under way, the criterion for commencing balance sheet normalization. The approaching change to our reinvestment policy has been clearly communicated and is well anticipated. In principle, the FOMC could use both the balance sheet and the federal funds rate as active tools for setting monetary policy. However, I view the federal funds rate as the preferred active tool, because its effect on financial conditions and the economy has been more extensively tested and therefore is better understood than changes to the balance sheet. As a result, once we set in motion the change in balance sheet policy, as long as the economy evolves broadly as expected, we should allow the balance sheet to run off in the background at the gradual pace that was announced. We would primarily look to ongoing adjustments in the federal funds rate to calibrate the stance of monetary policy as economic conditions evolve. Once balance sheet normalization is under way, I will be looking closely at the evolution of inflation before making a determination about further adjustments to the federal funds rate. We have been falling short of our inflation objective not just in the past year, but over a longer period as well. My own view is that we should be cautious about tightening policy further until we are confident inflation is on track to achieve our target. Unless we expect inflation to move quickly back toward target--or there are indications that the short-run neutral rate has moved up further--a variety of empirical estimates suggest we could approach neutral without too many additional rate increases. Many forecasters assume that the neutral rate of interest is very low currently, and that it is likely to be low relative to historical norms in the longer run. For example, the well- known Laubach-Williams model currently suggests an estimate of the longer-run neutral federal funds rate that is actually slightly below zero. And in the most recent Summary of Economic Projections (SEP), the median longer-run nominal federal funds rate was 3 percent, which implies the long-run real federal funds rate would only be 1 percent, lower than its average in the decades before then of around 2 1/2 percent. These estimates suggest that the neutral rate of interest is likely to rise only modestly in the medium term. It is worth remembering, in addition, that the Federal Reserve's balance sheet policy may be reinforcing this tendency over the next several years. A recent study suggests balance sheet runoff could boost the level of the term premium on the 10-year Treasury yield by about 40 basis points over the first few years. Typical rules of thumb suggest that such an increase in term premiums would imply a decrease in the short-run neutral rate of interest. Although the FOMC expects to begin normalizing its balance sheet relatively soon, several foreign central banks are continuing their purchases of longer-term assets in their own currencies. Because longer-term government securities in the major economies are close substitutes, the ongoing balance sheet programs of some foreign central banks will likely continue to hold down U.S. longer-term interest rates. But with economies abroad strengthening, it may not be too long before some foreign central banks will end their net purchases and, eventually, begin reducing their balance sheets. As that happens, the current downward pressure on longer-term interest rates from foreign spillovers will abate. For these reasons, my current expectation is that the short-run neutral rate of interest may not rise much over the medium term. But this is an open question and bears close monitoring. Of course, it is entirely possible that other factors will be working to offset this downward pressure on the equilibrium funds rate--as could be the case, for instance, if fiscal stimulus is greater than many observers currently expect. To the extent that the neutral rate remains low relative to its historical value, there is a high premium on guiding inflation back up to target so as to retain space to buffer adverse shocks with conventional policy. In this regard, I believe it is important to be clear that we would be comfortable with inflation moving modestly above our target for a time. In my view, this is the clear implication of the symmetric language in the Before concluding, it is worth considering the possible implications of a sustained period of low neutral rates and low unemployment for financial imbalances. Historically, extended periods with very low unemployment rates tend to be associated with below- average spreads of expected returns on risky assets over safe interest rates--low bond risk premiums, for example, or low equity premiums. Although, to some extent, low risk premiums and rising asset valuations may be consistent with strong economic fundamentals, such as low default rates and strong corporate earnings, there have also been episodes when a very strong economy and low unemployment rate have led to overvaluation of asset prices, underpricing of risk, and growing financial imbalances. Thus, in today's environment, it is important to be vigilant to the signs that asset valuations appear to be elevated, especially in areas such as commercial real estate and corporate bonds, as well as the exceptionally low levels of expected volatility. Nonetheless, there are few signs of a dangerous buildup of leverage or of maturity transformation, which have traditionally been important contributors to financial instability. This is due, in no small measure, to the improvements in capital, liquidity, and risk management made by the financial institutions at the core of the system, which are associated with post-crisis financial reforms, as well as money market reform and the greater transparency in the derivatives markets. To conclude, much depends on the evolution of inflation. If, as many forecasters assume, the current shortfall of inflation from our 2 percent objective indeed proves transitory, further gradual increases in the federal funds rate would be warranted, perhaps along the lines of the median projection from the most recent SEP. But, as I noted earlier, I am concerned that the recent low readings for inflation may be driven by depressed underlying inflation, which would imply a more persistent shortfall in inflation from our objective. In that case, it would be prudent to raise the federal funds rate more gradually. We should have substantially more data in hand in the coming months that will help us make that assessment. . . . . . Forum, sponsored by the Initiative on Global Markets at the University of .pdf . . . . pp. 3-33, available at .
r170926b_FOMC
united states
2017-09-26T00:00:00
Why Persistent Employment Disparities Matter for the Economy's Health
brainard
0
I want to compliment the organizers and others for gathering an outstanding group of researchers and papers for this conference. Understanding why some groups persistently fare better than others in the job market and how these disparities may affect the economy's overall performance is vitally important to the Federal Reserve. While opportunity and inclusion have long been central to American values, it is increasingly clear that they are also central to the strength of our economy. As directed by the Congress, the Federal Reserve's dual mandate is to promote maximum employment and stable prices. In fulfilling its dual mandate, the Federal Open Market Committee (FOMC) has set a target of 2 percent for inflation but does not have a similarly fixed numerical goal for maximum employment. That is because the level of maximum employment depends on "nonmonetary factors that affect the structure and dynamics of the labor market," which "may change over time and may not be directly Understanding how close the labor market is to our full-employment goal requires consulting a variety of evidence along with a healthy dose of judgment. The recognition that maximum employment evolves over time to reflect changes in the economic landscape serves us well by requiring FOMC participants to develop a nuanced understanding of labor market developments. This approach to maximum employment has allowed the FOMC to navigate the current expansion in a way that has likely brought more people back into productive employment than might have been the case with a fixed unemployment rate target based on pre-crisis standards. This is especially true at a time when the traditional Phillips curve relationship is flatter than in the past, which means that price inflation is likely to be less informative regarding labor market tightness than it was previously. It therefore seems particularly valuable to look beyond inflation and headline unemployment to assess the strength of the labor market. Even when aggregate economic statistics look strong, studying geographic areas and demographic groups that are not faring as well can point to ways of further improving the economy's performance. The Federal Reserve is also keenly interested in disparities in employment, labor force participation, income, and wealth because they may have implications for the growth capacity of the economy. When we consider appropriate monetary policy, we need to have a good sense of how fast the economy can grow without fueling excessive price inflation. At a time when the retirement of the baby-boom generation looks likely to be something of a drag on the growth of the labor force, it is especially important to consider whether relatively low levels of employment and labor force participation for some prime working-age groups represent slack that, if successfully tapped, could increase the labor force and boost economic activity. More broadly, when a person who was previously unemployed or discouraged secures a job, not only does it boost the economy, but that person also may gain a greater sense of economic security, self-sufficiency, and self-worth and be better able to invest in their family's future. With a richer understanding of economic or social barriers that inhibit labor market success and prosperity for some groups, we may better grasp how much these individuals can be helped by broad economic expansion and how much targeted intervention is required through other policy means. There is also an important connection between the economy's potential growth rate and equality of opportunity. Large disparities in opportunity based on race, ethnicity, gender, or geography mean that the enterprise, exertion, and investments of households and businesses from different groups are not rewarded commensurately. To the extent that disparities in income and wealth across race, ethnicity, gender, or geography reflect such disparities in opportunity, families and small businesses from the disadvantaged groups will then underinvest in education or business endeavors, and potential growth will fall short of the levels it might otherwise attain. Aside from reducing the long-run productive potential of the economy, persistently high levels of income and wealth inequality may also have implications for the robustness of consumer spending, which accounts for roughly two-thirds of aggregate spending in the United States. The gaps in household income and wealth between the richest and poorest households are at historically high levels, as income and wealth have increasingly accrued to the very richest households. For example, results from the soon, indicate that the share of income held by the top 1 percent of households reached 24 percent in 2015, up from 17 percent in 1988. The share of wealth held by the top 1 percent rose to 39 percent in 2016, up from 30 percent in 1989. Some research suggests that widening income and wealth inequality may damp consumer spending in the aggregate, as the wealthiest households are likely to save a much larger proportion of any additional income they earn relative to households in lower income groups that are likely to spend a higher proportion on goods and services. When we disaggregate the economy-wide labor market statistics, we find significant and persistent racial and ethnic disparities. In August, the national unemployment rate of 4.4 percent, which is low by historical standards, masked substantial differences across different demographic groups. As shown in figure 1, unemployment rates ranged from 3.9 percent for whites to 4 percent for Asians, participation rates, shown in figure 2, also differ substantially, although by less than unemployment rates, with the rate for African Americans lowest at 62.2 percent. These differences are not a recent development--similar differences across racial and ethnic dimensions have existed for as long as these data have been collected. Even more striking, a significant portion of the gaps in unemployment rates across racial and ethnic groups cannot be attributed to differences in their underlying characteristics, such as age and education levels. Although the differences in employment rates between racial and ethnic groups are still quite large, they have narrowed recently, after having widened considerably during the recession, and are near their lowest levels in decades. Differences in unemployment rates across racial and ethnic groups tend to widen sharply during recessions, as less advantaged groups shoulder an outsized share of total layoffs, and these differences shrink during recoveries. For example, in the second quarter of 2017, the unemployment rate for black adult men was a little more than 3 percentage points higher than for white adult men. This differential, while sizable, is nonetheless close to the smallest gap seen since comparable data became available in the mid-1970s. Differences in unemployment rates are similarly near historical lows for black women relative to white women, and for Hispanics relative to whites. Since racial disparities tend to get smaller throughout the course of an economic expansion, it seems likely that racial differences in unemployment rates will continue to shrink if the overall unemployment rate falls further. More broadly, the persistent disparities in employment outcomes are mirrored in significant and persistent racial and ethnic differences in families' income and wealth. According to forthcoming findings from the latest SCF and as shown in figure 3, the average income for white families in 2015 was about $123,000 per year, compared with Disparities in wealth, shown in figure 4, are even larger: Average wealth holdings for white families in 2016 black families. Moreover, these racial and ethnic gaps in average family income and wealth have generally widened rather than narrowed over the past few decades. Based on SCF data, median family wealth has grown much more rapidly for white families than for other families over the past few decades, while median family incomes have risen by about the same amount for white, black, and Hispanic families. As the economic expansion continues and brings more Americans off the sidelines and into productive employment, it seems likely that the positive trends in employment and participation rates for historically disadvantaged groups will continue. That said, the benefits of a lengthy recovery can only go so far, as the research points to some barriers to labor market outcomes for particular groups that appear to be structural. After controlling for sectoral and educational differences, the research suggests that these factors include discrimination as well as differences in access to quality education and informal social networks that may be an important source of information and support regarding employment opportunities. While the policy tools available to the Federal Reserve are not well suited to addressing the barriers that contribute to persistent disparities in labor market outcomes, understanding these barriers and efforts to address them is vital in assessing maximum employment as well as potential growth. The Federal Reserve System benefits not only from our engagement with research, statistics, and surveys, but also from our presence in communities all across America. This local presence, by design, provides valuable perspectives on how Americans in different communities are experiencing the economy and the varied challenges that lie beneath the aggregate numbers. While traveling around the country with our community development staff, I have been struck by the widening gulf between the economic fortunes of our large metropolitan areas and those of our small cities, towns, and rural areas. The statistics bear this out. Over the past 30 years, the convergence in income across regions of the country has slowed dramatically. Much of the gains in employment, income, and wealth since the end of the recession, and more broadly over the past few decades, have accrued to workers and families in larger cities. Since some workers and families may find it difficult to move, this concentration of economic opportunities in larger cities may have adverse implications for the well-being of these households and, potentially, the growth capacity of the economy as a whole. Although pockets of opportunity and poverty are found in large metropolitan and rural areas alike, a greater share of the new jobs and business establishments created during the recovery that followed the Great Recession have been in larger metro areas than was the case in previous recoveries. In countless rural towns and small cities we are seeing how a deep economic setback can leave a profound and long-lasting mark. These experiences challenge common assumptions about the ability of local economies to recover from a setback. This could be the legacy of the concentrated presence of an industry that experiences decline due to trade or technology, or it could be the byproduct of a lack of connectivity--whether by highways or broadband. Technological change, globalization, and other shifts in demand and costs are not new to the U.S. economy, but there are troubling signs that less diversified or connected localities have a diminished ability to adapt. And the evidence suggests that concentrated economic shocks and the associated labor market stress also have broader consequences for health and mortality. To provide some sense of the magnitudes, on average over the past year the unemployment rate for adults of prime working age (25 to 54) was about 1 percentage point higher in nonmetropolitan areas than in larger metro areas. But there is an even greater gap in labor force engagement, as can be seen in figure 5. The participation rate for prime-age adults in larger metro areas is currently nearly 3-1/2 percentage points above the participation rate for prime-age adults in nonmetro areas. Interestingly, the geographic participation rate gap between more and less populous areas is apparent for all races as well as, in recent years, for both men and women. This gap in labor force participation between large cities and other areas has widened substantially since just before the Great Recession: Since 2007, the participation rate for prime-age adults in nonmetro areas has fallen nearly 3 percentage points, as compared with less than 1 percentage point on net in larger metro areas. Indeed, since 2007, the large decline in labor force participation in small metro and rural areas can explain about 40 percent of the economy-wide decline in prime-age labor force participation, even though these areas account for a smaller 25 percent of the population. Before discussing possible contributors to this growing participation gap, it is important to emphasize that less populous areas appear to be falling behind in ways beyond these employment outcomes. Based on forthcoming SCF data, for example, the average annual income for families in metro areas was about $54,000 higher than for families in nonmetro areas, and the average wealth holdings for families in metro areas exceeded average wealth for families in nonmetro areas by nearly $500,000--and these gaps have more than doubled over the past three decades. The gaps in many other measures of well-being have widened as well. In small towns and rural areas, college attainment rates have increased by less, disability rates have increased by more, divorce rates have risen by more, and mortality due to lung disease, cancer, or cardiovascular disease have either improved by less or worsened by more. Opioid use is also most prevalent in less populous metro and rural areas. I have seen many of these challenges first hand. In the small towns and hollers of eastern Kentucky, I visited with community development financial institutions that are trying to plug the gap in access to credit so that small businesses can continue operating and hiring locally, and so that families can access housing that is safe and affordable. In rural communities in the Mississippi Delta, I learned about diminished access to financial services available to rural residents, which can be a barrier to housing and business investment and pose vexing challenges to local governments. In Texas, I learned about barriers to economic development in the rural colonias areas on the southern border associated with underinvestment in physical and broadband infrastructure. As we consider the long-term health of the U.S. economy, it is important to better understand the decade-long decline in aggregate labor force participation. It is striking that in larger metro areas, the labor force participation rate for prime-age men has recently retraced much of the decline experienced during the recession, while in smaller metro and rural areas, the labor force participation rate remains well below its pre- recession level, with only modest improvements of late. The evidence increasingly suggests that much of the decline relates to a sustained decline in job opportunities for prime-age men, especially less-educated prime-age men, resulting in languishing wages relative to other groups. Indeed, it is notable that the striking decrease in labor force participation rates for nonmetro areas relative to large metro areas is highly concentrated among adults with no more than a high school education, who comprise a larger share of the prime-age population in nonmetro areas. The labor force participation rate for adults with no more than a high school education has fallen to 72 percent in nonmetro areas-- about 3-1/2 percent below larger metro areas. Although the precise causes of this decline are still not fully settled, one contributing factor is advancing automation and computerization. Another contributor is globalization. For example, a growing body of research has identified a steeper decline in the employment and labor force attachment of prime-age men in areas of the country that specialized in the industries that were most negatively affected by increased imports from China. Research suggests that some of the decline in prime-age labor force participation relates to some individuals' reduced ability or desire to work, in some cases resulting directly from the ongoing decline in job opportunities. There are many reasons why some prime-age men may be less willing or able to work. One possibility is that the unusually long spells of nonemployment associated with the Great Recession may have eroded job skills and informal employment networks. Another possibility that is increasingly in focus is that physical disabilities, as well as sharp increases in opioid use, have increasingly inhibited some individuals from participating in the labor force. The fraction of prime-age men receiving disability insurance benefits has increased from 1 percent in the late 1970s to 3 percent more recently. Recent research also finds that among all prime-age men who are not in the labor force, about one-third reported having at least one disability, and nearly one-half reported taking pain medications daily. These supply-side explanations may be related to the drop in labor demand: the despair related to diminished prospects of a stable and quality job may lead to substance abuse and related health or mortality concerns. At least some of these explanations potentially relate to the growing divide between large metro areas and other areas of the country. As noted earlier, the opioid epidemic appears to be particularly acute in smaller cities and rural areas. In addition, employment in non-metro areas tends to be more concentrated in manufacturing, which is the sector that has experienced the largest decline in employment from automation and globalization. Similarly, research suggests that workers in less populous areas have been more likely to be directly affected by increased import competition from China due to the geographical distribution of industries. And for many less populous areas, job opportunities are less diverse than in bigger cities, so that when a plant shuts down, there are fewer local alternative job opportunities for unemployed workers, especially with comparable levels of employment security or benefits. These striking results naturally raise the question of whether we are seeing heightened migration from the less populous areas to the larger metros with greater economic opportunity. A conventional assumption in economics is that regional differences should narrow over time as workers move toward areas where jobs are more plentiful and wages are higher. In reality, Americans' propensity to move is currently at its lowest level in many decades. In 2016, the fraction of the population that had moved within the United States in the past year was 11 percent, down from 17 percent or more in the early 1980s, with the steepest decline in the fraction of people moving longer distances, across county or state lines. The evidence suggests that the decline in geographic mobility cannot be fully explained by population aging, by the housing boom and bust, by changes in the composition of industries, by the increasing ease of telecommuting from longer distances, or by the rise in dual-earner households which may make work-related relocation more difficult. Some of the decline may be related to changes in the labor market, perhaps because workers are more likely to perceive that job opportunities are no better elsewhere, and consequently that the labor market returns to switching jobs or locations--in terms of better wages or higher job quality--have declined. Also, zoning requirements may be boosting housing costs in cities where job opportunities are most abundant, such as San Francisco, pricing out many potential workers and inhibiting migration. Whatever the reason, the fact that families are less likely to move now than in the past suggests that many of those in less populous areas are not able to access the economic opportunity present in denser and more diversified large metropolitan areas at a time when the gap in labor market outcomes for larger metros relative to other areas continues to grow. The Federal Reserve is deeply engaged in understanding disparities through our data collection, research collaboration, and community development work. One way the Federal Reserve seeks to obtain a clearer picture is by collecting data ourselves. For instance, some of the data I have cited today come from the Federal Reserve's triennial Survey of Consumer Finances, which provides detailed information on income and wealth holdings by demographic groups. The Survey of Household Economics and Decisionmaking provides a portrait of household finances, employment, housing, and debt; the Survey of Young Workers provides insights into younger adults' employment experiences soon after entering the labor force; and the Enterprising and Informal Work Activities Survey provides information about income generating activities that are often outside the scope of other employment and income surveys. Across the Federal Reserve System, a variety of initiatives are aimed at understanding economic disparities and how to foster more-inclusive growth. The brings together researchers from a variety of fields to analyze barriers to economic opportunity and advancement. The Economic Growth and Mobility Project at the Federal Reserve Bank of Philadelphia aims to bring together researchers with community stakeholders to focus on differences in poverty and economic mobility across demographic characteristics. The Investing in America's Workforce Initiative is a collaboration between the Federal Reserve System and academic research institutions to promote investment in workforce skills that better align with employers' needs. All of that brings me to today's conference, which I am confident will make an important contribution to this mission. I am heartened to see so many researchers and practitioners from a variety of backgrounds focused on these important issues. This conference is part of our efforts to hear from experts with diverse backgrounds and perspectives to better understand the nature and implications of labor market disparities. A deeper understanding of labor market disparities is central to the mission of the Federal Reserve because it may help us better assess full employment, where resources may be underutilized, and the likely evolution of the labor market and overall economic activity. We look forward to hearing what you have to say about these important questions and learning what other questions are in need of attention. . . . . vol. 125 . . , . . . Proceedings of the . ation_report.pdf . meage_male_lfp_cea.pdf . . . . . . . (The conference draft is available at . System, eds., Communities, and the Economy. . . . .
r170926a_FOMC
united states
2017-09-26T00:00:00
Inflation, Uncertainty, and Monetary Policy
yellen
1
I would like to thank the National Association for Business Economics for inviting me to speak today and for the vital role the association plays in fostering debate on important economic policy questions. Today I will discuss uncertainty and monetary policy, particularly as it relates to recent inflation developments. Because changes in interest rates influence economic activity and inflation with a substantial lag, the Federal Open Market Committee (FOMC) sets monetary policy with an eye to its effects on the outlook for the economy. But the outlook is subject to considerable uncertainty from multiple sources, and dealing with these uncertainties is an important feature of policymaking. Key among current uncertainties are the forces driving inflation, which has remained low in recent years despite substantial improvement in labor market conditions. As I will discuss, this low inflation likely reflects factors whose influence should fade over time. But as I will also discuss, many uncertainties attend this assessment, and downward pressures on inflation could prove to be unexpectedly persistent. My colleagues and I may have misjudged the strength of the labor market, the degree to which longer-run inflation expectations are consistent with our inflation objective, or even the fundamental forces driving inflation. In interpreting incoming data, we will need to stay alert to these possibilities and, in light of incoming information, adjust our views about inflation, the overall economy, and the stance of monetary policy best suited to promoting maximum employment and price stability. Let me begin by reviewing recent inflation developments and the economic outlook. As the solid blue line in figure 1 indicates, inflation as measured by the price index for personal consumption expenditures (PCE) has generally run below the FOMC's 2 percent longer-run objective since that goal was announced in January 2012. inflation, which strips out volatile food and energy prices, has also fallen persistently short of 2 percent (the red dashed line). Furthermore, both overall and core inflation, after moving up appreciably last year, have slipped again in recent months. Sustained low inflation such as this is undesirable because, among other things, it generally leads to low settings of the federal funds rate in normal times, thereby providing less scope to ease monetary policy to fight recessions. In addition, a persistent undershoot of our stated 2 percent goal could undermine the FOMC's credibility, causing inflation expectations to drift and actual inflation and economic activity to become more volatile. As noted in its recent statement, the FOMC continues to anticipate that, with gradual adjustments in the stance of monetary policy, inflation will rise and stabilize at around 2 percent over the medium term. This expectation is illustrated by the green stars, which represent the medians of the inflation projections submitted by FOMC participants at our meeting last week. In part, this expectation reflects the significant improvement in labor market conditions over the past few years. As shown in figure 2, the unemployment rate (the blue line) now stands at 4.4 percent, somewhat below the median of FOMC participants' estimates of its longer-run sustainable level (the black line). As the green stars indicate, labor market conditions are expected to strengthen a bit further. The inflation outlook also reflects the Committee's judgment that inflation expectations will remain reasonably well anchored at a level consistent with PCE price inflation of 2 percent in the long run, and that the restraint imposed in recent years by a variety of special factors, including movements in the relative prices of food, energy, and imports, will wane in coming quarters. To understand this assessment, it is useful to decompose the forces driving movements in inflation since the financial crisis, as estimated using a simple model of inflation that I presented in a speech two years ago. Figure 3 reports this decomposition as the contributions made by various factors to the shortfall of PCE price inflation from 2 percent, year by year. As illustrated by the purple dotted portion of the bars, labor underutilization, or "slack," accounts for a shrinking share of the shortfall since 2012 and is now having a negligible effect. By comparison, the influence of changes in relative food, energy, and import prices--the solid blue and checkered red portions--has been more substantial in the past few years, although their contribution is estimated to have greatly diminished this year. Not surprisingly, the simple model does not account for all of the year-to-year movements in inflation. As indicated by the green striped portion of the bars, the residual component of the shortfall was modestly positive on average from 2008 through last year. This year, however, inflation has been unexpectedly weak from the model's perspective. This unusually large error does not necessarily imply that inflation is more likely to continue to come in on the low side in coming years. Some of the recent decline in inflation, although not all, reflects idiosyncratic shifts in the prices of some items, such as the large decline in telecommunication service prices seen earlier in the year, that are unlikely to be repeated. As the green dashed line in figure 4 illustrates, if the average change in consumer prices each month is calculated excluding items whose price changes are outliers on both the high and low side, the resulting "trimmed mean" measure of inflation shows less of a slowdown this year. Based on analyses of this sort, my colleagues and I currently think that this year's low inflation is probably temporary, so we continue to anticipate that inflation is likely to stabilize around 2 percent over the next few years. But our understanding of the forces driving inflation is imperfect, and we recognize that something more persistent may be responsible for the current undershooting of our longer-run objective. Accordingly, we will monitor incoming data closely and stand ready to modify our views based on what we learn. Although we judge that inflation will most likely stabilize around 2 percent over the next few years, the odds that it could turn out to be noticeably different are considerable. This point is illustrated by figure 5. Here the red line indicates the median of the latest inflation projections submitted by FOMC participants that I showed previously. The pertinent feature of this figure is the blue shaded region around the red line, which shows a 70 percent confidence interval around FOMC participants' median outlook. The width of this region reflects the average accuracy of inflation projections made by private and government forecasters over the past 20 years. As the figure shows, based on that history, there is a 30 percent probability that inflation could be greater than 3 percent or less than 1 percent next year. Most of this uncertainty reflects the influence of unexpected movements in oil prices and the foreign exchange value of the dollar, as well as that of idiosyncratic developments unrelated to broader economic conditions. These factors could easily push overall inflation noticeably above or below 2 percent for a time. But such disturbances are not a great concern from a policy perspective because their effects fade away as long as inflation expectations remain anchored. For this reason, the FOMC strives to look through these transitory inflation effects when setting monetary policy. Such was the case when rising oil prices pushed headline inflation noticeably above 2 percent for several years prior to the financial crisis. Similarly, the Committee substantially discounted the reductions in inflation that occurred from 2014 through 2016 as a result of the decline in oil prices and the effects of the dollar's appreciation on import prices. A more important issue from a policy standpoint is that some key assumptions underlying the baseline outlook could be wrong in ways that imply that inflation will remain low for longer than currently projected. For example, labor market conditions may not be as tight as they appear to be, and thus they may exert less upward pressure on inflation than anticipated. Alternatively, long-run inflation expectations, which have an important influence on actual inflation, may not be consistent with the FOMC's 2 percent goal. More broadly, the conventional framework for understanding inflation dynamics could be misspecified in some fundamental way. Let's now consider each of these possibilities in turn. The unemployment rate consistent with long-run price stability at any time is not known with certainty; we can only estimate it. The median of the longer-run unemployment rate projections submitted by FOMC participants last week is around 4-1/2 percent. But the long-run sustainable unemployment rate can drift over time because of demographic changes and other factors, some of which can be difficult to quantify--or even identify--in real time. For these and other reasons, the statistical precision of such estimates is limited, and the actual value of the sustainable rate could well be noticeably lower than currently projected. Thus, although FOMC participants generally view current labor utilization as probably somewhat greater than what can be sustained in the longer run, the statistical evidence from past experience does not rule out the possibility that some slack still remains in the labor market. If so, the economy could sustain a higher level of employment and output in the longer run than now anticipated--a very beneficial outcome, albeit one that would require recalibrating monetary policy over time in order to reap those benefits and compensate for the accompanying reduction in inflationary pressures. A related question is whether the unemployment rate alone is an adequate gauge of economic slack for the purposes of explaining inflation. Although the unemployment rate is probably the best single summary measure of labor utilization, some indicators have shown less improvement since the financial crisis. As the solid blue line in figure 6 illustrates, the employed share of the "prime-age worker" population--that is, persons from ages 25 to 54--remains noticeably below the 2007 level. But employment rates for this group may now be permanently lower than in the past as a result of declining employment opportunities for less-skilled workers, a rising number of people receiving disability insurance, and other worrisome trends. Similarly, although the share of part- time workers who would like a full-time job is still somewhat above where it stood before the last two recessions (the dashed red line), it could reflect a structural change in firms' reliance on part-time labor. In addition, these two measures have to be weighed against other labor indicators that have either returned to, or are currently above, their pre-recession levels. As shown in figure 7, those indicators include the quits rate (the short-dashed blue line), household perceptions of job availability (the short-and-long- dashed green line), the jobs opening rate (the long-dashed red line), and the percentage of small firms finding it hard to fill jobs (the solid black line). On balance, the unemployment rate probably is correct in signaling that overall labor market conditions have returned to pre-crisis levels. But that return does not necessarily demonstrate that the economy is now at maximum employment because, due to demographic and other structural changes, the unemployment rate that is sustainable today may be lower than the rate that was sustainable in the past. In that regard, some observers have pointed to the continued subdued pace of wage growth as evidence that the economy is not yet back to full employment. As shown in figure 8, labor compensation as measured by the employment cost index (the short- dashed red line) has been growing at more or less the same rate since 2014, and hourly compensation in the nonfarm business sector (the short-and-long-dashed green line)--a quite noisy measure, even after smoothing--is actually growing more slowly. But growth in average hourly earnings (the solid blue line) and the Atlanta Fed's Wage Growth Tracker (the long-dashed black line) have clearly picked up. In addition, productivity growth has been quite weak in recent years, and empirical analysis suggests that it is has been holding down aggregate growth in labor compensation independent of labor utilization in recent years. An analysis of the pattern of wage growth at the U.S. state level also suggests that subdued growth for the country as a whole probably reflects sluggish productivity or some other factor common to all states, because cross-state differences in wage growth are about what one would expect given cross-state differences in unemployment rates. Finally, I would note that the percentage of firms planning wage increases has moved back up to its pre-recession level, many firms report difficulties in finding qualified workers, and some have responded by expanding training programs and offering signing bonuses--possible harbingers of stronger wage gains to come. Overall, I view the data we have in hand as suggesting a generally healthy labor market, not one in which substantial slack remains or one that is overheated. That said, the evidence does not allow for any definitive assessment, so policymakers must remain open minded on this question and its implications for reaching our inflation goal. Another source of uncertainty concerns inflation expectations. In standard economic models, inflation expectations are an important determinant of actual inflation because, in deciding how much to adjust wages for individual jobs and prices of goods and services at a particular time, firms take into account the rate of overall inflation they expect to prevail in the future. Monetary policy presumably plays a key role in shaping these expectations by influencing the average rate of inflation experienced in the past over long periods of time, as well as by providing guidance about the FOMC's objectives for inflation in the future. Even so, economists' understanding of exactly how and why inflation expectations change over time is limited. Moreover, we have to contend with the fact that we do not directly observe the inflation expectations relevant to wage and price setting. Instead, we can only imperfectly infer how they might have changed based on survey responses and other data. The FOMC's outlook depends importantly on the view that longer-run inflation expectations have been stable for many years at a level consistent with PCE price inflation that will average around 2 percent in the longer run. Provided this stability continues, standard models suggest that actual inflation should stabilize at about 2 percent over the next two or three years in an environment of roughly full employment, absent any future shocks. However, there is a risk that inflation expectations may not be as well anchored as they appear and perhaps are not consistent with our 2 percent goal. To assess this risk, the FOMC considers a variety of survey measures of expected longer- run inflation, some of which are shown in figure 9. Long-range projections of PCE price inflation made by private forecasters, the solid red line, have been remarkably stable for many years, as have been the longer-run inflation expectations reported in surveys of financial market participants (not shown). Households' longer-term expectations as reported in the University of Michigan Surveys of Consumers, the short-dashed blue line, have also been fairly stable overall since the late 1990s. That said, results from this survey, as well as a survey of consumers carried out by the Federal Reserve Bank of New York, do hint that expectations may have slipped a bit over the past two or three years. If so, stabilizing inflation at around 2 percent could prove to be more difficult than expected. In theory, differences between yields on conventional Treasury securities and inflation expectations in that they measure the compensation received by investors for exposing themselves to future changes in consumer prices. As indicated by the long- dashed green line, TIPS inflation compensation for the five-year period starting five years from now has fallen roughly 1 percentage point over the past three years. This decline could be interpreted as a significant drop in market participants' expectations for the most likely outcome for inflation in the longer run. However, research suggests that the fall in TIPS inflation compensation instead primarily reflects a decline in inflation risk premiums and differences in the liquidity of nominal and indexed Treasury securities. This research notwithstanding, the notable decline in inflation compensation may be a sign that longer-term inflation expectations have slipped recently. Another risk is that our framework for understanding inflation dynamics could be misspecified in some fundamental way, perhaps because our econometric models overlook some factor that will restrain inflation in coming years despite solid labor market conditions. One possibility in this vein is a continuation of the subdued growth in health-care prices that we have seen in recent years--a sector-specific factor not controlled for in standard models. Because health care accounts for a large share of total consumer spending, this slow growth has restrained overall inflation materially and may continue to do so for some time. A similar situation occurred during the 1990s, when a significant shift in health insurance enrollment away from fee-for-service and toward HMO (that is, Health Maintenance Organization) plans reduced cost pressures and held down overall inflation for several years. If these sorts of favorable supply-type shocks continue, achieving our 2 percent inflation goal over the medium term may require a more accommodative stance of monetary policy than might otherwise be appropriate. Some commentators have conjectured that, because of rising trade volumes and the integration of production chains across countries, U.S. inflation now depends on global resource utilization, not just on conditions here at home and those effects arising through movements in energy and import prices. However, studies of this issue do not, on balance, provide much empirical support for this possibility. Moreover, foreign economic growth has firmed this year and the global economy appears to have largely recovered, so any influence that global resource utilization might have on U.S. inflation would presumably be small. Nevertheless, increased competition from the integration of China and other emerging market countries into the world economy may have materially restrained price margins and labor compensation in the United States and other advanced economies. In fact, one study concludes that most of the decline in the labor share of national income in the United States since the late 1980s can be attributed to offshoring of labor-intensive production. If this restraint on the labor share continues to build over the next few years (and not merely holds steady), then it could indirectly hold down the growth of domestic wages and prices in ways not captured by conventional models. More speculatively, changes in the structure of the domestic economy may also be altering inflation dynamics in ways not captured by conventional models. The growing importance of online shopping, by increasing the competitiveness of the U.S. retail sector, may have reduced price margins and restrained the ability of firms to raise prices in response to rising demand. That said, the economy overall appears to have become more concentrated and less dynamic in recent years, which may tend to increase firms' pricing power. Because these changes occur slowly, determining their complex effects on the economy will, as a practical matter, require studying data over a considerable time. Finally, I would note the possibility that inflation may rise more sharply in response to robust labor market conditions than anticipated. The influence of labor utilization on inflation has become quite modest over the past 20 years, implying that the inflationary consequences of misjudging the sustainable rate of unemployment are low. But we cannot be sure that this modest sensitivity will persist in the face of strong labor market conditions, given that we do not fully understand how it came to be so modest in the first place. Although the evidence is weak that inflation responds in a nonlinear manner to resource utilization, this risk is one that we cannot entirely dismiss. What are the policy implications of these uncertainties? For one, my colleagues and I must be ready to adjust our assessments of economic conditions and the outlook when new data warrant it. In this spirit, FOMC participants--like private forecasters-- have reduced their estimates of the sustainable unemployment rate appreciably over the past few years in response to the continual flow of information about the always changing economy. To the extent these assessments change over time, so too will the outlook and judgments about the appropriate stance of monetary policy. Importantly, even if resource utilization is currently lower than we estimate or if longer-run inflation expectations are running at levels consistent with longer-run PCE price inflation somewhat below 2 percent, the FOMC can still achieve its inflation goal. Under those conditions, continuing to revise our assessments in response to incoming data would naturally result in a policy path that is somewhat easier than that now anticipated--an appropriate course correction that would reflect our commitment to maximum employment and price stability. Similar considerations apply to other important sources of uncertainty, such as the value of the neutral real interest rate--that is, the inflation-adjusted level of the federal funds rate consistent with keeping the economy operating on an even keel. Estimates of this rate have declined considerably in recent years, and, by some estimates, the real neutral rate is currently close to zero. But the neutral rate changes over time as a result of the interaction of many forces, including demographics, productivity growth, fiscal policy, and the strength of global demand, so its value at any point in time cannot be estimated or projected with much precision. My FOMC colleagues and I will therefore need to continue to reassess and revise our assessments of the neutral rate in response to incoming data and adjust monetary policy accordingly. How should policy be formulated in the face of such significant uncertainties? In my view, it strengthens the case for a gradual pace of adjustments. Moving too quickly risks overadjusting policy to head off projected developments that may not come to pass. A gradual approach is particularly appropriate in light of subdued inflation and a low neutral real interest rate, which imply that the FOMC will have only limited scope to cut the federal funds rate should the economy be hit with an adverse shock. But we should also be wary of moving too gradually. Job gains continue to run well ahead of the longer-run pace we estimate would be sufficient, on average, to provide jobs for new entrants to the labor force. Thus, without further modest increases in the federal funds rate over time, there is a risk that the labor market could eventually become overheated, potentially creating an inflationary problem down the road that might be difficult to overcome without triggering a recession. Persistently easy monetary policy might also eventually lead to increased leverage and other developments, with adverse implications for financial stability. For these reasons, and given that monetary policy affects economic activity and inflation with a substantial lag, it would be imprudent to keep monetary policy on hold until inflation is back to 2 percent. To conclude, standard empirical analyses support the FOMC's outlook that, with gradual adjustments in monetary policy, inflation will stabilize at around the FOMC's 2 percent objective over the next few years, accompanied by some further strengthening in labor market conditions. But the outlook is uncertain, reflecting, among other things, the inherent imprecision in our estimates of labor utilization, inflation expectations, and other factors. As a result, we will need to carefully monitor the incoming data and, as warranted, adjust our assessments of the outlook and the appropriate stance of monetary policy. But in making these adjustments, our longer-run objectives will remain unchanged--to promote maximum employment and 2 percent inflation. The inflation model used in the decomposition procedure includes two equations: an identity for the change in the price index for total personal consumption expenditures (PCE) and a simple reduced-form forecasting equation for core PCE price inflation. The identity is , where and denote growth rates (expressed as annualized log differences) of total and core PCE prices, respectively. and are annualized growth rates for prices of consumer energy goods and services and prices of food and beverages, both expressed relative to core PCE prices, and and are the weights of energy and food in total consumption. The core inflation forecasting equation is , where is expected long-run inflation; denotes the level of resource utilization, controls for the effect of changes in the relative price of core imported goods, is a white-noise error term, and the coefficients are ordinary least squares estimates For estimation purposes, is approximated using the unemployment rate less the Congressional Budget Office's (CBO) historical series for the long-run natural rate. From 2007 to the present, is approximated using the median forecasts of long- to 2006:Q4, the series is based on the median long-run forecasts of inflation as measured by the consumer price index (CPI), less a constant adjustment of 40 basis points to put is approximated by the long-run inflation expectations reported in the Hoey survey. The relative import price term, , is defined as the annualized growth rate of the price index for core imported goods (defined to exclude petroleum, natural gas, computers, and semiconductors) less the lagged four-quarter change in core PCE inflation, all multiplied by the share of nominal core imported goods in nominal GDP. To decompose recent movements in inflation into its various components, the series used in the inflation model--for which complete quarterly data are available only through 2017:Q2 in most cases--are first extended through the end of 2017. In the case of inflation, the extensions are consistent with the medians of Federal Open Market Committee (FOMC) participants' projections for total and core PCE inflation in 2017 that were reported at the press conference following the September 2017 FOMC meeting. over the second half of 2017 is defined to be consistent with the median of FOMC projections for the 2017:Q4 unemployment rate less the CBO's estimates of the historical path of the long-run natural rate. The CBO's 2017 estimate is slightly higher than the median of FOMC participants' most recent projections of the normal longer-run level of the unemployment rate. For changes in the price of core imports, the 2017:H2 extrapolations are based on a regression of this series on current and lagged changes in exchange rates. This approach predicts that core import prices should rise about 4-1/2 percent at an annual rate in the second half of this year. and food prices over the second half of 2017 are assumed to rise at annual rates of 10 percent and 1.6 percent, respectively; these assumptions (which take into account published monthly PCE data through July and published CPI data through August, as well as recent movements in gasoline prices in the wake of Hurricanes Harvey and Irma) ensure that the combined contribution of food and energy prices to inflation in 2017 is consistent with the median difference between FOMC participants' projections for total and core inflation. Finally, nominal spending shares for food, energy, and core imports are assumed to remain unchanged at their 2017:Q2 levels, and long-run inflation expectations are assumed to remain constant at 2 percent. After computing historical tracking errors for the two equations of the model, the final step in the decomposition procedure is to run a sequence of counterfactual variable of the model is set to zero, and the model is simulated; the resulting difference between actual inflation and its simulated value equals the historical contribution of that particular factor. Importantly, the simulations are all dynamic in that the lagged inflation term in the core inflation equation is set equal to its simulated value in the preceding period rather than its actual value. As a result, the decompositions incorporate the effects of changes in lagged inflation that are attributable to previous movements in the explanatory variables. The estimated equation for the employment cost index (ECI) is: , where is the annualized log difference of the ECI for hourly compensation of private industry workers, and have the same definition as the corresponding variables from the PCE price inflation model, denotes the first difference of , is a moving average of an estimate of trend productivity growth for the business sector, and is an error term. The coefficients are obtained from a restricted productivity growth is estimated as the low-frequency component of the annualized log difference of business-sector output per hour from the Bureau of Labor Statistics Productivity and Costs report. The moving average of trend productivity growth (which is used in the estimation) is computed as a geometrically declining weighted average, , where denotes trend productivity growth and where the moving average is initialized in 1955:Q1 with that quarter's estimate of the trend growth rate. The model is used to compute a decomposition of ECI growth following a procedure similar to that used to construct the decomposition for core PCE price inflation. The table summarizes the results of this decomposition over various periods; note that the column labeled "Slack" combines the effects of and , the effect of the model's constant term is included in the column labeled "Trend productivity," and the column labeled "Other" gives the contributions of the model's tracking errors. . . Ashenfelter, eds., . vol. 118 . . . . . _competition_issue_brief.pdf . . . . . . . . . ssion_and_recovery . . . . Dynamics," 60th annual economic conference sponsored by the Federal Reserve . .
r170927a_FOMC
united states
2017-09-27T00:00:00
Labor Market Disparities and Economic Performance
brainard
0
I would like to thank President George and her staff at the Federal Reserve Bank of Kansas City for inviting me to participate in today's conference, which is a joint effort As you may know, this is the second minority bankers forum hosted by the Federal Reserve System. Diversity and inclusion are important to the strength of the banking industry, and therefore I am pleased that the Federal Reserve System is sponsoring this conference. Among other benefits, diversity and inclusion strengthen organizations by improving deliberations and decisionmaking. Banks with more diverse workforces are also better able to reach broader groups of customers, especially customers who have historically been underbanked, when they show that serving those customers is a top priority. One of the most effective ways of showing a commitment to a diverse customer base is showing a commitment to a diverse workforce. We now have substantial empirical evidence documenting the benefits of diversity in broadening the range of ideas and perspectives that are brought to bear on solving problems, and thereby contributing to better outcomes, in research, policy, and business. Studies suggest that increased diversity alters group dynamics and decisionmaking in positive ways. Microeconomic experiments and other research have confirmed these ideas. One experiment found that greater racial diversity helped groups of business students outperform other students in solving problems. And another found similar benefits from gender diversity. A study of 2.5 million research papers across the sciences found that those written by ethnically diverse research teams received more citations and had a greater influence than papers by authors with the same ethnicity. As a bank regulator, the Federal Reserve has an interest in promoting diversity in the financial services industry. For example, the Federal Reserve Bank of Chicago is actively engaged in developing diverse talent in the financial services industry as a key about 20 financial services organizations that are working together to increase the representation of Latinos and African Americans, at all levels, within the Chicago-area financial services industry. The members of this initiative recognize they have a collective interest in developing a diverse pipeline of talent. They are more likely to be more successful in achieving their internal pipeline goals if they succeed in improving diversity and inclusion in the finance workforce more broadly. Another example of our work to promote diversity at banking institutions is the standards developed by the Board's Office of Minority and Women Inclusion to promote transparency and awareness of diversity policies and practices within banking institutions. The Board encourages banking institutions not only to provide their policies, practices, and self-assessment information to the Board, but also to disclose this information to the public. We hope that this new self-assessment tool will help individual banks achieve their own diversity and inclusion goals and contribute to greater diversity in the banking industry more generally. I also want to take this opportunity to update you on what we are doing to some of you work for MDIs, the Federal Reserve recognizes their vital role in serving low- and moderate-income and minority communities as well as bringing diversity to the field of banking. I make a point of meeting with the leadership of MDIs as I travel around the country so I can hear firsthand about the challenges these institutions face and the important work they are doing to provide financial services to minority and historically underserved populations. I have heard from CEOs of MDIs about how challenging it can be to simultaneously serve their target market and be a bank of choice in a competitive banking market. Some have managed to thread that needle successfully by remaining firmly rooted and accessible in the communities they serve as well as maintaining their focus on trust, loyalty, and personalized relationship banking, which is a particular strength and competitive advantage for MDIs. The Federal Reserve has developed a national outreach program called the Partnership for Progress to assist MDIs in confronting business model challenges, cultivating safe banking practices, and competing more efficiently. Recently, the Board doubled its resources for this program to better support MDIs. We brought to bear the resources of our community development function, which promotes economic growth and financial stability in lower-income communities. Combining the resources of our banking and supervision staff with our community development staff allows us to be more creative in supporting MDIs around the country. Recognizing that there has been a dearth of MDI research in recent decades, last year, for the first time, we commissioned two external researchers to develop papers on MDIs. The resulting new papers, along with another written by economists at the Federal Reserve Bank of Chicago, were presented at the interagency biennial MDI conference this past April in Los Angeles. We are in the process of commissioning research for next year in an effort to deepen our understanding of these unique institutions. Turning to the workforce more broadly, I want to spend some of my time today talking about disparities and share some findings from a conference we hosted yesterday fulfilling its dual mandate, the Federal Open Market Committee (FOMC) has set a target of 2 percent for inflation but does not have a similarly fixed numerical goal for maximum employment. That is because the level of maximum employment depends on "nonmonetary factors that affect the structure and dynamics of the labor market," which "may change over time and may not be directly measurable." Understanding how close the labor market is to our full-employment goal requires consulting a variety of evidence along with a healthy dose of judgment. This approach to maximum employment has allowed the FOMC to navigate the current expansion in a way that has likely brought more people back into productive employment than might have been the case with a fixed unemployment rate target based on pre-crisis standards. The Federal Reserve is also keenly interested in disparities in employment, labor force participation, income, and wealth because they may have implications for the growth capacity of the economy. When we consider appropriate monetary policy, we need to have a good sense of how fast the economy can grow without fueling excessive price inflation. At a time when the retirement of the baby-boom generation looks likely to be something of a drag on the growth of the labor force, it is especially important to consider whether relatively low levels of employment and labor force participation for some prime working-age groups represent slack that, if successfully tapped, could increase the labor force and boost economic activity. More broadly, when a person who was previously unemployed or discouraged secures a job, not only does it boost the economy, but that person also may gain a greater sense of economic security, self-sufficiency, and self-worth and be better able to invest in their family's future. With a richer understanding of economic or social barriers that inhibit labor market success and prosperity for some groups, we may better grasp how much these individuals can be helped by broad economic expansion and how much targeted intervention is required through other policy means. There is also an important connection between the economy's potential growth rate and equality of opportunity. Large disparities in opportunity based on race, ethnicity, gender, or geography mean that the enterprise, exertion, and investments of households and businesses from different groups are not rewarded commensurately. To the extent that disparities in income and wealth across race, ethnicity, gender, or geography reflect such disparities in opportunity, families and small businesses from the disadvantaged groups will then underinvest in education or business endeavors, and potential growth will fall short of the levels it might otherwise attain. Aside from reducing the long-run productive potential of the economy, persistently high levels of income and wealth inequality may also have implications for the robustness of consumer spending, which accounts for roughly two-thirds of aggregate spending in the United States. The gaps in household income and wealth between the richest and poorest households are at historically high levels, as income and wealth have increasingly accrued to the very richest households. For example, results from the soon, indicate that the share of income held by the top 1 percent of households reached 24 percent in 2015, up from 17 percent in 1988. The share of wealth held by the top 1 percent rose to 39 percent in 2016, up from 30 percent in 1989. Some research suggests that widening income and wealth inequality may damp consumer spending in the aggregate, as the wealthiest households are likely to save a much larger proportion of any additional income they earn relative to households in lower income groups that are likely to spend a higher proportion on goods and services. When we disaggregate the economy-wide labor market statistics, we find significant and persistent racial and ethnic disparities. In August, the national unemployment rate of 4.4 percent, which is low by historical standards, masked substantial differences across different demographic groups. As shown in figure 1, unemployment rates ranged from 3.9 percent for whites to 4 percent for Asians, participation rates, shown in figure 2, also differ substantially, although by less than unemployment rates, with the rate for African Americans lowest at 62.2 percent. These differences are not a recent development--similar differences across racial and ethnic dimensions have existed for as long as these data have been collected. Even more striking, a significant portion of the gaps in unemployment rates across racial and ethnic groups cannot be attributed to differences in their underlying characteristics, such as age and education levels. Although the differences in employment rates between racial and ethnic groups are still quite large, they have narrowed recently, after having widened considerably during the recession, and are near their lowest levels in decades. Differences in unemployment rates across racial and ethnic groups tend to widen sharply during recessions, as less advantaged groups shoulder an outsized share of total layoffs, and these differences shrink during recoveries. For example, in the second quarter of 2017, the unemployment rate for black adult men was a little more than 3 percentage points higher than for white adult men. This differential, while sizable, is nonetheless close to the smallest gap seen since comparable data became available in the mid-1970s. Differences in unemployment rates are similarly near historical lows for black women relative to white women, and for Hispanics relative to whites. Since racial disparities tend to get smaller throughout the course of an economic expansion, it seems likely that racial differences in unemployment rates will continue to shrink if the overall unemployment rate falls further. More broadly, the persistent disparities in employment outcomes are mirrored in significant and persistent racial and ethnic differences in families' income and wealth. According to forthcoming findings from the latest SCF and as shown in figure 3, the average income for white families in 2015 was about $123,000 per year, compared with Disparities in wealth, shown in figure 4, are even larger: Average wealth holdings for white families in 2016 black families. Moreover, these racial and ethnic gaps in average family income and wealth have generally widened rather than narrowed over the past few decades. Based on SCF data, median family wealth has grown much more rapidly for white families than for other families over the past few decades, while median family incomes have risen by about the same amount for white, black, and Hispanic families. As the economic expansion continues and brings more Americans off the sidelines and into productive employment, it seems likely that the positive trends in employment and participation rates for historically disadvantaged groups will continue. That said, the benefits of a lengthy recovery can only go so far, as the research points to some barriers to labor market outcomes for particular groups that appear to be structural. After controlling for sectoral and educational differences, the research suggests that these factors include discrimination as well as differences in access to quality education and informal social networks that may be an important source of information and support regarding employment opportunities. While the policy tools available to the Federal Reserve are not well suited to addressing the barriers that contribute to persistent disparities in labor market outcomes, understanding these barriers and efforts to address them is vital in assessing maximum employment as well as potential growth. The Federal Reserve is deeply engaged in understanding disparities through our data collection, research collaboration, and community development work. One way the Federal Reserve seeks to obtain a clearer picture is by collecting data ourselves. For instance, some of the data I have cited today come from the Federal Reserve's triennial Survey of Consumer Finances, which provides detailed information on income and wealth holdings by demographic groups. The Survey of Household Economics and Decisionmaking provides a portrait of household finances, employment, housing, and debt; the Survey of Young Workers provides insights into younger adults' employment experiences soon after entering the labor force; and the Enterprising and Informal Work Activities Survey provides information about income generating activities that are often outside the scope of other employment and income surveys. as indicated in the minutes published after each meeting of the Federal Open labor experiences of different racial and ethnic groups as background for the FOMC's deliberations on monetary policy. Across the Federal Reserve System, a variety of initiatives are aimed at understanding economic disparities and how to foster more-inclusive growth. The brings together researchers from a variety of fields to analyze barriers to economic opportunity and advancement. The Economic Growth and Mobility Project at the Federal Reserve Bank of Philadelphia aims to bring together researchers with community stakeholders to focus on differences in poverty and economic mobility across demographic characteristics. The Investing in America's Workforce Initiative is a collaboration between the Federal Reserve System and academic research institutions to promote investment in workforce skills that better align with employers' needs. In short, a deeper understanding of labor market disparities is central to the mission of the Federal Reserve because it may help us better assess full employment, where resources may be underutilized, and the likely evolution of the labor market and overall economic activity. Let me conclude by thanking you again for having me here today and allowing me the opportunity to explore some of the factors that influence labor market disparities. . . . .
r170928a_FOMC
united states
2017-09-28T00:00:00
The Independent Bank of England--20 Years On
fischer
0
It is a pleasure to speak at this commemoration of 20 years of Bank of England monetary policy independence. In my remarks today, I will consider how central banking in the United Kingdom and the United States has evolved in response to the challenges of recent years. I will address a limited set of topics and will have time to touch only briefly on each. These remarks are divided into three parts. In the first part, I will discuss several aspects of aggregate monetary policy: central bank independence, policy transparency, and policy tools. In the second part, I consider differences between the approaches of the two banks to their lender-of-last-resort function. And, third, I will close with brief reflections on the central bank's responsibility for financial stability. We start by considering a central bank that influences economic activity only through its influence over the general level of interest rates. More than two decades ago, Guy Debelle and I offered two terms-- goal independence and instrument independence-- to describe such a central bank's degree of independence. Our definitions were as follows: "A central bank has goal independence when it is free to set the final goals of monetary policy. . . . A bank that has instrument independence is free to choose the means by which it seeks to achieve its goals." achieved instrument independence--something the Federal Reserve had already long had. rather than the Treasury, set the policy interest rate. Inflation targeting, which began in the United Kingdom in 1992, continued under the new system and was codified in the The MPC was given an explicit numerical inflation target, corresponding to effective price stability, alongside an implied stabilization goal for real economic activity. Consequently, the Bank of England from 1997 had the combination that Debelle and I advocated: instrument independence but not goal independence. We also judged that the vagueness of the Federal Reserve's statutory objectives meant that the Federal Reserve "has considerable goal independence." Today both the FOMC and the MPC have a numerical inflation goal--2 percent. However, this numerical goal is not specified in legislation in either country. In the United Kingdom, the Chancellor of the Exchequer sets the MPC's inflation target: currently 2 percent per year for the U.K. consumer price index. annually, Committee participants have judged that the longer-run inflation objective that corresponds to the Federal Reserve's mandate is a rate of 2 percent per year, as measured by the change in the price index for personal consumption expenditures. In practice there is little difference between the policy goals of the two central banks, or between the variables that are targeted. But there is a subtle difference between them in terms of who sets the inflation target. To date, that difference has not generated any major divergence between the approaches to monetary policy of the two central banks. Although much has changed in central banking since the 1990s, the case for instrument independence and against goal independence remains sound. The case against goal independence is that it is not appropriate in a democracy: goals should reflect the preferences of society at large and should not be determined by unelected officials. The goals of monetary policy should be set by the central government, as is the case in both the United States and the United Kingdom. We all know what the goals of the Bank of England and the Fed are: For both, the goals are stability of the price level and full employment. Although there is a hint of a lexicographic ordering in favor of the rate of inflation in the Bank of England's mandate, the Federal Reserve's dual mandate unambiguously places equal weight on both goals. Across the world, there seems to be a preference for inflation either to be the only goal variable or to be lexicographically emphasized over unemployment. But I do not regard that difference as very significant. There is no central bank that--in Mervyn King's terminology--is an "inflation nutter." Instrument independence is necessary because, without it, the central bank is unable to set the stance of monetary policy that it believes to be most consistent with achievement of the statutory mandate. That said, there are, to be sure, degrees of instrument independence, and one can easily envisage central banks with considerable instrument independence whose actions on their instruments are constrained by law or by decisions by the treasury. For example, in a country that has committed its central bank to enforce an exchange rate band (with the exchange rate free to move within the band), the central bank will not be fully instrument independent, though it may have a substantial amount of independence with respect to the precise settings of the policy interest rate and other monetary tools. Historically, central banks were frequently created to finance public spending. However, these banks--such as those in Sweden and the United Kingdom--were later sometimes given their operational independence in part to create a greater separation of monetary policy from fiscal policy--in particular, in order to ensure that they would not be vulnerable to pressure to finance the government budget (that is, to monetize public spending). Around the world, the fear of the central bank losing its ability to meet its price stability goal may well have been the prime reason for governments to implement instrument independence. Fundamentally, however, central bank instrument independence is desirable because monetary policy is an esoteric and complicated art or science, involving technical judgments that have economy-wide and often long-lasting consequences: A separate institution is needed to manage and take responsibility for monetary policy. Central banks have over the years--in some cases over a few centuries--amassed an expertise and developed a character that makes them the natural candidates to perform the functions expected of such institutions. As the Governor of the Riksbank has observed, the most important skill and reputation that a central bank needs is one of reliability . That is, both the central government and the general public should be confident that their central bank can be relied on to deliver a stable price level and close to full employment, along with financial stability. If such independent institutions did not exist, we would have to invent them; if they exist and perform well, the country is blessed; and if they exist and perform badly, they need to be reformed--by a change in the laws by which they are governed, by changes in their structure, or by changes in personnel--and sometimes all of the above. In a paper written for the Bank of England's tercentenary, I considered how an instrument-independent central bank might conduct itself. My discussion noted that an independent central bank should adhere to the "principle of accountability to the public of those who make critically important policy decisions." Accountability, in the senses I defined it, included the requirement "to explain and justify its policies to the legislature and the public"--that is, policy transparency and communications. public accountability, and policy communications form the quid pro quo of central bank independence, and they can also contribute to achievement of macroeconomic goals. These were points the FOMC recognized in its Statement on Longer-Run Goals and Monetary Policy Strategy, which observed that clarity concerning policy decisions "increases the effectiveness of monetary policy, and enhances transparency and accountability, which are essential in a democratic society." In my coverage of these issues today, however, I would like to concentrate on the Bank of England, which in the past quarter-century has been an innovator in several ways with regard to accountability and transparency. Calls for more transparency concerning U.K. monetary policy and the Bank of England's monetary actions predated independence. For example, in the late 1950s, Richard Sayers observed: "It may not be wise to turn the central bank into a goldfish bowl, but at least some relaxation of the traditional secretiveness would make for better health in the nation's monetary affairs." And some Bank communications vehicles, such as the economic analysis in the and testimony and speeches by the Governor and other Bank officials, were of long standing by the mid-1990s. But the Bank of England made further strides toward improved transparency and communications during the 1990s. In 1993, it initiated the From the beginning, the was intended to increase transparency about the U.K. monetary policy reaction function--that is, the connection between policy instruments and economic variables, including the goal variables. After independence, the presented the MPC's inflation forecast. Furthermore, alongside other Bank statements, the provides a publicly available analysis of the economy and of economic implications of developments like Brexit. The content reflects the Bank's change in focus--from markets in the pre-inflation-targeting era to macroeconomic implications of financial and other developments. The Bank expanded its policy communications after 1997, publishing MPC analogues to the FOMC releases (some of them only recent innovations by the Fed), namely, postmeeting MPC statements and meeting minutes. And an innovation of Mervyn King in the early years of inflation targeting that has continued in the era of independence is the press conference. Here, senior Bank figures discuss the MPC's forecast and the state of the economy. This innovation was a forerunner of the Federal Reserve Chair's press conference, begun in 2011, in which the Chair describes the latest policy decision together with the Summary of Economic Projections The MPC and FOMC have extensively--particularly in recent years--used two monetary policy tools other than decisions on the current short-term interest rate. These tools are forward guidance and asset purchases. Monetary authorities used to be reluctant to discuss the future course of the policy rate. By 1997, however, there was widespread recognition of the merits of clarity on the reaction function and of having long-term interest rates incorporate accurate expectations of future policy. These considerations led to the FOMC's use of forward guidance regarding the short-term interest rate in its postmeeting statements during the mid-2000s. The MPC, in contrast, for a long time generally preferred to let markets infer likely future rates from the extensive communication it provided about its reaction function. Beginning in 2008, with the policy rate at or near its lower bound, regular forward guidance acquired new efficacy. Through forward guidance, additional accommodation from short-term interest rate policy could be provided by lengthening the period over which the policy rate was expected to remain at its lower bound. The knowledge that the short-term policy rate likely would be lower for longer would put downward pressure on longer-term rates. The FOMC has provided forward guidance on the policy rate in its postmeeting statements ever since the target federal funds rate was brought to the lower bound in December 2008. In addition, the SEP shows individual FOMC participants' expectations regarding the policy rate, though it does not identify the individuals in the interest rate dot plot. In the United Kingdom, the MPC started providing forward guidance in its postmeeting statements in 2013. As the policy rate--Bank Rate--is still at its lower bound in the United Kingdom, it remains to be seen whether MPC forward guidance will continue during policy firming. For its part, the FOMC has provided forward guidance in its policy statements during the tightening phase that began with the increase in the target range for the federal funds rate in December 2015. I expect that the Bank of England will also likely continue to use forward guidance when it begins to raise the policy interest rate above its effective lower bound. Asset purchases are less of a new tool than forward guidance. In the early post- long-term interest rates directly by transacting in longer-term Treasury securities. By 1997, however, monetary policy operations in longer-term securities markets had fallen into disuse. The financial crisis changed matters, with both countries' central banks expanding their balance sheets through large-scale purchases of longer-dated securities to put downward pressure on longer-term interest rates and set in motion movements in asset prices and borrowing costs that would stimulate spending by households and businesses. It is widely, though not universally, recognized that these asset purchases helped contain the economic downturns in the United Kingdom and the United States and underpinned the subsequent recovery in each country. This experience raises the question of whether the balance sheet will continue to be a routine tool of monetary policy once interest rates normalize. The FOMC has indicated its preference that, barring large adverse shocks to the economy, adjustments to the federal funds rate will be the main means of altering the stance of monetary policy. The second section concerns the role of the lender of last resort. The financial crisis and recession confirmed the value of central bank tools that affect the financial system, beyond those most associated with monetary policy. One of these tools is the discount window or lender-of-last-resort function. As of the mid-2000s, the posture of the Bank of England and the Federal Reserve toward the lender-of-last-resort function reflected the principles enunciated by Bagehot and the long absence of a severe financial crisis: The discount rate stood above the key policy rate by a fixed amount; the discount window was not used for macroeconomic stabilization; and depository institutions were the users of the discount window, typically on a short-term basis. In 1978, Rudi Dornbusch and I noted that the lender-of-last-resort function should imply that "the central bank steps in to ensure that funds are available to make loans to firms which are perfectly sound but, because of panic, are having trouble raising In the financial crisis that started in 2007, wide-ranging measures were taken along these lines. In order to improve the functioning of U.S. credit markets, the Federal Reserve made numerous changes to its lending arrangements: The spread of the discount rate over the policy rate was lowered, lending was extended to include loans to nondepository financial institutions, and facilities were created allowing longer maturity of, and broader collateral for, loans than was usual for the central bank. After its own experience during the financial crisis, the Bank of England permanently widened lender-of-last-resort access to include not only commercial banks, but also other systemically important financial institutions. discount rate has for several years been back to its normal relationship with the policy rate, and the special lending facilities have long since been wound up. Emergency lending facilities of the type seen during the financial crisis remain feasible, if needed, though their usage has not been incorporated into the Federal Reserve's routine lender-of- last-resort powers, along the lines of the changes seen in the United Kingdom. Instead, the restriction on their deployment has been tightened by requiring approval by the U.S. Discount window lending puts public funds at risk--though I stress that the Federal Reserve's lending during the crisis did not, in fact, lead to any losses. The lender of last resort is also a less impersonal, and more allocative, device than aggregate monetary policy tools, because it involves direct lending by the central bank instead of an attempt by monetary policy to alter the overall cost of private-sector borrowing. For these reasons, the lender-of-last-resort function is bound to be more rule driven than interest rate policy, and it is inevitably associated with collateral arrangements and other safeguards to protect against losses and with strict eligibility criteria. Our third and final section concerns the financial stability responsibility of the central bank. This responsibility has been subject to considerable institutional change over the past two decades. Until 1997, the Bank of England had wide supervisory and regulatory powers. With the reforms of the late 1990s, the Bank had a deputy governor responsible for financial stability, but regulatory powers were largely moved to a crisis demonstrated that financial imbalances can ultimately endanger macroeconomic stability and highlighted the need for enhanced central bank oversight of the financial system. In the post-crisis era, the FSA became two separate regulatory authorities, one of which--the Prudential Regulation Authority, created in 2012--is part of the Bank of England. In effect, regulatory powers have largely returned to the Bank. It is fair to say that the Bank was initially glad to cede many of its financial powers, but that it was later even more glad to have those powers restored. Financial supervision has also been reformed in the United States in light of the crisis. The Federal Reserve, which always had regulatory powers, received enhanced authority and devoted more resources to financial stability. In both countries, it remains the case that not all financial stability responsibilities rest with the central bank-- so it is less independent in this area than in monetary policy proper--and the central bank's tools for achieving financial stability are still being refined. Indeed, as I have noted previously, a major concern of mine is that the U.S. macroprudential toolkit is not large and not yet battle tested. The Federal Reserve and the Bank of England benefit from each other's experience as they develop and improve arrangements to meet their financial stability responsibilities. One major innovation that deserves mention is that the Bank of England has two policy committees: Alongside the MPC is the Financial Policy Committee (FPC). Although they coordinate and have partially overlapping memberships, the MPC and FPC are distinct committees. Why have both the MPC and the FPC? I offer a few possible reasons. First, not all of a central bank's responsibilities typically rest with its monetary committee. This is true not only of the Bank of England, but also of the Federal Reserve: Our financial regulatory authority resides in the Board of Governors, not the FOMC. Second, aggregate monetary policy tools--typically, one is talking of the policy interest rate--are often blunt weapons against financial imbalances, so deploying them might produce a conflict between financial stability and short-term economic stabilization. Macroprudential tools may be more direct and more appropriate for fostering financial stability. Third, financial policy might need less frequent adjustment than monetary policy. Perhaps reflecting this judgment, the FPC meets on a quarterly basis, which contrasts with the MPC's eight meetings a year. The lower frequency of meetings may also reflect the desirability of a relatively stable regulatory structure; financial tools likely should not be as continuously data dependent as monetary policy tools. It is clear that the U.K. institutional framework for the preservation of financial stability has much to be said for it. But it also seems clear that there is no uniquely optimal set-up of the framework for the maintenance of financial stability that is independent of the size and scale of the financial system of the country or of its political and financial history. It has been more than 20 years since the Bank of England celebrated its 300th birthday with a conference focused on central bank independence. Since then, central banks' operating frameworks have undergone substantial changes, many in response to the financial crisis. But the case for monetary policy independence set out in the 1990s remains sound, and monetary policy independence is now widely accepted in the United Kingdom, as it long has been in the United States. It is also clear that central bank responsibilities other than policy rate decisions--specifically, the lender-of-last-resort function and financial stability--are closely connected with monetary policy and that these responsibilities play a prominent role in macroeconomic stabilization. Let me conclude by observing that, while the crisis and its aftermath motivated central banks to reappraise and adapt their tools, institutions, and thinking, future challenges will doubtless prompt further reforms. Or, if I may be permitted a few final words on my way out the door, the watchwords of the central banker should be " vigilans, " because history and financial markets are masters of the art of surprise, and "Never say never," because you will sometimes find yourself having to do things that you never thought you would. . . . . . webpage, BOE, . . Accountability," speech delivered at the Institute for Monetary and Economic . . . -------- (2015). m . press release, April 5, m . . . -------- (2016). Letter from the Governor of the Bank of England to the Chancellor of the pdf . vol. 23 proceedings of a conference held in North Falmouth, Massachusetts, in June . eds., . . the Exchequer to the Governor of the Bank of England, March 8, . . . proceedings of the Federal Reserve Bank of Kansas - . .
r171004a_FOMC
united states
2017-10-04T00:00:00
Welcoming Remarks
yellen
1
For release on delivery by at I am very glad to welcome everyone here to this conference sponsored by the these gatherings five years ago, this research and policy conference has established itself as an important event where industry leaders, academics, and supervisors discuss the latest research and exchange ideas about promoting a healthy and growing community banking industry. All of us share an interest in seeing that community banks continue their vital role in their customers' lives and in a strong and stable U.S. financial system. The Fed has been working hard to ensure that its regulation and supervision of banks are tailored appropriately to the size, complexity, and role different institutions play in the financial system. For community banks, which by and large avoided the risky business practices that contributed to the financial crisis, we have been focused on making sure that much- needed improvements to regulation and supervision since the crisis are appropriate and not unduly burdensome. One way we are doing this is through the regulatory review required by the first step we took, pursuant to EGRPRA, was to listen to industry and others with a stake in how community banks are supervised, which was accomplished through soliciting written comments and by holding six outreach meetings in 2014 and 2015. Our EGRPRA report, issued in March, was focused on community banks, and it noted steps that had already taken place, for example, to simplify Call Report requirements and expand the number of firms eligible for less frequent examinations. Since that report, the Board of Governors, along with the Federal Deposit Insurance Corporation and the Office of the Comptroller of the Currency, has proposed a rule expanding the number of commercial real estate transactions that will no longer require an appraisal, allowing for a less detailed evaluation. And just last week, the Fed, along with other regulators, took a significant step to reduce the regulatory burden on community banks and other smaller and less complex institutions by proposing to simplify several requirements in the regulatory capital rule. We have done this because we have an abiding commitment to consider how our decisions affect institutions and the customers they serve. We are well aware that community banks serve communities, businesses, and households that are often underserved by larger institutions and offer more extensive and more personalized services than are often otherwise available. We know that community bankers are part of the communities they serve, and they are often better able to understand the needs and the aspirations of their customers. We hope that the research presented at this conference stimulates discussion about the leading policy issues facing the industry and supervisors. As in the past, I see that this year's agenda includes presentations related to several important issues, including the effect of supervision on risk-tasking and the effects of greater competition on community banks. I am cheered to see that finance students at the University of Akron remain interested enough in community banking to participate in a case-study competition related to community banking, and I am pleased that the conference will hear a presentation of the winning case study. In closing, let me thank the conference organizers with the Fed and the Conference of State Bank Supervisors, the scholars and others making presentations, and all of you for attending. I hope you have a terrific conference.
r171005a_FOMC
united states
2017-10-05T00:00:00
Treasury Markets and the TMPG
powell
1
I am honored to join you to celebrate the first 10 years of the Treasury Market Practices Group (TMPG). The TMPG has become an essential forum where industry participants gather under the auspices of the Federal Reserve Bank of New York to address market practice issues as they arise in the fast evolving markets for Treasury securities. Outside this room, you are competitors, and that vigorous competition serves your firms, your customers, and ultimately the U.S. taxpayer. But when members of the TMPG attend meetings, they bring their long experience and deep expertise to bear to safeguard the functioning and overall health of these markets. As I have heard a number of people say, TMPG members check their partisan interests at the door. The TMPG is the place where market participants recognize and address their responsibilities to each other. I know that they take that responsibility seriously, and I encourage all of the market participants here today to take the TMPG's recommendations just as seriously and to adopt them as best practices that will enhance the market's functioning. I first encountered Treasury markets in a serious way 25 years ago, when I served as made national headlines when we learned that a Salomon Brothers' trader had repeatedly circumvented Treasury auction rules to corner the market for the on-the-run two-year Treasury. As it became clear that Salomon's senior management had known about the issue for several months without alerting regulators, the scandal threatened to bring down one of the largest financial firms of that time. Over one memorable August weekend, we first prohibited the firm from dealing in government securities on behalf of customers, and then reduced that sanction as top Salomon management left the firm and Warren Buffett, then a large Salomon shareholder, agreed to assume the chairmanship of the board of directors. This event takes up a chapter in Buffett's biography, , and it is a good illustration of why we need the TMPG. I reread that chapter every couple of years, and it still gives me nightmares. After the dust settled, we had to grapple with the wider implications of the scandal for the market itself and particularly the role of regulatory oversight. We certainly could have used a TMPG to help us with this work, but none existed at that time. Our recommendations were summarized in a joint report to the Congress issued by the Treasury Department, the Securities considered were how to encourage a level playing field for all market participants, and when and where regulators should respond to the periodic technical difficulties the market sometimes experienced. As we drew up the report, we were deeply aware of the importance that Treasury markets held for the American economy. The sale of Treasury bonds, notes, and bills finances the U.S. government, and those securities are in turn a primary vehicle for savings for a wide range of U.S. households. Treasury securities are also an important source of collateral within the financial system. This last role has become all the more critical in recent years as regulations have required banks to hold larger amounts of high-quality liquid assets so that they can safely meet their potential liquidity needs. We also knew that, despite the misconduct by Salomon Brothers, Treasury markets generally worked quite well. These markets are and have long been among the deepest and most liquid markets in the world. It was important to make the auction system more robust to potential manipulation, and we made several recommendations to do so and to open up the process to greater competition. However, in other areas that already worked well, we chose, for example, to open up issuance of securities that were in short supply (or "squeezed") when needed rather than to risk negatively affecting the functioning of Treasury markets by instituting more invasive regulation. There is certainly a role for regulation, but regulation should always take into account the impact that it has on markets--a balance that must be constantly weighed. More regulation is not the best answer to every problem. There is also a role for a body such as the TMPG to address market problems. The TMPG's approach to "fails" in the Treasury and mortgage-backed securities markets provides a good example. Fails--that is, the failure to deliver collateral in either a cash or repurchase agreement transaction--impose a cost on the party expecting delivery. If the practice were to become too frequent, then it could seriously impair market functioning. But at the same time, it doesn't make sense to simply outlaw the practice, since that limits flexibility in a way that may not always be called for and would likely reduce market liquidity. The TMPG's recommendations were carefully calibrated to set financial incentives that would minimize fails and have been a marked success. The TMPG also plays an important role in helping to "fill in the cracks" between the competing regulations that various Treasury market participants face. Many different institutions and individuals depend on these markets, and the regulatory system reflects that. The Government Securities Act gave the Treasury Department some rulemaking authority over all government securities brokers and dealers. But the act also required these firms to register with the SEC. At the same time, the Federal Reserve regulates many of the banks that are active in these markets, and the Federal Reserve Bank of New York also plays an essential role, both as fiscal agent for the U.S. government and, by nature of its frequent activity in these markets, in conducting monetary policy operations. And of course, the Commodity Futures Trading Commission regulates Treasury futures markets. The agencies work well together, but there is real value in having an industry group help to identify issues that cross regulatory boundaries. I'll give two examples. First, in 2016, the TMPG conducted a study of financial benchmarks and uncovered uses of ICAP's federal funds open rate that had not previously been well understood. As ICAP decided to stop publishing the rate, the TMPG also helped to guide market participants to an alternative that is aligned with the International Organizations of Securities Commissions Principles for Financial Benchmarks while steering the market away from the London Interbank Offered Rate (LIBOR) as a potential alternative--a move that now seems prescient given the subsequent news around the long-run risks to LIBOR. More recently, the TMPG has been busy creating a map of clearing and settlement in Treasury markets, work that I am sure all of the regulatory agencies will find to be of great interest. All of this is incredibly valuable. In fact, my only regret about the TMPG is that we didn't think of creating something like it earlier. As regulators, we fully support your work, and will continue to make sure that our own rules support these markets. The Financial Industry Regulatory Authority's (FINRA's) new collection of the Treasury transactions of its members through its Trade Reporting and Compliance Engine, or TRACE system, will do much to improve the regulatory agencies' understanding of the dynamics of this market. But as in the past, we should also be concerned about creating a level playing field. For that reason, the Board of Governors is continuing to negotiate with FINRA for it to act as our agent in collecting similar data from banks. We do not want to create a regulatory arbitrage where the same activity done within a broker-dealer is treated differently than when it is done within a bank. The TMPG and similar groups around the world play an important role by helping both regulators and industry leaders address market concerns before they threaten market function. I thank the members for their service and look forward to today's discussions.
r171012a_FOMC
united states
2017-10-12T00:00:00
Rethinking Monetary Policy in a New Normal
brainard
0
Bernanke proposes an approach to policy that is elegant and straightforward to communicate. I will focus on those elements that I find particularly relevant for the challenges faced by policymakers and suggest some implications and complications. My comments are not intended to address current policy. Policymakers in advanced economies are confronting a different constellation of challenges today than those that dominated the canon of U.S. monetary policymaking over the previous half-century, which I refer to as the "new normal." A key feature of the new normal is that the neutral interest rate--the level of the federal funds rate that is consistent with the economy growing close to its potential rate, full employment, and stable inflation--appears to be much lower than it was in the decades prior to the crisis. Projections (SEP), the median FOMC participant expected a longer-run real federal funds rate, after subtracting inflation, of 3/4 percent, down sharply from the value the first time the policy projection was published in the January 2012 SEP of 2-1/4 percent--and the average value in the decades prior to the financial crisis of 2-1/2 percent . The low level of the neutral rate limits the amount of space available for cutting the federal funds rate to offset adverse developments and thereby can be expected to increase the frequency and duration of periods when the policy rate is constrained by the effective lower bound, unemployment is elevated, and inflation is below target. In this environment, frequent or extended periods of low inflation run the risk of pulling down private-sector inflation expectations, which could amplify the degree and persistence of shortfalls of inflation, thereby making future lower bound episodes even more challenging in terms of output and employment losses. To the extent it is weighing on longer-run inflation expectations, the persistently low level of the neutral federal funds rate may be a factor contributing to the persistent shortfall of U.S. inflation from the FOMC's target. Further complicating the ability of central banks to achieve their inflation objectives in today's new normal is the very flat Phillips curve observed in the United States and many other advanced economies, which makes the relationship between labor market conditions and price inflation more tenuous. For instance, inflation has remained stubbornly below the FOMC's 2 percent target for the past five years even as unemployment has fallen from 8.2 percent to 4.2 percent, a level that most experts believe is in the vicinity of full employment . Bernanke's paper provides an excellent review of the Federal Reserve's efforts to operate in this new environment and makes some interesting new proposals. Reflecting on the Fed's available "policy toolbox," Bernanke concludes that the available tools are not likely to be sufficient and proposes a framework that relies on forward guidance with commitment to help central banks achieve their inflation and employment objectives. The academic literature on monetary policy suggests a variety of prescriptions for preventing a lower neutral rate of interest from eroding longer-run inflation expectations. The paper argues convincingly that many of these proposals present practical difficulties that would create a very high bar for their adoption. For instance, raising the inflation target sufficiently to provide meaningfully greater policy space could engender public discomfort or, at the other extreme, risk unmooring inflation expectations. The transition to a notably higher target is likely to be challenging and could heighten uncertainty. As I have noted previously, the persistence of the shortfall in inflation from our objective is an important consideration for monetary policy. The makeup principle, in which policy would make up for past misses of the inflation target, is not reflected in most standard monetary policy frameworks, although it is an important precept in theory. Some of the proposals that have been advanced to implement this principle present some difficulties. For example, while price-level targeting would be helpful in the aftermath of a recession that puts the economy at the effective lower bound, it could require tightening into a negative supply shock, which is a very unattractive feature, as Bernanke points out. Bernanke proposes a framework that avoids this undesirable possibility by implementing a temporary price-level targeting framework only in periods where conventional policy is constrained by the lower bound. Bernanke's proposal thus has the advantage of maintaining standard practice in normal times while proposing a makeup policy in periods when the policy rate is limited by the lower bound and inflation is below target. His proposed temporary price-level target would delay the liftoff of the policy rate from the lower bound until the average inflation over the entire lower bound episode has reached 2 percent and full employment is achieved. This type of policy, which would result in temporary overshooting of the inflation target in order to make up for the previous period of undershooting, is designed to, in Bernanke's words, "calibrate the vigor of the policy response . . . to the severity of the episode." The proposed temporary price-level targeting policy is designed to address what I see as one of the key challenges facing policymakers. Following deep recessions of the type we experienced in 2008-09, there appears to be an important premium on "normalization." This was apparent in 2010, for instance, when there was substantial pressure among Group of Twenty officials to commit to timelines and targets for reducing fiscal support and to articulate exit principles for monetary policy. inclination proved premature, as was evident from the subsequent intensification of the euro-area crisis. Moreover, the benchmark for "normal" tends to be defined in terms of pre-crisis standards that involved policy settings well away from the lower bound, at least initially, because it may take some time to learn about important changes in underlying financial and economic relationships. For example, the factors underlying what we now understand to be the new normal of persistently low interest rates were in many cases initially viewed as temporary headwinds. In these circumstances, a standard policy framework calibrated around the pre-crisis or "old" normal may be biased to underachieving the inflation target in a low neutral rate environment. The kind of policy framework that Bernanke proposes, which pre-commits to implementing the makeup principle based on the actual observed performance of inflation during a lower bound episode, could guard against premature liftoff and help prevent the erosion of longer-term inflation expectations. Monetary policymakers operate in an environment of considerable uncertainty and therefore have to weigh the risks of tightening too little or too late against those of tightening too much or too soon. While past experience has conditioned U.S. policymakers to be highly attentive to the risks associated with a breakout of inflation to the upside, as in the 1970s, they balance these risks against those associated with undershooting the inflation target persistently, as in Japan in the late 1990s and the 2000s. In weighing these risks, the standard approach is typically designed to achieve "convergence from below," in which inflation gradually rises to its target. Given the lags in the effects of monetary policy, convergence from below would necessitate raising interest rates preemptively, well in advance of inflation reaching its target. Moreover, particularly in the early stage of a recovery, this kind of preemptive approach tends of necessity to rely on economic relationships derived from pre-crisis observations, when policy rates were comfortably above the lower bound. During a period when the policy rate is limited by the lower bound, Bernanke's proposal would represent a substantial departure from the standard approach. While a standard policy framework would tend to prescribe that tightening should start preemptively, well before inflation reaches target, Bernanke's temporary price-level target proposal would imply maintaining the policy rate at the lower bound well past the point at which inflation has risen above target. In principle, policymakers would have to be willing to accept elevated rates of above-target inflation for a period following a lengthy period of undershooting. Just as policymakers could run a risk of low inflation becoming entrenched in the standard preemptive framework, so, too, there are risks in the temporary price-level target framework. One risk is that the public, seeing elevated rates of inflation, may start to doubt that the central bank is still serious about its inflation target. It is worth noting that the policy is motivated by the opposite concern--that convergence from below, following an extended lower bound episode, may lead to an unanchoring of inflation expectations to the downside. Still, a conscious policy of overshooting may be difficult to calibrate, especially since the large confidence intervals around inflation forecasts suggest that the risks of an undesired overshooting are nontrivial. A related risk is that the central bank would lose its nerve: Maintaining the interest rate at zero in the face of a strong economy and inflation notably above its target would place a central bank in uncomfortable territory. One additional challenge of the proposed framework is specifying a path for the policy rate immediately following liftoff that smoothly and gradually eases inflation back down to target and facilitates a gradual adjustment of the labor market. In the proposed framework, once the cumulative average rate of inflation during the lower-bound period has reached the target of 2 percent, policy would revert to a standard policy rule. implies that a standard policy rule would kick in at a point when inflation is above target and the economy is at or beyond full employment. Even with a smoothing (inertial) property, a standard policy rule could result in a relatively sharp path of tightening, and the anticipation of the steep post-liftoff rate path itself could undo some of the benefits associated with the framework. Thus, there would likely need to be a transitional framework to guide policy initially post-liftoff that might make both communications and policy somewhat more complicated. The temporary price-level targeting framework proposed by Bernanke is appealing on a conceptual level because it proposes a simple and clear mechanism to help policymakers deal with the challenges posed by the lower bound on the policy rate in an environment of uncertainty. The reality is more complicated, however, especially if, as the paper suggests, many central banks in advanced economies are likely to operate with an additional tool when the policy rate is constrained. In the paper, Bernanke cites Chair Yellen's 2016 Jackson Hole speech, which suggests that in a recession, the FOMC could be expected to turn to large-scale asset purchases as well as forward guidance after the federal funds rate is lowered to zero. Today, when many central banks in advanced economies are operating with two distinct tools, policymakers consider the effects of the balance sheet as well as the policy rate in their assessment of the extent of accommodation provided by monetary policy. In the United States, from the time tapering was first discussed to the September 2017 meeting, when the path for balance sheet runoff was adopted, FOMC minutes and statements suggest that participants considered the degree of accommodation provided by both policy tools in their discussions of the sequencing and timing of changes to policy settings. Discussions about the sequencing of "normalization" and the delay of balance sheet runoff "until normalization of the level of the federal funds rate is well under way" effectively consider the extent to which maintaining the balance sheet may continue to provide makeup support for the economy while enabling the policy rate to escape the lower bound earlier than otherwise in a low neutral rate environment. As Bernanke acknowledges, now that many central banks have developed playbooks specifying the operational modalities associated with asset purchases, and there is some familiarity with their effects on asset prices and financial conditions, there is a greater likelihood that asset purchases would become a part of the policy reaction function, along with forward guidance, during lower-bound episodes. Yet, as I have noted previously in the international context, asset purchases can complicate policy frameworks and communications, because their deployment and withdrawal has tended to be discontinuous and discrete and thus may be associated with greater uncertainty about the policy reaction function. It appears the public closely follows statements about both the policy rate and asset purchases to glean possible information about the future overall stance of monetary policy. This suggests there may be benefits in communications and predictability of a unified policy framework across the tools that is more predictable and continuous. Relatedly, one helpful elaboration of the framework Bernanke proposes might be to incorporate a unified measure, or shadow rate, that would capture the degree of policy accommodation provided through the combined settings of both asset purchases and the policy rate. Moving away from the policy proposal in the paper, there are two other aspects of a low neutral rate world that I want to touch on briefly: cross-border spillovers and financial imbalances. The new normal appears to be characterized by low neutral rates and a weak relationship between overall inflation and unemployment not only in the United States, but also in many other advanced economies with lower-bound episodes likely to be more prevalent. The current environment appears also to evidence intensified cross-border feedback into financial conditions. In this kind of environment, it is conceivable the kind of committed forward guidance associated with the temporary price- level targeting framework proposed by Bernanke, by helping rule out anticipation of a standard preemptive tightening, could help avoid unwarranted premature tightening through the exchange rate. Given available data, it is difficult to disentangle whether the heightened cross- border feedback effects are attributable to the low level of neutral rates, particular features of today's lower-bound episodes, or the interaction of the policies adopted by many central banks. In any case, recent Federal Reserve staff analysis suggests that cross-border spillovers have increased notably since the crisis and are quite large. For instance, European Central Bank policy news that leads to a 10 basis point decrease in the German 10-year term premium is associated with a roughly 5 basis point decrease in the U.S. 10-year term premium; by contrast, these spillovers were smaller in the years leading up to the crisis. Moreover, news about policy rates and term premiums appears to have quite different effects on exchange rates, such that the ordering of policy normalization can have important implications for exchange rates and associated financial conditions, as I discussed earlier this year. Recent staff estimates suggest that news about expected changes in the policy rate tends to have a large spillover through the exchange rate, whereas news about changes in term premiums tends to lead to corresponding cross- border changes in term premiums, as discussed previously, with much smaller effects on the exchange rate. Moreover, the exchange rate effect of changes in short-term rates is much greater than it was pre-crisis. For instance, policy news that leads to a 25 basis point increase in the expected interest rate portion of the 10-year Treasury yield is associated with a roughly 3 percentage point appreciation in the dollar, which is three times greater than the response pre-crisis. By contrast, policy news surrounding a change in U.S. term premiums has a muted effect on the exchange rate both now and pre-crisis. Finally, a low neutral rate environment may also be associated with a heightened risk of asset price bubbles, which could exacerbate the tradeoff for monetary policy between achieving the traditional dual-mandate goals and preventing the kinds of imbalances that could contribute to financial instability. Standard asset-valuation models suggest that a persistently low neutral rate, depending on the factors driving it, could lead to higher ratios of asset prices to underlying income flows--for example, higher ratios of prices to earnings for stocks or higher prices of buildings relative to rents. If asset markets were highly efficient and participants had excellent foresight, this would not necessarily lead to imbalances. However, to the extent that financial markets extrapolate price movements, markets may not transition smoothly to asset valuations that reflect underlying fundamentals but may instead evidence periods of overshooting. forces may have played a role in both the stock market boom that ended in the bust of 2001 and the house price bubble that burst in 2007-09. The risks of such financial imbalances may be greater in the context of the kind of explicit inflation target overshooting policies proposed in the paper. Again, if market participants were perfectly rational, overshooting policies would not likely pose financial stability risks. But the combination of low interest rates and low unemployment that would prevail during the inflation overshooting period could well spark capital markets to overextend, leading to financial imbalances. Macroprudential tools are the preferred first line of defense to address such financial imbalances, which should in principle enable monetary policy to focus on price stability and macroeconomic stabilization. But the development and deployment of macroprudential tools is still relatively untested in the U.S. context, and the toolkit is limited. Although important research suggests that the situations under which monetary policy should take financial imbalances into account are likely to be very rare, some recent research has pointed out that the case in favor of taking financial imbalances into account is strengthened when the consequences of financial crises are long lasting. this case, another complication of a persistently low neutral rate may be a sharper tradeoff between achieving the traditional dual-mandate objectives and avoiding financial stability risks, which may make it even more difficult to achieve our price-stability objective. . . . . . . . . . . . . vol. 32 .
r171012b_FOMC
united states
2017-10-12T00:00:00
Prospects for Emerging Market Economies in a Normalizing Global Economy
powell
1
For release on delivery Remarks by at the Thank you for inviting me to speak here at the Institute of International Finance Annual Membership meeting. I am pleased to note that there have been signs lately that a sustainable global recovery may finally be materializing. This is certainly good news, although significant risks and uncertainties remain. One important question is how the emerging market economies (EMEs) will fare as global monetary conditions normalize. In our intertwined world, prospects for these economies are a significant driver of prospects for the United States and other advanced economies. In my remarks today I will argue that, despite the risks and uncertainties, EMEs are likely to manage that normalization reasonably well. As many observers have noted, EME economic prospects are strongly linked to the evolution of capital flows. Accordingly, I will first review the recent rebound in EME capital inflows and analyze the drivers of this rebound. Against this backdrop, I will then discuss how the prospects for EMEs depend on three factors: Vulnerabilities in the EMEs themselves; the evolution of advanced-economy monetary conditions, including those in the United States; and market responses to that evolution. As always, my comments here represent my own views. After real GDP growth plummeted in many EMEs during the Global Financial be short lived and was followed by a notable, widespread fall in EME growth as advanced economies remained sluggish, economic imbalances in China mounted, and commodity prices plunged. Lately, however, the streak of weak growth in the EMEs appears to have been broken: The downward trend in Chinese growth has flattened, growth in other EMEs has picked up some, and Brazil seems to be moving into recovery mode. The factors that underlie the pickup in EMEs to a large extent represent a reversal of developments that led to the slowing. The improvement in the performance of the advanced economies has become more widespread. Chinese authorities have bolstered their economy by providing more credit stimulus. And commodity prices have bounced back from their lows in early 2016, bolstering activity and allaying financial stability concerns in commodity-exporting economies. These developments have also contributed to a modest reversal of the slowdown in global trade seen in recent years. A rebound in capital flows has come along with the pickup in economic performance in the EMEs. Slide 2 shows net private capital flows to EMEs--the difference between gross private inflows and gross private outflows. These private net inflows are quite volatile, as the experience of the past 10 years shows. Strong pre-GFC net inflows to major emerging markets (the black line)--hovering in the neighborhood of 3 to 4 percent of EME gross domestic product (GDP)--were interrupted by a collapse during the crisis, but inflows quickly recovered and stayed strong through 2010. After that, net inflows trended down for several years and turned negative by 2015. Part of this retrenchment reflected Chinese net inflows turning into net outflows due to what might be considered special circumstances--notably, changes in expectations of Chinese exchange rate policy. But even taking China out of the picture, as shown by the dashed blue line, there was a clear downward trend in net inflows. Over the past couple of years, however, net inflows have recovered and have averaged, if China is excluded, 0.7 percent of GDP in 2016 and about 1-1/2 percent of GDP in early 2017. As shown in slide 3, other measures of capital flows, such as flows into EME investment funds, show an even sharper rebound. The recent recovery of investor appetite for EME exposure has shown up in asset prices as well. Emerging-market credit spreads have declined, and equity prices have risen (slide 4). These developments are not occurring in isolation, but in the context of a general improvement in the global outlook and in investor risk sentiment. The improvement in economic fundamentals raises the following question: To what extent can the recent recovery in EME capital flows be explained by these better economic One way to shed light on this question is to compare the recent behavior of EME capital flows with what we might expect from a model of these flows based on historical data. In a recent study, Federal Reserve staff regressed net private capital inflows into several key EMEs on measures of investment opportunities in these economies, monetary policy variables, and risk sentiment variables. As can be seen in slide 5, by comparing the solid and dashed lines, the model does a fairly good job overall of fitting the data. It is instructive to look at what the model tells us about the slowing of flows between 2010 and 2015. Note that the falloff in commodity prices (the red portion of the bars) was the largest contributor to the slowdown in flows. The decline in economic growth differentials between the EMEs and advanced economies (the yellow portions) was also a major contributor. In fact, growth differentials became a slightly negative contributor in 2015 after being substantially positive in 2010. blue portions) also became less of a factor in 2015 in driving flows to EMEs. As for the recent rebound in flows, over the past year the model's predicted net inflows (the dashed line) have actually been significantly above actual net inflows (the solid line), suggesting that there is some room for flows to increase further without raising concerns. The model attributes the recovery of flows primarily to the turnaround in commodity prices and, to a substantially lesser extent, to improvements in risk sentiment (as seen by some waning of the negative contribution from the slashed green bars). The growth differential is not playing a major role because the rise in EME growth has been accompanied by a rise in advanced-economy growth. All in all, this evidence suggests that the recent pickup of capital flows to EMEs has not outrun its fundamental determinants, which provides some encouragement that these flows will not reverse themselves and endanger EME prospects, a situation that is also encouraging for U.S. prospects. Some observers have noted that the risk of a reversal of EME capital flows may become more pronounced as U.S. and global interest rates return to more normal levels. These developments could encourage capital to return to the advanced economies and, by raising domestic interest rates and putting downward pressures on emerging market currencies, could also enlarge EME debt burdens. In assessing this risk, as I mentioned earlier, three elements are important: first, the vulnerabilities in the EMEs themselves; second, the evolution of advanced-economy monetary policies; and, third, how markets might respond to that evolution. Let me discuss each of these elements in turn. There is clear empirical evidence that the response of EME financial markets to different shocks, including changes in U.S. interest rates, depends importantly on the state of economic fundamentals in the EMEs themselves. For example, Bowman and coauthors document in their study that a deterioration in a country's economic conditions significantly increases its vulnerability to adverse effects from changes in U.S. interest rates. A case in point is the so-called taper tantrum in 2013, when rises in sovereign bond spreads were significantly greater in those EMEs with greater relative vulnerabilities. There is little doubt that over the past couple of decades, EME macroeconomic fundamentals and policy frameworks have improved substantially. One way you can see this improvement is through an index of aggregate EME vulnerability (the black line in slide 6), which is based economic data on a variety of variables from 13 major economies. According to this index, EME vulnerabilities today stand well below those in the 1990s--a period during which financial crises in EMEs were much more prevalent. That said, the vulnerability index has been trending up since 2008. Part of this increase in the vulnerability index can be attributed to a run-up in bank credit to the private sector, which brings me to a key risk for EME prospects: the position of EME corporates. Observers have been expressing concerns about the mounting levels of corporate debt and the risk that a normalization of global conditions could exacerbate debt service burdens of EME corporations--particularly those with elevated levels of dollar-denominated debt--by raising global interest rates, boosting the value of the dollar, and perhaps damping economic activity. Given the prominence of this risk, I will discuss EME corporates in a bit more detail. Since 2008, the debt of EME nonfinancial corporations has tripled in dollar value, reaching roughly $27 trillion in the first quarter of 2017. As a share of GDP, as shown by the black line in slide 7, it has nearly doubled, to over 100 percent of GDP. China's situation is distinct from many other EMEs. On the one hand, as can be seen by the red line, its corporate debt, at 170 percent of GDP now, is much higher than most other EMEs and substantially above the level we saw in East Asia before the Asian crisis. On the other hand, Chinese corporates are much less exposed to changes in exchange rates and global interest rates. But the rising amount of debt by itself does not tell us whether this debt is excessive and how vulnerable EME corporates are to global monetary and market shocks. For that assessment we need to drill down deeper into the health of the corporate sector. In a recent study, Beltran and coauthors undertake such an analysis using a common metric of debt service capacity--the interest coverage ratio, or ICR, which is the ratio of earnings to interest expense. All else being equal, this ratio is lower for firms that are less profitable, more leveraged, and have a higher cost of borrowing. Using firm-level data, the authors classify the debt of those firms with an ICR of less than 2 as "debt-at- They find, as shown by the black line in slide 8, that this measure of risky EME corporate debt has almost tripled since 2011 to about 30 percent of GDP. But this share is still considerably lower than the 46 percent of GDP debt-at-risk in East Asia on the eve of the Asian crisis (the horizontal dashed black line in the chart). For China, though, the debt-at-risk now exceeds what we saw in East Asia before the Asian crisis. Outside of China (the dashed blue line), EME debt-at-risk, at about 10 percent of GDP, seems much more manageable. However, as can be seen by the blue portions of the bars in slide 9, debt-at-risk in a number of EMEs, including South Korea, India, Turkey, and Brazil exceeds that average level. How will EME corporate debt fare going forward as global normalization proceeds? The results of the study I just discussed imply that a 1 percentage point increase in EME corporate borrowing costs by itself would not be so problematic, at least outside of China. What this shock would do to debt-at-risk is shown by the red cross-hatched portions of the bars in the chart. But it would be a bigger deal if the rise in borrowing costs was accompanied by a more generalized adverse turn of events in EMEs, modeled here as a 20 percent earnings reduction and a 20 percent hit to the value of EME currencies against the dollar. The estimated effects of these additional shocks on debt-at- risk are shown by the slashed red portions of the bars. In this case, aggregate EME debt-at-risk rises from about 30 percent of GDP to around the level seen prior to the Asian financial crisis. Notably, the increase comes mainly from China, where debt-at-risk jumps to about 85 percent of GDP. Outside of China, risky debt also rises substantially but seemingly not to levels that would be considered unmanageable. Overall, based on this analysis, I would conclude that corporate debt represents a moderate degree of vulnerability for EME prospects. The situation is not alarming, but risks are significant and bear close watching, especially in China. What of the evolution of monetary conditions in the advanced economies? I will confine myself here to Fed policy. One factor that favors easier adjustment in EMEs is that U.S. monetary policy normalization has been and should continue to be gradual, as long as the U.S. economy evolves roughly as expected. Since the start of normalization in December 2015, the federal funds rate has risen to about 1-1/4 percent from its effective lower bound (slide 10). The median projections 2.9 percent by the end of 2020, fairly close to what is regarded by the median participant as its long-run value and significantly below its average value in the years prior to the GFC. As reflected in the FOMC's recent communications, the shrinkage of the Fed's balance sheet is also expected to proceed quite gradually, with slowly phased-in increases in caps on the monthly reductions in the Federal Reserve's security holdings. The expectation of gradual policy normalization should reduce the likelihood of outsized movements in interest rates. Indeed, even if we add, say, a 50 basis point term premium to the expected long-run federal funds rate, this value would still leave long- term U.S. interest rates (shown in slide 11) well below their pre-GFC averages. As long as global financial conditions normalize in an orderly fashion, EMEs should have sufficient time to adjust. And, as we saw earlier, interest rate changes of this magnitude should not lead to generalized corporate distress in EMEs, although undoubtedly some corporates are more exposed and could experience difficulties. All that said, market movements can be noisy, which brings me to what I believe is the most uncertain element--the potentially volatile behavior of markets even in an environment of relatively contained EME vulnerabilities and of gradual and clearly communicated advanced-economy monetary policies. So far, markets have behaved in a manner consistent with a relatively benign scenario for EMEs: Risk sentiment is holding up, credit spreads in emerging markets have been declining, equities are up, long-term yields have hardly budged, and the dollar has been declining. Markets, however, can turn on a dime, and reactions can be outsized. This concern may be especially relevant at present, given the low level of volatility and elevated asset prices in global markets, which may increase the likelihood and severity of an adjustment. Most of the time bouts of market turbulence lead to relatively quick corrections that leave markets more resilient without substantially depressing global growth. The taper tantrum of 2013 that I mentioned earlier is a good example. Ultimately, the policy adjustments made by some of the most affected economies, along with the more realistic appraisal of risks by global investors, likely left the global economy in a somewhat better position than before the episode. That said, however, market tantrums pose complex economic and financial challenges, and such episodes carry a significant risk of snowballing into something bigger that more substantially threatens the economic expansion. To conclude, I have suggested that the most likely outcome is that the challenges posed to EMEs by the normalization of global financial conditions will be manageable. So far, capital flows have been moving in line with market fundamentals. Although, EME vulnerabilities have been rising, they are still well below the levels of the crisis- prone years of the 1980s and 1990s. Global monetary conditions are expected to normalize only gradually, as the Federal Reserve and other advanced-economy central banks continue to stress clear communication and transparency. And the reaction of EME financial markets so far has been benign. But significant risks of more adverse scenarios remain. The corporate debt situation in EMEs has been worsening, particularly in China, and market reactions to even small surprises can be unpredictable and outsized. Even with these risks, however, the best thing the Federal Reserve can do--not just for the United States, but for the global economy at large--is to keep our house in order through the continued pursuit of our dual mandate. Finally, it bears remembering that Fed policy normalization is occurring not in isolation, but in the context of a solid U.S. economic recovery, which should benefit all economies around the world. . . Journal of vol. 51 . at . . . . chapter 1 of . chapter 2 in . . . . . . . .
r171015a_FOMC
united states
2017-10-15T00:00:00
The U.S. Economy and Monetary Policy
yellen
1
I would like to thank the Group of Thirty for inviting me to participate in their American Development Bank for hosting this event. My comments today will focus on U.S. economic prospects and monetary policy. Economic activity in the United States has been growing moderately so far this year, and the labor market has continued to strengthen. The terrible hurricanes that hit Texas, Florida, Puerto Rico, and our neighbors in the Caribbean caused tremendous damage and upended many lives, and our hearts go out to those affected. While the effects of the hurricanes on the U.S. economy are quite noticeable in the short term, history suggests that the longer-term effects will be modest and that aggregate economic activity will recover quickly. Starting with the labor market, through August, payroll job gains averaged 170,000 per month this year, down only a little from the average pace of gains in 2016 and still well above estimates of the pace necessary to absorb new entrants to the labor force. In September, payrolls were reported to have declined 33,000, but that weakness reflected the effects of Hurricane Irma, which hit Florida during the reference week for the September labor market surveys. I would expect employment to bounce back in subsequent months as communities recover and people return to their jobs. Other aspects of the jobs report for September were strong. The unemployment rate, which seems not to have been noticeably affected by the hurricanes, declined further to 4.2 percent, down about 1/2 percentage point from the end of 2016 and below the median of Federal Open force participation continues to strengthen relative to a downward trend that reflects, in part, the aging of the population. Other labor market indicators, including the rates of job openings and the number of people who voluntarily quit their jobs, also point to strength. Wage indicators have been mixed, and the most recent news, on average hourly earnings through September, was encouraging. On balance, wage gains appear moderate, and the pace seems broadly consistent with a tightening labor market once we account for the disappointing productivity growth in recent years. I expect the labor market to strengthen further as economic growth continues. The hurricanes will likely result in some hit to GDP growth in the third quarter but a rebound thereafter, and smoothing through those movements, I'm expecting growth that continues to exceed potential in the second half of the year. The latest projections from FOMC participants have a median of 2-1/2 percent GDP growth this year. Growth of consumer spending has been supported by the ongoing job gains and relatively high levels of household wealth and consumer sentiment. Business investment has strengthened this year following surprising weakness in 2016. The faster gains partly reflect an upturn in investment in the energy sector as oil prices have firmed. But the gains have been broader than that, and some measures of business sentiment remain quite strong. Exports also have risen this year, as growth abroad has solidified and the exchange value of the dollar has declined somewhat. My fellow FOMC participants and I perceive that risks to global growth have receded somewhat and expect growth to continue to improve over the near term. The biggest surprise in the U.S. economy this year has been inflation. Earlier this year, the 12-month change in the price index for personal consumption expenditures seemed consistent with the view that inflation had been held down by both the sizable fall in oil prices and the appreciation of the dollar starting around mid-2014, and that these influences have diminished significantly by this year. Accordingly, inflation seemed well on its way to the FOMC's 2 percent inflation objective on a sustainable basis. Inflation readings over the past several months have been surprisingly soft, however, and the 12-month change in core PCE prices has fallen to 1.3 percent. The recent softness seems to have been exaggerated by what look like one-off reductions in some categories of prices, especially a large decline in quality-adjusted prices for wireless telephone services. More generally, it is common to see movements in inflation of a few tenths of a percentage point that are hard to explain, and such "surprises" should not really be surprising. My best guess is that these soft readings will not persist, and with the ongoing strengthening of labor markets, I expect inflation to move higher next year. Most of my colleagues on the FOMC agree. In the latest Summary of Economic Projections, my colleagues and I project inflation to move higher next year and to reach 2 percent by 2019. To be sure, our understanding of the forces that drive inflation is imperfect, and we recognize that this year's low inflation could reflect something more persistent than is reflected in our baseline projections. The fact that a number of other advanced economies are also experiencing persistently low inflation understandably adds to the sense among many analysts that something more structural may be going on. Let me mention a few possibilities of more fundamental influences. First, given that estimates of the natural rate of unemployment are so uncertain, it is possible that there is more slack in U.S. labor markets than is commonly recognized, which may be true for some other advanced economies as well. If so, some further tightening in the labor market might be needed to lift inflation back to 2 percent. Second, some measures of longer-term inflation expectations have edged lower over the past few years in several major economies, and it remains an open question whether these measures might be reflecting a true decline in expectations that is broad enough to be affecting actual inflation outcomes. Third, our framework for understanding inflation dynamics could be misspecified in some way. For example, global developments--perhaps technological in nature, such as the tremendous growth of online shopping--could be helping to hold down inflation in a persistent way in many countries. Or there could be sector-specific developments--such as the subdued rise in medical prices in the United States in recent years--that are not typically included in aggregate inflation equations but which have contributed to lower inflation. Such global and sectoral developments could continue to be important restraining influences on inflation. Of course, there are also risks that could unexpectedly boost inflation more rapidly than expected, such as resource utilization having a stronger influence when the economy is running closer to full capacity. In this economic environment, with ongoing improvements in labor market conditions and softness in inflation that is expected to be temporary, the FOMC has continued its policy of gradual policy normalization. As the Committee announced after our September meeting, we are initiating our balance sheet normalization program this month. That program, which was described in the June Addendum to the Policy Normalization Principles and Plans, will gradually scale back our reinvestments of proceeds from maturing Treasury securities and principal payments from agency securities. As a result, our balance sheet will decline gradually and predictably. limiting the volume of securities that private investors will have to absorb as we reduce our holdings, the caps should guard against outsized moves in interest rates and other potential market strains. Changing the target range for the federal funds rate is our primary means of adjusting the stance of monetary policy. Our balance sheet is not intended to be an active tool for monetary policy in normal times. We therefore do not plan on making adjustments to our balance sheet normalization program. But, of course, as we stated in June, the Committee would be prepared to resume reinvestments if a material deterioration in the economic outlook were to warrant a sizable reduction in the federal funds rate. Also at our September meeting, the Committee decided to maintain its target for the federal funds rate. We continue to expect that the ongoing strength of the economy will warrant gradual increases in that rate to sustain a healthy labor market and stabilize inflation around our 2 percent longer-run objective. That expectation is based on our view that the federal funds rate remains somewhat below its neutral level--that is, the level that is neither expansionary nor contractionary and keeps the economy operating on an even keel. The neutral rate currently appears to be quite low by historical standards, implying that the federal funds rate would not have to rise much further to get to a neutral policy stance. But we expect the neutral level of the federal funds rate to rise somewhat over time, and, as a result, additional gradual rate hikes are likely to be appropriate over the next few years to sustain the economic expansion. Indeed, FOMC participants have built such a gradual path of rate hikes into their projections for the next couple of years. Of course, policy is not on a preset course. I have spoken about some of the uncertainties associated with the inflation outlook in particular, and we will be paying close attention to the inflation data in the months ahead. But uncertainty about the outlook is by no means limited to inflation. As always, the Committee will adjust the stance of monetary policy in response to incoming economic information and the evolution of the economic outlook to achieve its objectives of maximum employment and stable prices. Moreover, we are mindful of the possibility that shifting expectations concerning the path of U.S. policy can lead to spillovers to other economies via financial markets and the value of the dollar. We remain committed to communicating as clearly and effectively as possible to help mitigate the risk of sudden changes in the policy outlook among market participants that could spur unintended effects in global financial markets.
r171018a_FOMC
united states
2017-10-18T00:00:00
Financial Innovation: A World in Transition
powell
1
We live in a world defined by the rapid pace of technological change. Four of the five largest U.S. companies by market capitalization are classified as "technology companies," where the term describes the products that these companies sell and how they operate. Thanks to decades of investment in information technology, especially in electronic communication networks, consumers now expect services to be available instantly at their fingertips. This statement is true for almost every industry and every aspect of daily life, including financial transactions. This evening, I will consider how technology is changing the delivery of retail banking and payments services. I will discuss the roles of banks, fintech companies, and other stakeholders in moving the United States forward to a better payment system. I will also review the Federal Reserve's collaboration with these payment system stakeholders in pursuing that goal. I will argue that, for policymakers as well as the private sector, the challenge is to embrace technology as a means of improving convenience and speed in the delivery of financial services, while also assuring the security and privacy necessary to sustain the public's trust. As always, the views I express here are my own. As with so many sectors of the economy, technology is transforming the retail banking sector. The banking industry has traditionally been characterized by physical branches, privileged access to financial data, and distinct expertise in analyzing such data. But in today's world companies need not be bound by physical infrastructure and related overhead expenses. For example, companies can take advantage of an explosion in available data, and leverage advances in computing power, via cloud computing, analytical tools, and off-the-shelf machine learning tools, to make sense of those data. The banking industry is adjusting to this world, and facing significant challenges to traditional banking business models. For example, today financial technology can support access to credit through innovative approaches to gathering and analyzing data. Historically, a customer seeking a loan has provided financial statements to a bank or other traditional lending institution. More recently, the use of a fintech platform may allow a lender to quickly monitor and analyze more up-to-date data from a broader range of sources, including those outside of the traditional lending process, to verify an applicant's identity and make inferences about the applicant's overall financial health. For example, a business loan applicant could submit information such as shipping data or customer reviews as additional input to more traditional data sources. With this additional information, the bank would have a more complete picture of an applicant's day-to-day activity and overall financial capacity, and potentially a greater ability to provide credit to customers, including some who might have been otherwise denied a loan based on traditional data. Fintech firms are also finding ways to use banks' data, in some cases without entering into an explicit partnership with the bank. With customers' permission, fintech firms have increasingly turned to data aggregators to "screen scrape" information from financial accounts. In such cases, data aggregators collect and store online banking logins and passwords provided by the bank's customers and use them to log directly into the customer's banking account. This information can be used to provide consumers with convenient real-time snapshots of their financial information across multiple banks and accounts. These examples highlight that there is a balance that needs to be achieved in this innovative environment. On the one hand, new technologies have enabled banks and other firms to find different ways of meeting consumers' demand for speed and convenience. On the other hand, these same technologies raise new considerations about data security and safety, as well as consumer privacy and protection. Policymakers and the financial industry must assure that enhanced convenience and speed in financial services do not undermine the safety, security, and reliability of those services. Technology is also shaping changes in retail payments. As with retail banking, retail payments will need to evolve to meet consumer expectations of constant connectivity and instant access while assuring security and privacy. It is not news that consumers' lives, including the way they pay, are now intertwined with mobile phone usage. While the overall amount of time we spend on our phones continues to grow, the duration of individual phone sessions is actually shrinking. In late 2015, Google estimated that the average mobile session lasts only 70 seconds, and may be repeated dozens of times per day. As a result, payment innovators have had to create new ways to move money that are not only fast and mobile-focused, but also sufficiently "frictionless" that consumers can now fit commerce into these brief interludes. This development has ushered in a world of multiple smartphone apps that allow for "instant" payments. We can use a payments app to move funds instantly to anyone who has that app. Some banks have similarly collaborated to build faster payments applications that leverage their deposit account systems. And we are already moving to a world in which we need not open a special app or go to our bank's website in order to send money. Many people here will have taken an Uber or Lyft, and then paid your driver without relaunching the app, much less reaching for your wallet. Similarly, payment providers can now leverage the application programming interfaces (APIs)-- essentially the protocols--of smartphone messaging services to integrate their payment tools directly into messaging applications: Nowadays, consumers can simply "attach" money while messaging a friend. Innovation in retail payments can also offer tangible benefits to consumers beyond convenience. Improvements in security, such as our ability to authenticate consumers and detect fraudulent transactions, are also possible through innovation. For instance, mobile payments introduce a wide array of ways to authenticate a consumer's identity, including two-factor authentication codes sent via text message to the phone; biometrics, like a fingerprint or face scan; device identification information; IP address; and geolocation data. Similarly, increased access to transaction data and cloud computing resources means that we have smarter, faster computational processes--like enhanced neural networks--to detect payments that do not match a consumer's spending patterns and help prevent fraudulent transactions. Both security and convenience are crucial elements for successful payments innovation. Consumers will not store their funds in a system that is not secure and will not want to transfer funds out of an otherwise secure system if the process is cumbersome. The examples I have highlighted so far illustrate payments innovations from fintech firms and banks alike. I want to spend a moment highlighting the special role of banks in the payments process, and how banks are needed in order to create innovations that can be used broadly across the economy. The traditional role of banks in the payments process has been to hold deposits and enable their transfer from one individual or business to another. A depositor might withdraw cash from the bank's ATM to pay a friend or write a check to make a payment. Over time, we have moved from ATMs and paper checks toward electronic payments and online payments through banking platforms--payment methods for which banks are still perceived as essential. More recently, consumer-facing technology has become front and center. At times, the payments process is so seamlessly integrated that one can forget that there is even a bank in the process, as with the Uber and Lyft example. But despite this shift in focus, payments innovation is still fundamentally about how, when, and where an individual's deposits can be held, transferred, and packaged with other information. And banks are still important players in making that happen. Even where this reality is obscured by several layers of technology, there is almost always a bank involved in consumer transactions. Given their importance in holding and transferring funds, banks continue to have a key role to play in the design and safety of more efficient retail payment systems. Without bank participation, it would be difficult to change how funds are transferred in a way that brings pervasive benefits to consumers. For example, if the aim is to capture the speed and continuous nature of today's commerce in the payment system as a whole--as has become a focus for many countries, including the United States--it would be difficult to do so without banks allowing the transfer of their deposits on a 24x7 real-time basis. Of course, individual payment systems are already doing this for consumers within their own network. But achieving these benefits on a broad scale would be challenging without the banking system's participation, because of the large role banks have in holding and transferring funds. All this is to say that we are at a critical juncture in the payment system's evolution, where technology is rapidly changing many facets of the payments process. Fintech firms and banks are seizing these technological changes in their own ways. But a collective and collaborative effort by all payment stakeholders will also be important as the United States works to achieve a payment system that has broad reach and can seamlessly integrate with other systems to transfer funds in a reliable, secure, and convenient manner. When we pay with cash or write a check, we don't spend a lot of time worrying about who our recipient banks with; that universality seems an appropriate standard for new payment options as well. At the Federal Reserve, we believe it is important to embrace opportunities provided by technological change to improve the convenience and safety of the U.S. payment system. About five years ago, we launched our payment system improvement initiative, which committed the Federal Reserve to working with the full range of payments system stakeholders to achieve a faster, more secure payment system. We saw that technology was transforming the nature of commerce and end-user expectations for payment services. We saw some players coming to market with innovative product offerings, but it was a fragmented approach. Meanwhile, other countries were advancing on initiatives to improve the speed and safety of their payment systems, creating a gap between the U.S. payment system and those abroad. While the Federal Reserve does not have plenary authority over payment systems, as is the case in some other countries, we have often played an important role as a leader and catalyst for change. It was in this role that we issued a call to action asking stakeholders to come together in pursuit of a better payment system for the future-- focusing on speed, security, efficiency, international payments, and collaboration. believe a collaborative approach ensures that change is designed by those whose commitment and expertise are needed to improve the payment system. Stakeholders - including banks, fintech companies, consumer groups, regulators, and others -- answered our call to action, signing up for two task forces convened by the and around 200 joined the Secure Payments Task Force. Let me first touch upon the Faster Payments Task Force, which has recently completed its work. The Faster Payments Task Force's mission was to identify and assess alternative approaches for implementing a safe, ubiquitous, faster payments system in the United States. The task force began its work by developing a set of effectiveness criteria laying out desirable attributes for faster payment solutions covering the broad categories of ubiquity, efficiency, safety and security, speed, legal framework, and governance. While the task force was focused on improving speed and convenience, it also underscored the importance of safety and security by establishing 11 criteria of a total of 36 focused on those objectives. The task force encouraged its members to submit proposals for faster payment solutions that would meet the criteria that its members had agreed upon. A diverse range of task force members rose to the challenge by submitting 16 proposals to be vetted against its criteria. These proposals represent a broad universe of creative and innovative ways to deliver faster payments by embracing technology. They range in structure from solutions that use a centralized clearing and settlement mechanism to others that focus on distributed networks. Some are based on traditional assets held in transaction accounts, and others depend on new asset forms like digital currencies. The role of the task force process was not to recommend or implement a faster payment solution, but rather to offer a range of ideas to move the United States further along the path to a better payment system. We believe that the task force has successfully carried out this role. We are very grateful to the members of the Faster Payment Task Force for all of their work and for the collaborative spirit they brought to the job. But there is more to be done to advance our collective vision of a ubiquitous, real-time, secure future payment system. Last month, the Federal Reserve reaffirmed its commitment to that vision in the outlines refreshed strategies and tactics that we, in collaboration with the payment industry, will employ to make further progress. I will mention just a few. One of the recommendations from the Faster Payments Task Force work was to establish an industry governance framework for collaboration and decision-making on faster payments. To move forward in creating this framework, the task force established the Governance Framework Formation Team to develop, publish, and solicit public comment on a proposal for a governance framework. This work group will carry out many of the task force recommendations and the Federal Reserve, at the request of the task force, is chairing and facilitating this effort. In addition, the Federal Reserve is considering providing settlement services--a traditional core function of a central bank--to address the future needs of a ubiquitous real-time retail payments environment. We plan to actively engage with the industry and other stakeholders to further understand gaps and requirements for real-time retail payments settlement and assess alternative models that will support needs over the long term. We also plan to explore and assess the need, if any, for other related Federal Reserve services or capabilities. In carrying out this assessment, we will be guided by current and potential market developments and challenges, as well as our long- established criteria for offering new products and services. These criteria include the need to fully recover costs over the long term; the expectation that the new service will yield clear public benefit; and the expectation that other providers alone cannot be expected to provide the service with a reasonable effectiveness, scope, and equity. The Federal Reserve will also continue to support the ongoing work of the Secure Payments Task Force. This task force has been working to educate stakeholders on payment security practices, risks, and actions that could enhance payment security. These are challenging topics, because they require stakeholders to be open and forthcoming about potential vulnerabilities if there is to be substantial progress. The Federal Reserve will also pursue two new efforts focused on security. Early in 2018, we plan to launch a study analyzing payment security vulnerabilities. This study is similar to other research efforts that the Federal Reserve has pursued to build foundational and collective understanding of the U.S. payment system. We also plan to build upon the contributions of the Secure Payments Task Force to establish work groups focused on approaches for reducing the cost and prevalence of specific payment security vulnerabilities. In a world of ever-escalating threats to the integrity of our payment system, this collective action is needed to sustain public confidence. These were just a few of our new initiatives. The package of next steps the Federal Reserve outlined in its recent paper confirm that we remain steadfast in our commitment to work with industry and other stakeholders to achieve a better payment system through both leadership and action. Rapidly changing technology is providing a historic opportunity to transform our daily lives, including the way we pay. Fintech firms and banks are embracing this change, as they strive to address consumer demands for more timely and convenient payments. A range of innovative products that seamlessly integrate with other services is now available at our fingertips. It is essential, however, that this innovation not come at the cost of a safe and secure payment system that retains the confidence of its end users. The examples I have drawn upon today highlight that fintech firms and banks must each play a role in assuring that enhancements to convenience and speed do not undermine safety and security. More broadly, the Faster and Secure Payments Task Forces demonstrate the importance of broad and diverse stakeholder input, which are essential if the United States is to implement safe, ubiquitous real-time retail payments. Working together, we can achieve a safe and fast payments system that meets the evolving needs of consumers and our dynamic economy.
r171020a_FOMC
united states
2017-10-20T00:00:00
A Challenging Decade and a Question for the Future
yellen
1
I am delighted to address the National Economists Club, and I am also honored on this occasion to be associated with Herb Stein, whose public service and scholarship--characterized by careful analysis, clear-eyed pragmatism, and sharp wit-- exemplified the best in our profession. Herb was willing to consider new ideas and new approaches to government policy, and that openness fits with the subject of my remarks today. Namely, I will discuss the unconventional monetary policy tools used by the Federal Reserve since the start of the financial crisis and Great Recession and the role that those tools may play in addressing future economic challenges. Nearly 10 years ago, with our nation mired in its worst economic and financial confronted a key challenge to the pursuit of its congressionally mandated goals of maximum employment and price stability: how to support a weakening U.S. economy once our main conventional policy tool, the federal funds rate, had been lowered to essentially zero. Addressing that problem eventually led to a second challenge: how to ensure that we could scale back monetary policy accommodation in an orderly fashion once it was no longer needed. Failure to meet either challenge would have significantly compromised our ability to foster maximum employment and price stability, leading to serious consequences for the livelihoods of millions of Americans. I will argue today that we have met the first challenge and have made good progress to date in meeting the second. Thanks in part to the monetary policy accommodation provided in the aftermath of the crisis--especially through enhanced forward rate guidance and large-scale asset purchases--the U.S. economy has made great strides. Indeed, with the economy now operating near maximum employment and inflation expected to rise to the FOMC's 2 percent objective over the next couple of years, the FOMC has been scaling back the accommodation provided in response to the Great Recession. In no small part because of our authority to pay interest on excess reserves, the process of removing policy accommodation is working well. After discussing a few issues related to our recent decision to start reducing the size of the Federal Reserve's balance sheet, I will address a key question: What is the appropriate future role of the unconventional policy tools that we deployed to address the Great Recession? While I believe that influencing short-term interest rates should continue to be our primary monetary policy lever in normal times, our unconventional policy tools will likely be needed again should some future economic downturn drive short-term interest rates back to their effective lower bound. Indeed, empirical analysis suggests that the neutral federal funds rate--defined as the level of the federal funds rate that is neither expansionary nor contractionary when the economy is operating near its potential--is much lower than in previous decades. Consequently, the probability that short-term interest rates may need to be reduced to their effective lower bound at some point is uncomfortably high, even in the absence of a major financial and economic crisis. I will return to the question about the future of our various policy tools, but first I would like to review our experience this decade, which I view as instructive for addressing that question. A substantial body of evidence suggests that the U.S. economy is much stronger today than it would have been without the unconventional monetary policy tools deployed by the Federal Reserve in response to the Great Recession. Two key tools were large-scale asset purchases and forward guidance about our intentions for the future path of short-term interest rates. The rationale for those tools was straightforward: Given our inability to meaningfully lower short-term interest rates after they reached near-zero in late 2008, the FOMC used increasingly explicit forward rate guidance and asset purchases to apply downward pressure on longer-term interest rates, which were still well above zero. Longer-term interest rates reflect, in part, financial market participants' expectations of the future path of short-term interest rates. As a result, FOMC communications that affect those expectations--such as the enhanced forward rate guidance provided in our post-meeting statements in the aftermath of the Great Recession--can affect longer-term interest rates. In addition, longer-term interest rates include a term premium, which is the compensation demanded by investors for bearing the interest rate risk associated with longer-term securities. When the Federal Reserve buys longer-term securities in the open market, the remaining stock of securities available for purchase by the public declines, which pushes the prices of those securities up and thus depresses their yields by lowering the term premiums embedded in those yields. Several studies have found that our forward rate guidance and asset purchases did appreciably reduce longer-term interest rates. The FOMC's goal in lowering longer-term interest rates was to help the U.S. economy recover from the recession and stem the disinflationary forces that emerged from it. Some have suggested that the slow pace of the economic recovery proves that our unconventional policy tools were ineffective. However, one should recognize that the recovery could have been much slower in the absence of our unconventional tools. Indeed, the evidence strongly suggests that forward rate guidance and securities purchases--by substantially lowering borrowing costs for millions of American families and businesses and making overall financial conditions more accommodative--did help spur consumption and business spending, lower the unemployment rate, and stave off disinflationary pressures. Other central banks also deployed unconventional policy tools in the years that followed the financial crisis. Evidence accumulated from their experience also supports the notion that these tools have helped stimulate economic activity in their countries after their short-term interest rates were lowered to near-zero--and, in some cases, even below zero. By 2014, the U.S. economy was making notable progress toward the FOMC's goals of maximum employment and price stability. The unemployment rate had dropped to 6 percent by midyear--well below its Great Recession peak of 10 percent--and other measures of labor market conditions were also showing significant improvement. In addition, inflation, as measured by the change in the price index for personal consumption expenditures, had reached about 1-3/4 percent by mid-2014 after hovering around 1 percent in the fall of 2013. Reflecting that progress, the Federal Reserve's focus was shifting from providing additional monetary policy accommodation to scaling it back. A key question for the FOMC then was how to reduce the degree of accommodation in the context of a vastly expanded Federal Reserve balance sheet. One possible approach was to start by reducing the Federal Reserve's securities holdings while short-term interest rates remained at the lower bound. We could allow securities to roll off the Federal Reserve's balance sheet and even sell securities, thereby putting upward pressure on long-term rates while calibrating the pace and configuration of the reduction in our holdings as warranted by our maximum employment and price stability objectives. Eventually, once our securities holdings had shrunk sufficiently, the FOMC could start nudging up its short-term interest rate target. One problem of this "last in, first out" approach was that the FOMC does not have any experience in calibrating the pace and composition of asset redemptions and sales to actual and prospective economic conditions. Indeed, as the so-called taper tantrum of 2013 illustrated, even talk of prospective changes in our securities holdings can elicit unexpected abrupt changes in financial conditions. Given the lack of experience with reducing our asset holdings to scale back monetary policy accommodation and the need to carefully calibrate the removal of accommodation, the FOMC opted to allow changes in the Federal Reserve's securities holdings to play a secondary role in the Committee's normalization strategy. Rather than balance sheet shrinkage, the FOMC decided that its primary tool for scaling back monetary policy accommodation would be influencing short-term interest rates. As we explained in our "normalization principles" issued in September 2014, the FOMC decided to maintain the overall size of the Federal Reserve's securities holdings at an elevated level until sometime after the FOMC had begun to raise short-term interest rates. Once normalization of the level of the federal funds rate was "well under way" and the Committee judged that the economic expansion was strong enough that further increases in short-term interest rates were likely to be warranted, the FOMC would gradually and predictably reduce the size of the balance sheet by allowing the Federal Reserve's securities holdings to "run off"--that is, we would allow our balance sheet to shrink passively by not reinvesting all of the principal payments from our securities. One advantage of the FOMC's chosen approach to scaling back accommodation is that both the FOMC and the public have decades of experience with adjustments in short-term interest rates in response to changes in economic conditions. Nonetheless, the post-crisis environment presented a new test to the FOMC's ability to influence short- term interest rates. Before the crisis, the FOMC could raise the federal funds rate--the rate at which banks with excess reserves lend to banks with a reserve need--by removing a small amount of reserves from the banking system. That would translate into a higher federal funds rate because reserves were relatively scarce to begin with. The intuition was simple: The FOMC would signal that it was going to tighten conditions in the reserve market, and the cost of obtaining reserves in the market--the federal funds rate--would rise. Other market interest rates would then increase accordingly. After the crisis, however, reserves were plentiful because the Federal Reserve funded its large-scale asset purchases through adding reserves to the system--crediting the bank accounts of those who were selling assets to the Fed. Moreover, in light of the FOMC's decision not to sell the longer-term securities it acquired, reserves were likely to remain plentiful for the foreseeable future. Consequently, when the time came to remove accommodation, a key question for the Committee was how to raise the federal funds rate in an environment of abundant reserves. An important part of the answer to that question came in the Federal Reserve's authority to pay interest on excess reserves. The Congress granted the Federal Reserve that authority in 2006, to become effective in 2011. However, in the fall of 2008, the Congress moved up the effective date to October 2008. Having authority to pay interest on excess reserves means that the Federal Reserve can influence the federal funds rate and other short-term interest rates regardless of the amount of excess reserves in the banking system. The mechanics of the new framework are straightforward: Banks will generally only provide short-term funding at an interest rate around or above what they could earn at the Fed. As a result, if the Federal Reserve raised the rate it paid, other short-term lending rates would likely rise as well. This new approach for raising short-term interest rates is working well: Since December 2015, we have raised the interest paid on excess reserves and the target range for the federal funds rate by 100 basis points, and the effective federal funds rate has risen accordingly. In light of our recent decision to start reducing our securities holdings this month, I would like to discuss a few aspects of our balance sheet strategy. anticipated that its decision to maintain the size of the Federal Reserve's securities holdings at an elevated level until sometime after the beginning of rate hikes would keep some downward pressure on longer-term interest rates well after the end of its asset purchase programs. Although estimates of the effect of our securities holdings on longer- term interest rates are subject to uncertainty, a recent study reported that the Federal Reserve's securities holdings were reducing the term premium on the 10-year Treasury yield by roughly 1 percentage point at the end of 2016. The guidance that the FOMC would eventually start a gradual and predictable reduction of the Federal Reserve's securities holdings implied that the downward pressure on longer-term yields would likely diminish over time as financial market participants came to expect that the start of balance sheet normalization was nearing. Indeed, with that process now under way, it is likely that our securities holdings are now depressing the term premium on the 10-year yield by somewhat less than the 1 percentage point estimate reported for late last year. Several factors suggest that the downward pressure on term premiums exerted by our securities holdings is likely to diminish only gradually as our holdings shrink. For instance, as I have already noted, our intention to reduce our balance sheet by reducing reinvestment of repayments of principal on our holdings--rather than selling assets-- has been well communicated for several years now. As a result, we do not anticipate a jump in term premiums as our balance sheet reduction plan gets under way. In addition, the maturity distribution of our securities holdings is such that it will take some years for the size of our holdings to normalize via runoff. The judgment that the downward pressure on term premiums will decline only gradually as we reduce the size of our balance sheet stands in sharp contrast to evidence suggesting that this pressure built up rather quickly when we were expanding our balance sheet. To understand this contrast, remember that, unlike our plan to shrink our balance sheet, the various phases of our asset purchases had, to differing degrees, an element of surprise, with asset purchase announcements occasionally leaving a distinct imprint on the path of longer-term yields. Moreover, each of our asset purchase programs resulted in a rapid increase in our securities holdings during a relatively short period, whereas the normalization process will play out gradually over many years. I have focused thus far on the likely response of term premiums to our balance sheet reduction plan. Let me turn my attention briefly to the likely response of longer- term yields, which, as I have noted, reflect both a term premium component and expectations of the future path of short-term interest rates. While the available evidence points to a strong reaction of longer-term yields to our asset purchases, it is conceivable that those yields will react much more modestly to our balance sheet reduction plan. Consider, for instance, a hypothetical scenario in which the FOMC has decided not to rely on balance sheet reduction to scale back accommodation, choosing instead to continue to reinvest indefinitely all principal payments from the Federal Reserve's securities holdings. If financial market participants perceived no change in the economic outlook and no intention on the part of the FOMC to alter the overall stance of monetary policy, the FOMC's inclination to leave the size of the balance sheet unchanged would be taken as an indication that the FOMC would instead rely more on increases in short-term interest rates to scale back accommodation, resulting in a faster pace of short-term interest hikes. On net, longer-term yields may be little affected by this hypothetical scenario: While the decreased emphasis on balance sheet reduction would depress term premiums and hold longer-term yields lower, the expected faster pace of short-term interest rate increases would push longer-term yields higher. As the financial crisis and Great Recession fade into the past and the stance of monetary policy gradually returns to normal, a natural question concerns the possible future role of the unconventional policy tools we deployed after the onset of the crisis. My colleagues on the FOMC and I believe that, whenever possible, influencing short- term interest rates by targeting the federal funds rate should be our primary tool. As I have already noted, we have a long track record using this tool to pursue our statutory goals. In contrast, we have much more limited experience with using our securities holdings for that purpose. Where does this assessment leave our unconventional policy tools? I believe their deployment should be considered again if our conventional tool reaches its limit--that is, when the federal funds rate has reached its effective lower bound and the U.S. economy still needs further monetary policy accommodation. Does this mean that it will take another Great Recession for our unconventional tools to be used again? Not necessarily. Recent studies suggest that the neutral level of the federal funds rate appears to be much lower than it was in previous decades. Indeed, most FOMC participants now assess the longer-run value of the neutral federal funds rate as only 2-3/4 percent or so, compared with around 4-1/4 percent just a few years ago. With a low neutral federal funds rate, there will typically be less scope for the FOMC to reduce short-term interest rates in response to an economic downturn, raising the possibility that we may need to resort again to enhanced forward rate guidance and asset purchases to provide needed accommodation. Of course, substantial uncertainty surrounds any estimates of the neutral level of short-term interest rates. In this regard, there is an important asymmetry to consider. If the neutral rate turns out to be significantly higher than we currently estimate, it is less likely that we will have to deploy our unconventional tools again. In contrast, if the neutral rate is as low as we estimate or even lower, we will be glad to have our unconventional tools in our toolkit. The bottom line is that we must recognize that our unconventional tools might have to be used again. If we are indeed living in a low-neutral-rate world, a significantly less severe economic downturn than the Great Recession might be sufficient to drive short-term interest rates back to their effective lower bound. Let me conclude with a brief summary. As a result of the Great Recession, the Federal Reserve has confronted two key challenges over the past several years: One, the FOMC had to provide additional policy accommodation after short-term interest rates reached their effective lower bound; and two, subsequently, as we made progress toward the achievement of our mandate, we had to start scaling back that accommodation in the presence of a vastly expanded Federal Reserve balance sheet. Today I highlighted two points about the FOMC's experience with those challenges. First, the monetary policy tools that the Federal Reserve deployed in the immediate aftermath of the crisis--explicit forward rate guidance, large-scale asset purchases, and the payment of interest on excess reserves--have helped us overcome these challenges. Second, in light of evidence suggesting that the neutral level of short-term interest rates is significantly lower than it was in previous decades, the likelihood that future monetary policymakers will have to confront those two challenges again is uncomfortably high. For this reason, we must keep our unconventional policy tools ready to be deployed again should short-term interest rates return to their effective lower bound.
r171102a_FOMC
united states
2017-11-02T00:00:00
Introductory Remarks
powell
1
Good morning. I am sorry that I am not able to be with you today at the FRBNY for this important meeting. I thought that I would begin by discussing the LIBOR related events that have brought us here, and then talk about the way forward. LIBOR gained negative public attention when reports began to surface during the financial crisis that employees at some banks had attempted to manipulate the rate by altering the quotes they submitted for use in the calculation of LIBOR. A number of agencies, including the Financial Conduct Authority (FCA), took the lead in investigating and prosecuting the cases of LIBOR manipulation that were uncovered. The Federal Reserve also joined in international efforts to strengthen LIBOR. Among other things, we joined the ICE Benchmark intensively with international authorities and IBA in developing and encouraging the reforms set out in IBA's Roadmap for LIBOR. But at the same time, as we and other authorities collected data on the transactions underlying banks' submissions to LIBOR, we began to see that those transactions were relatively few and far between. As a result, many began to question whether LIBOR could be truly and permanently fixed. To be sure, much has been done to address the cases of attempted manipulation, and LIBOR has much stronger governance in place than it did before the crisis. The question instead was whether there were enough actual LIBOR transactions to form a stable basis for this critical rate. Since many of the pressures around LIBOR stem from the low level of underlying transactions, let me share some data regarding activity in U.S. dollar wholesale funding markets. Today, there are 17 banks that submit quotes in support of dollar LIBOR. Some have suggested requiring more banks to submit LIBOR data, but doing so would not materially improve the situation. The panels in Figure 1 show the distribution of daily aggregate wholesale dollar funding volumes for the 30 global systemically important banks (or GSIBs). The data here include all of the Eurodollar, federal funds, CD, and commercial paper transactions that the Federal Reserve has access to--the most complete picture of U.S. dollar unsecured funding that I am aware of. For one-month funding, shown in the top panel, the median daily volume of transactions by these banks since money-market reforms took effect last year was just over $1 billion. For three-month funding (the middle panel), the most heavily referenced LIBOR tenor, the median is less than $1 billion per day. On some days we see less than $100 million. If we compare this to the more than $100 trillion in outstanding volumes of U.S. dollar LIBOR contracts, it should be clear that the activity in this market is miniscule compared to the size of the contracts written on it. In our view, it would not be feasible to produce a robust, transaction-based rate constructed from the activity in wholesale unsecured funding markets. A transactions-based rate from this market would be fairly easy to manipulate given such a thin level of activity, and the rate itself would likely be quite volatile. Thus, LIBOR seems consigned to rely primarily on some form of expert judgment rather than direct transactions. As we discussed these issues with the officials in the United Kingdom who oversee and regulate LIBOR, we also became aware that they were receiving a steady stream of requests, and sometimes demands, from banks seeking to leave the LIBOR panels. The use of expert judgment in submissions allows LIBOR to be published every day, but many banks are now understandably uncomfortable with being asked to provide judgment about something that they do very little of. In his July speech, Andrew Bailey discussed the efforts of the FCA to keep these banks on the panels. Market participants should understand that the official sector has done everything it can to stabilize and strengthen LIBOR. Without the intervention of U.K. authorities, LIBOR would be in a weaker state today. But that balancing act has grown increasingly difficult. As time has passed, some banks have grown more resistant to public-sector entreaties to remain on the panels. As you know, one bank left the U.S. dollar panel last year. At the same time, we have had to confront the fact that, if banks could not be persuaded to voluntarily remain on panels, then the legal powers to compel them to do so were limited. Under European Union benchmark regulations, which LIBOR will soon be subject to, authorities can only compel submissions to a critical benchmark for a period of two years. Given this time limit, brokering a voluntary agreement with the submitting banks to stay on for a longer period was the last, best choice that authorities had available to guarantee some further period of stability for LIBOR. CFTC Chairman Chris Giancarlo and I have publicly supported the FCA's efforts to secure an agreement with the submitting banks to stay on through the end of 2021, and we have encouraged the U.S. banks that submit to LIBOR to cooperate with FCA's effort. Of course, LIBOR may remain viable well past 2021, but we do not think that market participants can safely assume that it will. Users of LIBOR must now take in to account the risk that it may not always be published. While the public's understanding of this risk has increased significantly since Andrew Bailey's speech, the official sector has been concerned about it for some years, as reflected for example in our public comments and in the annual reports of the Financial Stability Oversight Council. Given our understanding of the risks to LIBOR, the cooperation with the Treasury Department and CFTC. Consistent with recommendations from the Financial Stability Board (FSB), we charged the ARRC with identifying a robust alternative to U.S. dollar LIBOR and with developing a plan to encourage its use in some derivatives and other transactions as appropriate. The ARRC has accomplished the things that we asked of it. I want to thank the members for their work and also to extend my special thanks to Sandie O'Connor as the chair of the ARRC. Sandie will discuss the ARRC's work shortly, but I'll make a few points. First, I'd note that, like most market participants, the ARRC members initially had a difficult time conceiving of any kind of transition from LIBOR. As is the case for all of you, a transition will be a complicated task for the broker-dealers and other members of the ARRC, and will involve significant costs. Over time, however, I think the ARRC members have developed a greater understanding of the risks to LIBOR and now see that, despite those complications and costs, a transition may prove necessary. I also think that, having had time to consider the transition plans that Sandie will talk about, ARRC members have become more comfortable with the idea that a transition is feasible, even if the necessity of achieving it is regrettable. Second, it is clear that any rate the ARRC selected as a potential alternative needed to be highly robust. There would be no point in selecting a rate that might find itself quickly in the same kinds of conditions that LIBOR is in now. In our view, the ARRC has chosen the most robust rate available. In general, only overnight unsecured or secured funding markets appeared to have enough underlying transactions to produce a robust rate. The overnight Treasury repo market is the largest and most active market in any tenor of U.S. rates markets. Figure 2 at about $700 billion per day or more, are much larger than the volumes in overnight unsecured markets, even much larger than estimates of the volume in Treasury bills, and they dwarf the volumes in other term markets. The alternative reference rate needs to be able to stand the weight of having trillions of dollars written on it, and the ARRC has definitely met this standard in choosing SOFR. Third, we charged the ARRC with devising plans for a voluntary transition that encouraged the use of their recommended rate where appropriate. We have never told anyone that they cannot use LIBOR. The ARRC did consider whether other cash products could move from LIBOR to the rates it evaluated, but their paced transition plan has focused on derivatives because that is where the largest gross exposures to LIBOR are, and because it may be easier for many derivatives transactions to move away from LIBOR to a new rate. Now, however, market participants have realized that they may need to more seriously consider transitioning other products away from LIBOR, and the ARRC has expanded its work to help ensure that this can be done in a coordinated way that avoids unnecessary disruptions. Sandie will discuss plans to eventually create a term reference rate, which may help to smooth any transition. That term reference rate would have to be built by first developing futures and OIS markets that reference SOFR. It will likely never be as robust as SOFR itself, and so derivatives transactions will almost certainly need to be based on the overnight rate, but a term reference rate could conceivably be used in some loan or other contracts that currently reference All of the work that we will talk about today will help, but we have to acknowledge that the transition will be complicated. Unfortunately, as I have discussed, we cannot guarantee that it won't be necessary. The most complicated aspects involve the legacy contracts that reference LIBOR, many of which do not have strong language in place if LIBOR were to stop publication. to devise better language for derivatives, and ISDA's Scott O'Malia and Katherine Tew Darras will discuss that work later this morning. As people now consider the risks around LIBOR for other types of contracts, they will need to go through their documentation to understand what the fallback language is and how it can be improved. We will also discuss those issues today. This is important work, both for the parties to these contracts and for our financial stability. While there may be no perfect contract language or fallback, good risk management requires that we work together to find language and fallbacks that are robust and that limit unintended valuation changes. So, while much work has been done, there is more still to do. I have been heartened in seeing that many market participants are already confronting these issues. For some, Andrew Bailey's speech was a difficult wake-up call. But the efforts we have undertaken with the ARRC show what is possible when the official sector works collectively with market participants. If market participants are willing to continue to work together, then we can safely achieve the transitions needed to create a better and more robust system that will help to ensure our ongoing financial stability.
r171107a_FOMC
united states
2017-11-07T00:00:00
Remarks accepting the 2017 Paul H. Douglas Award for Ethics in Government
yellen
1
Thank you for the honor of sharing this award with such a worthy person, my friend and former colleague Ben Bernanke, and also for the honor of being associated with the exemplary life and legacy of Paul Douglas. Sen. Douglas's contributions to ethics in government are an important aspect of that legacy, but since Ben and I are the first economists to win this award, I would also like to give due credit to Professor Douglas for his contributions to economics. Working with the mathematician Charles Cobb, Douglas gathered data and created a statistical model that advanced economists' understanding of the relative contributions to production made by capital and labor. This work demonstrated many of the methods that economists would come to use and continue to use to this day. Douglas constructed empirical measures of key economic concepts. He then employed what were, for the time, advanced statistical techniques to analyze these data, thereby shedding light on a basic economic relationship. As one recent commentator put it, work such as Douglas's was part of "a growing literature , , , [that] played an important role in shaping the approach to combining statistical methods and economic theory that would become the standard econometric practice in the later decades of the 20th century." But, of course, Paul Douglas was much more than an economist. From an early age, he always seemed to be the man in the middle, seen by the two sides of a dispute as intelligent, knowledgeable, and impartial enough to be trusted to seek a solution. He mediated labor disputes and advised local government officials in Illinois. He also helped governors in Pennsylvania and New York develop what became Social Security, unemployment insurance, and the idea of publicly owned power utilities. After losing his first election for the U.S. Senate in 1942, at the age of 49 he enlisted in the U.S. Marines and was awarded a Bronze Star and Purple Heart for his service in the Pacific. Later, after he was elected to the Senate, Douglas was a champion of consumer protection laws, including the 1968 Truth in Lending Act, and was a leading supporter of civil rights legislation. One of Paul Douglas's most important achievements in public life was to promote ethics in government. He was raised with a strong sense of right and wrong and was heavily influenced by the philosopher John Stuart Mill to believe that ethical behavior was also an eminently practical approach to life. As a college professor, he entered public life in Chicago in the 1920s somewhat reluctantly to fight the spectacular political and commercial corruption that was strangling the city. Once elected to office, he faced the challenge of how to act ethically when corruption was so universal that the city's aldermen were expected to funnel large sums to their constituents, based on the presumption that all politicians were "on the take." If he refused bribes and then had no money to hand out, how would he convince people that he wasn't simply refusing to The key was transparency and communication. Everyone who asked for a large payment was handed a mimeographed statement titled "Please Help Me to Be an Honest Alderman." At first, the public, inured to corruption, either disbelieved him or thought him a sucker for refusing to take bribes. But over time, Douglas's stand changed both the demands made by supplicants and his constituents' expectations for ethical behavior by a government official. Similarly, in 1939, Paul Douglas was one of the first public officials in America to publish a full accounting of his personal finances. In his memoir, Douglas admits that this and other ethical rules he imposed on himself were often a nuisance and, in some cases, probably went further than needed to demonstrate his honesty. But he believed that the public's trust was so fundamental to the effectiveness of government that such steps were appropriate. As a senator, he continued to publish his finances and observed strict limits on the value of personal gifts he received from those seeking benefits from him as a public official. He strongly believed that these gift limits also made him more effective as a public servant, and he campaigned for legislation that ultimately led to the widespread requirement of financial disclosure and limits on accepting gifts. I share Paul Douglas's view that behaving ethically is both the right thing to do and, in practical terms, helps maintain the trust the public places in those who act on its behalf. The Federal Reserve's very effectiveness in setting monetary policy depends on the public's assured confidence that we act only in its interest. We must act ethically, and we must demonstrate our ethical standards in ways that leave little room for doubt. I am grateful to share this award with Ben, and I am even more grateful for the example Paul Douglas set for ethics in government that has guided countless public officials since his day and also shaped the public's expectation that its leaders will put the public's interests first.
r171115a_FOMC
united states
2017-11-15T00:00:00
Regional Food Systems and Community Development
brainard
0
Thank you for inviting me. I appreciate the opportunity to listen and learn about the important work that you do. One of the Federal Reserve's responsibilities is to understand how communities across America are experiencing the economy. That is why Congress established a network of Federal Reserve Banks and Branches across the country. Our local presence gives us valuable opportunities to engage with communities across a broad spectrum--from those who are thriving to those who are confronting challenges, and from inner-city neighborhoods to rural towns. One of the lessons that our community engagement has taught us is that there is an important connection between the strength of regional food systems and community health. Today's meeting is one in a series that Federal Reserve Banks are holding across the country to talk about local efforts to support regional food systems, and how such efforts can advance communities' goals. These meetings build on the Federal Reserve Board's release last August of a publication that explores the community and economic development potential of investing in regional food systems. Through this collaborative research effort, we learned several important lessons about what works to strengthen regional food systems. First, we learned that appropriately tailored investments in regional food systems have the potential to support the creation of new jobs and small businesses in local communities, as well as to improve farm profitability and financial resilience. Second, we learned that in order to take advantage of new business opportunities in the regional food sector, entrepreneurs need access to capital, specialized knowledge, and general business skills. Unfortunately, one or more of these is often missing from historically marginalized communities. We also learned that organizations across the country are filling these gaps and empowering communities to take advantage of opportunities. I understand that many of you in this room are engaged in this important work; for that I thank you and hope that this meeting will help advance your efforts. As a result of an intentional focus on equity and inclusiveness, these investments can create new access to economic opportunity for segments of communities that have often faced challenges, such as people of color, recent immigrants, the formerly homeless, and the previously incarcerated. Lastly, we learned that no organization has all of the resources or expertise necessary to effectively carry out this work alone. As such, long-term partnerships and collaboration are necessary for success in implementing regional food strategies. Because of the critical role of multi-sectoral partnerships, I am heartened to see so many of you here today to talk about the current state of regional food system investment in New England, to build new relationships, and to strengthen existing ones. It is this ongoing commitment to working together that will carry you through to the next stage of regional food systems development. Later today, I will be meeting with several local organizations that not only bring important resources and expertise to their communities, but also exemplify this important dedication to partnership and cross-sector collaboration. The Urban Farming Institute of Boston demonstrates dedication to cross-sector collaboration through their Farmer Training Program, many of whose graduates go on to work for or start up regional food enterprises. They also advance the broader regional food system by co-hosting an annual conference that brings together farmers, policymakers, investors, and other stakeholders. Likewise, CommonWealth Kitchen has emerged as an important hub where farmers, entrepreneurs, universities, investors, and other stakeholders come together in ways that not only advance regional food systems, but also increase access to economic opportunity for people impacted by racial, social, or economic inequality. In this way, CommonWealth Kitchen is dedicated to addressing Greater Boston's growing wealth divide by promoting inclusive entrepreneurship through an integrated approach that links education, training, and manufacturing to a strong network of industry partners, including anchor institutions. These are just two examples of the great work going on in New England, and I look forward to visiting both organizations later today. This morning, I am eager to learn about the work of the many organizations represented here, especially the barriers and opportunities you face when trying to invest in this sector, and how your investments are linking more families and communities to meaningful economic opportunities. Attending events like this and visiting communities around the country provide me with opportunities to speak with families, farmers, small business owners, investors, bankers, and other community members about their experiences in the economy. These conversations help me to develop a granular and very human understanding of the economy that, when combined with the information provided by our traditional research and data collection efforts, are important considerations informing judgments about policy. A community-level understanding of the economy is especially important in today's economy, where we see welcome strength in the aggregate statistics coexisting side-by-side with important disparities at the community and family levels. In addition to disparities based on the community where a family lives and significant and persistent racial disparities, I have been struck while traveling around the country by the widening gulf between the economic fortunes of our large metropolitan areas and those of our small cities, towns, and rural areas. The statistics bear this out: the convergence in income across regions of the country has slowed dramatically over recent decades. Much of the gains in employment, income, and wealth since the end of the recession--and more broadly over the past few decades--have accrued to workers and families in larger cities. If some workers and families find it difficult to move, this concentration of economic opportunities in larger cities may have adverse implications for the well-being of these households and, potentially, the economy overall. In my visits, I have also been heartened by the efforts of local partners to address these disparities and improve their communities, including those aimed at capitalizing on local food- based assets to advance economic opportunity and address food insecurity. For instance, in El Paso, I visited with several vibrant community organizations that were running community gardens, local nutrition and farming educational outreach programs, a commercial kitchen, and a food pantry to improve nutrition and access to fresh food in an area that lacked full-scale grocery stores. In the Mississippi Delta, I met with people involved in an interesting collaboration between a local entrepreneur, a community development financial institution, the engineering department of a local community college, local farmers, and local food organizations--the aim of which was to produce biofuels from food waste. Of course, pockets of both opportunity and persistent poverty are found in large metro and rural areas alike. In fact, a recent report found Boston to have one of the highest rates of income inequality among the 100 largest metropolitan areas in the United States, despite the overall strength of Boston's economy. Findings like this remind us that not all communities are well positioned to access the opportunities available in the economy, even those in their own backyard. Our research suggests there are things that can be done to improve the likelihood that an area will be a community of opportunity. Some localities have fared better than others in this respect, and their successes can provide us with actionable lessons. For instance, the Federal Reserve Bank of Boston undertook an in-depth study of 25 medium-sized cities nationwide that had experienced a post-industrial decline and identified 10 that experienced an economic resurgence. The study found that the critical determinant of success was the ability of leaders in those cities to collaborate across sectors around a long-term vision for revitalization. encourage such collaboration, the Boston Fed has facilitated Working Cities Challenges in Massachusetts, Rhode Island, and Connecticut that reward effective public-private collaboration to reach communitywide goals. For instance, one of the winning cities, Lawrence, Massachusetts, set goals of increasing the income of parents with children in the Lawrence Public School system by 15 percent, dramatically increasing parental engagement in the schools and tracking the impact of these efforts on student achievement. A cross-section of partners from the public, private, nonprofit, and philanthropic sectors sought to achieve this, in part, by have placed over 200 parents in new positions paying 25 percent higher wages on average than their previous jobs, with 200 more parents in the training pipeline. Learning about what works reinforces for me the importance of events like this one: events that bring together different stakeholders to talk about the future of their communities and how they can work together to advance common goals. Thank you for being our valued partners in this important work.
r171116a_FOMC
united states
2017-11-16T00:00:00
Where Do Consumers Fit in the Fintech Stack?
brainard
0
The new generation of fintech tools offers the potential to help consumers manage their increasingly complicated financial lives, but also poses risks that will need to be managed as the marketplace matures. In many ways, the new generation of fintech tools can be seen as the financial equivalent of an autopilot. The powerful new fintech tools represent the convergence of numerous advances in research and technology--ranging from new insights into consumer decisionmaking to a revolution in available data, cloud computing, and artificial intelligence (AI). They operate by guiding consumers through complex decisions by offering new ways of looking at a consumer's overall financial picture or simplifying choices, for example with behavioral nudges. As consumers start to rely on financial autopilots, however, it is important that they remain in the driver's seat and have a good handle on what is happening under the hood. Consumers need to know and decide who they are contracting with, what data of theirs is being used by whom and for what purpose, how to revoke data access and delete stored data, and how to seek relief if things go wrong. In short, consumers should remain in control of the data they provide. In addition, consumers should receive clear disclosure of the factors that are reflected in the recommendations they receive. If these issues can be appropriately addressed, the new fintech capabilities have enormous potential to deliver analytically grounded financial services and simplified choices, tailored to the consumers' needs and preferences, and accessible via their smartphones. When the first major "credit card," the Diner's Club Card, was introduced in 1949, consumers could only use the cardboard card at restaurants and, importantly, only if they paid the entire amount due each month. Today, the average cardholder has about four credit cards, and the Federal Reserve Bank of New York estimates that American consumers collectively carry $785 billion in credit card debt. When signing up for a credit card, consumers face a bewildering array of choices. Half of consumers report that they select new cards based on reward programs, weighing "cash back" offers against "points" with their credit card provider that may convert into airline or hotel "miles," which may have varying values depending on how they are redeemed. In some cases, rewards may apply to specific spending categories that rotate by quarter and require that consumers re-register each term, and the rewards may expire or be forfeited under complicated terms. In some cases, the choices may be confusing. Let's take the example of zero percent interest credit card promotions. A consumer may choose a zero percent interest credit card promotion and expect to pay no interest on balances during a promotional period, after which any balances are assessed at a higher rate of interest going forward. But if a consumer instead chooses a zero percent interest private-label credit card with deferred interest and has a positive balance when the promotional period expires, interest could be retroactively assessed for the full time they held a balance during the promotional period. Even sophisticated consumers could be excused for confusing these products. As it turns out, it is often the most vulnerable consumers who have to navigate the most complicated products. For instance, one recent study of the credit card market found that the average length of agreements for products offered to subprime consumers was 70 percent longer than agreements for other products. The complexity multiplies when we go beyond credit cards and consider other dimensions of consumers' financial lives. found that nearly a quarter of the Americans that don't maintain bank accounts are concerned that bank fees are too unpredictable. Even though mortgage debt is over two-thirds of household debt, nearly half of consumers don't comparison shop before taking out a mortgage. Student loans now make up 11 percent of total household debt, more than twice its share in Over 11 percent of student debt is more than 90 days delinquent or in default--and researchers at the Federal Reserve Bank of New York estimate that this figure may understate the problem by as much as half. Today, consumers navigate numerous weighty financial responsibilities for themselves and their dependents. It seems fair to assume they could use some help managing this Decisionmaking (SHED), more than half of respondents reported that their spending exceeded their income in the prior year. Indeed, 44 percent of SHED respondents reported that they could not cover an emergency expense costing $400 without selling something or borrowing money. Given the complexity and importance of these decisions, it is encouraging to see the fast- growing development of advanced, technology-enabled tools to help consumers navigate the complex issues in their financial lives. These tools build on important advances in our understanding of consumer financial behavior and the applications, or "app," ecosystem. Researchers have invested decades of work exploring how consumers actually make decisions. We all tend to use shortcuts to simplify financial decisions, and it turns out many of these can prove faulty, particularly when dealing with complex problems. For example, empirical evidence consistently shows that consumers overvalue the present and undervalue the future. Researchers have documented that consumers make better savings decisions when they are presented with fewer options. They have shown the importance of "anchoring" bias--the tendency to place disproportionate weight on the first piece of information presented. This bias can lead consumers either to make poor financial choices or instead to tip the scales in favor of beneficial choices, as with automatic savings defaults. in the right circumstances or instead backfire in surprising ways. These behavioral insights are especially powerful when paired with the remarkable advances we have seen in the technological tools available to the average consumer, especially through their smartphones. population had a mobile phone, the vast majority of which were smartphones. Smartphone use is prevalent even among the unbanked and underbanked populations. Survey evidence suggests we are three times more likely to reach for our phone than our significant other when we first wake up in the morning. Some evidence suggests that smartphones are already helping consumers make better financial decisions. The 2016 SCMF found that 62 percent of mobile banking users checked their account balances on their phones before making a large purchase, and half of those that did so decided not to purchase an item as a result. In addition, 41 percent of smartphone owners checked product reviews or searched product information online while shopping in a retail store, and 79 percent of those respondents reported changing their purchase decision based on the information they accessed on their smartphone. And those use cases just scratch the surface of what is possible. First of all, the smartphone platform has become a launch pad for a whole ecosystem of apps created by outside developers for a wide variety of services, including helping consumers manage their financial lives. Second, the smartphone ecosystem puts the enormous computing power of the cloud at the fingertips of consumers. Interfacing with smartphone platforms and other apps, outside developers can tap the computing power of the leading cloud computing providers in building their apps. Importantly, cloud computing offers not only the power to process and store data, but also powerful algorithms to make sense of it. Due to early commitment to open-source principles, app developers have open access to many of the same machine-learning and artificial intelligence tools that power the world's largest internet companies. Further, the major cloud computing providers have now taken these free building blocks and created different machine- learning and artificial intelligence stacks on their cloud platforms. A developer that wants to incorporate artificial intelligence into their financial management app can access off-the-shelf models of cloud computing providers, potentially getting to market faster than by taking the traditional route of finding training data and building out models in-house from scratch. Third, fintech developers can also draw from enormous pools of data that were previously unavailable outside of banking institutions. Consumer financial data are increasingly available to developers via a new breed of business-to-business suppliers, called data aggregators. These companies enable outside developers to access consumer account and transactional information typically stored by banks. But aggregators do more than just provide access to raw data. They facilitate its use by developers, by cleaning the data, standardizing it across institutions, and offering their own application programming interfaces for easy integration. Further, similar to cloud computing providers, data aggregators are also beginning to provide off-the-shelf product stacks on their own platforms. This means that developers can quickly and easily incorporate product features, such as predicting creditworthiness, determining how much a consumer can save each month, or creating alerts for potential overdraft charges. Researchers have documented the benefits of tailored one-on-one financial coaching. Until recently, though, it has been hard to deliver that kind of service affordably and at scale, due to differences in consumers' circumstances. Let's again consider the example of deferred interest credit cards. It turns out only a small minority of consumers miss the deadlines for repaying promotional balances and are charged retroactive interest payments, and they typically have deep subprime scores. Similarly, for consumers that opt into overdraft products on their checking accounts, 8 percent of consumers pay 75 percent of the fees. Up until now, it has been hard for consumers to understand those odds and objectively assess whether they are likely to be in the group of customers that will face challenges with a particular financial product. The convergence of smartphone ubiquity, cloud computing, data aggregation, and off-the-shelf AI products offer the potential to make tailored financial advice scalable. For instance, a fintech developer could pair historical data about how different types of consumers fare with a specific product, on the one hand, with a consumer's particular financial profile, on the other hand, to make a prediction about how that consumer is likely to fare with the product. Since the early days of internet commerce, developers have tried to move beyond simple price comparison tools to offer tailored "agents" for consumers that can recommend products based on analyses of individual behavior and preferences. Today, a new generation of personal financial management tools seems poised to make that leap. When a consumer wishes to select a new financial product, he or she can now solicit options from a number of websites and mobile apps. These new comparison sites can walk the consumer through a wide array of financial products, offering to compare features like rewards, fees, and rates, or tailoring to a consumer's stated goals. Some fintech advisors ask consumers to provide access to their bank accounts, retirement accounts, college savings accounts, and other investment platforms in order to enable a fintech advisor to offer a consumer a single, near complete picture of his balances and cash flows across different institutions. In reviewing the advertising, terms and conditions, and apps of an array of fintech advisors, it appears that many of these tools offer advanced data analysis, machine learning, and even artificial intelligence to help consumers cut down on unnecessary spending, set aside money for savings, and use healthy nudges to improve their financial decisions. For instance, a fintech advisor may help a consumer automate savings "rules," like rounding up charges and putting the difference into savings, enabling these small balances to accumulate over time or setting a small amount of money aside every time a consumer spends money on little splurges. The early stages of innovation inevitably feature a lot of learning from trial and error. Fortunately, as the fintech ecosystem advances, there are useful experiences and good practices to draw upon from the evolution of the commercial internet. To begin with, one internet adage is that if a product is free, " you are the product." In this vein, fintech advisors frequently offer free services to consumers and earn their revenue from the credit cards and other financial products that they recommend through lead generation. Of course, many fintech advisors are not lead generators. Some companies offer fee-for- service models, with consumers paying a monthly fee for the product. Other companies are paid by employers, who then provide the products free of charge to their employees as an employee benefit. In these cases, they likely have quite different business models. But for those services that do act as lead generators, there are important considerations about whether and how best to communicate information to the consumer about the nature of the recommendations being made. For instance, according to some reports, fintech advisors can make between $100 and $700 in lead generation fees for every customer that signs up for a credit card they recommend. In many cases, a fintech advisor may describe their service as providing tailored advice or making recommendations as they would to friends and family. In such cases, a consumer might not know whether the order in which products are presented by a fintech assistant is based on the product's alignment with his or her needs or different considerations. Different fintech advisors may order the lists they show consumers using different criteria. A product may be at the top of the advisor's recommendations because the sponsoring company has paid the advisor to list it at the top, or the sponsoring company may pay the fintech assistant a high fee, contingent upon the consumer signing up for the product. Alternatively, a fintech advisor may change the order of the loan offers or credit cards based on the likelihood that the consumer will be approved. Moreover, in some cases, the absence of lead generation fees for a particular product may impact whether that product is on the list shown to consumers at all. There appears to be a wide variety of practices regarding the prominence and placement of advertising and other disclosures relative to the advice and recommendations such firms provide. Overall, fintech assistants have increasingly improved the disclosures that explain to consumers how they get paid, but this is still a work in progress. The good news is that these challenges are not new. The experience with internet search engines outside of financial products, such as Google, Bing, and Yahoo!, as well as with other product comparison sites, such as Travelocity and Yelp, may provide useful guidance. As consumers and businesses have adapted to the internet, we have, collectively, adopted norms and standards for how we can expect search and recommendation engines to operate. In particular, we generally expect that search results will be included and ranked based on what's organically most responsive to the search--unless it is clearly labeled otherwise. Accordingly, when we search for a product, we now know to look for visual cues that identify paid search results, usually in the form of a text label like "Sponsored" or "Ad", different formatting, and visually separating advertising from natural search results. Even when an endorsement is made in a brief Twitter update, we now expect disclosures to be clear and conspicuous. As fintech advisors evolve to engage consumers in new ways, disclosure methodologies will no doubt be expected to adapt as well. For instance, some personal financial management tools now interact with consumers via text message. If consumers move to a world in which most of their interactions with their advisors occur via text-messaging "chatbots"--or voice communication--I am hopeful that industry, regulators, consumers, and other stakeholders will work together to adapt the norms to distinguish between advice and sponsored recommendations. While the lead generation revenue model presents some familiar issues that are readily apparent, under the hood, fintech relationships raise even more complex issues for consumers in knowing who they are providing their data to, how their data will be used, for how long, and what to expect in the case of a breach or fraud. Let me briefly touch on each issue in turn. Often, when a consumer signs up with a fintech advisor or other fintech app, they are asked to log into their bank account in order to link the fintech app with their bank account data. In reviewing apps' enrollment processes, it appears that consumers are often shown log-in screens featuring bank logos and branding, prompting consumers to enter their online banking logins and passwords. In many cases, the apps note that they do not store the consumers' banking credentials. When the consumer logs on, he or she is often not interfacing with a banks' computer systems, but rather, providing the bank account login and password to a data aggregator that provides services to the fintech app. In many cases, the data aggregator may store the password and login and then use those credentials to periodically log into the consumer's bank account and copy available data, ranging from transaction data, to account numbers, to personally identifiable information. In other cases, things work differently under the hood. Some banks and data aggregators have agreed to work together to facilitate the ability to share data with outside developers in authorized ways. These agreements may delineate what types of data will be shared, and authorization credentials may be tokenized so that passwords are never stored by the aggregator. It is often hard for the consumer to know what is actually happening under the hood of the financial app they are accessing. In most cases, the log in process does not do much to educate the consumer on the precise nature of the data relationship. Screen scraping usually invokes the bank's logo and branding but infrequently shows the logo or name of the data aggregator. In reviewing many apps, it appears that the name of the data aggregator is frequently not disclosed in the fintech app's terms and conditions, and a consumer generally would not easily see what data is held by a data aggregator or how it is used. The apps, websites, and terms and conditions of fintech advisors and data aggregators often do not explain how frequently data aggregators will access a consumer's data or how long they will store that data. Recognizing this is a relatively young field, but one that is growing fast, there are a myriad of questions about the consumer's ability to opt out and control over data that will need to be addressed appropriately. In examining the terms and conditions for a number of fintech apps, it appears that consumers are rarely provided information explaining how they can terminate the collection and storage of their data. For instance, when a consumer deletes a fintech app from his or her phone, it is not clear this would guarantee that a data aggregator would delete the consumer's bank login and password, nor discontinue accessing transaction information. If a consumer severs the data access, for instance by changing banks or bank account passwords, it is also not clear how he or she can instruct the data aggregator to delete the information that has already been collected. Given that data aggregators often don't have consumer interfaces, consumers may be left to find an email address for the data aggregator, send in a deletion request, and hope for the best. If things go wrong, consumers may have limited remedies. In reviewing terms, it appears that many fintech advisors include contractual waivers that purport to limit consumers' ability to seek redress from the advisor or an underlying data aggregator. In some cases, the terms and conditions assert that the fintech developer and its third-party service providers will not be liable to consumers for the performance of or inability to use the services. It is not uncommon to see terms and conditions that limit the fintech adviser's liability to the consumer to $100. E, consumers have had protections to mitigate their losses in the event of erroneous or fraudulent transactions that would otherwise impact their credit and debit cards, such as data breaches. Those protections are not absolute, however. In particular, if a consumer gives another person an "access device" to their account and grants them authority to make transfers, then the consumer is "fully liable" for transfers made by that person, even if that person exceeds his or her authority, until the consumer notifies the bank. As the industry matures, the various stakeholders will need to develop a shared understanding of who bears responsibility in the event of a breach. So what can be done to make sure consumers have the requisite information and control to remain squarely in the driver's seat? Establishing and implementing new norms is in the shared interest of all of the participants in the fintech stack. For instance, in the case of credit cards, mortgages, and many other products, it is often banks or parties closely affiliated with banks that pay fees to fintech advisors to generate leads for their products, pursuant to a contract. Through these contractual relationships with fintech advisors, banks have considerable influence in the lead generation relationship, including through provisions describing how a sponsored product should be described or displayed. Banks have a stake in ensuring that their vendors and third-party service providers act appropriately, that consumers are protected and treated fairly, and that the banks' reputations aren't exposed to unnecessary risk. Likewise, some of the leading speech-only financial products are currently credit card and bank products. Accordingly, banks have incentives to invest in innovating the way they disclose information to consumers, as they also invest in new ways of interacting with them. As for consumers' relationships with data aggregators, there's an increasing recognition that consumers need better information about the terms of their relationships with aggregators, more control over what is shared, and the ability to terminate the relationship. We have spoken to data aggregators who recognize the importance of finding solutions to many of the complex issues involved with the important work of unlocking the potential of the banking stack to developers. And while there are some difficult issues in this space, other issues seem relatively straightforward. It shouldn't be hard for a consumer to be informed who they are providing their credentials to. Consumers should have relatively simple means of being able to consent to what data are being shared and at what frequency. And consumers should be able to stop data sharing and request the deletion of data that have been stored. Responsibility for establishing appropriate norms in the data aggregation space should be shared, with banks, data aggregators, fintech developers, consumers, and regulators all having a role. Banks and data aggregators are negotiating new relationships to determine how they can work together to provide consumers access to their data, while also ensuring that the process is secure and leaves consumers in the driver's seat. In many cases, banks themselves were often the original customers of data aggregators, and many continue to use these services. According to public filings, more than half of the 20 largest banks are customers of data aggregators. banks have an opportunity as customers of data aggregation services to ensure that the terms of data provision protect consumers' data and handle it appropriately. Regulators also recognize that there may be opportunities to provide more clarity about how the expectations about third-party risk management would work in this sector, as well as other areas experiencing significant technological change. Through external outreach and internal analysis, we are working to determine how best to encourage socially beneficial innovation in the marketplace, while ensuring that consumers' interests are protected. We recognize the importance of working together and the potential to draw upon existing policies, norms, and principles from other spaces. Consumers may not fully understand the differences in regulations across financial products or types of financial institutions, or whether the rules change when they move from familiar search and e-commerce platforms to the fintech stack. Consumers, as well as the market as a whole, will benefit if regulators coordinate to provide more unified messages and support the development of standards that serve as a natural extension of the common-sense norms that consumers have come to expect in other areas of the commercial internet. The combination of technologies that put vast computing power, rich data sets, and artificial intelligence onto simple smartphone apps together with important research into consumer financial behaviors has great potential to help consumers navigate their complex financial lives more effectively, but there are also important risks. I am hopeful that fintech developers, data aggregators, bank partners, consumers, and regulators will work together to keep consumers in the driver's seat as we move forward with these new technologies. If we work together effectively toward this goal, the fintech stack may be able to offer enormous benefits to the consumers they aim to serve, while appropriately identifying and managing the risks.
r171130a_FOMC
united states
2017-11-30T00:00:00
Thoughts on Prudent Innovation in the Payment System
quarles
0
It is a pleasure to be with you today to talk about financial stability and fintech. Being in the Treasury's beautifully restored Cash Room calls to mind the themes of both history and finance. History, because the room was constructed just after the Civil War. Finance, because it was designed to be the Treasury's bank and was originally used to conduct daily banking and payment transactions with other banks and the general public. Of course the functionality of this room has changed over the decades as the financial system landscape has evolved, and today we are holding a conference here to discuss financial innovation. Before I begin, I will note that my colleague, Jay Powell, was supposed to be your speaker today. However, as you can likely surmise, he had to attend to other matters and asked me to speak in his place. So to carry through the theme of today's topic, one might call Jay the innovator on payments issues at the Federal Reserve, as he has spoken extensively on the subject for many years. And one might think of me as tech support on these issues for the day. New technologies have brought tremendous, positive change to our lives, raising productivity and living standards and contributing to economic growth. In the past few years, innovation has profoundly transformed industries such as retail shopping, the media, and even transportation by providing greater speed, convenience, and competition. Not surprisingly, both the banking industry and technology firms have also been seeking innovations in financial services that mirror and complement changes that have been made in other industries. Innovation is coming to finance with changes to consumer lending, financial advice, and retail payments, to name a few. The Federal Reserve, itself, is engaged in a multiyear effort to address challenges and opportunities in the current payment system. During my experience in the realm of private equity, I had a chance to interact with many new firms in these areas. The pace of innovation was often dizzying. In my new role as Vice Chairman for Supervision at the Federal Reserve, I see innovation as something that can and should be fostered, but of course I must also scrutinize these innovations from a different perspective. That is to say, it is appropriate not only to evaluate the potential of innovations to improve on existing services, but also to judge their ramifications for the safety and soundness of the institutions we supervise and for financial stability--the topic of this conference. Although many of these technologies are still nascent, it is important to have an eye on the potential financial stability implications both in the short- and long-run. Payment systems need to be resilient during adversity. Without that resilience, we could face a sudden loss of public confidence and the seizing up of systems and critical activities. With the Cash Room's perspective on history and daily financial operations in mind, I would like to concentrate my remarks on the U.S. payment system, which is a critical foundation on which financial transactions and the conduct of business take place. New technologies are being proposed that could alter the design of our payment system. Today, I will talk about the necessary trust and confidence that the system requires, the tension between the need for financial stability and the need to innovate, and the challenges that digital currencies, in particular, present relative to the current system. These considerations highlight the need for a prudent approach to innovation in payment systems. Payment systems are both financial networks and technical networks. In simple terms, there is an asset that functions as money that can be transferred by households and businesses to buy goods and services, which, along with their financial institutions, make up the bulk of the financial network. Today, the predominant forms of money used in payment systems are Federal Reserve notes and reserve balances as well as transaction balances at depository institutions. Payment systems also require a technical network to hold and transfer money. In earlier times, the network was less technical in nature and almost exclusively designed for the logistical storage and transfer of physical forms of money. Today, the main payments networks use centralized technology to process and safeguard the public's electronic funds transfers. Regulated banking institutions provide deposit money to the public and are a main source of trust for these systems. Transfers of balances on the books of these institutions are at the center of the public's transactions, with the Federal Reserve Banks playing a central, supporting role in interbank clearing and settlement for the most critical systems. A great amount of resources and effort goes into the networks that make the overall payment system safe, efficient, and resilient. It is fair to say that the general public places a great deal of trust in the components of the overall system to safeguard their money and operate as planned every day, and that trust is necessary for the system to work. From the perspective of financial stability, if the safety and integrity of the institutions and assets at the heart of a system erode or the transfer operations are not dependable, then the necessary trust and confidence that the system requires to work may quickly cease to function as needed. With a steady diet of news about the effect of electronic networks, personal devices, apps, and more on U.S. industries, many question the effect of these technologies on the payment system. I think we should recognize that there can be a tension between the need for financial stability in the overall payment system and the need to innovate to keep up with the demands of modern technology and lifestyles. However, we should also recognize that this tension is not necessarily troubling. By definition innovation means doing something new, which usually involves taking risk in furtherance of some gain. But at the same time, we should be vigilant in balancing the benefits of innovation with the safe and reliable operation of systems and critical activities. From an analytical perspective, payment systems typically increase in value as more people use a system and the more attractive it becomes to others. In addition, there is an inverse relationship between the volume of users of a system and the cost of production--more users lower the cost of production. Until recently, these features may have hindered innovation by presenting high barriers to entry and may have also fostered greater concentration into a few key entities that could become systemically important. Of course, technology may be able to reduce the effect of at least some of these hurdles by, for example, attracting high numbers of users quickly or reducing the costs of production. However, the effect of reducing technological barriers for financial stability is not clearly positive or negative. For example, on the one hand, new market participants attracted by lower barriers to entry may introduce new and unknown risks in the payment system. On the other hand, new market participants may relieve the concentration of activity in a limited number of players. Thus, the potential tension between innovation and stability can be more difficult to manage in the case of payment systems as compared to other industries that are less affected by these hurdles. One sensible approach to risk management would emphasize "starting small" and taking small risks. But unless a payment system grows a fairly large network of users in a reasonable time, it is unlikely to achieve the scope and scale it needs to be successful. Conversely, if a system attempts to start on a large scale and is successful, there will surely be questions about resilience in adversity, particularly if cutting-edge technologies and methods are used to handle people's money. The essential problem is how to achieve scale and manage financial and technical risk at the same time. Not surprisingly, because striking the right balance takes time, genuine innovation in payment systems over history has often been measured in decades, not years. As part of the new technology associated with fintech, we are now seeing the emergence of privately developed digital currencies using new decentralized technologies. Fundamental to these digital currencies is the establishment of a new asset, the unit of the digital currency--for example, a bitcoin--and a new record-keeping and transfer mechanism that enables users to store and trade those units--for example, a blockchain--often without reliance on traditional financial institutions. I believe the financial industry is increasingly recognizing that we should separate the concept of digital currencies from the innovative new technologies that they have employed to transfer assets. Those technologies, such as distributed ledgers, may offer useful new ways to store, transfer, and protect data and traditional financial assets. The industry is now moving cautiously from pilot projects in many of these areas to the use of these new technologies in limited production settings. This cautious approach to using new technology appears to reflect the weight of responsibility the financial industry bears for protecting both their customers and their reputations. Continued monitoring of developments is in order, and time will tell how these new technologies--and others--can contribute to a safe and secure payment system and broader financial system. The Federal Reserve has been actively monitoring these developments and will continue to do so. But when we examine the assets at the center of digital currency systems, I think we should begin to think clearly about the long-term properties we seek for large-scale payment networks and systems used by the general public. Today, the vast majority of our payments by volume and value are processed by regulated financial institutions. In the U.S. payment system, digital currencies are a niche product that sometimes garners large headlines. But from the standpoint of analysis, the "currency" or asset at the center of some of these systems is not backed by other secure assets, has no intrinsic value, is not the liability of a regulated banking institution, and in leading cases, is not the liability of any institution at all. Indeed, how to treat and define this new asset is complicated. While these digital currencies may not pose major concerns at their current levels of use, more serious financial stability issues may result if they achieve wide-scale usage. Risk management can act as a mitigant, but if the central asset in a payment system cannot be predictably redeemed for the U.S. dollar at a stable exchange rate in times of adversity, the resulting price risk and potential liquidity and credit risk pose a large challenge for the system. During times of crisis, the demand for liquidity can increase significantly, including the demand for the central asset used in settling payments. Even private-sector banks and certainly non- banks can have a hard time meeting large-scale demands for extra liquidity at the very time when their balance sheets may be in question. Moreover, this inability to meet the demand for extra liquidity can have spillover effects to other areas of the financial system. Earlier in our history, the United States frequently witnessed bank runs that severely disrupted financial and economic activity, an example of what can happen when people lose faith in a payment system. In response, Congress ultimately introduced both a central bank and deposit insurance programs to help regulate fluctuations in the supply of liquidity in order to keep prices stable. Without the backing of a central bank asset and institutional support, it is not clear how a private digital currency at the center of a large-scale payment system would behave, or whether the payment system would be able to function, in times of stress. Given that privately developed digital currencies may raise important financial stability issues tied to the value of the asset, some have argued that central banks should begin to issue their own digital currency as a 21st century analogue to paper currency. I would urge caution, particularly for countries like the United States with highly developed banking systems and ongoing robust demand for physical cash. As a practical matter, I believe that consideration of a central-bank-issued digital currency to the general public would require extensive reviews and consultations about legal issues, as well as a long list of risk issues, including the potential deployment of unproven technology, money laundering, cybersecurity, and privacy to name a few. I am particularly concerned that a central-bank-issued digital currency that's held widely around the globe could be the subject of serious cyberattacks and could be widely used in money laundering and terrorist financing. The effect of all this would significantly divert our focus from work to improve or establish new private-sector retail payment systems based on existing institutions. The prospect of a government-sponsored digital currency might even derail private-sector plans to enhance the payment services provided to their customers, thereby significantly disrupting the financial networks that exist today in ways that could create instability. For example, if payment activity radically shifted from using deposits at financial institutions to using central-bank-issued digital currency, deposits could significantly shrink and potentially disrupt financial institutions' ability to make loans that spur economic activity. That said, research into digital currency issues, including highly liquid and secure limited-purpose digital currencies for use as a settlement asset for wholesale payment systems, should continue. As technologies are developed and refined, old issues are resolved and new issues arise. Other countries may have different environments and experiences. We should always be open to learning and understanding from the experiences of others. For the United States, the alternative to privately issued digital currency is not necessarily a publicly issued digital currency. Instead, the near-term alternative is to build on the trusted foundations of the existing payment system and work to improve private-sector payment services. Importantly, this means looking to the banking system, which holds the bulk of the transaction deposits in this country, to improve services. This began a number of years ago with internet banking. Today, many banks offer around-the-clock internet-based access to accounts as well as mobile banking and payment capabilities. Many banks typically allow real-time or near real-time transfers of funds among their own customer base. What does not yet exist in the United States is the sort of ubiquitous, real-time payment system that would allow banks and their customers to make transfers and settlements of funds across the banking system instantly, conveniently, and securely all the time. As my colleague at the Board, Jay Powell, recently discussed at a conference in New York, the Federal Reserve has been working with the banking industry and a wide range of other payment system stakeholders to better understand the consequences of this state of affairs and support efforts to expand the available options through our payment system improvement initiative. For example, based on recommendations from the industry, the Federal Reserve is currently studying potential improvements in its settlement services--a traditional core function of a central bank--that could address the future needs of a ubiquitous real-time retail payments environment. Building on our existing banking system also makes sense from a financial stability perspective. Federally insured and supervised institutions are the core of our current payment system and largely address the potential financial stability problem of relying on payment systems with unbacked and unregulated digital currencies at their heart. But leveraging our existing banking system does not suggest that there is no room for new or emerging institutions and technologies. Indeed, there are a number of promising avenues that would allow the innovations that appear to be of the greatest interest to households and business--attributes like instant payment capabilities and around-the-clock operations--to be offered using a variety of existing and new technologies without requiring significant tradeoffs in safety and resiliency. To conclude, our financial stability requires that the payment system be reliable and dependable so that the public can trust it. As a result, there can be a tension between innovation and the need for financial stability in the overall payment system. Innovation must therefore account for the effects that it has on both the financial and technology networks that make up our payment system. The innovation that is beginning to flow from the development of digital currencies--and other technologies--will likely have a long-run effect on the technical networks and the business processes used in the payment system and the wider financial system. Privately developed digital currencies as currently configured would raise concerns about the effect on financial stability if they take on more prominence in the payments and overall financial system. Central bank digital currencies are also not immune to a large range of risks and could even adversely affect financial stability. As such, central banks should tread cautiously as they contemplate issuing them. But this does not mean that we should avoid further innovation. Working cooperatively, private-sector participants and central banks can incorporate innovation that may be able to strike the right balance of improving the technical networks without adversely generating financial stability concerns. I am optimistic that the Federal Reserve's work with the payments industry will facilitate a future with a safe and more efficient payment system.
r171213a_FOMC
united states
2017-12-13T00:00:00
Workforce Development in Today's Economy
brainard
0
I would like to thank the staff at the Federal Reserve Bank of New York for inviting me to attend this awards ceremony . As many of you may already know, the Leading the Way campaign is a joint effort between the Federal Reserve Bank and the P- schools of the Greater Rochester area. It is wonderful to be part of an event that celebrates collaboration between educational institutions, local government, the Federal Reserve, and employers as they work together to provide appropriate training for young workers to fill local jobs needs. Collaboration creates a more vibrant workforce that is connected to actual jobs. The FOMC is one of the main policymaking committees of the Federal Reserve System. Our main job on the FOMC is to set interest rates to achieve the maximum level of employment in the country and sustain inflation at a moderate level. The nation's unemployment rate is currently around 4.1 percent, which is a relatively low level by historical standards. What that means for the young people participating in this program is that this is a great time to find a job. Moreover, not only is the national unemployment rate low, so are unemployment rates for young people and members of ethnic and racial groups that have traditionally faced greater challenges in the labor market. As many of you know, during the deep recession of 2008 and 2009, many Americans were so discouraged by poor job prospects that they stopped even looking for work. That meant that too many Americans were sitting on the sidelines. In the past few years, the job market has gotten so strong that many of these people have come off the sidelines--and many are now back at work. The Rochester area, like much of the country, has now recovered from the Great Recession. The education and health-care sectors continue to be the main drivers of job growth. And, the local community colleges have partnered with local employers to align their curriculums with the employers' job openings, to increase workers' connections to jobs, and to become a resource for regional job information. Finally, the local government has worked to contend with a long-term loss of jobs in manufacturing and a more recent loss of jobs in professional services by attracting investment in a world-class photonics hub in Rochester. I often meet with people who run large companies to get their take on the strength of the economy. Whereas a few years ago, they might not have been engaged in a lot of hiring, today they tell me that it is becoming more challenging to find well-qualified workers for the job openings they want to fill. That's good news for those of you participating in the P-TECH program. It means employers are looking for graduates of programs just like yours, which are attuned to the job opportunities of employers. P-TECH Rochester programs offer students professional mentors, job shadowing opportunities, and on-the-job experience to ensure that they are ready to contribute positively to the workplaces of local employers. I also understand that companies are investing more in on-the-job training and are strengthening efforts to retain more of the workers they attract, which means younger workers--such as P-TECH graduates--have a better chance of securing jobs that lead to long-lasting careers. By any measure, the market for younger workers today looks much better than it did in the years just after the Great Recession, when unemployment rates for teenagers were above 20 percent. In fact, firms are turning more and more to programs like P-TECH to help them find and train the workers they need. So the skills you are gaining today are likely to provide pathways not only to the jobs of today but also the careers of the future. The Federal Reserve Bank of New York is one of the facilitators of the Leading the Way campaign. The Federal Reserve System has a long-standing interest in understanding labor market dynamics and promoting workforce development opportunities. Programs like this help workers prepare for jobs and help firms invest in workforce development. Together, these efforts help make the economy more productive and help us achieve our goal of maximum employment. In addition to the Federal about labor markets in small industrial cities through their Working Cities Challenge and the Federal Reserve Bank of Dallas has focused on the importance of infrastructure including broadband for economic inclusion. The Federal Reserve Bank of Atlanta has been a leader within the Federal Reserve System in establishing the Investing in America's Workforce Initiative, an effort across the Federal Reserve System to reframe training expenditures as investments in human capital rather than costs. In October, Atlanta established the Center for Workforce and Economic Opportunity to focus on employment policies and labor market issues that affect low- and moderate-income individuals. The System also recently hosted a national conference in Austin to promote the importance of investing in America's workforce. As employers' demand for specific education and training increase, collaborative initiatives, like those exhibited through the P-TECH programs of Greater Rochester, will help minimize the skills mismatch appearing in many communities with burgeoning job markets. Since 2013, the Federal Reserve Board has been administering the Survey of Young Workers to learn more about work experiences and future prospects for 18-to-30- year-olds nationwide. When I look at the survey findings, I am struck by how well the P- TECH program addresses many of the issues young people identified in the survey. As a first example, the survey findings highlight the important correlation between postsecondary education and labor market outcomes. The survey data indicate that higher levels of educational attainment are associated with higher earnings, greater job satisfaction, and increased optimism about one's job future. P-TECH focuses on providing students with postsecondary credentials, helping to address one of the survey's main concerns. Second, responses to the survey underscore the importance of young workers receiving appropriate information that enables them to select an educational program that provides better labor market outcomes. The survey found that more than 30 percent of young adults did not receive information about jobs and careers in high school and college. P-TECH is focused on providing information about educational opportunities and jobs to students before they enter high school. Third, the survey found that many young workers are not employed in fields aligned with their education. Fewer than half (45 percent) of the young workers surveyed said they were employed in a career field that is closely related to their educational and training background. P-TECH focuses on aligning education with actual jobs. Finally, the survey found that steady employment is very important to young workers. In 2015, young adults had a strong preference for steady employment (62 percent) over higher pay (36 percent). And, among the respondents who preferred steady employment, 80 percent would rather have one steady job than a stream of steady jobs for the next five years. The work that P-TECH students do directly with employers helps steer them toward long-term, steady employment. In conclusion, I am proud that the Federal Reserve is contributing to the efforts of the P-TECH program in the Rochester area and I look forward to the video presentations we will have shortly. I want to congratulate in advance the winners of the video competition, who will also be announced shortly. I also want to congratulate all of you for being part of a 21 century education program that will help connect you to pathways of opportunity and help enable you to make important contributions to the vitality of your community.
r180119a_FOMC
united states
2018-01-19T00:00:00
Early Observations on Improving the Effectiveness of Post-Crisis Regulation
quarles
0
It is a pleasure to be here with you at the American Bar Association banking law committee annual meeting. Thank you to Meg Tahyar, my longtime friend and colleague, for inviting me to speak today. These are still the early days of my tenure at the Federal Reserve--last weekend marked my first three months as the first Vice Chairman for Supervision. In those three months, people have had a lot of questions for me, but the most frequently asked question has been: What's next? Today I hope to give you some insights into how I am approaching the work of evaluating and improving the post-crisis regulatory regime and to outline some specific areas that are emerging as areas of focus early in my tenure. Some of those areas are closer to being ready for action, while others are topics that I believe are important and would benefit from more attention and discussion. My hope is that you will come away from our time together with a better sense of my preliminary thinking for charting a course forward on financial regulation. Before I delve into specifics, let me say a few words about the principles that are guiding my approach to evaluating changes to the current regime. The body of post-crisis financial regulation is broad in scope, complicated in detail, and extraordinarily ambitious in its objectives. Core aspects of that project have resulted in critical gains to our financial system: higher and better quality capital, an innovative stress testing regime, new liquidity regulation, and improvements in the resolvability of large firms. We undoubtedly have a stronger and more resilient financial system due in significant part to the gains from those core reforms. These achievements are consistent with the responsibility of the Federal Reserve to be a steward of a safe financial system, and with the goal of maintaining the ability of banks to lend through the business cycle. That said, the Federal Reserve and our colleagues at other agencies have now spent the better part of the past decade building out and standing up the post-crisis regulatory regime. At this point, we have completed the bulk of the work of post-crisis regulation, with an important exception being the U.S. implementation of the recently concluded Basel III "end game" agreement on bank capital standards at the Basel Committee. As such, now is an eminently natural and expected time to step back and assess those efforts. It is our responsibility to ensure that they are working as intended and--given the breadth and complexity of this new body of regulation--it is inevitable that we will be able to improve them, especially with the benefit of experience and hindsight. In undertaking this review and assessment, in addition to ensuring that we are satisfied with the effectiveness of these regulations, I believe that we have an opportunity to improve the efficiency, transparency, and simplicity of regulation. By efficiency I mean the degree to which the net cost of regulation--whether in reduced economic growth or in increased frictions in the financial system--is outweighed by the benefits of the regulation. In other words, if we have a choice between two methods of equal effectiveness in achieving a goal, we should strive to choose the one that is less burdensome for both the system and regulators. Efficiency of regulation can be improved through a variety of means. For example, it can mean achieving a given regulation's objective using fewer tools. It can mean addressing unintended adverse consequences to the industry and the broader public from a regulation or eliminating perverse incentives created by a regulation. It can mean calibrating a given regulation more precisely to the risks in need of mitigation. It can also mean simpler examination procedures for bank supervisors, or less intrusive examinations for well managed firms. In our approach to assessing post-crisis regulation, we should consider all of these ways of improving efficiency. Transparency is an objective that ought to particularly resonate with this audience. As lawyers, we were all trained to view transparency as a necessary precondition to the core democratic ideal of government accountability--the governed have a right to know the rules imposed on them by the government. In addition, as any good lawyer also recognizes, there are valuable, practical benefits to transparency around rulemaking; even good ideas can improve as a result of exposure to a variety of perspectives. Finally, simplicity of regulation is a principle that promotes public understanding of regulation, promotes meaningful compliance by the industry with regulation, and reduces unexpected negative synergies among regulations. Confusion that results from overly complex regulation does not advance the goal of a safe system. When I arrived at the Federal Reserve, the early stages of reflection on how to improve the cost-benefit balance of post-crisis regulation had already begun, mainly in a few narrow areas of focus. These were areas of low-hanging fruit in which relatively broad consensus was reached that efficiency enhancements were available with no material cost to the resiliency or resolvability of the banking system. My colleague and Chairman-nominee Jay Powell spoke about five of these areas last summer when he served as the Board's oversight governor for supervision and regulation: small bank capital simplification, burden reduction in resolution planning, enhancements to stress testing, leverage ratio recalibration, and Volcker rule simplification. I wholeheartedly support these initiatives, and I am pleased that some of them have progressed even in the months since the summer. The banking agencies recently proposed changes to the capital rules for smaller firms, consistent with last year's Economic Growth and Regulatory Paperwork Reduction Act report, which is a positive step toward meaningful burden relief for smaller banks. Reserve, along with the Federal Deposit Insurance Corporation, extended the upcoming resolution planning cycles for the eight most systemic domestic banking firms and for foreign banks with limited U.S. operations in order to allow for more time between submissions. believe we should continue to improve the resolution planning process in light of the substantial progress made by firms over the past few years, including a permanent extension of submission cycles from annual to once every two years and reduced burden for banking firms with less significant systemic footprints. And, most recently, the Federal Reserve released a package of proposed enhancements to the transparency of our stress testing program, which is currently out for comment. The progress you have seen in those areas represents constructive early steps. Leverage ratio recalibration also is among the Federal Reserve's highest-priority, near- term initiatives. We have made considerable progress on that front in the past few months, and I expect that you will see a proposal on this topic relatively soon. Finally, the relevant agencies have begun work on a proposal to streamline the Volcker rule. This project is a quite comprehensive and substantial undertaking as well as a five-agency endeavor. As such, it will naturally take a bit of work for the agencies to congeal around a thoughtful Volcker rule 2.0 proposal for public review. Volcker rule reform remains a priority in the Federal Reserve's regulatory efforts. With that update on the familiar, I will turn to my own impressions of what is next for post-crisis regulation. In my early days as the Vice Chairman for Supervision, I asked our staff to conduct a comprehensive review of the regulations in the core areas of reform that I outlined earlier--capital, stress testing, liquidity, and resolution. The objective is to consider the effect of those regulatory frameworks on resiliency and resolvability of the financial system, on credit availability and economic growth, and more broadly to evaluate their costs and benefits. This is a comprehensive and serious process, and work is still underway. I should note, however, that I have already formed views on a few areas that warrant more focus, and that I will be working with my colleagues on the Board to constructively consider. I will start with the issue of tailoring supervision and regulation to the size, systemic footprint, risk profile, and business model of banking firms. The Federal Reserve has devoted considerable energy in its post-crisis regulatory work to incorporate the tailoring concept in its regulation and supervision across the spectrum of small, medium, and large firms. A recent example of this approach is our late 2017 proposal to simplify capital requirements for small- and medium-sized banking firms. In my view, there is further work for the Federal Reserve and the other banking agencies to do on the tailoring front. I would emphasize that tailoring is not an objective limited in scope to a subset of the smallest firms. As my colleagues and I have said before, the character of our regulation should match the character of the risk at the institution. Accordingly, we should also be looking at additional opportunities for more tailoring for larger, non-Global Systemically Important Banks, or non-G-SIBs. In this regard, I support congressional efforts regarding tailoring, whether by raising the current $50 billion statutory threshold for application of enhanced prudential standards or by articulating a so-called factors-based threshold. Irrespective of where the legislative efforts land, I believe we at the Federal Reserve have the responsibility to ensure that we do further tailoring for the institutions that remain subject to our rules to ensure that regulation matches the risk of the firm. Take for example large non-G-SIBs whose failure would not individually pose a risk to U.S. financial stability. Even without financial stability implications, the distress or failure of these firms still could harm the U.S. economy by, for example, significantly disrupting the flow of credit to households and businesses. In my view, this tranche of the U.S. banking system ought to be subject to regulations that are generally stricter than those that apply to small banking firms, but that are also meaningfully less strict than those that apply to the G-SIBs. The Board has effected this sort of G-SIB versus non-G-SIB tailoring among large banks in many areas of the regulatory framework. Most notably, each of the risk-based capital requirements, leverage requirements, stress testing requirements, and total loss-absorbing capacity (TLAC) requirements is calibrated substantially more strictly for G-SIBs than for large non-G-SIBs. However, in some key regulations, there is no distinction between the requirements for large non-G-SIBs and G- Liquidity regulation, for example, does not have a G-SIB versus non-G-SIB gradation. In particular, the full liquidity coverage ratio (LCR) requirement and internal stress testing requirements of enhanced prudential standards apply to large, non-G-SIB banks in the same way that they apply to G-SIB banks. I believe it is time to take concrete steps toward calibrating liquidity requirements differently for large, non-G-SIBs than for G-SIBs. And I see prospects for further liquidity tailoring in that the content and frequency of LCR reporting are the same for the range of firms currently subject to the modified LCR as they are for the large non-G-SIBs that are subject to the full LCR. We should also explore opportunities to apply additional tailoring for these firms in other areas, such as single counterparty credit limits and resolution planning requirements. Another area that I think we should revisit are the "advanced approaches" thresholds that identify internationally active banks. These thresholds are significant not only for identifying which banking firms are subject to the advanced approaches risk-based capital requirements, but also for identifying which firms are subject to various other Basel Committee standards, such as the supplementary leverage ratio, the countercyclical capital buffer, and the LCR. The metrics used to identify internationally active firms--$250 billion in total assets or $10 billion in on- balance-sheet foreign exposures--were formulated well over a decade ago, were the result of a defensible but not ineluctable analysis, and have not been refined since then. We should explore ways to bring these criteria into better alignment with our objectives. A third area in which I will be working with my Board colleagues is a meaningful simplification of our framework of loss absorbency requirements. There are different ways to count the number of loss absorbency constraints that our large banking firms face--which is perhaps in itself an indication of a surfeit of complexity if we can't be perfectly sure of how to count them--but the number I come up with is 24 total requirements in the framework. While I do not know precisely the socially optimal number of loss absorbency requirements for large banking firms, I am reasonably certain that 24 is too many. Candidates for simplification include: elimination of the advanced approaches risk-based capital requirements; one or more ratios in stress testing; and some simplification of our TLAC rule. I am not the first Federal Reserve governor to mention some of these possibilities, and we should put them back on the table in the context of a more holistic discussion of streamlining these requirements. Let me be clear, however, that while I am advocating a simplification of large bank loss absorbency requirements, I am not advocating an enervation of the regulatory capital regime applicable to large banking firms. Although not a post-crisis regulation, the Board's complex and occasionally opaque framework for making determinations of control under the Bank Holding Company Act (BHC Act) is another area that is ripe for re-examination through the lenses of efficiency, transparency, and simplicity. As you know, a determination of control under the BHC Act is significant because even remote entities in a controlled group can be subject to the BHC Act's restrictions on activities and a host of other regulatory requirements. Under the Board's control framework-- built up piecemeal over many decades--the practical determinants of when one company is deemed to control another are now quite a bit more ornate than the basic standards set forth in the statute and in some cases cannot be discovered except through supplication to someone who has spent a long apprenticeship in the art of Fed interpretation. The process can be burdensome and time-consuming both for the requester and Federal Reserve staff. We are taking a serious look at rationalizing and recalibrating this framework. Finally, as I mentioned earlier, an enhanced stress testing transparency package was released for public comment last month. I personally believe that our stress testing disclosures can go further. I appreciate the risks to the financial system of the industry converging on the Federal Reserve's stress testing model too completely, so I am hesitant to support complete disclosure of our models for that reason. However, I believe that the disclosure we have provided does not go far enough to provide visibility into the supervisory models that often deliver a firm's binding capital constraint. It is important in any proposal to receive comments, and I can say that I and my colleagues on the Board will be paying particularly close attention to your comments on how we might improve this current proposal. To conclude, I hope that these remarks give you a sense of our approach to analyzing and improving post-crisis regulation. As I mentioned earlier, the areas of core reform--capital, liquidity, stress testing, and resolution--have produced a stronger and more resilient system and should be preserved. We have made great progress, but there is further work to do. Some clear improvements are in the offing in the relatively near future. Other areas will benefit from longer term discussion. I look forward to engaging with you and the public more broadly as I help to chart a course for the important work ahead.
r180213a_FOMC
united states
2018-02-13T00:00:00
Remarks at the Ceremonial Swearing-in
powell
1
It is both humbling and a great privilege to be standing here today. I am particularly honored by the trust and faith that the President has placed in me and by the Senate's quick action in confirming me. There is no greater honor than public service, as Randy, Lael and all of our colleagues here at the Fed would agree. The Congress has assigned the Federal Reserve the goals of stable prices and maximum employment. Price stability means that businesses and households can make important decisions without concern for high or volatile inflation. Maximum employment means that those who want a job either have one or can find one reasonably quickly. We also have important responsibilities for the stability of the financial system and for the regulation and supervision of financial institutions, including our largest banks. Through regulation that is both effective and efficient, we seek to ensure that credit, which is vital for a healthy economy, will be available to families and businesses throughout the business cycle, so they can invest in a brighter future. These are awesome responsibilities, and the Congress has wisely entrusted us with an important degree of independence so that we can pursue our monetary policy goals without concern for short-term political pressures. As a public institution, we must be transparent about our actions so that the public, through its elected representatives, can hold us accountable. Over the past 25 years, the Fed has been a leader among central banks in improving transparency. Today, we are open and accountable. We strive to explain our actions in a way that enhances the public's understanding of our goals and methods. We will continue to pursue ways to improve transparency both in monetary policy and in regulation. Many millions of Americans were still suffering from the ravages of the crisis. Since then, monetary policy has continued to support a full recovery in labor markets and a return to our inflation target; we have made great progress in moving much closer to those statutory objectives. In addition, the financial system is incomparably stronger and safer, with much higher capital and liquidity, better risk management, and other improvements. Much credit for these results should go to Chairman Bernanke and Chair Yellen. I am grateful for their leadership and for their example and advice as colleagues. But there is more to the story than successful leadership. The success of our institution is really the result of the way all of us carry out our responsibilities. We approach every issue through a rigorous evaluation of the facts, theory, empirical analysis and relevant research. We consider a range of external and internal views; our unique institutional structure, with a Board of Governors in Washington and 12 Reserve Banks around the country, ensures that we will have a diversity of perspectives at all times. We explain our actions to the public. We listen to feedback and give serious consideration to the possibility that we might be getting something wrong. There is great value in having thoughtful, well-informed critics. While the challenges we face are always evolving, the Fed's approach will remain the same. Today, the global economy is recovering strongly for the first time in a decade. We are in the process of gradually normalizing both interest rate policy and our balance sheet with a view to extending the recovery and sustaining the pursuit of our objectives. We will also preserve the essential gains in financial regulation while seeking to ensure that our policies are as efficient as possible. We will remain alert to any developing risks to financial stability. I am deeply grateful for the opportunity to lead the Fed as we face these evolving challenges. I believe that the way we approach our work, the strong values we hold, and the dedication to public service I see throughout the Federal Reserve have been the keys to our success. As Chairman, I will uphold these values and do my very best to further our pursuit of something we all seek--an economy that works for all Americans.
r180222a_FOMC
united states
2018-02-22T00:00:00
The U.S. Economy after the Global Financial Crisis
quarles
0
I am very happy to be participating in this symposium on taking stock of the global economy a decade after the Global Financial Crisis, and I thank Hiroshi Watanabe for the invitation. I have been asked to provide an overview of the U.S. economy since the advent of the crisis in no longer than 10 minutes, so I could either talk very quickly or focus my comments on more recent developments, perhaps throwing in a bit of historical context when appropriate. To cut to the bottom line, the U.S. economy appears to be performing very well and, certainly, is in the best shape that it has been in since the crisis and, by many metrics, since well before the crisis. Recent volatility in equity markets is a reminder that asset prices can move rapidly and unexpectedly. However, it is my assessment that the underlying fundamentals of the U.S. economy are sound and much improved relative to earlier in the decade. One easy and important place to see that improvement is in the labor market. After peaking at 10 percent in October 2009, the unemployment rate fell rather steadily to 4.1 percent in January--the lowest level, outside of a period from 1999 to 2000, since the 1960s. Job gains in recent months have continued at a pace that would be pushing the unemployment rate even lower if the labor participation rate had not stabilized in recent years, a welcome development and a sign that the strength of the labor market is pulling in or retaining workers who might otherwise be on the sidelines. Broader measures of labor market slack--for example, those that include individuals who are out of the labor force but say they want a job as well as those working with a part-time job but who would like to work full time--have largely returned to pre-crisis levels. While the labor market has shown steady improvement over the past decade, the post-crisis performance of gross domestic product (GDP) growth has been more disappointing, averaging just 2 percent per year over the past seven years. However, beginning with the second quarter of last year, growth has shown some momentum. Over the past three quarters of 2017, real GDP increased at an average rate of almost 3 percent. While headline growth stepped back a bit in the fourth quarter, largely on account of increased drag from higher imports and lower inventories, underlying final private domestic demand--which is a better indicator of economic momentum--grew at its fastest pace in more than three years. Recent survey data reveal a growing sense of economic optimism. Consumer confidence has returned to pre-crisis levels. Business optimism is also apparent in survey data as well as in the strength of investment. In 2017, investment in capital equipment increased at the fastest pace since 2011, accelerating through the year to a double-digit rate in the second half. It might be early, but it is possible that the investment drought that has afflicted the U.S. economy for the past five years may finally be breaking. The tax and fiscal packages passed in recent months could help sustain the economy's momentum in part by increasing demand, and also possibly by boosting the potential capacity of the economy by encouraging investment and supporting labor force participation. While the recent performance of the economy has been solid relative to much of the pre-crisis period, one area that continues to lag is productivity growth, a condition that has been common across the advanced economies. Beginning in 2011, the annual growth rate of labor productivity has averaged only 3/4 percent, compared with an average 2-1/4 percent pace in the two decades leading up to the financial crisis. Why productivity growth has been so weak defies easy explanation. The weak pace of business investment is likely part of the story. In addition, some have argued that there has been a decline in business dynamism following the crisis; others do not link the slowdown to the crisis but rather to an exogenous slowdown in the rate of technological progress; and still others believe that productivity growth has not really slowed much at all and, instead, is just not being measured correctly in the official statistics. Regardless, given the importance of productivity growth for the long-run potential of the economy and living standards, it is vitally important that policymakers pursue policies aimed at boosting the growth rate of productivity. Another aspect of the economy that has attracted a lot of attention is the apparent low level of inflation despite the tightness in labor markets. The 12-month increase in headline PCE prices was 1.7 percent in December, a touch below the Fed's 2 percent objective. After assessing the recent data, my take is that the current shortfall in inflation from target as most likely due to transitory factors that will fade through 2018, pushing inflation back up to target. Suffice to say, a deviation from our target of a few tenths of 1 percentage point, especially one I expect to fade, does not cause me great concern. Against this economic backdrop, with a strong labor market and likely only temporary softness in inflation, I view it as appropriate that monetary policy should continue to be gradually normalized. An important component of this normalization was initiated in October, when we started to gradually scale back our reinvestment of proceeds from maturing Treasury securities and principal payments from agency securities. With the balance sheet normalization plan set to remain on autopilot, barring a material deterioration in the economic outlook, the federal funds rate remains our primary tool for adjusting the stance of monetary policy. At our January meeting, the Federal Open Market Committee decided to maintain its target range for the federal funds rate In this range, monetary policy remains accommodative. I anticipate further gradual increases in the policy rate will be appropriate to both sustain a healthy labor market and stabilize inflation around our 2 percent objective. Of course, it should go without saying that I will keep a close eye on economic indicators--and their implications for the outlook for inflation and real activity--and adjust my views on appropriate monetary policy accordingly. I would like to wrap up with a word on the financial sector. The Federal Reserve and our colleagues at other agencies have now spent the better part of the past decade building out and standing up the post-crisis regulatory regime. At this point, we have completed the bulk of the work of post-crisis regulation. As such, now is an eminently natural and expected time to step back and assess those efforts. It is our responsibility to ensure that they are working as intended, and--given the breadth and complexity of this new body of regulation--it is inevitable that we will be able to improve them, especially with the benefit of experience and hindsight.
r180226a_FOMC
united states
2018-02-26T00:00:00
Brief Thoughts on the Financial Regulatory System and Cybersecurity
quarles
0
Thank you very much for having me here at the Financial Services Roundtable's spring meeting. I am pleased to speak with you all about our financial regulatory system: both the broad principles that have been directing my approach to evaluating the regulatory system, as well as cybersecurity, which is a topic of great import to financial system participants and their regulators. As I have said before, we have an opportunity to improve the efficiency, transparency, and simplicity of regulation. We have spent the past decade building out and standing up the post-crisis regulatory regime, and as a result we have made critical gains. The financial system is undoubtedly stronger and safer. We have robust capital and liquidity levels, an effective stress testing regime, and improved resolvability of our largest firms. But at the same time, it is our responsibility to ensure that those rules are effective. And if we identify rules that are not working as intended, we should make the necessary changes. With the benefit of hindsight and with the bulk of our work behind us, now is a natural and expected time to evaluate the effectiveness of that regime. Our efforts toward implementing those principles are underway. Federal Reserve Board staff members continue the review that I have previously outlined. The goal is to consider the effect of past regulatory initiatives on the resiliency of our financial system, on credit availability and economic growth, and more broadly, their costs and benefits. I am confident that that review will reveal some clear ways that we can improve the core post-crisis reforms. Let me now turn from regulation to supervision, and more specifically, to the topic of cybersecurity, which continues to be a high priority for the Federal Reserve. The Federal Reserve is committed to strategies that will result in measureable enhancements to the cyber resiliency of the financial sector. Given the dynamic and highly sophisticated nature of cyber risks, collaboration between the public sector and private sector toward identifying and managing these risks is imperative. While we know that successful cyber attacks are often connected to poor basic information technology hygiene, and firms must continue to devote resources to these basics, we also know that attackers always work to be a step ahead, and we need to prepare for cyber events. Many of you provide services that are critical to maintaining the functionality of the financial system. Those critical services should be highly resilient. But at the same time, some of the solutions in place to improve the resiliency of those critical services may actually contribute to a cyber event. One example would be the replication of bad data across data centers. As the Federal Reserve thinks about its financial stability mandate, this concern will be a particular focus. Solutions will not come easily, but I am confident that with strong public and private efforts, solutions will emerge. The Federal Reserve also focuses on the sharing of threat information and collaborates with a number of partners toward protective mechanisms. We work with other domestic agencies as well as international authorities, and we have partnerships between the public and private sectors to introduce and participate in programs that combat the increasingly frequent and sophisticated cyber threats. Specifically, we collaborate with government and industry partners to plan and execute cybersecurity tabletop exercises focused on identifying areas where sector resilience and information sharing can be enhanced. We also participate in community and industry outreach forums and actively share threat intelligence with sector partners including the Financial Services work collectively through arrangements such as FS-ISAC so that threat information can be shared promptly and effectively. Collaboration among many stakeholders on cybersecurity is critical to progress. The Federal Reserve has been working with, and will continue to work with, other financial regulatory agencies on harmonizing cyber risk-management standards and regulatory expectations across the financial services sector. Specifically, we are focused on aligning our expectations with existing best practices, such as the National Institute of Standards and Technology's Cybersecurity Framework, and identifying opportunities to further coordinate cyber risk supervisory activities for firms subject to the authority of multiple regulators. We support industry efforts to improve harmonization across the sector, which are complementary to achieving our regulatory safety and soundness goals. The Federal Reserve continues to work toward improving both post-crisis regulation and our approach to cybersecurity. I hope that my intention to lay out the broad principles guiding us as we move forward was helpful. And while many of the areas will require additional work and may not have fast results, the Federal Reserve is committed to getting it right, and I look forward to those efforts.
r180305a_FOMC
united states
2018-03-05T00:00:00
The Federal Reserve’s Regulatory Agenda for Foreign Banking Organizations: What Lies Ahead for Enhanced Prudential Standards and the Volcker Rule
quarles
0
Thank you very much to the Institute of International Bankers for inviting me to speak here today. Among my first areas of focus when I was a very young lawyer starting out in my career well over 30 years ago was providing advice to foreign banks and financial firms operating in the United States, and I learned then just how integral, essential, and welcome a part your firms play in our domestic financial sector. Non-U.S. firms serve as an important source of credit to U.S. households and businesses and contribute materially to the strength and liquidity of U.S. financial markets, so it is critical--not just as a matter of fairness but as a matter of our domestic interest--that we as regulators ensure that they operate in a fair and open financial services sector. I view that as an important part of my job. So today I want to share my perspective on the appropriate regulatory environment for foreign banks operating in the United States, as well as some thoughts on specific elements of that regime. Before doing that though, we should take stock of the pre-crisis history of foreign firms operating in the United States. First, the financial crisis revealed that in times of stress, international banking firms with large and complex local operations can contribute to instability in those local markets and can require extraordinary support from local authorities. Second, a number of foreign financial institutions expanded the size and complexity of their U.S. operations at a rousing pace and scale prior to the crisis, and we did not adjust our local regulatory and supervisory approaches to address the increased risk associated with this expansion. As a result, the difficulties faced by the U.S. operations of non-U.S. banks during the crisis mirrored that of their similarly sized domestic counterparts, underscoring a need for increased resiliency of both domestic firms and the U.S. operations of foreign banks. To bolster that resiliency, the environment for foreign banks operating in the United States underwent a number of changes. While there are important differences, those changes for foreign firms broadly parallel many of the changes instituted for domestic firms. My Federal Reserve colleagues and I have termed these the core post-crisis regulatory reforms: capital, liquidity, stress testing, and resolution planning. Of course, the obvious and most prominent difference for foreign firms--as attendees of this conference certainly know--was the introduction of the intermediate holding company (IHC) structure, to which the post-crisis regulatory reforms apply. In my estimation, these reforms have gone a long way toward meeting our goal of a more resilient financial system. That said, we are now at a point--with ten years of experience in setting up and living with the body of post-crisis regulation--where it is both relevant and timely to examine the post-crisis reforms and identify what is working well and what can be improved. If none of the regulatory measures implemented up to now were capable of improvement, this would be the first project of this scale and complexity conducted that had been done exactly right the first pass through. If there was still work to be done after Hammurabi, there is probably still some work to be done now after Dodd and Frank. In particular, as I have said elsewhere, we should be looking to see where we can achieve our regulatory objectives in ways that maintain our measures' effectiveness, but improve their efficiency, transparency, and simplicity. As part of that effort, we will consider additional tailoring and flexibility of our regulations in light of their impact on foreign banking organizations (FBOs) based on lessons learned over the past several years. To illustrate how I am thinking about these issues, I want to focus in my remarks today on two specific regulatory examples. These are, of course, not an exhaustive list of work to be done in the regulation of FBOs, but they tend to be near the top of the feedback list from both the industry and supervisors. First, I will discuss the application of enhanced prudential standards to FBOs, including our flexibility in implementing certain aspects of these standards. I will also offer some initial thoughts on opportunities for further tailoring that regime for FBOs. Second is the Volcker rule. I will provide some of my initial thinking on how we might be able to improve the Volcker rule, both generally and in its application to FBOs in particular. In implementing enhanced prudential standards for foreign banks with a large U.S. presence, we sought to ensure that firms hold sufficient local capital and liquidity--and have a risk management infrastructure--that is commensurate with the risks in their U.S. operations. And in general, that approach is meeting many of the broad goals the Federal Reserve set out to achieve. Today, foreign banks with large U.S. operations are less fragmented, maintain local capital and liquidity buffers that align to the size and riskiness of their U.S. footprint, and operate on equal footing with their domestic counterparts. Our current approach aligns with other jurisdictions that host a large and complex foreign bank presence. For example, the European subsidiaries of U.S. banking firms have long been subject to Basel-based standards imposed by the European Union and the United Kingdom as host regulators. In addition, European regulators are contemplating a holding company structure for the local operations of foreign banks to reduce fragmentation and ensure effective local supervision, similar in many ways to Federal Reserve rules. In adopting the enhanced prudential standards, however, the Board has acknowledged both the uniqueness of FBOs--as the U.S. operations are a small part of a larger firm--and the diversity of foreign bank operations in the United States. The Board contemplated from the outset that circumstances may require application of the rule's requirements to be adjusted in light of an individual firm's structure or risk profile. The Board has exercised this authority in the past, and I want to stress that we will continue to provide flexibility where appropriate to accommodate these differences. For instance, in implementing enhanced risk management standards, we have focused on outcomes--a strong control environment for foreign bank operations in the United States--while providing some flexibility in how those outcomes are achieved. We have allowed the global risk committee to serve as the risk committee for the U.S. operations rather than require the creation of a standalone committee. Further, for foreign banks with large U.S. branches but no IHC, the Board has acknowledged the challenges associated with the location of the risk committee. The Board has accordingly allowed risk committees at U.S. holding companies as well as managerial committees located in the United States, provided that the global board provided appropriate oversight. We are committed to continuing this outcomes-focused approach and to refining it where needed. Further, we recognize that effective stress testing regimes can take many different forms, specifically when interpreting the home-country stress testing requirements of the Dodd-Frank that a foreign bank's internal capital adequacy assessment process (ICAAP) may meet the minimum standards, provided that the firm's ICAAP is on a consolidated basis and reviewed by the home country regulator. In addition, while we believe that the IHC requirement serves a valuable role in ensuring consistency of regulation across U.S. operations of an FBO, the Board has reserved authority to approve multiple IHCs if circumstances warrant based on the FBO's activities, scope of operations, structure, home country regulatory framework, or similar considerations. For example, the Board's enhanced prudential standards rule contemplates allowing multiple IHCs in cases where home country legal requirements inhibit the combination of certain bank and nonbank operations. In practice, and in several instances, the Board has permitted a foreign bank to maintain certain U.S. subsidiaries outside of its IHC, so long as the foreign bank did not have practical control over that subsidiary. In addition, the Board recently approved an application by a foreign bank for a second IHC. Part of our rationale for approving the dual IHC structure was the enhancement of recovery and resolution options of the global firm. In granting the exception, the Board applied enhanced prudential standards to the two IHCs in the same manner that would apply to a single IHC, to maintain a level playing field and align incentives for the safe and sound operation of both IHCs. This approach allows us more flexibility in addressing firm-specific structure needs, while maintaining the goals of the enhanced prudential standards more generally. We will continue to consider future applications based on the merits of the case. Finally, to the extent that foreign banks have decided to reduce the scope of their U.S. operations to reduce the application of some of the enhanced prudential standards, the Board has accommodated requests for extended transition periods, so as to avoid unnecessary investments in infrastructure that ultimately would not be required by regulation. We are committed to tailoring our regulatory and supervisory regimes to align with the risk posed by financial institutions to the U.S. financial system. We are also continuing to evaluate whether our rules are sensitive to changes in the risk profile of banking organizations. We want our rules both to increase in stringency as firms' risks grow and, just as important, to decrease in stringency when firms have actively reduced their risk profiles. Let me turn now to the Volcker rule. Not to put too fine a point on it, but I believe the regulation implementing the Volcker rule is an example of a complex regulation that is not working well. The fundamental premise of the Volcker rule is simple: banks with access to the federal window--should not engage in risky, speculative trading for their own account. Whatever one's view of this basic premise, it is the law of the land. Taking that premise as a given, we have to ask how to improve the framework of the implementing regulation to make it more workable and less burdensome in practice from both a compliance and supervisory perspective. I think we all can agree that the implementing regulation is exceedingly complex. As one example of specifics, among many, the statute and implementing regulation's approach to defining "market making-related activities" rests on a number of complex requirements that are difficult or impossible to verify objectively in real time. As a result, banks spend far too much time and energy contemplating whether particular transactions or positions are consistent with the Volcker rule. Some of you may quite sensibly be asking, "If the deficiencies of the regulation are so apparent, how did we get here?" Despite the best of intentions in crafting the regulations, no one seems to be happy with the complex rule we wound up with. This has a very positive consequence: I have heard nothing but support from all of my regulatory colleagues for the proposition that the regulation is overly complex and would benefit from streamlining and simplifying to improve its workability in practice. We are actively working with our fellow regulators in seeking ways to further tailor and to reduce burden, particularly for firms that do not have large trading operations and do not engage in the sorts of activities that may give rise to proprietary trading. We also appreciate the broad extraterritorial impact of the rule in its current form for foreign banks' operations outside of the United States. To that end, we have, with the full cooperation of all five Volcker regulatory agencies, picked back up the process that was begun last fall to engage in a rulemaking process subject to the Administrative Procedures Act and develop a proposal for public comment that would make material changes to the Volcker rule regulations. In that process we will take account of our own experience with the regulations since implementation, and we also want to take account of the views of market participants and other interested parties with views on the Volcker rule, including what is working and what is not. We expect this process will proceed with dispatch. We must also work within the confines of the statute. For example, a number of my current and former Federal Reserve Board colleagues have expressed support for Congress providing an exemption from the Volcker rule for community banks, which is something I also support. Short of a statutory exemption, we can only do our best to mitigate burden on community banks that generally do not engage in the types of activities the Volcker rule was intended to cover. Statutory changes likely would make our work of streamlining more straightforward and complete, but we have a fair bit that we can accomplish even absent such changes. What are some of the improvements that we are thinking about that would be possible within the regulation itself? As an initial matter, it should be clearer and more transparent what is subject to the Volcker rule's implementing regulation and what is not. The definition of key terms like "proprietary trading" and "covered fund" should be as simple and clear as possible. It should not be a guessing game or require hours of legal analysis of complex banking and securities regulations to determine if a particular entity is a covered fund. It should not happen-- although it has happened--that our supervised firms come to us and ask questions about whether a particular derivative trade is subject to the rule, and we cannot give them our own answer or a consistent answer across the five responsible agencies. Supervisors need to be able to provide clear and transparent guidance on what is covered by the Volcker rule and what is not. This would benefit not only the firms, but the supervisors at the agencies as well. Again, a good example is the exemption for market making-related activities, which is one of the key exemptions from the prohibition on proprietary trading. The rule contains a gaggle of complex regulatory requirements, but the statute contains merely one--that the market making-related activities are designed not to exceed the reasonably expected near-term demands of clients, customers, or counterparties, otherwise known as RENT'D. We are considering different ways to use a clearer test for RENT'D. We want banks to be able to engage in market making and provide liquidity to financial markets with less fasting and prayer about their compliance with the Volcker rule. As I noted earlier, we also understand that the Volcker rule has had an extraterritorial impact on FBOs. With respect to foreign banks, there are at least a few places where we would like to revisit the application of the final rule based on concerns raised by market participants and others over the past four years of implementation. In particular, there are certain foreign funds--funds that are organized outside the United States by foreign banks in foreign jurisdictions and offered solely to foreign investors--that are subject to the Volcker rule due to Bank Holding Company Act control principles. Last summer, the banking agencies, in consultation with the Securities and Exchange Commission and the Commodity Futures Trading Commission, issued guidance that effectively stayed enforcement of the Volcker rule to these foreign funds in light of the technical and complex issues they raise. I expect we would continue this period of stay while we continue to consider these important issues. The statute also contains exemptions for FBOs to allow foreign banks to continue trading and engaging in covered fund activities solely outside the United States. The regulation again has a complex series of requirements that a foreign bank must meet to make use of these exemptions. We have heard from a number of foreign banks that complying with these requirements is unworkable in practice, and we are considering ways to address this impracticality. One possibility that has been suggested by market participants is a simple approach that focuses on the risk of the booking location. Of course, we would have to consider whether this is possible in light of the language of the statute and principles of competitive equity, but the suggestion is illustrative of the possibility of a more workable approach. As a final but no less important matter, we are considering broad revisions to the Volcker rule compliance regime. We would like Volcker rule compliance to be similar to compliance in other areas of our supervisory regime. As I noted earlier, we appreciate the broad extraterritorial impact of the rule in its current form on foreign banks' operations outside of the United States. Accordingly, we will be looking for ways to reduce the compliance burden of the Volcker rule for foreign banks with limited U.S. operations and small U.S. trading books. As I have described previously, the Federal Reserve is actively reviewing post-crisis financial reforms in an effort to better understand which reforms are working well and which ones can be improved to reduce regulatory burden and improve the efficiency, transparency, and simplicity of the regulatory framework without compromising a safe and sound financial system. In that effort, we recognize the importance of foreign banks to the U.S. economy and have a strong interest in ensuring our regulations are appropriately tailored to their U.S. footprint and risks to U.S. financial stability. Our goal is to maintain a regulatory framework that helps to ensure a strong and stable banking system in an efficient manner that does not result in excessively burdensome costs to the banking industry or the economy as a whole. The areas I have discussed today are important components of the exercise of improving our regulations as they apply to FBOs, and are part of a larger overall agenda to critically evaluate and improve our regulations to promote financial stability while fostering the conditions for solid economic activity. Some of these exercises will require more effort and time than others, but each one of them is a high priority for us at the Federal Reserve. I look forward to hearing your views as we make progress toward these improvements.
r180306a_FOMC
united states
2018-03-06T00:00:00
Navigating Monetary Policy as Headwinds Shift to Tailwinds
brainard
0
I appreciate the invitation from the Money Marketeers to discuss the path ahead for our economy and monetary policy. Many of the forces that acted as headwinds to U.S. growth and weighed on policy in previous years are generating tailwinds currently. Today many economies around the world are experiencing synchronized growth, in contrast to the 2015-16 period when important foreign economies experienced adverse shocks and anemic demand. The International Monetary Fund revised up its outlook for the world economy in January, continuing a recent pattern of upward revisions, in contrast to a string of downward revisions in 2015 and 2016. Stronger economies abroad should increase demand for America's exports and boost the foreign earnings of U.S. companies. The upward revisions to the foreign economic outlook are also pulling forward expectations of monetary policy tightening abroad and contributing to an appreciation of foreign currencies and increases in U.S. import prices. By contrast, foreign currencies weakened in the earlier period, pushing the dollar higher and U.S. import prices lower. Since the end of 2016, a broad index of the exchange value of the dollar has depreciated nearly 8 percent, whereas it appreciated by 25 percent from mid-2014 to 2016. In recent quarters, the combination of higher oil prices and robust global demand has been providing strong support to business investment--in contrast to the sharp pullback from 2015 to 2016. Business spending on fixed investment rose at more than a 6 percent pace in 2017. This rise followed two years of weak growth, dragged down by declines in the drilling and mining sector. Financial conditions are currently supportive of economic growth despite the recent choppiness in financial markets and some tightening since the beginning of the year. Various measures of equity valuations remain elevated relative to historical norms even after recent movements, and corporate bond spreads remain quite compressed. compares with the period from mid-2014 through the second half of 2016, when equity prices were flat and the dollar rose steeply. The Federal Reserve Bank of Chicago's National Financial Conditions Index provides a useful summary statistic. According to this measure, financial conditions tightened significantly from the middle of 2014 to early 2016. By comparison, financial conditions today remain near the accommodative end of the range since the financial crisis, even with the recent tightening in conditions. The most notable tailwind is the shift in America's fiscal policy stance from restraint to substantial stimulus in an economy close to full employment. In the earlier period, the economy had just weathered a challenging adjustment to a sharp withdrawal of fiscal support. Today, from a position near full employment, the economy is poised to absorb $1-1/2 trillion in personal and corporate tax cuts and a $300 billion increase in federal spending. Estimates suggest December's tax legislation could boost the growth rate of real gross domestic product (GDP) as much as 1/2 percentage point this year and next. On top of that, the recently agreed-to budget deal is likely to raise federal spending by around 0.4 percent of GDP in each of the next two years. Although the economy is currently around full employment and has been expanding at an above-trend pace, inflation has remained subdued for quite some time. Over the past year, overall PCE (personal consumption expenditures) inflation was 1.7 percent, and core PCE inflation was 1.5 percent--not very different from the average level of core inflation over the past five years. The persistence of subdued inflation, despite an unemployment rate that has moved below most estimates of its natural rate, suggests some risk that underlying inflation may have softened. While transitory factors no doubt played a role in last year's step-down in core PCE inflation, various empirical analyses conclude that persistent factors are at play in the stubbornly low level of core inflation. According to a variety of measures, underlying inflation-- the slow-moving trend that exerts a pull on wage and price setting-- may be running below levels that are consistent with the Federal Open Market Committee's . For example, some survey measures of longer-run inflation expectations are currently lower than they were before the financial crisis, as are most estimates based on statistical filters. Inflation compensation has moved up recently, but is still running somewhat below levels that prevailed before the crisis. Thus, it is important for monetary policy to ensure that underlying inflation is re- anchored firmly at 2 percent. At the same time, it is important for monetary policy to sustain full employment. It is difficult to know with precision how much slack remains in the labor market. If the unemployment rate were to continue to fall in the coming year at the same pace as in the past couple of years, it would reach levels not seen since the late 1960s. On the other hand, the employment-to-population ratio for prime-age workers remains more than 1 percentage point below its pre-crisis level. If substantially more workers could be drawn into the labor force, it would be possible for the labor market to firm notably further without generating imbalances. But it is an open question as to what portion of the prime- age Americans who are out of the labor force may prove responsive to tight labor market conditions because declining labor force participation among prime-age workers predates the crisis, especially for men. In one encouraging development, the strong labor market has pulled some discouraged workers back into the labor force and into productive employment over the past few years. Also encouraging, our Beige Book and workforce surveys indicate that employers are casting a wider net to find job candidates and investing more in on-the- job training. Although last year we faced a disconnect between the continued strengthening in the labor market and the step-down in inflation, mounting tailwinds at a time of full employment and above-trend growth tip the balance of considerations in my view. With greater confidence in achieving the inflation target, continued gradual increases in the federal funds rate are likely to be appropriate. Although experience in other countries suggests it can prove difficult to raise an underlying inflation trend that has been running below policymakers' target for several years, stronger tailwinds may help re-anchor inflation expectations at the symmetric 2 percent objective. Of course, it is conceivable we could see a mild, temporary overshoot of the inflation target over the medium term. If such a mild, temporary overshoot were to occur, it would likely be consistent with the symmetry of the FOMC's target and could help nudge underlying inflation back to our target. Recent research has highlighted the downside risks to inflation and to longer-run inflation expectations that are posed by the effective lower bound on nominal interest rates, and it suggests the importance of ensuring underlying inflation does not slip below target in today's new normal. We also seek to sustain full employment, and we will want to be attentive to imbalances that could jeopardize this goal. If the unemployment rate continues to decline on the current trajectory, it could fall to levels that have been rarely seen over the past five decades. Historically, such episodes have tended to see elevated risks of imbalances, whether in the form of high inflation in earlier decades or of financial imbalances in recent decades. One of the striking features of the current recovery has been the absence of an acceleration in inflation as the unemployment rate has declined, a development that is consistent with a flat Phillips curve. Although wage gains have seen some recent improvements, they continue to fall short of the pace seen before the financial crisis. However, we do not have extensive experience with an economy at very low unemployment rates and cannot be sure how it might evolve. In particular, we will want to remain attentive to the risk of financial imbalances. While asset valuations appear to be elevated, overall risks to the financial system remain moderate because household borrowing is moderate, risks associated with liquidity and maturity transformation have declined, and, importantly, the banking system appears to be well capitalized. suggests, however, that a booming economy can lead to a relaxation in lending standards, and the attendant excessive borrowing can complicate the task of monetary policy. We will need to be vigilant. What do these considerations imply for the path of monetary policy? Continued gradual increases in the federal funds rate are likely to remain appropriate to ensure inflation rises sustainably to our target and to sustain full employment, keeping in mind that interest rate normalization is well under way and balance sheet runoff is set to reach its steady-state pace later this year. Of course, we should be ready to adjust the path of policy in either direction if developments turn out differently than expected. In many respects, the macro environment today is the mirror image of the environment we confronted a couple of years ago. In the earlier period, strong headwinds sapped the momentum of the recovery and weighed down the path of policy. Today, with headwinds shifting to tailwinds, the reverse could hold true. . . . . . . . . . . . .
r180326a_FOMC
united states
2018-03-26T00:00:00
The Roles of Consumer Protection and Small Business Access to Credit in Financial Inclusion
quarles
0
Good evening. Thank you to John Bryant and Operation HOPE for inviting me today to I commend our hosts for bringing together such an impressive group of speakers and attendees, diverse in their perspectives, yet all of whom share the vital goal of advancing economic opportunity and inclusion. It is fitting that this event is being held on the 50th anniversary of several notable events, including the publication of Dr. Martin Luther King's last book, , which urged unity in order to create equal opportunity, and the enactment of the Fair Housing Act of 1968, which enshrined our nation's commitment to equal access to housing. These milestones remind us of the progress we have made to date as well as the challenges that remain, and inspire us to persist in our efforts. The conference has a robust agenda ahead. Over the next few days, you will be discussing many issues relating to financial inclusion--such as housing, small businesses, workforce participation, and other topics--to which we at the Federal Reserve share a commitment, and that we view as critical to our mission. Our economy is stronger when everyone has a chance to contribute fully and share in our national prosperity. And I personally believe that financial inclusion helps us realize a founding notion of our country--that this is a place where opportunity, innovation, and productivity are encouraged and rewarded. I'm particularly pleased to see on the conference agenda a number of sessions on issues to which we, at the Federal Reserve, devote considerable attention: access to credit and financial inclusion for consumers and small businesses. Conversations like these are enormously important to understanding the challenges Americans face and why a multitude of voices and approaches are needed to address them. In my remarks today, I'd like to focus on the consumer protection and credit accessibility aspects of financial inclusion, and some of the ways that the Federal Reserve promotes a fair and transparent consumer financial services marketplace. In recent speeches, I have emphasized my view that we should aim for an effective safety and soundness regulatory approach that is as efficient, transparent, and simple as feasible. I consider these principles to apply equally to our consumer protection supervision program; that is, we should also strive to promote consumer protection with as much efficiency, transparency, and simplicity as possible. I believe that a commitment to these principles is not only compatible with financial inclusion, but in fact helps promote it. Regulatory burden can make it harder for institutions to serve their customers and communities. This is especially the case for community banks and minority depository institutions (or MDIs), which play an important role in serving the needs of their local communities, including historically underserved populations. I'd also like to share some thoughts on an area to which I have devoted a good portion of my career: the importance of small business access to credit, which is a critical part of financial inclusion and a catalyst for economic growth in local communities. Let me start with the foundation of why financial inclusion is important. In broad terms, financial inclusion means access to affordable financial products and services that meet the needs of individuals and businesses and that are delivered in a responsible and sustainable way. the Federal Reserve, we recognize the influence that financial inclusion has on the broader economic performance of our country. Inclusion is essential to advancing the Federal Reserve's goal of promoting maximum employment, as well as supporting the stability of the financial system. Likewise, we support and share your goal of ensuring a fair and transparent marketplace for financial products and services--including credit--that can provide a pathway toward economic prosperity for all Americans. Overall, our economy is performing well, and unemployment is low. However, many households and communities continue to face financial challenges. Consider that more than two-thirds of white households own their homes as compared to less than half of African And consider that all important human asset: education. One-third of white adults have at least a bachelor's degree as compared to one in four African Americans and 17 percent of Hispanics. This matters because the advantages of a college degree for accumulating income and wealth are lifelong and inter-generational. We likewise see evidence of financial disparities in the Federal Reserve's Survey of Household Economics and When individuals have unequal or insufficient access to financial products and services, such as credit, they may be deprived of a chance to fund an education, finance a business, or pursue homeownership--opportunities that can provide greater financial security for themselves and their families and future generations. And as a nation, we are deprived of the benefits of their potential contributions to the economy. In the context of my earlier discussion of financial inclusion, federal consumer protection laws are critical to ensuring consumers are treated fairly when offered financial products and services. Discrimination and deception have no place in a fair and transparent marketplace. These practices can close off opportunities and limit consumers' ability to improve their economic circumstances, including through access to homeownership and education. The Federal Reserve's consumer compliance supervisory program reflects our commitment to promoting financial inclusion and ensuring that the financial institutions under our jurisdiction fully comply with applicable federal consumer protection laws and regulations. Let me give two examples involving the Federal Trade Commission Act's prohibition against unfair or deceptive practices in products and services that will undoubtedly be familiar to the audience--student financial aid and mortgage lending. In the last few years, the Federal Reserve has addressed deceptive practices in these areas through public enforcement actions that have collectively benefited hundreds of thousands of consumers and provided millions of dollars in restitution. In the financial aid context, our actions required restitution for students who were not given full information about the potential fees and limitations associated with opening deposit accounts for their financial aid refunds. And in mortgage lending, our action required restitution by a bank that had given borrowers the option to pay an additional amount to purchase discount points to lower their mortgage interest rate, but that did not actually provide the reduced rate to many of those borrowers. As we mark the 50 anniversary of the Fair Housing Act, the fair lending laws remain critical in fostering vibrant communities and a fair and transparent consumer financial services marketplace. For all state member banks, we enforce the Fair Housing Act, and for banks of $10 billion dollars or less in assets, we also enforce the Equal Credit Opportunity Act. Our examiners evaluate fair lending risk at every consumer compliance exam. While we find that the vast majority of our institutions comply with the fair lending laws, we are committed to identifying and remedying violations when they occur. Pursuant to the Equal Credit Opportunity Act, if we determine that a bank has engaged in a pattern or practice of discrimination, we refer DOJ public actions in critical areas, such as redlining and mortgage-pricing discrimination. For example, in our redlining referrals, the Federal Reserve found that the banks treated majority- minority areas less favorably than non-minority ones, such as through lending patterns, marketing, and Community Reinvestment Act assessment-area delineations. For our mortgage- pricing discrimination referrals, the Federal Reserve found that the banks charged higher prices to African American or Hispanic borrowers than it charged to non-Hispanic white borrowers and that the higher prices could not be explained by legitimate pricing criteria. Consumers deserve to be treated fairly, regardless of the size of the banking institution. Yet, we can achieve this goal and still reduce regulatory burden through a balanced program of tailored and risk-focused supervision. Accordingly, we continue to seek opportunities to promote efficient, simple, and transparent supervision where possible, so that the institutions we supervise can focus on finding solutions that work for all consumers and communities. In an effort to promote consumer compliance, our community bank supervisory program focuses our examinations on the areas of highest consumer risk. This has improved the efficiency and effectiveness of our examinations and reduced regulatory burden for many community banks. Banks and consumers benefit when supervision is timely and effective. Put simply, our role as supervisors should not be to play "gotcha" with our banks, but to published in November 2016, is an example of this approach. This guidance provides incentives for institutions to focus on managing their consumer compliance risks, preventing consumer harm, and helping to create a culture that identifies and corrects problems. Another example of how we support our institutions is how we work with MDIs, which as I noted earlier, play an important role in serving the needs of their local communities. We have dedicated Reserve Bank staff who are in frequent contact with MDI leadership. Based on what we've learned in the course of this outreach, we have expanded our Partnership for Progress--a program for outreach and technical assistance to MDIs --to include staff from our Community Development function, and we have enhanced our MDI-related programing. Our commitment to transparency also includes a robust outreach program for banks. This includes , a widely subscribed Federal Reserve System publication focused on consumer compliance issues, and its companion webinar series, . For example, in 2017, we sponsored an interagency webinar on fair lending supervision with almost 6,000 registrants, a substantial share of which were community banks. At the Federal Reserve, we view small business credit from several perspectives. For the economy, small businesses need adequate and affordable credit in order to form, grow, and succeed; otherwise they may underperform, slowing growth and employment. For many small business owners, personal and business finances are intertwined. A well-functioning housing finance market is vital for small business owners who may draw upon the equity in their homes to fund their businesses. Student loans may be needed to help fund the education that is important for both small business owners and their employees to boost profits and productivity. And, short-term credit matters for day-to-day management of cash flow, while longer-term credit is essential for capital investments. So, entrepreneurs--just like consumers--need access to a variety of credit sources. Recently, I had the pleasure of meeting a group of small businesses from across the country. By "small," I mean some of these companies have just a handful of employees. The group included owners of a catering company from Dallas, a mechanical engineering company from Philadelphia, a marketing consulting firm from Cleveland, a combination bar and bakery from Brooklyn, and a gluten-free bakery from my hometown, Salt Lake City. They shared with me the joys of running their own businesses, of creating jobs, and of providing for their families as well as creating opportunities for their employees. They also spoke of their challenges, which I'm sure are familiar to many of you in this room. For one, the owners wear many hats, including CEO, chief operating officer, chief finance officer, head of sales and marketing, and some--quite literally--chief bottle washer. They also spoke of the day- to-day difficulties of finding, training, and retaining employees. And, they raised the challenges they often face in gaining access to working capital, the lifeblood to sustaining and growing their businesses. The anecdotes I heard touched on three related trends we have been observing in the small business credit environment. First, although lending standards have eased since the recession and the financial condition of businesses has improved, some small business credit needs, especially for the smallest of firms and minority small business owners, continue to go Banks, some small businesses still face persistent credit gaps, even though they often seek credit in small amounts. Of the firms that apply for credit, more than two-thirds apply for less than $100,000, with substantial numbers of these applying for less than $25,000. However, smaller firms often struggle to qualify for bank credit, and among firms that were denied, low credit scores and insufficient credit history were the most frequently cited reasons. In addition, our survey suggests that some low-credit-risk minority- and women-owned firms are less likely than low-risk white-owned and male-owned firms to receive financing; and if they are approved, it is for less than the amount sought. Second, there have been shifts in the composition of commercial bank lending to small businesses. Large banks' share of small business lending has grown, especially among the smallest loans. This represents a change from 20 years ago when small businesses relied more on a relationship with local community banks for access to credit. For loans under $100,000, small banks of less than $1 billion in assets now hold a 19 percent share, down from 60 percent two decades ago. Today, it is the largest banks--those with greater than $50 billion in assets-- that account for more than 60 percent. Some of this trend may be due to industry consolidation, which has reduced the number of very small banks. The composition of credit offered also is shifting. Loans entail high fixed costs that are roughly the same regardless of whether a loan is for $100,000 or $1 million, reducing the profitability of smaller-dollar loans. Our data suggest that the growing share of small business lending at larger banks may be partly due to their use of automated underwriting for credit cards. By providing credit cards, banks are expanding credit available to small businesses; however, some small business advocates note that this form of credit is generally more expensive and lacks the flexibility of other products. For example, personal or business credit cards may be suitable for purchasing supplies, but not for payroll. In addition, this automated approach may be more "cookie cutter," meaning that firms that don't meet standard lending criteria may not qualify. These developments have created a space for the third trend we have been observing--the emergence of nonbank online alternative lenders that provide small-dollar business credit. For example, some of the large technology firms are providing credit, at a rapidly growing pace, to their built-in customer base of merchants. Several of the businesses I met with mentioned they had turned to nonbank online lenders after being turned down by banks. Some online lenders obtain access to a prospective borrower's accounting software, merchant accounts, shipping, and payroll data in real time in order to underwrite businesses. Business owners can receive funds in a couple of days or even hours. This emergence of online lenders is part of a broader evolution of financial technology-- or "fintech"--as seen in a wide range of products and services for both consumers and small businesses. I'm not surprised to see this important topic on your conference agenda. The use of fintech to expand access to credit has great promise and also associated risks. For example, online origination platforms and more sophisticated algorithms may enable credit to be underwritten and delivered in a manner that is still prudent but with greater efficiency, convenience, and lower processing costs. And as regulators, we do not want to unnecessarily restrict innovations that can benefit consumers and small businesses. At the same time, our interest is in ensuring that banks understand and manage their risks when introducing new technologies or partnering with fintech companies, and that consumers and small businesses remain protected. This is why the Federal Reserve has been engaged in a broad and multidisciplinary effort to develop a robust understanding of the technologies and activities in this space, in order to study fintech's opportunities and risks, and assess policy and supervisory implications. Fintech is also one of the factors driving calls for regulators to modernize the Community Reinvestment Act, or CRA, as technology changes how financial products and services are accessed. As I'm sure most of you know, the CRA is an important law that recognizes banks' affirmative obligation to meet the credit needs of the communities they serve, including low- and moderate-income communities. The CRA promotes financial inclusion by encouraging banks to extend mortgages, small business loans, and other types of credit as well as to provide investments and other services in communities where they take deposits, consistent with safe and sound banking operations. The arrival of new financial technologies, along with significant industry consolidation and other structural changes, has changed the way that financial services are delivered to consumers and the ways in which banks lend in communities. We continue to study these shifts, and share the common goal of improving the current supervisory and regulatory framework for CRA to further the statute's core objective of promoting access to credit and financial inclusion. Finally, I'll note that we will continue to learn about issues concerning financial inclusion through our research, such as by collecting new data in our Survey of Household Economics and Decisionmaking and Survey of Consumer Finances that I mentioned earlier. The information we gather helps us better understand factors affecting consumers and households, including low- and moderate-income and historically underserved populations. We also continue to convene experts and practitioners from industry, academia, and community-based organizations to help provide context on what we are seeing in the data, to identify emerging issues, and to consider where there may be data gaps and opportunities for additional research. In addition to yielding important insights that inform our policymaking, we hope these efforts can support the conversations you are having at conferences like this one. Thank you, again, for inviting me to speak at your conference as we mark the notable anniversaries that this year brings. Together, our work clearly has a great deal of synergy, and I thank you for your efforts. Making the economy work for the benefit of all Americans, including lower-income communities, is of the utmost importance.
r180403a_FOMC
united states
2018-04-03T00:00:00
An Update on the Federal Reserve's Financial Stability Agenda
brainard
0
The Federal Reserve's work on financial stability is integral to our dual-mandate objectives of price stability and full employment. As the Global Financial Crisis demonstrated, when severe financial stress triggers a broad pullback from risk, the resulting disruption in financial intermediation can impose deep and lasting damage on American families, workers, and businesses. The primary focus of financial stability policy is tail risk (outcomes that are unlikely but severely damaging) as opposed to the modal outlook (the most likely path of the economy). The objective of financial stability policy is to lessen the likelihood and severity of a financial crisis. Guided by that objective, our financial stability work rests on four interdependent pillars: systematic analysis of financial vulnerabilities; standard prudential policies that safeguard the safety and soundness of individual banking organizations; additional policies, which I will refer to as "macroprudential," that build resilience in the large, interconnected institutions at the core of the system; and countercyclical policies that increase resilience as risks build up cyclically. This work also recognizes the important connections to our monetary policy objectives. The foundation for our financial stability work is our assessment of systemic financial vulnerabilities. Our assessment framework is informed by historical episodes of financial stress here at home and around the world, as well as by a growing body of research on key indicators of building imbalances. Instead of attempting to forecast particular adverse shocks that could buffet the economy, the focus is on vulnerabilities-- that is, on features of the financial system that amplify bad shocks, spreading damage to households and businesses. Each quarter, Federal Reserve Board staff assess a set of vulnerabilities relevant for financial stability: asset valuations and risk appetite, borrowing by the nonfinancial sector (households and nonfinancial businesses), liquidity risks and maturity transformation by the financial system, and leverage in the financial system. It may be illuminating to briefly describe our current assessment in each of these areas. Valuations in a broad set of markets appear elevated relative to historical norms, even after taking into account recent movements. Estimates of risk premiums and spreads in a range of markets remain narrow by historical standards. Corporate bond yields remain low by historical comparison, and spreads of yields on junk bonds above those on comparable-maturity Treasury securities are near the lower-end of their historical range. Spreads on leveraged loans and securitized products backed by those loans remain narrow. Prices of multifamily residential and industrial commercial real estate (CRE) have risen, and capitalization rates--the ratios of operating income relative to the sale price of commercial properties--for these segments have reached historical lows. However, measures of credit conditions suggest that lenders are, to an extent, taking into account the potential for a reversion of valuations. By contrast, prices of single-family homes appear to be closer to historical norms. House prices have risen at a robust pace, and the price-to-rent ratio is high in absolute terms, but it does not appear to be far out of line with its longer-run trend. This broad national trend belies significant variation among local markets, however. One area that the Federal Reserve is monitoring is the extreme volatility evidenced by some cryptocurrencies. For instance, Bitcoin rose over 1,000 percent in 2017 and has fallen sharply in recent months. These markets may raise important investor and consumer protection issues, and some appear especially vulnerable to As in other highly speculative markets, individual investors should be careful to understand the possible pitfalls of these investments and the potential for losses. But it is less clear how the valuations of cryptocurrencies currently could pose a threat to financial stability. For instance, it is hard to see evidence of substantial leverage used in the purchase of the cryptocurrencies, or a material degree of use in payments, although our assessment of these markets is limited by their opacity. Nonetheless, we will continue to study them. In the assessment of elevated asset valuations, the relatively low level of Treasury yields is a mitigating factor; many asset valuation metrics, such as price-to-earnings ratios, corporate bond yields, and property capitalization rates, appear notably less stretched when judged relative to low Treasury yields. That said, Treasury yields reflect historically low term premiums--the compensation investors demand to hold assets over a longer horizon. This poses the risk that term premiums could rise sharply--for instance, if investor perceptions of inflation risks increased. I will return to this risk later. Although asset valuations are elevated, vulnerabilities due to debt owed outside the financial sector appear to be moderate--in the middle of their historical range. This reflects elevated leverage in the nonfinancial business sector and a moderate pace of borrowing in the household sector. In the nonfinancial business sector, the debt-to- earnings ratio has increased to near the upper end of its historical distribution, and net leverage at speculative-grade firms remains especially elevated. Overall, however, the ratio of nonfinancial-sector borrowing to gross domestic product has been below an estimate of its trend for several years as a result of the deleveraging of the household sector following the crisis. While the sustained period of post-crisis household deleveraging appears to have come to an end and savings rates have recently moved down, overall borrowing has been at a moderate pace and, on net, concentrated among borrowers with high credit scores. Even though the balance sheet of the household sector as a whole appears relatively strong, recent years have seen a rapid rise in student debt as well as rising default rates for borrowers with subprime credit scores on auto loans and, more recently, credit card balances. Beyond the nonfinancial sector, the vulnerabilities associated with maturity and liquidity transformation in the financial system appear to have fallen significantly relative to the levels seen prior to the crisis. The amount of wholesale short-term funding, which proved to be a substantial source of run risk during the crisis, has dropped substantially since its peak in 2008. Money market funds, which had been an area of vulnerability in the crisis, have undergone important reforms, including a move to floating net asset values for prime institutional funds along with the imposition of fees and restrictions on redemption. The anticipation of the enactment of these reforms in October 2016 led to a large decline in the level of assets under management at the affected funds, which has since held steady. So far, the growth of alternative short-term investment vehicles that could pose similar risks appears to have been weak. Finally, risks associated with leverage in the financial sector also appear to be subdued by historical standards. Leverage in the banking sector has declined notably since the crisis. Issuance of securitized products remains well below pre-crisis levels for most asset classes, with few signs of securitizations that involve maturity or liquidity transformation and limited issuance of complex securities whose opaque structures can contain significant leverage. And the data that are available suggest that leverage at nonbank financial firms has been stable. That said, there are indications that the use of leverage has been increasing at some institutions; for example, margin credit provided by dealers to equity investors such as hedge funds has expanded. There is an important connection between the robustness of our financial regulatory framework and the assessment of resilience in the financial sector. The subdued level of vulnerabilities from liquidity and maturity transformation and leverage is due centrally to reforms undertaken in response to the financial crisis. The Federal Reserve has implemented a framework of rules and supervision that requires large, interconnected banking organizations to hold substantial capital and liquidity buffers. This framework requires banks to be forward looking in their capital decisions and to be prepared for the possibility of severely stressed conditions occurring. The framework is macroprudential in design so that banks internalize the costs of undertaking activities that pose risks to the system. The core of the framework is the requirement of a substantial stack of common equity to build resilience against shocks and to provide an incentive for prudent risk management. Regulatory capital ratios for the largest banking firms at the core of the system have about doubled since 2007 and are currently at their highest levels in the post- crisis era. U.S. firms have substantially increased their capital since the first round of stress tests led by the Federal Reserve in 2009. The common equity capital ratio--which compares high-quality capital to risk-weighted assets--of the bank holding companies participating in the 2017 Comprehensive Capital Analysis and Review has more than doubled from 5.5 percent in the first quarter of 2009 to 12 percent in the fourth quarter of 2017. Their leverage ratios--defined as Tier 1 capital to total assets--increased from 7.3 percent to 8.6 percent over the same period. There has also been an important shift in the distribution of high-quality capital so that the average ratio of high-quality common equity to risk-weighted assets at the largest banks now exceeds the average for smaller banks. The larger and more complex banking organizations are now holding more capital, commensurate with the greater risks their distress could pose. Reduced vulnerability associated with liquidity and maturity transformation similarly is due importantly to key financial reforms instituted since the crisis. Large financial institutions are required to maintain substantial buffers of high-quality liquid assets (HQLA) calibrated to their funding needs and to their likely run risk in stressed conditions. Similar to the capital buffers, the liquidity buffers are greatest for those financial institutions that pose the greatest risks. Indeed, banks are holding buffers of HQLA in excess of their liquidity coverage ratio (LCR) requirements. Our largest banking firms have increased their holdings of HQLA from 13 percent of assets in 2011 to 20 percent in 2017 and have reduced their reliance on short-term wholesale funding from 36 percent of liabilities in 2011 to 29 percent in 2017. Just as our strengthened policy framework helps modulate vulnerabilities in the financial sector that could make the economy more vulnerable to shocks, so, too, our quarterly surveillance is intended to identify rising vulnerabilities early enough to be able to act to prevent disruptions that could damage the economy. In particular, the quarterly assessment of financial stability is a critical input into the Board's processes for adjusting the supervisory scenarios used in the stress test and the setting of the countercyclical capital buffer--the two tools that permit the Board to respond to vulnerabilities that build over time. The supervisory stress test is intended to ensure that large banking institutions will be able to continue to function normally even under severely adverse macroeconomic conditions. It also assesses the resilience of the largest trading firms to risks of a large disturbance to global financial markets and the failure of the firms' largest counterparty. These components reflect some of the key linkages through which the distress or failure of one firm could affect others, including direct credit losses as well as the severe financial disruptions that would be expected to accompany fire sales and an increase in risk aversion. By design, the Fed's stress test is intended to incorporate some elements to make the tests more stringent when the economy and financial markets are heating up. These countercyclical features are intended to give the stress tests some utility as a macroprudential tool--that is, to mitigate the financial system's inherent pro-cyclicality. The most prominent countercyclical feature of the stress-test scenario architecture is the setting of the unemployment rate in the severely adverse scenario. The general rule is to increase the unemployment rate by 4 percent unless the baseline unemployment rate starts at levels below 6 percent, in which case the ultimate level of the unemployment rate reached in the severely adverse scenario is fixed at 10 percent. In addition, last December, the Board put out a proposal for comment to introduce a systematically countercyclical mechanism in the component of the scenario that shocks house prices. Beyond these systematic elements, the assessment of vulnerabilities is a critical input in the development of scenarios for the stress tests each year to strengthen resilience against vulnerabilities that may be identified. As I noted earlier, recent assessments have noted high levels of valuations across a broad set of asset markets and elevated business leverage in an environment where Treasury yields and term premiums have been relatively low by historical standards. In such circumstances, asset prices might be particularly susceptible to an unexpected development that accentuates downside risks to the macroeconomic outlook. For instance, a sharp increase in concerns about the potential for high inflation or in uncertainty about policy could boost term premiums on Treasury securities, which could trigger declines in asset prices across a range of markets. The scenarios for this year's stress tests, which were announced in February, feature material decreases in asset prices--notably including CRE prices--along with a substantial rise in Treasury term premiums. Although the severely adverse scenarios always include severe recessions and sharp declines in asset prices, in past years, they have been accompanied by large declines in Treasury yields, which have resulted in capital gains on these securities. In contrast, in this year's severely adverse scenario, yields on longer-maturity Treasury securities are flat. In this way, this year's severely adverse scenario addresses one of the salient vulnerabilities that have been identified. By encouraging institutions at the core of the system to build resilience against such an eventuality, we seek to lessen the severity of the distress to the overall financial system should asset prices fall and term premiums rise sharply in a challenging macroeconomic environment. Even with these design features, the Fed's stress-testing framework has some limitations in counteracting the inherent pro-cyclicality in the availability of credit. Indeed, the stress tests have become less binding on banks as the recovery has gathered strength. Thus, losses on loans and positions in the severely adverse scenario among participating banks have declined over time as the economy has strengthened. For example, in the 2016 exercise, losses amounted to $526 billion, while in 2017 they had fallen to $493 billion, despite a larger increase in the unemployment rate in the scenario. As economic conditions strengthen, typical measures of underwriting quality look strong, delinquencies fall to low levels, and profits rise consistently, all of which could lead to lower projected stress losses. Of course, these effects tend to reverse during bad economic conditions: Underwriting deficiencies tend to be revealed, delinquencies to rise, and profits to fall. Thus, capital requirements based on stress tests alone are unlikely to completely compensate for the financial system's natural pro-cyclicality. In part for that reason, we also have a specifically countercyclical capital requirement that applies to the largest banks. Countercyclical capital requirements can lean against a dangerous increase in financial vulnerabilities at a time when the degree of monetary tightening that would be needed to achieve the same goal would be inconsistent with the Federal Reserve's dual mandate of full employment and price stability. The reverse is also true. The countercyclical capital buffer (CCyB) is designed to increase the resilience of large banking organizations when there is an elevated risk of above-normal losses, which often follow periods of rapid asset price appreciation or credit growth that are not well supported by underlying economic fundamentals. The CCyB is an additional margin of capital that the nation's largest banks can be asked to build to augment resilience at times of rising cyclical pressures and to release as the economy weakens in order to allow banks to lend more when it is most needed. The CCyB framework, which was finalized in September 2016, requires the Federal Reserve Board to vote at least once per year on the level of the CCyB. Put simply, the criterion for raising the CCyB above its minimum value of zero is that financial risks are assessed to be in the upper one-third of their historical distribution. Our assessment of financial vulnerabilities is a key input into the Federal Reserve Board's decisions surrounding the setting of the CCyB, along with a variety of other model-based and judgmental criteria. On December 1, 2017, with overall risks assessed as moderate and with other measures that we routinely monitor sending a similar signal, the Board announced its decision to leave the CCyB at its minimum value of zero. Of course, our assessments of financial vulnerabilities are also an important input important interdependence between financial stability and our monetary policy objectives of full employment and price stability. and Monetary Policy Strategy notes that "the Committee's policy decisions reflect . . . its assessments of the balance of risks, including risks to the financial system that could impede the attainment of the Committee's goals." Generally speaking, lessons from a broad range of countries suggest financial crises occur with substantially lower frequency than business cycles, and there is no settled doctrine to date on the use of the short-term policy rate--the key instrument of monetary policy--to lessen the probability and severity of financial crises. While financial imbalances are an important consideration in monetary policymaking and the expected path of monetary policy can have important implications for financial vulnerabilities, both research and experience suggest there is no simple rule for accomplishing our dual-mandate and financial stability objectives through reliance on a single policy instrument. As I have noted elsewhere, the recently enacted fiscal stimulus should boost the economy at a time when it is close to full employment and growing above trend. It is hard to know with precision how the economy is likely to respond. If unemployment continues to decline at the rate of the past year, it could reach levels not seen in several decades. Historically, such episodes tended to see a risk of accelerating inflation in earlier decades or a risk of financial imbalances in more recent decades. It is important to be attentive to the emergence of any imbalances, because we do not have much experience with pro-cyclical fiscal stimulus at a time when resource constraints are tightening and growth is above trend. Despite elevated asset valuations, overall risks to the financial system remain moderate in no small part because important financial reforms have encouraged large banking institutions to build strong capital and liquidity buffers. History suggests, however, that a booming economy can lead to a relaxation in lending standards and an attendant increase in risky debt levels. At a time when valuations seem stretched and cyclical pressures are building, I would be reluctant to see our large banking institutions releasing the capital and liquidity buffers that they have built so effectively over the past few years, especially since credit growth and profitability in the U.S. banking system are robust. Of course, if cyclical pressures continue to build and financial vulnerabilities broaden, it may become appropriate to ask the largest banking organizations to build a countercyclical buffer of capital to fortify their resilience and protect against stress. Alternatively, if there were to be a material adjustment to the calibration of the structural buffers held by the large banking institutions, it would be important to make a compensating adjustment to the countercyclical buffer in order to achieve the same overall resilience to financial vulnerabilities. As a rough rule of thumb--and as described in the Board's framework for implementing the CCyB, which was finalized in September 2016--the criteria for setting the CCyB are calibrated so that the CCyB will be above its minimum value about one- third of the time, assuming that vulnerabilities evolve as they did pre-crisis. It is worth noting that some other jurisdictions have designed their countercyclical buffer requirement to be above zero roughly half of the time--spanning a greater range of economic conditions than in the United States. This may reflect a difference in the relative emphasis on structural buffers relative to cyclically varying buffers. It is worth noting that, although U.S. structural buffers are on the stronger end of the range internationally, the U.S. banking system is also among the healthiest and most competitive in the world. Credit growth is robust, and banks are registering strong profitability relative to their international peers. Our financial stability agenda seeks to reduce the likelihood and severity of financial crises. In the wake of the 2007-09 financial crisis and recession, we learned important lessons about the critical necessity of monitoring emerging financial vulnerabilities in a systematic fashion and taking corresponding prudential, macroprudential, and countercyclical policies to build resilience. We undertake systematic assessment of financial vulnerabilities as an important input into our policymaking processes--helping to calibrate the prudential, macroprudential, and countercyclical policies that are our first lines of defense, in addition to informing FOMC deliberations because of the important feedback loops between financial conditions and our dual-mandate goals. This work is complemented by the efforts of our domestic and international partners through the Financial Stability Oversight Council and the Federal Financial Institutions Examination Council here at home and through the Financial
r180406a_FOMC
united states
2018-04-06T00:00:00
The Outlook for the U.S. Economy
powell
1
For release on delivery Remarks by at For more than 90 years, the Economic Club of Chicago has provided a valued forum for current and future leaders to discuss issues of vital interest to this city and our nation. I am honored to have the opportunity to speak to you here today. At the Federal Reserve, we seek to foster a strong economy for the benefit of individuals, families, and businesses throughout our country. In pursuit of that overarching objective, the Congress has assigned us the goals of achieving maximum employment and stable prices, known as the dual mandate. Today I will review recent economic developments, focusing on the labor market and inflation, and then touch briefly on longer-term growth prospects. I will finish with a discussion of monetary policy. Recent Developments and the State of the Economy After what at times has been a slow recovery from the financial crisis and the Great Recession, growth has picked up. Unemployment has fallen from 10 percent at its peak in October 2009 to 4.1 percent, the lowest level in nearly two decades (figure 1). Seventeen million jobs have been created in this expansion, and the monthly pace of job growth remains more than sufficient to employ new entrants to the labor force (figure 2). The labor market has been strong, and my colleagues and I on the Federal Open Market Committee (FOMC) expect it to remain strong. Inflation has continued to run below the FOMC's 2 percent objective but we expect it to move up in coming months and to stabilize around 2 percent over the medium term. Beyond the labor market, there are other signs of economic strength. Steady income gains, rising household wealth, and elevated consumer confidence continue to support consumer spending, which accounts for about two thirds of economic output. Business investment improved markedly last year following two subpar years, and both business surveys and profit expectations point to further gains ahead. Fiscal stimulus and continued accommodative financial conditions are supporting both household spending and business investment, while strong global growth has boosted U.S. exports. As many of you know, each quarter FOMC participants--the members of the Board of Governors and the presidents of the Reserve Banks--submit their individual projections for growth, unemployment, and inflation, as well as their forecasts of the appropriate path of the federal funds rate, which the Committee uses as the primary tool of monetary policy. These individual projections are compiled and published in the Summary of Economic Projections, or SEP. FOMC participants submitted their most recent forecasts three weeks ago, and those forecasts show a strengthening in the medium-term economic outlook (table 1). As you can see, participants generally raised their forecasts for growth in inflation-adjusted gross domestic product (GDP) and lowered their forecasts for unemployment. In addition, many participants expressed increased confidence that inflation would move up toward our 2 percent target. The FOMC sees the risks to the economic outlook as roughly balanced. As I mentioned, the headline unemployment rate has declined to levels not seen since 2000. The median projection in the March SEP calls for unemployment to fall well below 4 percent for a sustained period, something that has not happened since the late 1960s. This strong labor market forecast has important implications for the fulfillment of both sides of the dual mandate, and thus for the path of monetary policy. So I will spend a few minutes exploring the state of the job market in some detail. A good place to begin is with the term "maximum employment," which the Committee takes to mean the highest utilization of labor resources that is sustainable over time. In the long run, the level of maximum employment is not determined by monetary policy, but rather by factors affecting the structure and dynamics of the labor market. Also, the level of maximum employment is not directly measureable, and it changes over time. Real-time estimates of maximum employment are highly uncertain. this uncertainty, the FOMC does not set a fixed goal for maximum employment. Instead, we look at a wide range of indicators to assess how close the economy is to maximum employment. The headline unemployment rate is arguably the best single indicator of labor market conditions. In addition, it is widely known and updated each month. As I noted, the unemployment rate is currently at 4.1 percent, which is a bit below the FOMC's median estimate of the longer-run normal rate of unemployment. However, the unemployment rate does not paint a complete picture. For example, to be counted in the official measure as unemployed, a person must have actively looked for a job in the past four weeks. People who have not looked for work as recently are counted not as unemployed, but as out of the labor force, even though some of them actually want a job and are available to work. Others working part time may want a full-time job. And still others who say that they do not want a job right now might be pulled into the job market if the right opportunity came along. So, in judging tightness in the labor market, we also look at a range of other statistics, including alternative measures of unemployment, as well as measures of vacancies and job flows, surveys of households' and businesses' perceptions of the job market, and, of course, data on wages and prices. Figure 3 shows the headline unemployment rate and two broader measures of U-5 includes the unemployed plus people who say they want a job and have looked for one in the past year (though not in the past four weeks). U-6 includes all those counted in U-5 plus people who are working part time but would like full-time work. Like the headline unemployment rate, both U-5 and U-6 have declined significantly in recent years. They are now at levels seen before the financial crisis, though not quite as low as they were in 1999 to 2000, a period of very tight job market conditions. The left panel of the next chart shows that employers are having about as much difficulty now attracting qualified workers as they did 20 years ago (figure 4). Likewise, the job vacancy rate, shown on the right, is close to its all-time high, as is the average number of weeks it takes to fill a job opening. Households also are increasingly reporting that jobs are plentiful (figure 5), which is consistent with the high level of job postings reported by firms. In addition, the proportion of workers quitting their jobs is high, suggesting that workers are being hired away from their current employers and that others are confident enough about their prospects to leave jobs voluntarily--even before they have landed their next job. While the data I have discussed thus far do point to a tight labor market, other data are less definitive. The labor force participation rate, which measures the percentage of working age individuals who are either working or actively looking for a job, has remained steady for about four years (figure 6). This flat performance is actually a sign of improvement, since increased retirements as our population ages have been putting downward pressure on participation and will continue to do so. However, the participation rate of prime-age workers (those between the ages of 25 and 54) has not recovered fully to its pre-recession level, suggesting that there might still be room to pull more people into the labor force (figure 7). Indeed, the strong job market does appear to be drawing back some people who have been out of the labor force for a significant time. For example, the percentage of adults returning to the labor force after previously reporting that they were not working because of a disability has increased over the past couple of years, and anecdotal reports indicate that employers are increasingly willing to take on and train workers they would not have considered in the past. Wage growth has also remained moderate, though it has picked up compared with its pace in the early part of this recovery (figure 8). Weak productivity growth is an important reason why we have not seen larger wage gains in recent years. At the same time, the absence of a sharper acceleration in wages suggests that the labor market is not excessively tight. I will be looking for an additional pickup in wage growth as the labor market strengthens further. Taking all of these measures of labor utilization on board, what can we say about the state of the labor market relative to our statutory goal of maximum employment? While uncertainty around the long run level of these indicators is substantial, many of them suggest a labor market that is in the neighborhood of maximum employment. A few other measures continue to suggest some remaining slack. Assessments of the maximum level of employment are uncertain, however, and subject to revision. As we seek the highest sustainable utilization of labor resources, the Committee will be guided by incoming data across all of these measures. That brings me to inflation--the other leg of our dual mandate. The substantial improvement in the labor market has been accompanied by low inflation. Indeed, inflation has continued to run below our 2 percent longer-run objective (figure 9). Consumer prices, as measured by the price index for personal consumption expenditures, increased 1.8 percent over the 12 months ending in February. The core price index, which excludes the prices of energy and food and is typically a better indicator of future inflation, rose 1.6 percent over the same period. In fact, both of these indexes have been below 2 percent consistently for the past half-dozen years. This persistent shortfall in inflation from our target has led some to question the traditional relationship between inflation and the unemployment rate, also known as the Phillips curve. Given how low the unemployment rate is, why aren't we seeing higher inflation now? As those of you who carefully read the minutes of each FOMC meeting are aware--and I know there are some of you out there--we had a thorough discussion of inflation dynamics at our January meeting. Almost all of the participants in that discussion thought that the Phillips curve remained a useful basis for understanding inflation. They also acknowledged, however, that the link between labor market tightness and changes in inflation has become weaker and more difficult to estimate, reflecting in part the extended period of low and stable inflation in the United States and in other advanced economies. Participants also noted that other factors, including inflation expectations and transitory changes in energy and import prices, can affect inflation. My view is that the data continue to show a relationship between the overall state of the labor market and the change in inflation over time. That connection has weakened over the past couple of decades, but it still persists, and I believe it continues to be meaningful for monetary policy. Much of the shortfall in inflation in recent years is well explained by high unemployment during the early years of the recovery and by falling energy prices and the rise in the dollar in 2015 and 2016. But the decline in inflation last year, as labor market conditions improved significantly, was a bit of a surprise. The 2017 shortfall from our 2 percent goal appears to reflect, at least partly, some unusual price declines, such as for mobile phone plans, that occurred nearly a year ago. In fact, monthly inflation readings have been firmer over the past several months, and the 12- month change should move up notably this spring as last spring's soft readings drop out of the 12-month calculation. Consistent with this view, the median of FOMC participants' projections in our March survey shows inflation moving up to 1.9 percent this year and to 2 percent in 2019. Although job creation is strong and unemployment is low, the U.S. economy continues to face some important longer-run challenges. GDP growth has averaged just over 2 percent per year in the current economic expansion, much slower than in previous expansions. Even the higher growth seen in recent quarters remains below the trend before the crisis. Nonetheless, the unemployment rate has come down 6 percentage points during the current expansion, suggesting that the trend growth necessary to keep the unemployment rate unchanged has shifted down materially. The median of FOMC participants' projections of this longer-run trend growth rate is 1.8 percent. The latest estimate from the Blue-Chip consensus of private forecasters is about 2 percent. To unpack this discussion a little further, we can think of output growth as composed of increases in hours worked and in output per hour, also known as productivity growth. Here, a comparison with the 2001-to-2007 expansion is informative. Output growth in that earlier expansion averaged nearly 3 percent per year, well above the pace in the current expansion. Despite the faster output growth, however, average job growth in the early 2000s was 1/2 percentage point per year weaker than in the current expansion. The difference, of course, is productivity, which grew at more than twice the pace in the early 2000s than it has in recent years. Taking a longer view, the average pace of labor productivity growth since 2010 is the slowest since World War II and about one-fourth of the average postwar rate (figure 10). Moreover, the productivity growth slowdown seems to be global and is evident even in countries that were little affected by the financial crisis (figure 11). This observation suggests that factors specific to the United States are probably not the main drivers. As shown in figure 12, labor productivity growth can be broken down into the contributions from business investment (or capital deepening), changes in the skills and work experience of the workforce, and a residual component that is attributed to other factors such as technological change and efficiency gains (usually lumped together under the term total factor or multifactor productivity). In the United States and in many other countries, some of the slowdown in labor productivity growth can be traced to weak investment after the crisis. Investment has picked up recently in the United States, however, which suggests that capital deepening may pick up as well. The other big contributor to the slowdown has been in total factor productivity growth. The outlook for this dimension of productivity is considerably more uncertain. Total factor productivity growth is notoriously difficult to predict, and there are sharply different views on where it might be heading. Some argue that the productivity gains from the information technology revolution are largely behind us, and that more-recent technological innovations have less potential to boost productivity. Others argue that a well-documented decline in measures of business dynamism--such as the number of start-ups, the closure of less-productive businesses, and the rates at which workers quit their jobs and move around the country to take a new job--has held back productivity growth, in part by slowing the movement of capital and labor toward their most productive uses. New technological breakthroughs in many areas--robotics, biotech, and artificial intelligence to name just a few--have led others to take a more optimistic view. point to substantial productivity gains from innovation in areas such as energy production and e-commerce. In addition, the optimists point out that advances in technology often take decades to work their way into the economy before their ultimate effects on productivity are felt. That delay has been observed even for game-changing innovations like the steam engine and electrification, which ultimately produced broad increases in productivity and living standards. In this view, we just need to be patient for new technologies to diffuse through the economy. Only time will tell who has the better view--the record provides little basis to believe that we can accurately forecast the rate of increase in productivity. The other principal contributor to output growth is hours worked. Hours growth, in turn, is largely determined by growth in the labor force, which has averaged just 1/2 percent per year since 2010, well below the average in previous decades (figure 13). One reason for slower growth of the labor force is that baby boomers are aging and retiring, and that trend will continue. But another reason is that labor force participation of people between the ages of 25 and 54--prime-age individuals--declined from 2010 to 2015 and remains low. Indeed, the participation rate for prime-age men has been falling for more than 50 years, while women's participation in this age group rose through the 1990s but then turned downward, and it has fallen for the past 20 years. These trends in participation have been more pronounced in the United States than in other advanced economies. In 1990, the United States had relatively high participation rates for prime-age women relative to other countries and was in the midrange of advanced economies for prime-age men. However, we now stand at the low end of participation for both men and women in this age group--just above Italy, but well There is no consensus about the reasons for the long-term decline in prime-age participation rates, and a variety of factors could have played a role. For example, while automation and globalization have contributed positively to overall domestic production and growth, adjustment to these developments has resulted in dislocations of many workers without college degrees and those employed in manufacturing. In addition, factors such as the increase in disability rolls in recent decades and the opioid crisis may have reduced the supply of prime-age workers. Given that the declines have been larger here than in other countries, it seems likely that factors specific to the United States have played an important role. As I noted earlier, the strong economy may continue to pull some prime-age individuals back into the labor force and encourage others not to drop out. Research suggests that structurally-oriented measures--for example, improving education or fighting the opioid crisis--also will help raise labor force participation in this age group. To summarize this discussion, some of the factors weighing on longer-term growth are likely to be persistent, particularly the slowing in growth of the workforce. Others are hard to predict, such as productivity. But as a nation, we are not bystanders. We can put policies in place that will support labor force participation and give us the best chance to achieve broad and sustained increases in productivity, and thus in living standards. These policies are mostly outside the toolkit of the Federal Reserve, such as those that support investment in education and workers' skills, business investment and research and development, and investment in infrastructure. Let me turn now to monetary policy. In the aftermath of the financial crisis, the FOMC went to extraordinary lengths to promote the recovery, support job growth, and prevent inflation from falling too low. As the recovery advanced, it became appropriate to begin reducing monetary policy support. Since monetary policy affects the economy with a lag, waiting until inflation and employment hit our goals before reducing policy support could have led to a rise in inflation to unwelcome levels. In such circumstances, monetary policy might need to tighten abruptly, which could disrupt the economy or even trigger a recession. As a result, to sustain the expansion, the FOMC adopted a gradual approach to reducing monetary policy support. We began in December 2015 by raising our target for the federal funds rate for the first time in nearly a decade. Since then, with the economy improving but inflation still below target and some slack remaining, the Committee has continued to gradually raise interest rates. This patient approach also reduced the risk that an unforeseen blow to the economy might push the federal funds rate back near zero--its effective lower bound--thus limiting our ability to provide appropriate monetary accommodation. In addition, after careful planning and public communication, last October the FOMC began to gradually and predictably reduce the size of the Fed's balance sheet. Reducing our securities holdings is another way to move the stance of monetary policy toward neutral. The balance sheet reduction process is going smoothly and is expected to contribute over time to a gradual tightening of financial conditions. Over the next few years, the size of our balance sheet is expected to shrink significantly. At our meeting last month, the FOMC raised the target range for the federal funds another step in the ongoing process of gradually scaling back monetary policy accommodation. The FOMC's patient approach has paid dividends and contributed to the strong economy we have today. Over the next few years, we will continue to aim for 2 percent inflation and for a sustained economic expansion with a strong labor market. As I mentioned, my FOMC colleagues and I believe that, as long as the economy continues broadly on its current path, further gradual increases in the federal funds rate will best promote these goals. It remains the case that raising rates too slowly would make it necessary for monetary policy to tighten abruptly down the road, which could jeopardize the economic expansion. But raising rates too quickly would increase the risk that inflation would remain persistently below our 2 percent objective. Our path of gradual rate increases is intended to balance these two risks. Of course, our views about appropriate monetary policy in the months and years ahead will be informed by incoming economic data and the evolving outlook. If the outlook changes, so too will monetary policy. Our overarching objective will remain the same: fostering a strong economy for all Americans--one that provides plentiful jobs and low and stable inflation.
r180417a_FOMC
united states
2018-04-17T00:00:00
Community Development in Baltimore and A Few Observations on Community Reinvestment Act Modernization
brainard
0
Good afternoon everyone. I want to take a moment to thank you for giving us such great insights into all the important work that is taking place here in the neighborhoods of Baltimore. Every few months, I try to get out to some of the areas of the country where our Reserve Banks are engaged with community members, community development institutions, businesses, banks, and schools in efforts to create more vibrant communities. I last had the pleasure of visiting Baltimore with community development staff from the Richmond Federal Reserve Bank in 2015. Each visit teaches me so much more about the complexity of the challenges you face, the creativity of the solutions you devise, and ultimately the determination you all demonstrate to lift up the lives of all community members in this city. Collectively, as community development professionals, concerned residents, and community partners, I can see the considerable progress you are making in the neighborhoods of Baltimore, although of course there is still plenty of work to be done. the unique challenges Baltimore faces, and we are committed to remaining focused on this city and to partnering where that can be helpful. Baltimore and its residents deserve a more prosperous and equitable future. I am pleased that the Federal Reserve's Community Development team has been supporting your work and helping to convene collaborative endeavors, such as the Symposium on Workforce Transportation that occurred in February, and cross-state learning exchanges that connect community development leaders from Maryland and the Carolinas. Perhaps the work that best exemplifies our commitment to understanding the complex goal of this research initiative is to further understand the barriers and incentives that individuals living in persistent poverty face in making economic decisions. This multiyear qualitative research project will interview households in the Baltimore metro area to better understand how individuals have been able to overcome intergenerational poverty. I will be very interested in what we will all learn from this work. I would also like to highlight that the Reinventing our Communities conference will be held here in Baltimore from October 1st through 3rd. This is an excellent conference that draws on resources from across the Federal Reserve System, along with Johns Hopkins University, to highlight effective models and emerging strategies for investing capital to foster economic growth by enabling every individual to contribute to and derive benefit from the economy. The Federal Reserve System has a valuable perspective because we are in communities all across this nation and because as a supervisor we engage with banks as they seek to fulfill their affirmative obligations to meet the credit needs of their communities. I have seen the value of the Community Reinvestment Act (CRA) as a vital tool to address the credit needs of low- and moderate-income communities, and I believe the time is ripe for a refresh to make it even more relevant to today's challenges. As you may know, the Treasury Department recently completed an extensive outreach effort related to modernizing the CRA regulations. We look forward to working with the Office of the Comptroller of the Currency, and the Federal Deposit Insurance Corporation, with which we have traditionally issued joint rulemakings on CRA. The Federal Reserve is deeply committed to the Community Reinvestment Act's goal of encouraging banks to meet their affirmative obligation to serve their entire community, and in particular the credit needs of low- and moderate-income communities. As we have seen all over the country, when banks are inclusive in their lending, it helps low- and moderate-income communities to thrive by providing opportunities for community members to buy and improve their homes and to start and expand small businesses. We employ an extraordinary group of CRA examiners and community development professionals to make sure that we do our part to help banks meet this obligation, as the statute directs. We are very proud of the role we play in educating banks on the CRA's provisions, introducing them to potential partners in the community, and keeping them informed of best practices in the field. In fact, one of CRA's important achievements has been to help foster the growth of the community development field by building stronger relationships between financial institutions and community and economic development professionals. This development has expanded the number and range of opportunities for banks to lend and invest in safe and sound ways, benefiting both banks and their communities. The current CRA regulations date back to 1995. In the two decades since, there have been substantial changes in the ways that banks serve their customers and in the challenges faced by low- and moderate-income communities. The time is ripe to modernize the CRA regulations to make them more effective in making credit available in low- and moderate-income areas at a time when technological and structural changes in the banking industry allow banks to serve customers outside of the areas with branches that have traditionally defined a bank's community. As we update the rules, it should be possible to achieve better outcomes--both providing banks with the greater clarity and predictability they seek while also facilitating better provision of credit, investments, and banking services in low- and moderate-income areas. There are several outcomes that we will work toward in the interagency rule-writing effort. First, we should seek to modernize the definition of assessment areas in such a way that the core focus remains the credit needs of local communities. The definition in the existing CRA regulation of a bank's assessment area--that is, the area in which we evaluate a bank's CRA performance--reflects a banking environment when interstate banking was not yet allowed, and physical branches were necessary to serve the deposit and lending needs of bank customers. Technological advances and changing consumer preferences have made it possible for banks to serve customers far outside of their physical branches--for example, online and on mobile devices. Clearly, it is time to find a way to expand the area in which the agencies evaluate a bank's CRA activities. At the same time, it is important to retain a focus on place--and in particular the credit needs of local communities. We are confident that there are ways to expand the area where we evaluate a bank's CRA performance without losing the regulation's focus on the unique role banks play in meeting local credit needs and providing services that are only possible by using branches. Treasury's recommendation that the agencies revisit the regulations to allow CRA consideration for a bank's activities in its assessment area, as currently delineated around branches and deposit-taking automated teller machines, as well as in low- and moderate- income areas outside that branch footprint, is a reasonable place to start our interagency discussions. Second, banks should be encouraged to seek opportunities in areas that are underserved. Currently, a bank's performance in its major markets is evaluated most closely and weighs most heavily in its CRA rating. This emphasis has resulted in what banks and community organizations refer to as credit "hot spots" where there is a high density of banks relative to investment opportunities. Meanwhile, other areas have a difficult time attracting capital because they are not in a bank's major market, if they are served by a bank at all. Key priorities in any new set of regulations are to eliminate such market distortions and to avoid creating new ones. No matter how we define a bank's assessment area in the future, new regulations need to be designed and implemented in a way that encourages banks to spread their community investment activities across the areas they serve. Third, revised regulations should be tailored, recognizing that banks vary widely in size and business strategy and serve communities with widely varying needs. Banks seek clearer, simpler rules that result in more CRA activity with less burden. We believe this can be done while retaining the flexibility to evaluate a bank's CRA performance in light of its size, business strategy, capacity, and constraints as well as its community's demographics, economic conditions, and credit needs and opportunities. We should not adopt a set of evaluation criteria that would be appropriate for large banks and assume that smaller community banks would be able to meet them without substantial additional burden. We should also be sensitive to the ways in which a bank's business strategy, no matter its size, influences the types of activities it undertakes to meet its CRA obligations. Regulatory revisions that do not contemplate evaluating CRA performance in context risk undermining CRA's greatest attribute--its recognition that banks are uniquely situated to be responsive to the most impactful community and economic development needs in communities. Fourth, we should seek greater consistency in examinations and ratings across the agencies as well as within each agency. Clarity about the activities that qualify for CRA consideration, the area in which those activities will be considered, and the type of demographic and economic information examiners evaluate will go a long way toward promoting consistency. In addition to regulatory revisions, however, the agencies can promote consistency in other, non- regulatory ways. For example, we can improve the way in which examiners present their analysis in written performance evaluations and provide more opportunities for interagency examiner training. Fifth and finally, revised regulations should support CRA's position as one of several mutually reinforcing laws designed to promote an inclusive financial services industry. As banks seek to meet the credit needs of their entire community, it is important to ensure against discriminatory or unfair and deceptive lending practices. In the months ahead, I look forward to receiving input from the many stakeholders about the path forward on modernizing CRA regulations. I want to express my commitment to supporting the goals of CRA as we revise the regulations to better align them with current banking practices. In closing, I want to express my appreciation to you for inviting me to join you today. I am very impressed with the important ways different kinds of organizations represented in this room are working in partnership to promote economic development here in your community.
r180419a_FOMC
united states
2018-04-19T00:00:00
Safeguarding Financial Resilience through the Cycle
brainard
0
I am honored to be here today to participate in the Global Finance Forum. It is a good moment to take stock of the cyclical position of the economy and the health of the banking system. In many respects, where we are today is the mirror image of where we were just a decade ago. The job market is strong, household balance sheets have improved, and business activity is solid. Banks are doing well--credit growth is robust, profitability is strong, and capital and liquidity buffers have been fortified. While this progress is heartening, we cannot afford to be complacent. If we have learned anything from the past, it is that we must be especially vigilant about the health of our financial system in good times, when potential vulnerabilities may be building. Safeguarding resilience through the cycle should be a critical consideration in our ongoing evaluation of the regulatory framework. With that in mind, I will spend a few minutes describing current conditions and then outline our ongoing work to ensure the financial system's buffers continue to sustain resilience over the cycle. Cyclical conditions have been strengthening. Our growth here at home has been bolstered by synchronized growth abroad as well as supportive financial conditions. Employment growth has been heartening, and we are seeing the strong labor market continue to draw prime age Americans back into the labor force from the sidelines. Sizable fiscal stimulus is likely to reinforce cyclical pressures at a time of above-trend growth and tightening resource utilization. There are few historical episodes of similar pro-cyclical fiscal stimulus to draw upon as we assess the outlook. But in the few cases where resource utilization has been near the levels we may soon be approaching, there have been heightened risks either of inflation, in earlier decades, or of financial imbalances more recently. Currently, inflation appears to be well-anchored to the upside around our 2 percent target, but there are some signs of financial imbalances. Our scan of financial vulnerabilities suggests elevated risks in two areas: asset valuations and business leverage. First, asset valuations across a range of markets remain elevated relative to a variety of historical norms, even after taking into account recent market volatility. Corporate bond yields remain low by historical comparison, and spreads of yields on junk bonds above those on comparable-maturity Treasury securities are near the lower end of their historical range. Spreads on leveraged loans and securitized products backed by those loans remain narrow. Prices of multifamily residential and industrial commercial real estate (CRE) have risen, while capitalization rates for these segments have reached historical lows. Second, business leverage outside the financial sector has risen to levels that are high relative to historical trends. In the nonfinancial business sector, the debt-to-income ratio has increased to near the upper end of its historical distribution, and net leverage at speculative-grade firms is especially elevated. As we have seen in previous cycles, unexpected negative shocks to earnings in combination with increased interest rates could lead to rising levels of delinquencies among business borrowers and related stresses to some banks' balance sheets. We continue to assess the overall vulnerabilities in the U.S. financial system to be moderate by historical standards in great measure because post-crisis reforms have strengthened the regulatory and supervisory framework for the largest U.S. banking firms. The crisis revealed a stark weakness in the capital and liquidity positions of many of our large banking organizations that left many of them incapable of dealing with financial stress and necessitated unprecedented government intervention. A primary focus of post-crisis financial reform has been strengthening capital and liquidity buffers at large banking institutions, which has bolstered the safety and soundness of these institutions and reduced systemic risk more broadly. In terms of liquidity, not only do our largest firms now have the right kind and amount of liquidity calibrated to their funding needs and to their likely run risk in stressed conditions, but they also are required to know where it is at all times and to ensure it is positioned or readily accessible where it is most likely to be needed in resolution. Prior to the crisis, many of the largest firms did not even have a good handle on where their liquidity was positioned. For example, our largest banking firms have increased their holdings of high quality liquid assets from 12 percent of assets in 2011 to 20 percent of assets in 2017, and they have reduced their reliance on short-term wholesale funding from 37 percent of liabilities in 2011 to 25 percent of liabilities in 2017. This, combined with critical reforms to money market funds and other vital short-term funding markets, have reduced the vulnerabilities in the financial system associated with liquidity mismatch and maturity transformation. In terms of capital, the quality of capital has improved with a particular focus on common equity, the most loss-absorbing form of capital. The quantity of capital also has increased through higher minimum requirements and new capital conservation buffers that require banking firms to keep their capital levels well above the minimums in order to maintain full flexibility to allocate profits to capital distributions and employee bonus payments. These buffers increase the ability of banking organizations to absorb losses and continue to lend to households and businesses, including during times of stress. Indeed, the common equity capital to risk-weighted assets ratio of the bank holding companies participating in the Comprehensive Capital Analysis and Review has more than doubled from 5.5 percent in the first quarter of 2009 to 12 percent in the fourth quarter of 2017. We now regularly conduct comprehensive stress tests of the largest banking firms to help ensure that their capital distribution plans are consistent with their ability to lend and withstand severe macroeconomic and financial stress like that observed during the financial crisis. One key benefit of our stress testing program is that it promotes a dynamic forward-looking assessment of a bank's capital adequacy in the face of severe stress. It is critical to maintain a dynamic capital regime that anticipates rapidly changing risks and business conditions. Without such a dynamic focus, there is a risk that regulators and banking institutions end up spending too much time looking in the rear-view mirror and not enough time looking ahead for emerging risks. The stress testing capital regime applied to Fannie and Freddie before the crisis offers a sobering reminder of the dangers of failing to update stress tests in the face of changing market practices and emerging risks. The Federal Reserve has also imposed risk-based and leverage capital surcharges on the most systemic banking firms to ensure they internalize the costs their failure would have on the financial system and to provide an incentive to reduce their systemic footprint. We have recently released a proposal for comment to introduce a "stress capital buffer" or SCB that would integrate the forward-looking stress test results into each institution's ongoing capital requirements. Some observers contend that current capital requirements are too onerous and are choking off credit. But the evidence suggests otherwise: U.S. bank lending has been healthy over recent years and profits are strong. By any measure, U.S. banks appear very competitive relative to their international peers. In that regard, the current level of capital is a sign of strength. While there is a natural tendency to question the value of capital buffers when times are good, the severe costs associated with not having enough capital to absorb losses become all too evident in a downturn. By the time losses are rising, it is generally too late to start building buffers, which became all too clear with devastating consequences in some countries during the last crisis. I support efforts to identify improvements that make regulations less burdensome. But it is vital to be prudent regarding any material changes to the core capital and liquidity framework, and not lose sight of the need to safeguard financial resilience through the cycle. Prudence would argue for waiting until we have tested how the new framework performs through a full cycle before we make judgments about its performance. At this point in the cycle, it is premature to revisit the calibration of core capital and liquidity requirements for the large banking institutions. History suggests that a booming economy can lead to a relaxation in lending standards and an attendant increase in risky debt levels. I would be reluctant to see our large banking institutions releasing the capital and liquidity buffers that they have built so effectively over the past few years, at a time when cyclical pressures and vulnerabilities in the broader financial system are building. Indeed, if cyclical pressures continue to build and financial vulnerabilities broaden, it may become appropriate to ask the largest banking organizations to build a countercyclical buffer (CCyB) of capital to maintain an adequate degree of resilience against stress. The CCyB is an additional margin of capital that the nation's largest banks can be asked to build to sustain resilience when there is an elevated risk of above-normal losses, which often follow periods of rapid asset price appreciation or credit growth. This buffer is intended to be released as the economy weakens in order to allow banks to lend more when it is most needed. Countercyclical capital requirements can lean against rising financial vulnerabilities at a time when the degree of monetary tightening that would be needed to achieve the same goal would be inconsistent with the dual mandate goals of full employment and price stability. Moreover, countercyclical capital requirements build resilience, unlike monetary policy. The CCyB framework, which was finalized in September 2016, requires the Reserve Board to vote at least once per year on the level of the CCyB. While other jurisdictions have developed some experience with the use of countercyclical buffers, in the United States, the CCyB has so far not yet been activated. he condition set out in September 2016 for raising the CCyB above its minimum value of zero is that financial system vulnerabilities are meaningfully above normal. When the CCyB rule was issued in September 2016, it was calibrated against the backdrop of the established levels of required U.S. structural buffers. Thus, it would be prudent to accompany any consideration of material adjustments to the calibration of the structural buffers held by the large banking institutions, with compensating adjustments to the countercyclical buffer in order to achieve the same overall level of resilience through the cycle. While we have made important progress in our regulatory framework, we still have not implemented a few key elements. The list of remaining items is short but important and well anticipated. First, we are close to finalizing the net stable funding ratio, or NSFR. This significant liquidity regulation is important to ensure that large banking firms maintain a stable funding profile over a one-year horizon. It will serve as a natural complement to our existing liquidity coverage ratio, which helps ensure firms can withstand liquidity strains over a 30-day time horizon. And by most estimates, our large complex banking institutions are in a position to meet the expected requirements with little adjustment. Second, we need to finalize Dodd-Frank Act limits on large counterparty exposures. These limits will reduce the chances that outsized exposures, particularly between large financial institutions, could spread financial distress and undermine financial stability as we witnessed during the last financial crisis. Moreover, these large exposure limits will effectively update the traditional bank lending limits that proved useful for well over 100 years for today's challenges, by recognizing the many ways in which banks and their affiliates take on credit exposure beyond directly extending loans. I support efforts to improve the efficacy of the Volcker rule while preserving its underlying goal of prohibiting banking firms from engaging in speculative activities for which federal deposit insurance and other safeguards were never intended. The interagency regulation implementing the Volcker rule is not the most effective way of achieving its very laudable and important goal. We are exploring ways to streamline and simplify the regulation to reduce costs without weakening the key objectives. We should be able to provide firms and supervisors with greater clarity about what constitutes permissible market-making. We should also identify ways to further tailor the Volcker compliance regime to focus on firms with large trading operations and reduce the compliance burden for small banking entities with limited trading operations. I also support moving forward with minimum haircuts for securities financing transactions (SFTs) on a marketwide basis to counter the growth of volatile funding structures outside the banking sector. International agreement on a regulatory framework for minimum SFT haircuts was reached by financial regulators in 2015, and it is important to follow through on this work plan. While current market practices in this area may well exhibit much better risk management than pre-crisis, past experience suggests we cannot rely on prudent practices to remain in place as competitive and cyclical pressures build. Regulatory minimum haircuts calibrated to be appropriate through the cycle could help ensure that repo, securities lending, and securities margin lending and related markets do not become a source of instability in periods of financial stress through fire sales and run-type behavior. Here, I have focused primarily on the reforms that are most important for the resilience of the large interconnected banking organizations at the core of our system. Outside of this group, I favor better tailoring the regulatory framework for our smaller banking firms so as to decrease regulatory burden. While we have taken some important steps to reduce burden on smaller banking organizations such as streamlining the Call Report for small, less complex community banks, increasing appraisal thresholds for CRE loans, and reducing the frequency of exams in certain circumstances, there is more we should do. History and experience show that stable economic growth is aided by strong regulatory buffers that bolster the resilience of our large banking organizations and help reduce the severity of downturns. At a time when cyclical pressures are building, and asset valuations are stretched, we should be calling for large banking organizations to safeguard the capital and liquidity buffers they have built over the past few years. Maintaining resilience over the cycle can be accomplished through a combination of structural and countercyclical buffers whose calibrations are inherently linked. While we should carefully consider how to make our regulations more effective and better tailored, we must take great care to ensure that we do not inadvertently contribute to pro-cyclicality that would exacerbate financial conditions that are, on some dimensions, somewhat stretched. Although I believe it is too early today to reassess the calibration of existing capital and liquidity buffers because they have yet to be tested through a full economic cycle, I look forward to efforts that are planned in future years in the international standard-setting bodies to assess the framework quantitatively.
r180504a_FOMC
united states
2018-05-04T00:00:00
Liquidity Regulation and the Size of the Fed's Balance Sheet
quarles
0
For release on delivery Remarks by at Vice Chairman for Supervision Quarles updated his remarks to reflect their publication in Thank you very much to the Hoover Institution for hosting this important conference and to John Taylor and John Cochrane for inviting me to participate. In my capacity as both the Vice Chairman for Supervision at the Board of Governors and a member of the Federal Open Market Committee (FOMC), part of my job is to consider the intersection of financial regulatory and monetary policy issues, the subject of my discussion today. This topic is both complex and dynamic, especially as both regulation and the implementation of monetary policy continue to evolve. One important issue for us at the Fed, and the one that I will spend some time reflecting on today, is how post-crisis financial regulation, through its incentives for bank behavior, may influence the size and composition of the Federal Reserve's balance sheet in the long run. Obviously, the whole excessively kaleidoscopic body of financial regulation is admittedly difficult to address in the time we have today, so I will focus on a particular component -- the Liquidity Coverage Ratio (LCR--and its link to banks' demand for U.S. central bank reserve balances. Besides illuminating this particular issue, I hope my discussion will help illustrate the complexities associated with the interconnection of regulatory and monetary policy issues in general. Also, let me emphasize at the outset that I will be touching on some issues that the Board and the FOMC are in the process of observing and evaluating and, in some cases, on which we may be far from reaching any final decisions. As such, my thoughts on these issues are my own and are likely to evolve, benefiting from further discussion and our continued monitoring of bank behavior and financial markets over time. Before I delve into the more specific complicated subject of how one type of bank regulation affects the Fed's balance sheet, let me say a few words about financial regulation more generally. As I have said previously, I view promoting the safety, soundness, and efficiency of the financial system as one of the most important roles of the Board. Improving efficiency of the financial system is not an isolated goal. The task is to enhance efficiency while maintaining the system's resiliency. Take, for example, the Board's two most recent and material proposals, the stress capital buffer and the enhanced supplementary leverage ratio (eSLR). The proposal to modify the eSLR, in particular, initially raised questions in the minds of some as to whether it would reduce the ability of the banking system to weather shocks. A closer look at the proposal shows that the opposite is true. The proposed change simply restores the original intent of leverage requirements as a backstop measure to risk-based capital requirements. As we have seen, a leverage requirement that is too high favors high-risk activities and disincentivizes low- risk activities. We had initially calibrated the leverage ratio at a level that caused it to be the binding constraint for a number of our largest banks. As a result, those banks had an incentive to add risk rather than reduce risk in their portfolios because the capital cost of each additional asset was the same whether it was risky or safe, and the riskier assets would produce the higher return. The proposed recalibration eliminates this incentive by returning this leverage ratio to a level that is a backstop rather than the driver of decisions at the margin. Yet, because of the complex way our capital regulations work together-- with risk-based constraints and stress tests regulating capital at both the operating and holding company levels--this improvement in incentives is obtained with virtually no change in the overall capital requirements of the affected firms. Federal Reserve staff estimate the proposal would potentially reduce capital requirements across the eight large banks subject to the proposal by $400 million, or 0.04 percent of the $955 billion in capital these banks held as of September 2017. So this recalibration is a win-win: a material realignment of incentives to reduce a regulatory encouragement to take on risk at a time when we want to encourage prudent behavior without any material capital reduction or cost to the system's resiliency. Taken together, I believe these new rules will maintain the resiliency of the financial system and make our regulation simpler and more risk sensitive. Let me now back up to the time just before the financial crisis and briefly describe the genesis of liquidity regulations for banks. Banking organizations play a vital role in the economy in serving the financial needs of U.S. households and businesses. They perform this function in part through the mechanism of maturity transformation--that is, taking in short-term deposits, thereby making a form of short-term, liquid investments available to households and businesses, while providing longer-term credit to these same entities. This role, however, makes banking firms vulnerable to the potential for rapid, broad-based outflows of their funding (a so-called run), and these institutions must therefore balance the extent of their profitable maturity transformation against the associated liquidity risks. Leading up to the 2007-09 financial crisis, some large firms were overly reliant on certain types of short-term funding and overly confident in their ability to replenish their funding when it came due. Thus, during the crisis, some large banks did not have sufficient liquidity, and liquidity risk management at a broader set of institutions proved inadequate at anticipating and compensating for potential outflows, especially when those outflows occurred rapidly. In the wake of the crisis, central banks and regulators around the world implemented a combination of regulatory reforms and stronger supervision to promote increased resilience in the financial sector. With regard to liquidity, the prudential regulations and supervisory programs of the U.S. banking agencies have resulted in significant increases in the liquidity positions and changes in the risk management of our largest institutions. And, working closely with other jurisdictions, we have also implemented global liquidity standards for the first time. These standards seek to limit the effect of short-term outflows and extended overall funding mismatches, thus improving banks' liquidity resilience. One particular liquidity requirement for large banking organizations is the Liquidity Coverage Ratio, or LCR, which the U.S. federal banking agencies adopted in The LCR rule requires covered firms to hold sufficient high-quality liquid assets (HQLA)--in terms of both quantity and quality--to cover potential outflows over a 30-day period of liquidity stress. The LCR rule allows firms to meet this requirement with a range of cash and securities and does not apply a haircut to reserve balances or Treasury securities based on the estimated liquidity value of those instruments in times of stress. Further, firms are required to demonstrate that they can monetize HQLA in a stress event without adversely affecting the firm's reputation or franchise. The rules have resulted in some changes in the behavior of large banks and in market dynamics. Large banks have adjusted their funding profiles by shifting to more stable funding sources. Indeed, taken together, the covered banks have reduced their reliance on short-term wholesale funding from about 50 percent of total assets in the years before the financial crisis to about 30 percent in recent years, and they have also reduced their reliance on contingent funding sources. Meanwhile, covered banks have also adjusted their asset profiles, materially increasing their holdings of cash and other highly liquid assets. In fact, these banks' holdings of HQLA have increased significantly, from fairly low levels at some firms in the lead-up to the crisis to an average of about 15 to 20 percent of total assets today. A sizable portion of these assets currently consists of U.S. central bank reserve balances, in part because reserve balances, unlike other types of highly liquid assets, do not need to be monetized, but also, importantly, because of the conduct of the Fed's monetary policy, a topic to which I will next turn. With this backdrop, a relevant question for monetary policymakers is, what quantity of central bank reserve balances will banks likely want to hold, and, hence, how might the LCR affect banks' reserve demand and thereby the longer-run size of the Fed's balance sheet? Let me emphasize that policymakers have long been aware of the potential influence that regulations may have on reserve demand and thus the longer-run size of the Fed's balance sheet. And, of course, regulatory influences on banks' behavior, my focus today, is just one of many factors that could affect policymakers' decisions regarding the appropriate long-run size of the Fed's balance sheet. In particular, in that it "currently anticipates reducing the quantity of reserve balances, over time, to a level appreciably below that seen in recent years but larger than before the financial crisis" and went on to note that "the level will reflect the banking system's demand for reserve balances and the Committee's decisions about how to implement monetary policy most efficiently and effectively in the future." With that said, it is useful to begin by examining banks' current reserve holdings. Figure 1 plots the aggregate level of reserve balances in the U.S. banking system, starting well before the financial crisis. As you can see, the current level of reserves--at around $2 trillion--is many orders of magnitude higher than the level that prevailed before the financial crisis, a result of the Fed's large-scale asset purchase programs or "quantitative easing." The vertical lines in the figure show key dates in the implementation of the LCR, including the initial Basel III international introduction of the regulation followed by its two-step introduction in the United States. A key takeaway from this figure is that the Fed was in the process of adding substantial quantities of reserve balances to the banking system while the LCR was being implemented--and these two changes largely happened simultaneously. As a result, banks, in aggregate, are currently using reserve balances to meet a significant portion of their LCR requirements. In addition, because these changes happened together, it is reasonable to conclude that the current environment is likely not very informative about banks' underlying demand for reserve balances. But now the situation is changing, albeit very slowly. Last October, the Fed began to gradually and predictably reduce the size of its balance sheet. The Fed is doing so by reinvesting the principal payments it receives on its securities holdings only to the extent that they exceed gradually increasing caps--that is, the Fed is allowing securities to roll off its portfolio each month up to a specific maximum amount. This policy is also reducing reserve balances. So far, after the first seven months of the program, the Fed has shed about $120 billion of its securities holdings, which is a fairly modest amount when compared with the remaining size of its balance sheet. Consequently, the level of reserves in the banking system is still quite abundant. So, how many more reserve balances can be drained, and how small will the Fed's balance sheet get? Let me emphasize that this question is highly speculative--we have not decided ex ante the desired long-run size of the Fed's balance sheet, nor, as I noted earlier, do we have a definitive handle on banks' long-run demand for reserve balances. Indeed, the FOMC has said that it "expects to learn more about the underlying demand for reserves during the process of balance sheet normalization." let me spend a little time reflecting on this challenging question. How banks respond to the Fed's reduction in reserve balances could, in theory, take a few different forms. One could envision that as the Fed reduces its securities holdings, a large share of which consists of Treasury securities, banks would easily replace any reduction in reserve balances with Treasury holdings, thereby keeping their LCRs roughly unchanged. According to this line of thought, because central bank reserve balances and Treasury securities are treated identically by the LCR, banks should be largely indifferent to holding either asset to meet the regulation. In that case, the reduction in reserves and corresponding increase in Treasury holdings might occur with relatively little adjustment in their relative rates of return. Alternatively, one could argue that banks may have particular preferences about the composition of their liquid assets. And since banks are profit-maximizing entities, they will likely compare rates of return across various HQLA-eligible assets in determining how many reserves to hold. If relative asset returns are a key driver of reserve demand, then interest rates across various types of HQLA will adjust on an ongoing basis until banks are satisfied holding the aggregate quantity of reserves that is available. Recent research by the Board staff shows that banks currently display a significant degree of heterogeneity in their approaches to meeting their LCR requirements, including in their chosen volumes of reserve balances. Figure 2 shows a subset of this research to illustrate this point. The top and bottom panels represent estimates of how two large banks have been meeting their HQLA requirements over time. In each panel, the blue portions of the bars denote the share of HQLA met by reserve balances, while the red, yellow, and brown slices of the bars represent the share met by Treasury securities, agency mortgage-backed securities, and other HQLA-eligible assets, respectively. Despite holding roughly similarly amounts of HQLA, the two banks exhibit very different HQLA compositions, with the bank depicted in the top panel consistently holding a much larger share of HQLA in the form of reserve balances than the bank shown in the bottom panel. This finding suggests that there likely is no single "representative bank" behavioral model that can capture all we might want to know about banks' demand for central bank reserve balances. Some of the differences we see in bank behavior likely relate to banks' individual liquidity needs and preferences. Indeed, banks manage their balance sheets in part by taking into account their internal liquidity targets, which are determined by the interaction between the specific needs of their various business lines and bank management's preferences. In any case, this picture illustrates the complexities that are inherent in understanding banks' underlying demand for reserve balances, a topic for which more research would be quite valuable to policymakers. So, what does this finding say about the longer-run level of reserve balances demanded by banks? The answer is that there is a large degree of uncertainty. In fact, the Federal Reserve Bank of New York surveyed primary dealers and market participants last December to solicit their views about the level of reserves they expect to prevail in A few features of the survey responses stand out. All respondents thought that the longer-run level of reserve balances would be substantially lower than the current level of more than $2 trillion. In addition, there appeared to be a widely held view that the longer-run level of reserves will be significantly above the level that prevailed before the financial crisis. But even so, the respondents did not agree about what that longer-run level will be, with about half expecting a level ranging between $400 billion and $750 billion. It is also important to point out that the Fed's balance sheet will remain larger than it was before the crisis even after abstracting from the issue of banks' longer-run demand for reserve balances. The reason is that the ultimate size of the Fed's balance sheet also depends on developments across a broader set of Fed liabilities. One such liability is the outstanding amount of Federal Reserve notes in circulation--that is, paper money--which has doubled over the past decade to a volume of more than $1.6 trillion, growing at a rate that generally reflects the pace of expansion of economic activity in nominal terms. When I left my position in the Bush Treasury in 2006, by contrast, the total amount of paper currency outstanding was not quite $800 billion. Other nonreserve liabilities have also grown since the crisis, including the Treasury Department's account at the Fed, known as the Treasury's General Account. Recent growth in such items means that the longer-run size of the Fed's balance sheet will be noticeably larger than before the crisis regardless of the volume of reserve balances that might ultimately prevail. Putting the various pieces together, figure 3 illustrates how the overall size of the Fed's balance sheet may evolve. Given the uncertainties I have described, I have chosen to show three different scenarios, drawn from the most recent annual report released by the Federal Reserve Bank of New York, which was published last month. scenarios highlight the degree to which the longer-run size of the Fed's domestic securities portfolio--also known as the System Open Market Account, or SOMA, which accounts for the vast majority of the Fed's assets--will be affected by choices about the future level of reserve balances and the evolution of nonreserve liabilities. The assumptions underlying the scenarios are based on the distribution of responses from the surveys I described earlier, as those surveys also asked respondents to forecast the likely longer-run levels of several liabilities on the Fed's balance sheet other than reserves. The "median" scenario, represented by the red (middle) line in the figure, is based on the percentile of survey responses, while the "larger" and the "smaller" scenarios, denoted by the gold dashed (top) and blue dotted (bottom) lines, are based on the 75th and 25th percentiles, respectively. The figure illustrates that the Fed's securities holdings are projected to decline about $400 billion this year and another $460 billion next year as Treasury and agency securities continue to roll off gradually from the Fed's portfolio. The kink in each curve captures what the FOMC has referred to as the point of "normalization" of the size of the Fed's balance sheet--that is, the point at which the balance sheet will begin to expand again to support the underlying growth in liabilities items such as Federal Reserve notes in circulation. All else being equal, greater longer-run demand for currency, reserve balances, or other liabilities implies an earlier timing of balance sheet normalization and a higher longer-run size of the balance sheet, as illustrated by the top line. And the converse--smaller demand for these liabilities and a later timing of normalization, illustrated by the bottom line--is also possible. In the three scenarios shown, the size of the Fed's securities portfolio normalizes sometime between 2020 and 2022. That is quite a range of time, so as the balance sheet normalization program continues, the Fed will be closely monitoring developments for clues about banks' underlying demand for reserves. What will the Fed be monitoring as reserves are drained and the balance sheet shrinks? I would first like to emphasize that the Fed regularly monitors financial markets for a number of reasons, so I do not mean to imply that we will be doing anything that is very much different for our normal practice. As reserves continue to be drained, we will want to gauge how banks are managing their balance sheets in continuing to meet their LCRs, watching in particular how the distribution of reserve balances across the banking system evolves as well as monitoring any large-scale changes in banks' holdings of other HQLA-eligible assets, including Treasury securities and agency mortgage-backed securities. And on the liabilities side of banks' books, we will be keeping our eye on both the volume and the composition of deposits, as there are reasons why banks may take steps, over time, to hold onto certain types of deposits more than others. In particular, retail deposits may be especially desired by banks going forward because they receive the most favorable treatment under the LCR and also tend to be relatively low cost. Retail deposits have grown quite a bit since the crisis, especially in light of the prolonged period of broad-based low interest rates and accommodative monetary policy, limiting the need for banks to compete for this most stable form of deposits. However, the combination of rising interest rates and the Fed's shrinking balance sheet, together with banks' ongoing need to meet the LCR, may alter these competitive dynamics. Of course, importantly, deposits will not necessarily decline one-for-one with reserve balances as the Fed's balance sheet shrinks. The overall effects of the decline in the Fed's balance sheet will depend both on who ultimately ends up holding the securities in place of the Fed and on the full range of portfolio adjustments that other economic agents ultimately make as a result. We will also be monitoring movements in interest rates. In part, we will be tracking how the yields and spreads on the various assets that banks use to meet their LCR requirements evolve. For example, to the extent that some banks will wish to keep meeting a significant portion of their LCR requirements with reserves, the reduction in the Fed's balance sheet and the associated drop in aggregate reserves could eventually result in some upward pressure on the effective federal funds rate and on yields of Treasury securities. This situation could occur if some banks eventually find that they are holding fewer reserves than desired at a given constellation of interest rates and, in response, begin to bid for more federal funds while selling Treasury securities or other assets. Interest rates will adjust up until banks are indifferent with regard to holding the relatively smaller volume of reserves available in the banking system. Overall, we will be monitoring to make sure that the level of reserves the Fed supplies to the banking sector, which influences the composition of assets and liabilities on banks' balance sheets as well as market interest rates, provides the desired stance of monetary policy to achieve our dual mandate of maximum employment and stable prices. Of course, we will need to be very careful to understand the precise factors that underlie any significant movements in these areas, because factors that are unrelated to the Fed's balance sheet policies might also cause such adjustments. To conclude, I would like to reemphasize that I have touched on some highly uncertain issues today--issues that, I would like to stress again, have not been decided by the FOMC. One such issue that closely relates to my remarks today, and one I believe the upcoming panel will likely address, is which policy implementation framework the Fed should use in the long run. That is, broadly speaking, should the Fed continue to use an operational framework that is characterized by having relatively abundant reserves and operate in what is termed a "floor regime," or should it use one in which the supply of reserves is managed so that it is much closer to banks' underlying demand for reserves as in a "corridor regime"? Of course, a host of complex issues underlie this decision, so I would just like to emphasize two general points. First, a wide range of quantities of reserve balances--and thus overall sizes of the Fed's balance sheet--could be consistent with either type of framework. Second, while U.S. liquidity regulations likely influence banks' demand for reserves, the Fed is not constrained by such regulations in deciding its operational framework, because U.S. banks will be readily able to meet their regulatory liquidity requirements using the range of available high-quality liquid assets, of which reserve balances is one type. Importantly, additional experience with the Federal Reserve's policy of gradually reducing its balance sheet will help inform policymakers' future deliberations regarding issues related to the long-run size of the Fed's balance sheet, issues that will not need to be decided for some time. The final and most general point is simply to underscore the premise with which I began these remarks: Financial regulation and monetary policy are, in important respects, connected. Thus, it will always be important for the Federal Reserve to maintain its integral role in the regulation of the financial system not only for the visibility this provides into the economy, but precisely in order to calibrate the sorts of relationships we have been talking about today.
r180508a_FOMC
united states
2018-05-08T00:00:00
Monetary Policy Influences on Global Financial Conditions and International Capital Flows
powell
1
For release on delivery Panel remarks by at Thank you for inviting me to join you today as part of this distinguished panel. Our subject is the relationship between "center country" monetary policy and global financial conditions, and the policy implications of that relationship both for the center country and for other countries affected. This broad topic has been the subject of a great deal of research and discussion in recent years. Today I will focus in particular on the role of U.S. monetary policy in driving global financial conditions and capital flows. To preview my conclusions, I will argue that, while global factors play an important role in influencing domestic financial conditions, the role of U.S. monetary policy is often exaggerated. And while financial globalization does pose some challenges for monetary policy, efforts to build stronger and more transparent policy frameworks and a more resilient financial system can reduce the adverse consequences of external shocks. The well-known Mundell-Fleming "trilemma" states that it is not possible to have all three of the following things: free capital mobility, a fixed exchange rate, and the ability to pursue an independent monetary policy. The trilemma does not say that a flexible exchange rate will always fully insulate domestic economic conditions from external shocks. And, indeed, that is not the case. We have seen that integration of global capital markets can make for difficult tradeoffs for some economies, whether they have fixed or floating exchange rate regimes. Since the Fed is the central bank of the world's largest economy and issuer of the world's most widely used reserve currency, it is to be expected that the Fed's policy actions will spill over to other economies. To illustrate this point, the scatterplot on the left side of figure 1 focuses on movements in interest rates and exchange rates following Federal Reserve policy announcements. As you can see, changes in U.S. interest rates after Fed policy actions (shown on the horizontal axis) lead to corresponding changes in the value of the dollar (shown on the vertical axis). And because of the dollar's widespread use around the world, these changes in the dollar affect financial conditions abroad. Fed policy-related movements in U.S. bond yields also tend to spill over to bond yields abroad, such as the German yields shown in the scatterplot on the right. Such spillovers are to be expected in a world of highly integrated financial markets. As figure 2 shows, bond yields (left) and equity prices (right) around the world typically move together fairly closely. But the influence of U.S. monetary policy on global financial conditions should not be overstated. The Federal Reserve is not the only central bank whose actions affect global financial markets. In fact, the United States is the recipient as well as the originator of monetary policy spillovers. For example, as seen in figure 3, changes in German yields after European Central Bank policy decisions also pass through to U.S. yields. More broadly, it is notable that although the Fed has raised its target interest rate six times since December 2015 and has begun to shrink its balance sheet, overall U.S. domestic financial conditions have gotten looser, in part due to improving global conditions and central bank policy abroad. Much of the discussion of the spillovers of U.S. monetary policy focuses on their effects on financial conditions in emerging market economies (EMEs). Some observers have attributed the movements in international capital flowing to EMEs since the Global Financial Crisis primarily to monetary stimulus by the Fed and other advanced-economy central banks. The data do not seem to me to fit this narrative particularly well. As illustrated by the blue dashed line in the left panel of figure 4, capital flows to EMEs were already very strong before the Global Financial Crisis, when the federal funds rate was comparatively high. The subsequent surge in capital flows in 2009, as the crisis was abating, largely reflects a rebound from the capital flow interruption during the crisis itself, though highly accommodative monetary policies in advanced economies doubtless also played a role. Moreover, capital flows to EMEs started to ease after 2011, a period when the Federal Reserve continued to add accommodation and reduce yields through increases in its balance sheet, as shown in the right panel. And, more recently, capital flows to EMEs have picked up again despite the fact that the Fed has been removing accommodation since 2015. If U.S. monetary policy is not the major determinant, what other factors have been driving EME capital flows? One prominent factor has been growth differentials between EMEs and advanced economies. In figure 5, the left panel shows that capital inflows to EMEs picked up post-crisis, in line with the widening of this growth differential, while the slowdown in inflows after 2011 coincides with its narrowing. Another related determinant has been commodity prices, as shown in the right panel. The pickup in both global growth and commodity prices over the past couple of years explains a good part of the recent recovery of capital flows to EMEs. Monetary stimulus by the Fed and other advanced-economy central banks played a relatively limited role in the surge of capital flows to EMEs in recent years. There is good reason to think that the normalization of monetary policies in advanced economies should continue to prove manageable for EMEs. Fed policy normalization has proceeded without disruption to financial markets, and market participants' expectations for policy (the red symbols in figure 6) seem reasonably well aligned with policymakers' expectations in the Summary of Economic Projections (the black dots), suggesting that markets should not be surprised by our actions if the economy evolves in line with expectations. It also bears emphasizing that the EMEs themselves have made considerable progress in reducing vulnerabilities since the crisis-prone 1980s and 1990s. Many EMEs have substantially improved their fiscal and monetary policy frameworks while adopting more flexible exchange rates, a policy that recent research shows provides better insulation from external financial shocks. Corporate debt at risk--the debt of firms with limited debt service capacity--has been rising in EMEs, as shown in figure 7. But this rise has been relatively limited outside of China and has begun to reverse as stronger global growth has pushed up earnings. All that said, I do not dismiss the prospective risks emanating from global policy normalization. Some investors and institutions may not be well positioned for a rise in interest rates, even one that markets broadly anticipate. And, of course, future economic conditions may surprise us, as they often do. Moreover, the linkages among monetary policy, asset prices, and the mood of global financial markets are not fully understood. Some observers have argued that U.S. monetary policy also influences capital flows through its effects on global risk sentiment, with looser policy leading to more positive sentiment in markets and tighter policy depressing sentiment. While those channels may well operate, research at both the Fed and the IMF suggests that actions by major central banks account for only a relatively small fraction of global financial volatility and capital flow movements. Nevertheless, risk sentiment will bear close watching as normalization proceeds around the world. What can the Federal Reserve do to foster continued financial stability and economic growth as normalization proceeds? We will communicate our policy strategy as clearly and transparently as possible to help align expectations and avoid market disruptions. And we will continue to help build resilience in the financial system and support global efforts to do the same. . . . . . . . . . . . . . . . . . .
r180515a_FOMC
united states
2018-05-15T00:00:00
Cryptocurrencies, Digital Currencies, and Distributed Ledger Technologies: What Are We Learning?
brainard
0
It is a pleasure to be here today. What better place to discuss digital currencies than in San Francisco, home to so many technology innovators working on new ways to disrupt various aspects of our daily lives? Because of the transformative potential of digital currency and distributed ledger technologies, the Federal Reserve is actively monitoring digital innovations in the financial system. We have been keenly evaluating developments in fintech and digital currencies through a multidisciplinary lens, combining information technology and policy analysis to study their potential implications for payments policy, supervision and regulation, financial stability, monetary policy, and the provision of financial services. This work draws from expertise throughout the Federal Reserve System and benefits from engagement with our colleagues internationally. The past decade has seen a wave of important new developments in digital technologies for payments, clearing, and settlement. Cryptocurrencies represent the leading edge of this digital wave. And it was the advent a decade ago of Bitcoin, the first cryptocurrency, that first gave shape to the vision of a decentralized digital currency. At the heart of any cryptocurrency is the creation of a new type of asset--the unit of the cryptocurrency itself--that is distinct from any traditional form of money used in routine transactions, such as U.S. currency or checking accounts in commercial banks. A typical cryptocurrency would not be a liability of any individual or institution. There is no trusted institution standing behind it. This is in stark contrast to U.S. currency and reserve balances, which are liabilities of the Federal Reserve Banks, and deposit accounts, which are liabilities of a bank or another regulated depository institution backed by federal insurance up to a specific level. And while a typical cryptocurrency may be used in payments, it is not legal tender, in contrast to U.S. currency. A typical cryptocurrency relies on the use of distributed ledger technology, which provides a new way to keep ownership records and transfer ownership from one user to another, often with little to no information about the identity of the owner. For instance, Bitcoin relies on the blockchain, which is run by anonymous computers all over the world linked together through a ledger of anonymized transactions. Digital currencies use automation via computer processing power, networking via the internet, and cryptography to transfer value from one person to another. What is innovative is that the computer code behind these transactions uses automated checks and balances to validate the sender and receiver, and whether there is enough value in the sender's account to make the payment. Traditionally, this validation would be done by banks and payment networks. Instead, with a cryptocurrency, this validation could be done by anyone with enough computing power and resources to participate. Importantly, this technology is not owned or managed by any entity--regulated or not--that would be responsible for its maintenance, security, and reliability. Rather, its maintenance, security, and reliability are handled by a decentralized developer community, which often lacks strong governance. This combination of a new asset, which is not a liability of any individual or institution, and a new recordkeeping and transfer technology, which is not maintained by any single individual or institution, illustrates the powerful capabilities of today's technologies. But there are also serious challenges. For instance, cryptocurrencies have exhibited periods of extreme volatility. If you purchased Bitcoin in December 2017 at a value of over $19,000, your electronic claims would be worth close to half that today. Indeed, Bitcoin's value has been known to fluctuate by one-quarter in one day alone. Such extreme fluctuations limit an asset's ability to fulfill two of the classic functions of money: to act as a stable store of value that people can hold and use predictably in the future, and to serve as a meaningful unit of account that can be used to assign a comparable value of goods and services. In addition to losses, individual investors should be careful to understand the potential for other risks. Cryptocurrencies may raise important investor and consumer protection issues. The lack of strong governance and questions about the applicable legal framework for some cryptocurrencies may make consumers vulnerable to mistakes, thefts, and security breaches without much, or any, recourse. Although the cryptographic technology may be robust to some events, such as the fraudulent double spending of the same units of the cryptocurrency for more than one transaction, the large number of breaches at some cryptocurrency exchanges and wallet providers suggest that significant vulnerabilities may remain with respect to security protections around customers' accounts. These breaches remind us that relying solely on cryptography within the transfer technology is not enough. Ultimately, a more holistic approach to the security of the broader cryptocurrency ecosystem, along with added layers of security on top of cryptography, are likely to be necessary for cryptocurrencies to be widely adopted. Some cryptocurrencies also appear quite vulnerable to money-laundering (BSA/AML, or ledger little to no information about the identity of owners of the cryptocurrency, this essentially mimics a bearer instrument--that is, an instrument whereby the holder of the instrument is presumed to be its owner. Further, cryptocurrencies are easy to transfer across borders. Indeed, a cryptocurrency that mimics a bearer instrument and provides significant anonymity in transactions, including across borders, could raise significant concerns regarding the potential to facilitate illicit activities and associated money laundering. For example, electronic instruments can be easily transferred and stored in large amounts, and peer-to-peer transactions outside of the United States could be very hard to prevent and detect. Such instruments appear to have proven susceptible for use to convey payments to illicit actors--for example, to pay ransoms. Overall, however, the still relatively small scale of cryptocurrencies in relation to our broader financial system and relatively limited connections to our banking sector suggest that they do not currently pose a threat to financial stability. Of course, if cryptocurrencies were to achieve wide-scale use, or their impact were greatly magnified through leverage, the effects could be broader. In particular, adverse developments and shifts in sentiment could cause a global rush to exit this market. As we have seen in other speculative activity in the past, rush- for-the-exits behavior can aggravate price fluctuations, create trading difficulties, and even induce market breakdowns. Thus, we will continue to monitor cryptocurrencies as they evolve, with particular vigilance for any signs of growing materiality to the broader financial system. Given some of the inherent issues and challenges that cryptocurrencies pose for investor and consumer protection and the prevention of money laundering, some have advocated that central banks should create their own digital forms of currency as more stable and reliable alternatives to cryptocurrencies. After all, a central bank digital currency could overcome the volatility risks associated with an unbacked asset with no intrinsic value by substituting a digital instrument that is the direct liability of the central bank. Moreover, advocates suggest a central bank would be able to develop a transfer mechanism that has robust governance. Even though central bank digital currencies may at first glance appear to address a number of challenges associated with the current crop of cryptocurrencies, this appeal may not withstand closer scrutiny. First, there are serious technical and operational challenges that would need to be overcome, such as the risk of creating a global target for cyberattacks or a ready means of money laundering. For starters, with regard to money laundering risks, unless there is the technological capability for effective identity authentication, a central bank digital currency would provide no improvement over physical notes and could be worse than current noncash funds transfer systems, especially for a digital currency that could circulate worldwide. In addition, putting a central bank currency in digital form could make it a very attractive target for cyberattacks by giving threat actors a prominent platform on which to focus their efforts. Any implementation would need to adequately deal with a variety of cyber threats--especially for a reserve currency like the U.S. dollar. Second, the issuance of central bank digital currency could have implications for retail banking beyond payments. If a successful central bank digital currency were to become widely used, it could become a substitute for retail banking deposits. This could restrict banks' ability to make loans for productive economic activities and have broader macroeconomic consequences. Moreover, the parallel coexistence of central bank digital currency with retail banking deposits could raise the risk of runs on the banking system in times of stress and so have adverse implications for financial stability. Finally, there is no compelling demonstrated need for a Fed-issued digital currency. Most consumers and businesses in the U.S. already make retail payments electronically using debit and credit cards, payment applications, and the automated clearinghouse network. Moreover, people are finding easy ways to make digital payments directly to other people through a variety of mobile apps. New private-sector real-time payments solutions are beginning to gain acceptance in the United States. And the Faster Payments Task Force has laid out a roadmap embraced by a variety of stakeholders for a fast, ubiquitous, and secure payments system to be in place in the United States in the next few years. In short, a multiplicity of mechanisms are likely to be available for American consumers to make payments electronically in real time. As such, it is not obvious what additional value a Fed-issued digital currency would provide over and above these options. It is important for the Fed and other central banks to continue to research these issues as technology evolves, exploring the technical and economic possibilities and limitations of central- bank-issued digital currencies. Even though the case for a digital currency for general use may not be compelling, opportunities for more targeted and restricted use may nonetheless prove to have value. The private sector has been exploring a variety of ways of deploying the underlying technologies of digital assets that are native to a particular wholesale platform, to help to facilitate finality of settlement. Such wholesale digital settlement tokens could potentially reduce the time and costs required for wholesale financial transactions. This is being discussed, for instance, for the use cases of interbank payments, securities settlements, and cross-border transactions, where the introduction of a digital token native to a platform may facilitate certain types of settlement. Likewise, it is possible at some point in the future that a limited central bank digital instrument that serves as a settlement asset for wholesale payment and settlement activity may hold some promise. Several central banks have been studying this issue, and we have been actively watching these developments. We are also interested in work that decouples the underlying distributed ledger technology from cryptocurrencies and attempts to build on the benefits of the technology, a topic to which I now turn. Even if cryptocurrencies prove to have a very limited role in the future, the technology behind them is likely to live on and offer improvements in the way we transfer and record more traditional financial assets. Distributed ledger technology could also facilitate other applications that could improve the way we share information, validate possessions, and handle logistics. Recall that distributed ledger technology is the mechanism for recordkeeping and transfer of ownership that underpins cryptocurrencies. Over the past few years, the financial industry has conducted a great deal of research and development on how to adapt the more promising aspects of distributed ledger technology for use with more traditional financial assets. The industry has moved a number of these projects through a series of phases, often developing more incremental changes at first in order to gain confidence in the technology before tackling large projects with significant operational impacts. The industry is making steady progress and some projects could be live in some form this year. Many of the use cases focus on the areas of post-trade clearing and settlement of securities transactions, cross-border payments solutions, and trade finance. The common thread running through these use cases is the presence of operational "pain points" that generate inefficiencies and delays for users. For example, post-trade reconciliation of securities transactions can be a time-consuming and resource-intensive process that involves numerous parties, operational steps, and message flows across the counterparties and their various agents involved in the transactions. Distributed ledger technology has the potential to provide synchronized, real-time views for those counterparties and agents that can speed up the process and reduce errors. For cross-border transactions, the process for sending payments via the existing correspondent banking network can add time and money. Distributed ledger technology could potentially lower the costs and time it takes funds to reach the recipient through more direct connections, reducing the number of intermediaries required to effect the transaction. The financial industry has been working on versions of distributed ledger technology that help address a number of concerns, including the loose governance around the maintenance, security, and reliability of the technology for cryptocurrencies. Most projects are organized either as partnerships between technology and financial services firms or through consortia of technology firms, financial firms, and other interested parties. To some degree, these alliances may provide prototype governance arrangements for future technology deployments in financial services. In addition, there are exchanges and clearinghouses that are actively exploring the use of distributed ledger technology, which represent the more traditional model of multilateral organization in the financial markets. Although the governance arrangements may need to evolve over time, one thing that is clear is that strong governance arrangements will be required to provide the coordinated operational and financial risk management for the critical clearing and settlement operations that underpin our financial markets. In addition, the industry continues to make progress on the ability of distributed ledger technology to handle the very large volumes of transactions that take place both in financial markets and in retail payments every day. As I highlighted in 2016, this technical challenge of achieving the necessary scale and through put is an important hurdle. Much of this challenge has been tied to the time it takes to achieve "consensus" on a distributed ledger. Consensus is the process by which new transactions are broadcast to all the participants, or nodes, in the network and each node accepts those new transactions as valid additions to the ledger. The initial consensus method used by Bitcoin, called "proof of work," is designed to deal with the lack of information and trust among the users of the network by providing tools and incentives to overcome this problem. But it is a highly resource-intensive process that limits the number of transactions that can be processed each second. The proof of work consensus model represents a tradeoff between operational efficiency and scalability, on the one hand, and the ability to operate without sufficient trust or information about the entities in the network, on the other hand. Fundamentally, however, the financial industry does not operate as a trustless network. Rather, the industry has long specialized in the collection and analysis of information about customers and counterparties as a core part of banking operations. Even allowing for the inevitable imperfect information that may result, it would seem natural for the financial industry to be able to leverage institutional information and trust in ways that allow for more efficient methods to achieve consensus than proof of work. Consequently, the industry and the academic community have focused a great deal of attention on various consensus methods that can provide greater scalability either by leveraging trust, which relaxes some operational and incentive constraints, or possibly by devising methods without trust that are much less resource intensive. Some of the technology firms working with the financial industry are taking different approaches in this fast-moving arena. Another important challenge for the industry has been leveraging distributed ledger technology while preserving the confidentiality of transactional information. At its core, distributed ledger technology is a shared ledger across multiple nodes in a network, likely representing multiple firms and legal entities. Ownership records and transactions flows from accounts on such a ledger are typically copied and stored on all the nodes in the network. The financial industry, however, must develop distributed ledgers that adhere to laws, regulations, and policies that protect important information of the parties and their customers. Clearly, a model where every entity on the network can see everyone else's account holdings and transactions history will not satisfy broad industry confidentiality requirements. In addition, stored data that may be protected cryptographically today may not be protected as the technology continues to advance, which adds even more difficulty and urgency to the work on confidentiality. The industry has been working to develop approaches to preserve confidentiality so that only the authorized parties relevant to a transaction can see the details recorded on the ledger. Some of these approaches involve encrypting data on the ledger so that the ledgers can still be copied across all the nodes in the network, but an entity cannot look at any element of that ledger except for transactions in which it has been involved. Other approaches include so-called zero- knowledge proofs or ring signatures that allow entities to validate transactions without seeing confidential information. Still others are looking at platforms that connect multiple ledgers rather than having one single ledger that is copied across all nodes in the network. While questions remain about the usefulness and viability of each of these approaches, it is important to underscore that preserving confidentiality is an important area of research. Finally, perhaps the biggest potential benefit for payments, clearing, and settlement of distributed ledger technology may be resiliency. Distributed ledger technology may enable a network to continue to operate even if some of the nodes on the network are compromised because of the ability of the other nodes in the network to pick up the slack and continue processing transactions. One challenge going forward will be to understand the implications that the confidentiality tools and different approaches to consensus under consideration may have on the resilience of the distributed ledger. Given that resiliency is a key potential benefit of distributed ledger technology over existing platforms, it is critical to understand the trade-offs between resiliency and a consensus method that focuses on operational speed, or between resilience and confidentiality. It is an exciting time for the financial sector as digital innovations are challenging conventional thinking about currency, money, and payments. Cryptocurrencies are strikingly innovative but also pose challenges associated with speculative dynamics, investor and consumer protections, and money-laundering risks. Although central bank digital currencies may be able to overcome some of the particular vulnerabilities that cryptocurrencies face, they too have significant challenges related to cybersecurity, money laundering, and the retail financial system. Even so, digital tokens for wholesale payments and some aspects of distributed ledger technology--the key technologies underlying cryptocurrencies--may hold promise for strengthening traditional financial instruments and markets. I have highlighted a few key areas where the technology is advancing to deal with some important policy, business, and operational challenges. The Federal Reserve is dedicated to continuing to monitor industry developments and conduct research in these vital areas. I remain optimistic that the financial sector will find valuable ways to employ distributed ledger technology in the area of payments, clearing, and settlement in coming years.
r180516a_FOMC
united states
2018-05-16T00:00:00
Trust Everyone--But Brand Your Cattle: Finding the Right Balance in Cross-Border Resolution
quarles
0
Thank you to Professor Scott for inviting me to join this discussion on cross-border resolution and risks of fragmentation. Like many of my international counterparts in the audience, I maintain a deep commitment to cross-border banking and efficient movement of capital and liquidity, which are important contributors to long-term economic growth. And it is with that commitment in mind that I have considered the topic of today's conference: "ring- fencing," beginning with some reflections on what the term means. Many here use the term to describe local capital and liquidity requirements, which are imposed ex ante on local subsidiaries and designed to protect those entities and their creditors from losses. The term is also used to refer to disruptive actions taken by host regulators to seize assets in the moment of crisis. type of ring-fencing occurs suddenly and unilaterally. Both uses of the term are associated with the risk arising from the stress or failure of a global financial institution; however, whether ring-fencing as I first defined it--prepositioning--is helpful or harmful in minimizing this risk depends on one's perspective. The views of different stakeholders tend to vary depending on whether one is seeking to maximize efficient allocation of resources in good times or minimize losses in stress and, importantly, whether one is a home or host regulator. Before the financial crisis, much of our collective orientation was on maximizing the efficient flow of capital across the globe. This should remain a paramount goal. Yet in the wake of the financial crisis, global regulators have understandably also focused on minimizing the cost of the failure of a global financial institution by mitigating the impediments to cross-border resolution. The single-point-of-entry (SPOE) and bail-in concepts hold particular promise for most large global firms. However, a successful SPOE resolution of a large global firm has not yet been attempted and will require close cooperation among a large number of stakeholders, including both home and host country regulators. This cooperation will be based on an understanding of separate and mutual interests, not on trust alone. So while SPOE creates a potentially workable framework for resolution, setting the conditions for cooperation is critical. I grew up among the ranches of the American West, where we lived by the motto taught to me as a young child: trust everyone, but brand your cattle. This is a theme that will run throughout my remarks today. In addition to setting the stage for effective cooperation, I will focus specifically on the type of ring-fencing that will almost certainly undermine the successful execution of an SPOE resolution--namely the disruptive seizure of assets by host regulators in the moment of stress. As with other elements of our regime, I have been considering whether our current prepositioning requirements for domestic and foreign firms operating in the United States are both minimizing this risk and functioning in an efficient and transparent manner, and I will share some thoughts in that regard. Our vantage point in the United States as a large home and host regulator would counsel that it is sensible to find a middle ground and fine tune our approach as we learn more and global conditions evolve. To enable cooperation and avoid a destabilizing seizure of assets by host regulators, I would submit that all jurisdictions must find a balance of flexibility for the parent bank and certainty for local stakeholders. Flexibility, or the ability to allocate capital and liquidity to different parts of the group on an as-needed basis, helps to meet unexpected demands on resources and reduces the risk of misallocation and inefficient use of resources. Certainty, or the local prepositioning of capital and liquidity to ensure a firm can satisfy local claimants under stressful conditions, helps to promote cooperation in the context of a cross-border resolution and avoid incentives for more drastic action by host authorities. One's assessment of the optimal balance, as I alluded to earlier, can depend significantly on where one sits in the regulatory constellation. The home regulator, by nature, will logically prefer flexibility in a resolution; consolidated capital and liquidity requirements are most effective if resources can be freely allocated around the consolidated firm where and when they are needed. Flexibility also helps offset the uncertainty in forecasting the location within the consolidated firm where stress may arise. Yet I would also argue that home regulators should recognize host jurisdictions may take action to restrict the flow of resources, or worse yet, demand resources in the moment of crisis, even if the stress does not originally emanate from their location. Such actions tend to both limit the flexibility of the home regulator and undermine cooperation in times of stress. The host regulator, by nature, will prefer the certainty that resources will be available to satisfy local customers and counterparties under stressful conditions. This is particularly the case if the risk of default or the potential local loss given default of a foreign firm is high, and the tolerance for a government-assisted intervention for foreign banks is low. However, the host regulator should also recognize that it is ultimately in its interest for the SPOE resolution of the foreign bank to be successful and, given the uncertainty of the circumstances or location of losses that emerge in an actual stress, adequate flexibility for the parent to deploy resources where needed is likewise in the host regulator's interest. A global bank has other stakeholders who have preferences regarding flexibility or certainty and can take actions that can potentially destabilize the firm. For instance, parent company debt and equity holders may prefer less prepositioning, while local creditors and financial market utilities may prefer more. These other stakeholders can act in ways that can be destabilizing. Building a system that is transparent and is perceived by stakeholders as allocating losses fairly is key in this regard. Finally, I would like to note that the considerations of policymakers in determining the right balance of flexibility and certainty may differ depending on whether the resource at issue is capital or liquidity. In resolution, the most important difference between capital and liquidity is the speed with which financial stress can appear. Liquidity needs are sudden and tend to manifest in all areas of the organization, and the consequences of not meeting liquidity demand-- an immediate default on an obligation--can be grave. Capital needs, however, may be more localized and slower to evolve but are foundational to the execution of an SPOE resolution. It is unlikely that host regulators would be comfortable cooperating in an SPOE resolution strategy without some confidence in the viability of the entities in their jurisdictions. Historically, the United States and the United Kingdom have been in a unique position of having large interests as both home country and host country regulators of internationally active banks. Soon, the European Union is likely to assume this privilege as well. We understand that any requirements we impose on foreign banks operating in the United States may well be imposed on U.S. firms operating abroad. In addition, we are operating under a veil of ignorance, as we don't know whether the next firm in distress will be a U.S. firm operating globally or a foreign firm with U.S. operations. This provides us with strong incentives to view the risks from both sides. As the home regulator to U.S. global systemically important banking organizations (G- process to set the expectation that a firm appropriately balance prepositioned and centrally managed resources. This expectation is based on the premise that the optimal balance will depend on factors such as the firm's structure and the host jurisdictions in which the firm operates. As such, the Board and the FDIC have asked U.S. G-SIBs to analyze and anticipate capital and liquidity resources needed to ensure the continued operation of material entities in resolution. Regarding capital, the positioning of a U.S. G-SIB's internal total loss absorbing capacity (TLAC) should reflect a balance of certainty--prepositioning internal TLAC directly at material entities--and flexibility--holding recapitalization resources at the parent, known as contributable resources--to meet unanticipated losses at material entities. and maintain sufficient liquidity for material entities, an expectation known as Resolution RLAP expectations are intended to be designed so that liquidity is not "double counted" among home and host jurisdictions, to provide transparency into the location of liquidity across the firm's material entities, and to ensure that liquidity can flow where needed with minimal potential disruption. The RLAP approach is aimed at ensuring that surpluses in one host jurisdiction generally are not relied upon to meet deficits in another host jurisdiction, given the confusion and vulnerabilities such reliance can cause in an actual stress. Specifically, a firm should be able to measure the stand-alone liquidity position of each material entity and ensure that liquidity is readily available either at the parent or at that entity to meet any deficits. As with capital, firms are expected to have a balance of prepositioned and centrally managed liquidity--specifically, by balancing the certainty associated with holding liquidity directly at material entities against the flexibility provided by holding high-quality liquid assets at the parent available to meet unanticipated outflows at material entities. As a host regulator, our approach to local capital and liquidity regulations of foreign banks with large U.S. operations is motivated by the lessons of the recent financial crisis, where many foreign banks operating in the United States suffered severe stress and survived only with extraordinary support from the United States and their home country governments. In addition to increasing the resiliency of the U.S. operations of foreign banks, our approach also reflects both resolution and competitive equity considerations. From a resolution perspective, our rules seek to ensure that there are sufficient resources in the United States today to ensure that we, as a host regulator, are well positioned to cooperate with a home country authority in the event a firm experiences material stress or failure. Our rules also ensure that we do not have a strong incentive to limit flows or seek additional resources in the moment of crisis, which could be highly destabilizing in a stress event. From a competitive equity standpoint, we believe that U.S. subsidiaries of foreign banks should operate on a level playing field with their domestic counterparts. This is generally consistent with the long-standing treatment of large and complex bank and nonbank subsidiaries around the world. As such, our rules subject a large foreign bank to the same capital and liquidity requirements as domestic bank holding companies by requiring the foreign bank to hold its U.S. subsidiaries through a U.S. intermediate holding company (IHC) and imposing capital and liquidity requirements to the IHC. At the same time, we have adjusted our approach for the U.S. branches of a foreign bank with a large U.S. presence in recognition that branches are subject to a narrower set of permissible activities and operate as a direct extension of the parent bank. For these U.S. branches, we have imposed local liquidity requirements in light of the liquidity vulnerabilities that many U.S. branches of foreign banks experienced in the crisis, but no separate capital requirements. For IHCs that are subsidiaries of foreign-owned G-SIBs, the Federal Reserve requires such a firm to issue a minimum amount of loss-absorbing instruments to its foreign parent, known as internal TLAC, including a minimum amount of unsecured long-term debt. In the event that an IHC was experiencing significant financial distress, the internal TLAC could be used to replenish the IHC's equity and maintain its solvency. The U.S. implementation of internal TLAC is modeled on the internal TLAC framework developed by the Financial Stability Board (FSB), which includes a calibration of the amount of loss-absorbing resources that should be prepositioned in a given jurisdiction. Specifically, the FSB contemplates that internal TLAC requirements of a subsidiary of a foreign bank expected to be resolved through SPOE would be calibrated at 75 to 90 percent of the external TLAC requirement that would apply to the subsidiary if it were to be separately resolved. In implementing the TLAC standards in the United States, the Board calibrated the internal TLAC requirement for IHCs of foreign-owned G-SIBs at the high end of the FSB range, at around 90 percent. There are two principal points I have been making today. The first is that some amount of local capital and liquidity prepositioning can reduce the incentives for damaging and unpredictable seizures of resources by local regulators during times of stress--thus actually reducing the likelihood that improvised, beggar-thy-neighbor ring-fencing would frustrate completion of a successful SPOE resolution in the future. As we learned long ago out West, the branding of cattle creates the possibility of trust. The second point, however, is equally important: the best prepositioning structure is not an eternal verity mathematically deducible from first principles, but it is instead a practical balance designed to promote cooperation among humans, and any such balance is likely to be improvable with experience, reflection, and debate. We are interested in views from the firms and the public on how the regimes can be improved, and we expect to invite public comment on our living will guidance for U.S. and foreign firms in the near future. In addition, we are currently weighing the costs and benefits of our current approach of directing firms to determine the appropriate amount of prepositioned capital and liquidity. We are also considering whether formalizing resolution capital and liquidity requirements through a rulemaking process would improve the predictability and transparency of our approach. We continue to believe that the IHC and attendant requirements are appropriate for foreign banks with large U.S. operations. However, in light of our experience with these structures, I believe we should consider whether the internal TLAC calibration for IHCs could be adjusted to reflect the practice of other regulators without adversely affecting resolvability and U.S. financial stability. The current calibration is at the top end of the scale set forth by the FSB, and willingness by the United States to reconsider its calibration may prompt other jurisdictions to do the same, which could better the prospects of successful resolution for both foreign G-SIBs operating in the United States, and for U.S. G-SIBs operating abroad. Alternatively, it may be possible to streamline the elements of our resolution loss absorbency regime, which include both TLAC and long-term debt requirements. I will be recommending to my colleagues that we look closely at these possibilities in the coming weeks and seek comment on ways to further improve this framework. We are committed to working with other jurisdictions to continue to build the foundation of the SPOE resolution framework. In addition to finding the appropriate balance of flexibility and certainty that I have discussed, we continue to advocate for increasing the standardization in the global implementation of the regulatory capital rules, improving host supervisors' transparency into the global liquidity and capital positions of a G-SIB on a consolidated and deconsolidated basis, and addressing impediments to a successful SPOE resolution. As with all regulations, we will be open to considering adjustments that would improve transparency and efficiency and will continue to reassess our regime as we make advancements in developing the cross-border resolution framework.
r180518a_FOMC
united states
2018-05-18T00:00:00
Keeping Community at the Heart of the Community Reinvestment Act
brainard
0
I want to thank Ben Dulchin and the Association of Neighborhood and Housing Development (ANHD) for inviting me to be here with you today. While New York is known globally as a cultural hub that attracts the world to its doorstep, it is known locally as a city of neighborhoods. Like other cities across this country, New York's future is bound to the vitality of its neighborhoods as places to live, work, learn, play, worship, and invest. Neighborhoods have been the focus of ANHD's work for over 40 years. The organization's success is a function of its unwavering focus on preserving and strengthening the quality of life in low- and moderate-income neighborhoods across the five boroughs. The mission of each of the members of ANHD is to strengthen these communities. Together, your organizations are providing capacity building, advocacy, and services for the benefit of your communities and the city as a whole. Like many of the nation's metropolitan areas, New York City has rebounded since the recession and is thriving. That, however, has not been the experience for many. Many of the city's residents have not fully recovered, and some have fallen further behind. Here, as elsewhere in the country, there remain important gaps in economic opportunity. Powerful research now demonstrates that persistent gaps in economic opportunity are connected to the health of neighborhoods. The effects of place on opportunity can stretch from one generation to the next. Raj Chetty and others have shown that upward mobility can vary immensely by neighborhood even within the same metropolitan area. The longer a child lives in a neighborhood of opportunity--a neighborhood that is racially integrated and has a strong middle class, strong family structures, more social capital, and better schools--the more likely that a child will do better than their parents economically in adulthood and navigating a path of upward mobility. As I have noted previously, these disparities matter for growth and prosperity. If there are large disparities in opportunity, such that enterprise, exertion, and investments reap lower returns in some communities than others, then families and small businesses in these communities will invest less in the future, and potential growth will fall short. Although there are important implications at the national level for growth and prosperity, to the extent that the roots of the disparities in opportunity and mobility lie in local communities, an important part of the solution is likely to be investments in those communities. That one powerful insight--the importance of investment in communities--lies at the heart has provided for improving investment and development in lower-income communities. Implementing this law effectively is one of the important responsibilities of the banking agencies in promoting strong outcomes locally that reverberate nationally. So let me turn to the role of the CRA in supporting local efforts to strengthen low- and moderate-income neighborhoods and offer my preliminary thoughts on the opportunity before us to make the regulations even more effective in this regard. The CRA was enacted in 1977 to combat the legacy of redlining--literally, the demarcation in red ink of neighborhoods deemed too risky for lending. Federal lending agencies used these redlined maps in deciding where to guarantee mortgage loans. Not surprisingly, the resulting deprivation of credit stifled opportunity for the people living in these areas. Through the CRA, Congress requires the federal banking agencies to encourage banks and thrifts to help meet the credit needs of the communities they are chartered to serve, including low- and moderate-income neighborhoods, consistent with safety and soundness. It requires the Federal Reserve and other agencies to evaluate how well banking institutions help meet those needs and to assign ratings to their performance. The CRA further requires the agencies to make public both the ratings and their written evaluations of the banks' performance. This transparency provides an incentive for banks to work with their communities to meet the needs of low- and moderate-income members and provides important information to enable community members to engage meaningfully with banks. Research has demonstrated that the CRA has had a positive effect on low- and moderate- income neighborhoods. The CRA is unique in that it puts decisionmaking about the community's needs and priorities in the hands of local stakeholders: financial institutions that lend and invest, community organizations that deliver services and develop real estate, and state and local governments that direct incentives and subsidies. The enactment of other laws around the same time supported the CRA's success, including the Home Mortgage Disclosure Act, which requires lenders to report the location of their home mortgage lending and the race and income of borrowers. Much has changed in the years since the CRA was enacted, including technology-driven changes in the delivery of financial services, and it is high time to consider corresponding improvements to the regulations. Both banks and community organizations have offered a variety of suggestions for improving the CRA regulations and their implementation in the last several years. We are undertaking discussions with the Office of the Comptroller of the Currency and the Federal Deposit Insurance Corporation, with which we have traditionally issued joint rulemakings on the CRA. The Federal Reserve has long been interested in engaging in an interagency review of the CRA regulations, so we are pleased to participate in this process. In addition, the Treasury Department recently completed an extensive outreach effort related to modernizing the CRA regulations. CRA is a vital tool to address the credit needs of low- and moderate-income communities, and I believe the time is ripe for a refresh to make it even more relevant to today's challenges. There are five principles that can help guide our efforts. First, we should update the area in which the agencies assess a bank's CRA activities while retaining the core focus on place. This is the most important aspect of refreshing the CRA, and also the one that will require the most care. As our Federal Advisory Council noted, "...a bank's ability to engage in meaningful community development efforts continues to require a meaningful knowledge of and presence in local communities." A significant strength of CRA evaluations is that a bank's performance is evaluated taking into account the demographics of its communities, the types of housing and businesses they serve, and the other financial institutions serving those communities. The current regulations use branches and deposit-taking ATMs as a proxy for the communities the bank serves because these were the primary mechanisms for delivering services when the CRA was enacted and for many years subsequently. This reflected an environment where interstate banking was not allowed, and physical branches were necessary for deposit taking and lending. Today, changes in technology and consumer preferences have made it possible for banks to serve customers far beyond their physical branches and deposit-taking ATMs--via online and mobile platforms. Even so, as much as technology has made banking transactions more convenient for customers, it has not eliminated the need for branches. For large parts of the country, branches and deposit-taking ATMs remain an important way that banks engage with a community. Branches and ATMs are still necessary for depositing and withdrawing cash. Branches also provide personal service and assistance to consumers and business customers. They provide a presence for lenders to get to know borrowers and the communities in which they live, lend, and invest. As our Community Advisory Council recently emphasized, "[f]or many rural and LMI populations, bank branches remain critical for the provision of bank loans, investments, and Recent studies measuring the impact of branch closures on credit availability in neighborhoods demonstrate that branches still matter, particularly with respect to accessing small business credit. The Federal Reserve Bank of New York found that access to small business credit declines and the rates for small business loans increase as the distance between the bank and the borrower grows. Similarly, the large majority of mortgage lending continues to be located in one or more of a bank's delineated assessment areas--that is, near a physical branch. This research also found that people in low-income census tracts were more than twice as likely to live in a banking desert--an area without a branch within 10 miles--than those in higher- income tracts. When they are cut off from mainstream banking institutions, some consumers rely on more expensive alternatives that include payday loans, auto title loans, pawn shops, and check cashing services. Banks vary widely in terms of the types of services they offer and the mechanisms they use for reaching their customers. On one end of the spectrum, many community banks still rely primarily on branches and ATMs to reach their customers, with mobile functionality offered primarily to enhance the customer experience. In the middle of the spectrum, regional and national banking organizations are using online and mobile app capabilities to attract and retain customers well beyond their physical footprint. At the other end of the spectrum, online banks employ business models that reach customers across much or all of America without relying on branches. Although these online banks may provide the same retail products and services as a traditional bank with a branch network, because they have only one location, they are evaluated for CRA purposes in the area around that location, rather than in the many areas where they have customers. Still other banks offer a limited set of credit products or wholesale services. Take, for state-chartered insured depository institutions. ILCs are owned by corporations rather than bank holding companies and are typically chartered to provide credit cards or other forms of financing related to their parent company's business. Many large ILCs are chartered in Utah and are evaluated on their CRA performance in Salt Lake City, even though they lend nationally. The result is a saturated market for community development lending and investment in Salt Lake City. At the same time, other areas in Utah and beyond have important community development opportunities that may go unmet. At a minimum, revised CRA regulations should allow banks with this type of business model to expand CRA activities beyond the area surrounding its branch so that the community and economic development needs of more underserved communities can be met. Branches are important vehicles for reaching small business customers and low-income consumers, but they are not the only way. In considering ways to revise the CRA regulations, the agencies should be thoughtful about how to make the areas in which we evaluate CRA performance more meaningful to both banks and low- and moderate-income communities. For community banks that rely on branches to serve their customers, the current assessment area approach may need only small adjustments. For banks that serve their customers through a variety of approaches, assessing their broader deposit-taking and loan-making footprint might make sense. For large wholesale banks, it might make sense to evaluate CRA activity in a broader area and to encourage them to spread their investments and services to underserved areas. The CRA regulations should be updated in a way that is appropriate for different business models. To the extent that banks are able to serve the needs in low- and moderate-income communities through additional channels, it is important they receive due consideration, while CRA revisions should also continue to recognize the importance of sustaining branches in communities where they are needed. The second, and related, principle guiding our CRA refresh is that the regulations should encourage banks to seek out opportunities in underserved areas. As I noted earlier, the CRA recognizes that banks make a unique contribution. Banks are able to make targeted and valuable investments because of their stake in the local community and their knowledge of it. As a long- term stakeholder in a community, a bank's efforts to finance housing, small businesses, and community services are not just good business short-term, but also good investments long-term as residents improve their economic standing and use more banking services. Streamlining the CRA regulations and clarifying the performance measures could create stronger incentives for banks to pursue the less obvious, but potentially more impactful, projects that low- and moderate-income neighborhoods need the most to become areas of opportunity. We will want to update the CRA in ways that reduce the distortions that lead to some areas becoming credit "hot spots" and others credit deserts. Where there is a high density of banks relative to investment opportunities, the result of too many banks competing for too few CRA-qualified investments can be declining returns. Meanwhile, other areas have a difficult time attracting capital not because the social return on investment is low, but rather because they are not in a bank's major market, if they are served by a bank at all. However we define a bank's assessment areas in the future, the regulations need to be designed and implemented in a way that encourages banks to direct their community investment activities productively. The third guiding principle is that the CRA regulations should be tailored to banks of different sizes and business models. We should set standards that are flexible enough to evaluate the CRA performance of a $100 million bank no less effectively than a $2 trillion bank. Banks seek clearer, simpler rules that result in more CRA activity with less burden. We believe this can be done while retaining the flexibility to evaluate a bank's CRA performance in light of its size, business model, capacity, and constraints as well as its community's demographics, economic conditions, and credit needs and opportunities. Regulatory revisions that do not contemplate evaluating CRA performance in context could arguably undermine the CRA's greatest attribute-- its recognition that banks are uniquely situated to be responsive to the most important community and economic development needs in their communities. The typical small community bank focuses on serving its community through deposit and credit products and may not have the capacity to finance a major community development initiative. But as banks grow in size or specialize in different types of lending, they may have greater capacity to invest through additional channels. We should be sensitive to the ways in which a bank's business model, in addition to its size, influences the types of activities it undertakes to meet its CRA obligations. We will want to maintain the flexibility to ensure that, no matter the business line, a bank can meet its CRA obligation by doing what it has the expertise to do well. And as we look to improve our CRA evaluations, the agencies will need to determine what kind of data will be necessary to evaluate a bank's CRA performance based on the activities it chooses and which banks should collect and report that data based on their scale and business model. The fourth principle is that the revised regulations should promote greater consistency and predictability in evaluations and ratings, both within and across the agencies. The members of our Federal Advisory Council have recommended that the regulations "need to be consistent across the [a]gencies and provide for all regulated financial institutions to be subject to the same CRA 'crediting', examination, and remedial standards." Banks have expressed understandable concern regarding the variation they see in evaluations. Banks seek greater clarity in advance about what activities will qualify for CRA consideration. They want to understand how the qualitative criteria--those measuring the impact and responsiveness of loans, investments, and services--will be factored into their ratings. This concern is not limited to banks. The community organizations and local governments that are trying to attract bank financing to their projects need this clarity, too. Regulatory streamlining can help to promote consistency, as would regular examiner training. The fifth and final principle for CRA modernization is to ensure that revised CRA regulations support its position as one of several mutually reinforcing laws designed to promote an inclusive financial services industry. The central thrust of the CRA is to encourage banks to ensure that all creditworthy borrowers have fair access to credit. For banks to be successful in meeting the credit needs of their entire community, it follows that they must guard against discriminatory or unfair and deceptive lending practices. Ensuring fair access to credit is difficult and requires ongoing vigilance. For this reason, taking a holistic view of closely related issues is likely to be the best way to fulfill the purpose of the CRA, as one of several important laws intended to promote fair financial access. I want to briefly connect the dots between the high priority we put on strengthening the CRA and the pressing challenge of affordable housing that is a key focus of the work that you all do. Access to affordable housing connects families concretely to place and can be a source of strength or fragility. Pew just released a study showing that 38 percent of renter households in America are spending more than 30 percent of their pretax income on rent, representing a 19 percent increase from 2001. The percent of renter households that are spending half or more of their pretax income on housing increased by 42 percent during the same period. Households that spend a high fraction of their income on rent often may find themselves unable to pursue proven strategies to achieve financial security and invest in their family's future. They may need to turn to costly short-term sources of credit to cover emergency expenses. Some may experience eviction if there is an unexpected expense or loss of income. To manage the high cost of housing, workers may be compelled to live far from where they work, rent substandard units, or resort to overcrowding. Research shows that households that live far from work often pay a large share of their budget in commuting costs. As a society, we need to do better in ensuring that affordable housing is available where it is needed. The supply of affordable housing is a good example of a problem of national scope whose solution must be tailored to local needs and economic conditions. This audience knows better than most about the complex dynamics that affect housing at the local level. Local decisions on land use, zoning, taxes, and leveraging of federal funding, and whether community needs are prioritized, are at the heart of whether neighborhoods thrive. I'm not going to suggest that there are any easy answers to housing affordability. Still, the CRA is one of the important policy levers that can make a difference by encouraging solutions that are tailored to local needs and circumstances. It does this by encouraging banks to provide affordable and sustainable mortgage products to qualified low- and moderate-income families. This helps those families have a chance to purchase homes sustainably and build equity. In addition, by encouraging banks to work with the community to identify tailored rental and homeownership investment opportunities, the CRA helps encourage the construction of affordable housing where it is needed most. In our effort to refresh the CRA regulations, we will continue to honor the purpose of the CRA by encouraging banks to engage in local community and economic development initiatives. I am confident that there are ways to update the areas where we evaluate a bank's CRA performance without losing the core focus on place. We should do more to encourage banks to offer deposit and credit products designed to help rent-burdened customers save for homeownership and build strong credit scores that will enable them to succeed in obtaining mortgage credit on favorable terms. We should do more to encourage banks to lend to the underserved entrepreneurs and small businesses that hold the promise of providing jobs and growing local economies. Even as the economy looks strong overall, significant challenges remain for low- and moderate-income areas, making the CRA and its focus on local credit needs more important than ever. I look forward to hearing from you over the next several months to help inform our interagency effort to refresh the CRA regulations. This conference is an opportunity for ANHD members to envision the future for New York's neighborhoods and discuss critical actions to achieve that vision. I know you have a full agenda today, replete with pressing conversations on the compelling challenges that face your neighborhoods. I want to applaud your work to ensure that all of New York's neighborhoods are communities of opportunity. Thank you.
r180525a_FOMC
united states
2018-05-25T00:00:00
Financial Stability and Central Bank Transparency
powell
1
Thank you for inviting me here to celebrate this important milestone. Today is a special day for all of us, since the founding of the Riksbank 350 years ago marked the beginning of central banking. As we meet to discuss the challenges and opportunities the future may hold, it is worth pausing to note that the three and a half centuries since the Riksbank's founding have seen economic growth and dynamism the breadth and duration of which have been unprecedented in world history. The Swedish innovation we celebrate today, I believe, is a vital part of the financial foundations that support the continuation of rising prosperity. In my comments today, I will explore the road ahead for public transparency and accountability of central banks in a time of intense scrutiny and declining trust in public institutions in many places around the world. As you know, the importance of transparency and accountability to monetary policymaking was recognized and became firmly entrenched in practice over the past few decades. The Riksbank has been a leader in this transparency revolution. Today I will focus on the less-often emphasized but critically important role transparency and accountability play in regulatory and financial stability policies. To preview my conclusions, public transparency and accountability around both financial stability and monetary policy have become all the more important in light of the extraordinary actions taken by central banks in response to the Global Financial Crisis. Financial stability policymaking has evolved from managing individual crises as they arise to establishing a policy framework that emphasizes prevention. This framework now includes measures to increase the resiliency of the financial system; enhanced monitoring of financial institutions and of building risks to the system; and measures, such as resolution planning, that require firms to take steps today to better prepare for future episodes of stress. These innovations have placed special demands on transparency and accountability, and we have worked hard to explain them to the public. The framework is still evolving, and we will need to be open to making changes and to new ways to enhance transparency and accountability. This is a challenging moment for central banking. Opinion polls show that trust in government and public institutions is at historic lows. In this environment, central banks cannot take our measure of independence for granted. For monetary policy, the case for central bank independence rests on the demonstrated benefits of insulating monetary policy decisions from shorter-term political considerations. But for a quarter century, inflation has been low and inflation expectations anchored. We must not forget the lessons of the past, when a lack of central bank independence led to episodes of runaway inflation and subsequent economic contractions. As for financial stability, the crisis and the severe recession that followed revealed serious flaws at many private and public institutions, including shortcomings in supervision and regulation. The crisis and its aftermath led central banks to take extraordinary actions, actions that challenged the ingenuity of experts in the field and were understandably difficult to explain and justify to a skeptical public. While these actions were authorized by law and on the whole necessary to avert the complete collapse of the financial system's ability to service households and businesses, they may have also contributed to the erosion of public trust. Central banks are assigned narrow but important mandates. For monetary policy, the Fed's mandate is to keep inflation low and stable and to achieve maximum employment. For financial sector supervision and regulation, part of our mandate is to foster the safety and soundness of individual institutions. In addition, we have a responsibility, shared with other government agencies, to promote financial stability. I view this responsibility as being highly complementary to other aspects of our mission: Financial stability promotes sustainable economic growth, and a stable, well-functioning financial system is an effective transmission channel for monetary policy. Indeed, there can be no macroeconomic stability without financial stability. Within our narrow mandates, to safeguard against political interference, central banks are afforded instrument independence--that is, we are given considerable freedom to choose the means to achieve legislatively-assigned goals. While the focus is often on monetary policy independence, research suggests that a degree of independence in regulatory and financial stability matters improves the stability of the banking system and leads to better outcomes. For this reason, governments in many countries, including the United States, have granted some institutional and budgetary independence to their financial regulators. In a democratic system, any degree of independence brings with it the obligation to provide appropriate transparency. In turn, transparency provides an essential basis for accountability and democratic legitimacy by enabling effective legislative oversight and keeping the public informed. Of course, central banks also need to stick closely to our mandates; the case for independence weakens to the extent that central banks stray into issues that the legislature has not assigned to us. There is also an important policy effectiveness argument in favor of transparency. In the financial stability arena, there is no better example of this than the role that the first round of stress tests played during the crisis in restoring confidence in the U.S. banking system. So in the financial stability realm, the case for enhanced transparency is not just about being accountable; it is also about providing credible information that can help restore and sustain public confidence in the financial system. The post-crisis regulatory system recognizes the importance of enhanced transparency, both about financial institutions themselves and about the processes and expectations of regulators and supervisors. Before the crisis, supervision focused on the safety and soundness of individual institutions and was insufficiently attentive to risk in the financial system as a whole. Supervisory judgments about firms were shared with the public only in rare and exceptional circumstances. Financial stability tools were deployed after the fact, to address specific events that emerged to threaten stability. It is an understatement to say that this approach proved inadequate in the crisis. The post-crisis regime has shifted to implementing preventive policies well in advance of any crisis. Newly established ex ante policies include building the resilience of institutions by requiring more and higher-quality capital and liquidity buffers; a regime of stress tests undertaken by supervisors; and resolution planning, which requires firms to analyze their own potential for distress or failure and create a plan to be used in the event of bankruptcy. These post-crisis policies have benefitted from public solicitation of feedback and in many cases from consideration in open meetings of the Board of Transparency and incorporation of public feedback in these areas have produced more effective supervision and regulation. For example, transparent and clearly communicated policies make it easier for regulated entities to know what is expected of them and how best to comply. Of course, as with any large-scale, complex undertaking, the standards adopted over the past decade can undoubtedly be improved. At the Fed, we are committed to transparency as we assess the efficacy and efficiency of post-crisis reforms. In a sense, stress testing is itself a step forward in transparency. Pre-crisis, supervisors' views of the risks facing our most systemically important firms--and the firms' ability to understand and survive these risks--were shrouded in secrecy. Post- crisis, as part of our stress-testing regime, these supervisory views and expectations are transparent. We expect that these firms will have capital, liquidity, and risk-management capabilities that are adequate for the firms not only to survive, but to continue to perform their key functions even in the event of truly severe stress, akin to the global financial crisis. We make a great deal of information regarding the stress tests public, including the scenarios we use, portfolio-level projected losses for participating firms, and, of course, the results. We have also proposed for public comment a range of ways to further enhance the transparency of the supervisory stress tests. This detailed disclosure provides the public with a wealth of information on how these institutions would perform under severe stress. And this transparency both enhances public confidence and holds banking regulators accountable for their judgments. At the Federal Reserve we use a variety of additional means to enhance public understanding of our supervisory and financial stability efforts and judgments. The Board's Vice Chairman for Supervision testifies before the Congress twice a year. The Board staff's assessment of financial stability is discussed four times a year at Federal Open Market Committee meetings, and these discussions are summarized in the meeting's published minutes. And, since 2013, the semiannual to the Congress has contained a review of financial stability conditions. The post-crisis framework remains novel and unfamiliar. Some of these new policies, such as stress testing and resolution planning, are inherently complex and challenging for all involved. As a result, transparency and accountability around financial stability tools present particular challenges. We will continue to strive to find better ways to enhance transparency around our approach to preserving financial stability. Efforts to engage with the public--including consumer groups, academics, and the financial sector--are likely to lead to improved policies. Moreover, ongoing dialogue will work to enhance public trust, as well as our ability to adapt to new threats as they emerge. There is every reason to expect that technology and communications will continue to rapidly evolve, and to affect the financial system and financial stability in ways that we cannot fully anticipate. While future innovations may well improve the delivery of financial services and make the system stronger, they may also contain the seeds of potential future systemic vulnerabilities. We will need to keep up with the pace of innovation, which will doubtless require changes to our approach to financial stability. As we consider such changes, it will remain critically important to provide transparency and accountability. By doing so, we strengthen the foundation of democratic legitimacy that enables central banks to serve the needs of our citizens, in the long and proud tradition of the Riksbank. financial markets conference sponsored by the Federal Reserve Bank of Atlanta, . www . conference sponsored by Bank Negara Malaysia and the Bank for International .
r180531a_FOMC
united states
2018-05-31T00:00:00
Sustaining Full Employment and Inflation around Target
brainard
0
I appreciate the opportunity to join the Forecasters Club to discuss the path ahead for our economy and monetary policy. In the months ahead, I expect to see tightening resource utilization in the U.S. economy as rising fiscal stimulus reinforces above-trend growth. Continued gradual increases in the federal funds rate are likely to be consistent with sustaining strong labor market conditions and inflation around target, with the balance sheet running off gradually and predictably in the background. This outlook suggests a policy path that moves gradually from modestly accommodative today to neutral--and, after some time, modestly beyond neutral--against the backdrop of a longer- run neutral rate that is likely to remain low by historical standards. Let me consider each element in turn Although indicators of economic activity were on the soft side earlier in the year, the outlook for the remainder of 2018 remains quite positive, supported by sizable fiscal stimulus as well as still-accommodative financial conditions. In the latest report, real gross domestic product (GDP) increased 2.2 percent at an annual rate in the first quarter of 2018, a slowdown from the 3 percent pace in the final three quarters of 2017. The first-quarter slowdown was especially noticeable in consumer spending, which increased at only a 1 percent pace last quarter, compared with 2-3/4 percent in 2017. By contrast, business fixed investment increased 9 percent at an annual rate last quarter, surpassing its robust 2017 pace. I expect real GDP growth to pick up in the next few quarters. In particular, the fundamentals for consumer spending are favorable: Income gains have been strong, consumer confidence remains solid, and employment prospects remain bright. And business investment should remain solid, with drilling and mining bolstered by increased oil prices. Moreover, the sizable fiscal stimulus that is in train is likely to provide a tailwind to growth in the second half of the year and beyond. From a position of full employment, the economy will likely receive a substantial boost from $1.5 trillion in personal and corporate tax cuts and a $300 billion increase in federal spending, with estimates suggesting a boost to the growth rate of real GDP of about 3/4 percent this year and next. In short, with a tightening labor market and inflation near target, fiscal stimulus in the pipeline suggests some risk to the upside. By contrast, recent developments abroad suggest some risk to the downside. Global growth has been synchronized over the past year, but recent developments pose some risk. Political developments in Italy have reintroduced some risk, and financial conditions in the euro area have worsened somewhat in response. With some uptick in political uncertainty, and inflation still below target in the euro area and Japan, monetary policies among the advanced economies look likely to be divergent for some time. In addition, some emerging markets may find conditions more challenging. An environment with a strengthening dollar, rising energy prices, and the possibility of rising rates raises the risks of capital flow reversals in some emerging markets that have seen increased borrowing from abroad. Although stresses have been contained to a few vulnerable countries so far, the risk of a broader pullback bears watching. In addition, uncertainty over trade clouds the horizon. An escalation in measures and countermeasures--although an outside risk--could prove disruptive at home and abroad. Here at home, the labor market is strong. So far this year, payroll gains have averaged 200,000 per month, sufficient to put further downward pressure on unemployment. Indeed, the unemployment rate moved down to 3.9 percent in April following six consecutive months at 4.1 percent. The unemployment rate for African Americans dropped in April to 6.6 percent, which is the lowest level recorded since this series began in 1972 but still high relative to other groups. It is difficult to know how much slack remains. April's 3.9 percent unemployment rate was the lowest reading since December 2000. If the unemployment rate falls another couple of tenths--which seems likely, based on recent trends--it will be at its lowest level since 1969. Although the late 1960s marked the beginning of what is now called the Great Inflation, it is worth keeping in mind that there have been important shifts in the labor market since that time. For example, educational attainment is much higher today than it was in the 1960s, and college degree holders tend to have much lower unemployment rates, on average, than those with a high school degree or less. While the unemployment rate is now lower than before the financial crisis, the employment-to-population ratio for prime-age workers remains about 1 percentage point below its pre-crisis level. It is an open question what portion of the prime-age Americans who are out of the labor force may prove responsive to tight labor market conditions. While it is difficult to know with precision how much slack still remains, I am seeing more evidence that labor markets are tightening, and wages are accelerating, although at a measured pace. The 12-month change in the employment cost index (ECI) for private industry workers in the first quarter was 2.8 percent, up from 2.3 percent in the year-earlier period. By way of comparison, in the period from 2005 to 2007, just before the financial crisis, the ECI rose a bit more than 3 percent at an annual rate, while hearing anecdotes of labor market shortages in particular occupations and sectors, echoing a theme in our recent Beige Book. Going forward, I will be looking for confirmation in other measures of wages that labor market tightness is feeding through to wage gains. Turning to the second leg of our dual mandate, in the most recent data, the trailing 12-month change in core PCE prices was 1.8 percent, up from a year earlier, when core PCE prices increased only 1.6 percent. Overall PCE prices, which include the volatile food and energy sectors, increased 2.0 percent, largely reflecting the recent run-up in crude oil prices. While the recent core PCE data are somewhat encouraging, we will want to see inflation coming in around target on a sustained basis after seven years of below-target readings. As I have noted before, the persistence of subdued inflation, despite an unemployment rate that has moved below most estimates of its natural rate, suggests some risk that underlying inflation-- the slow-moving trend that exerts a pull on wage and price setting-- may have softened . For example, some survey measures of longer-run inflation expectations are currently lower than they were before the financial crisis, as are most estimates based on statistical filters. Inflation compensation has moved up recently but is still running somewhat below levels that prevailed before the crisis. (FOMC) 2 percent objective is an important goal. Recent research has highlighted the downside risks to inflation and inflation expectations that are posed by the effective lower bound on nominal interest rates, and it underscores the importance of ensuring underlying inflation does not slip below target in today's new normal. In that regard, if we were to see a mild, temporary overshoot of the inflation target, this could well be consistent with the symmetry of the FOMC's target and may help nudge underlying inflation back to target. In short, it is reassuring to see core PCE inflation moving up, along with market- based measures of inflation compensation retracing earlier declines. After seven years of below-target inflation, it will be important to see inflation coming in around target on a sustained basis to be confident that underlying trend inflation is running at 2 percent. Even though longer-term Treasury yields have moved up, on net, since the beginning of the year, there has been growing attention of late to the possibility of an inversion of the yield curve--that is, circumstances in which short-term interest rates exceed long-term interest rates on Treasury securities. Historically, yield curve inversions have had a reliable track record of predicting recessions in the United States. Since 1960, there has only been one case where the 3-month Treasury yield has moved above the 10-year Treasury yield and a recession has not followed--in 1966. This correlation between yield curve inversions and recessions might arise for a variety of reasons. First, let us take a case where short-term rates rise relative to long- term rates. When the FOMC is undertaking a deliberate tightening in policy, short-term interest rates typically rise, as do expectations of short-term interest rates in the medium term, while interest rates in the distant future may be less affected. For example, if short- term interest rates were raised to stabilize temporary swings in the economy, the logic of the expectations hypothesis would suggest that long rates would not rise as much. And if tighter monetary policy were to weaken the economy with a lag, this would lead to long rates not rising by as much or at all. Second, let us take a case where long-term rates decline relative to short-term rates, perhaps reflecting a flight to safety. If market participants become concerned about a future macroeconomic risk that could lead to a weaker economy, this concern would tend to lower expected longer-term interest rates, both because monetary policy would be expected to become more accommodative in the future and because market participants may increase their relative holdings of safe assets, such as Treasury securities. In this case, longer-dated Treasury yields may fall, and if short-term interest rates do not adjust commensurately, the yield curve will invert ahead of a weaker economy. Turning to current conditions, the spread between the 10-year and 3-month Treasury yields has declined from 375 basis points in early 2010 to about 125 basis points in the first quarter of this year. While that represents a considerable flattening, the current spread between the 10-year and 3-month yields is only about 20 basis points narrower than the average over the 45 years before the financial crisis. As we try to assess the implications of this flattening of the yield curve, it is important to take into account the very low level of the current 10-year yield by historical standards. For the 20 years before the crisis, the 10-year Treasury yield averaged about 6-1/4 percent, compared with recent readings around 3 percent. One reason the 10-year Treasury yield may be unusually low is that market expectations of interest rates in the longer run may be unusually low. A second reason may be that the term premium--the extra compensation an investor would demand for investing in a 10-year bond rather than rolling over a shorter-dated instrument repeatedly over a 10-year period--has fallen to levels that are very low by historical standards. According to one estimate from Federal Reserve Board staff, the term premium has tended to be slightly negative in recent years. By contrast, when the spread between the 10-year and 3-month Treasury yields was at its peak in early 2010, this measure of the term premium was close to 100 basis points. Other things being equal, a smaller term premium will make the yield curve flatter by lowering the long end of the curve. With the term premium today very low by historical standards, this may temper somewhat the conclusions that we can draw from a pattern that we have seen historically in periods with a higher term premium. With a very low term premium, any given amount of monetary policy tightening will lead to an inversion sooner so that even a modest tightening that might not have led to an inversion in the past could do so today. There are a number of possible explanations of the low level of the term premium. The asset purchases of the Federal Reserve and other central banks may be contributing factors. The goal of these policies was to lower longer-term interest rates--and in many cases, expressly by lowering term premiums. A number of studies suggest that these polices have indeed been successful in lowering term premiums. A second reason the term premium may be lower than in the past is the changing correlation between stock and bond returns, likely caused by changes in expected inflation outcomes. While in the 1970s and 1980s stock and bond returns tended to be positively correlated, more recently the correlation has tended to be negative. With an inverse correlation, bonds recently have been a good hedge for stocks, and that correlation may have contributed to lower bond term premiums by increasing the demand for bonds as an instrument for hedging portfolio risks. This changed correlation between stock and bond returns in turn may be related to better anchored inflation expectations following a long period of low and stable inflation. Looking ahead, it seems likely the term premium will increase somewhat, although perhaps not to the levels seen historically. On the one hand, a continued gradual runoff of the balance sheet of the Federal Reserve and reduced bond buying by other central banks will tend to put upward pressure on the term premium. On the other hand, the FOMC's demonstrated commitment to maintaining low and stable inflation makes it unlikely that expectations of high inflation will reemerge. Thus, on balance, while term premiums may recover somewhat from their recent depressed values, it is unlikely they will return to the high levels of earlier decades. the federal funds rate is projected to reach its longer-run value by 2019 and exceed it in 2020. If the 10-year term premium were to stay very low, that path would likely imply a yield curve inversion. But for the reasons I just noted, if the term premium remains low by historical standards, there would probably be less adverse signal from any given yield curve spread. It is important to emphasize that the flattening yield curve suggested by the SEP median is associated with a policy path calibrated to sustain full employment and inflation around target. So while I will keep a close watch on the yield curve as an important signal on how tight financial conditions are becoming, I consider it as just one among several important indicators. Yield curve movements will need to be interpreted within the broader context of financial conditions and the outlook and will be one of many considerations informing my assessment of appropriate policy. As suggested by the SEP median path, I believe that the forward-guidance language in the Committee statement that was introduced a few years ago that "the federal funds rate is likely to remain, for some time, below levels that are expected to prevail in the longer run" is growing stale and may no longer serve its original purpose. For purposes of comparison, in March 2016, the median of SEP projections for the federal funds rate path had the funds rate rising to 3.0 percent and remaining below the longer run value that was projected to be 3.3 percent. A year later, the median projection of the longer-run federal funds rate fell. In the March 2018 SEP, the median projection of the federal funds rate peaks at 3.4 percent in 2020--1/2 percentage point above the median projection of its longer-run value of 2.9 percent. It is worth noting that this progression reflects a decrease in the long-run federal funds rate as much as an increase in the medium-run federal funds rate. In an environment of tightening resource utilization and above-trend growth, with sizable fiscal stimulus likely to provide a boost to demand in the near-to-medium term that should fade somewhat further out, it seems likely that the neutral rate could rise in the medium term above its longer-run value. I expect current tailwinds to boost the neutral rate gradually over the medium term but leave little imprint on the long-run neutral rate. The short-run level of the neutral rate should rise gradually because the forces that are moving the economy from headwinds to tailwinds are likely to play out gradually. Although the tax cuts are already in place, their effects may not be fully felt for a few years, and the spend-out from the recent budget agreement may occur with some delay. A gradual pace is also warranted in light of the long period of undershooting the inflation target. I would not underestimate the challenge of calibrating monetary policy to sustain full employment and re-anchor trend inflation around 2 percent, while adjusting to sizable stimulus at a time when resource constraints are tightening and the economy is growing above trend. I continue to view gradual increases in the federal funds rate as the appropriate path, although I will remain vigilant for the emergence of risks and prepared to adjust if conditions change. . . . . . . . . . . . . . . .
r180620a_FOMC
united states
2018-06-20T00:00:00
Monetary Policy at a Time of Uncertainty and Tight Labor Markets
powell
1
Nine years into an expansion that has sometimes proceeded slowly, the U.S. economy is performing very well. Growth is meaningfully above most estimates of its long-term trend--though admittedly, that trend is not as strong as we would like it to be. The labor market is particularly robust, with unemployment at its lowest level since April 2000. Inflation has moved up close to our 2 percent objective, although we have yet to see it remain near that objective on a sustained basis. Today, most Americans who want jobs can find them. High demand for workers should support wage growth and labor force participation--the latter a measure on which the United States now lags most other advanced economies. A tight labor market may also lead businesses to invest more in technology and training, which should support productivity growth. And groups such as some racial and ethnic minorities that still have higher unemployment and lower participation rates could see increasing benefits from a tight labor market. In short, there is a lot to like about low unemployment. Achieving our statutory goal of maximum employment in a context of price stability and financial stability is both our responsibility and our challenge. Earlier in the expansion, as the economy recovered, the need for highly accommodative monetary policy was clear. But with unemployment low and expected to decline further, inflation close to our objective, and the risks to the outlook roughly balanced, the case for continued gradual increases in the federal funds rate is strong. At 3.8 percent, the unemployment rate is below most estimates of its long-run level, which are now clustered in the mid-4s. Many other labor market indicators also suggest an economy near full employment. To name just a few, these indicators include an elevated level of job vacancies. For the first time since the Labor Department began collecting the data in 2000, there are now more job vacancies than there are people counted as unemployed. In addition, the rate at which workers are quitting their jobs is elevated, a sign that workers are able to find another job when they seek one. And surveys show that businesses are finding it difficult to fill vacancies, and that households perceive jobs as plentiful. Some other indicators are less clear. The labor force participation rate of prime- age workers has moved up in recent years but remains below pre-crisis levels. addition, wage growth has been moderate, consistent with low productivity growth but also an indication that the labor market is not excessively tight. Looking ahead, the job market is likely to strengthen further. Real gross domestic product in the United States is now reported to have risen 2-3/4 percent over the past four quarters, well above most estimates of its long-run trend. Expansionary fiscal policy is expected to add to aggregate demand over the next few years. Many forecasters expect the unemployment rate to fall into the mid-3s and to remain there for an extended period. If that comes to pass, it will mean the lowest unemployment in the United States since the late 1960s. A historical comparison Because we have so little experience with very low unemployment, it is interesting to compare today's labor market with that earlier period. Unemployment was below 4 percent from February 1966 through January 1970. During that time, inflation as measured by the price index for personal consumption expenditures increased from below 2 percent in 1965 to about 5 percent in 1970. In hindsight, unemployment is now widely thought to have been unsustainably low at that time and to have contributed to escalating inflation. But how significant is this precedent for today? The U.S. economy has changed in many ways over the past 50 years. By some estimates, the natural rate of unemployment is substantially lower now. Office now estimates that the natural rate was about 5-3/4 percent (and rising) in the late Rising education levels do point to a decline in the natural rate since the 1960s, because more highly educated people are less likely to be unemployed. The share of the population with a college degree has risen from less than 15 percent in the late 1960s to nearly 40 percent now, and the share with less than a high school degree has declined from 45 percent to about 5 percent. Another important difference from the 1960s is that inflation has been low and stable for an extended period, which has better anchored inflation expectations. Today policymakers have a greater appreciation of the role expectations play in inflation dynamics and a clearer commitment to maintaining low and stable inflation. Unfortunately, with the passage of a half-century and important changes in the structure of our economy and in central bank practices, in my view the historical comparison does not shed as much light as we might have hoped. Questions prompted by a tight labor market The lack of useful historical precedent leaves us with some uncertainty about the answers to several important and challenging questions . First, estimates of the natural others have drifted lower as unemployment has declined without much apparent reaction from inflation. How reliable are these estimates? Natural rate estimates have always been uncertain, and may be even more so now as inflation has become less responsive to the unemployment rate. The anchoring of expectations is a welcome development and has likely played a role in flattening the Phillips curve. But a flatter Phillips curve makes it harder to assess whether movements in inflation reflect the cyclical position of the economy or other influences. Second, what would be the consequences for inflation if unemployment were to run well below the natural rate for an extended period? The flat Phillips curve suggests that the implications for inflation might not be large, although a very tight labor market could lead to larger, nonlinear effects. Research on this question is ambiguous, again reflecting the limited historical experience. We should also remember that where inflation expectations are well anchored, it is likely because central banks have kept inflation under control. If central banks were instead to try to exploit the nonresponsiveness of inflation to low unemployment and push resource utilization significantly and persistently past sustainable levels, the public might begin to question our commitment to low inflation, and expectations could come under upward pressure. So far, we see no signs of this. If anything, some measures of longer-term inflation expectations in the United States have edged lower in recent years. Third, can persistently strong economic conditions pose financial stability risks? Of course, strong economic conditions are a good thing! Such conditions can make the financial system better able to absorb shocks through strong balance sheets and investor confidence. But we have often seen confidence become overconfidence and lead to excessive borrowing and risk-taking, leaving the financial system more vulnerable. Indeed, the fact that the two most recent U.S. recessions stemmed principally from financial imbalances, not high inflation, highlights the importance of closely monitoring financial conditions. Today I see U.S. financial stability vulnerabilities as moderate and broadly in line with their long-run averages. While some asset prices are high by historical standards, I do not see broad signs of excessive borrowing or leverage. In addition, banks have far greater levels of capital and liquidity than before the crisis. Fourth, while persistently strong economic conditions can pose risks to inflation and perhaps financial stability, we can also ask whether there may be lasting benefits. As I mentioned, a tight labor market could draw more people into the labor force. In fact, as the labor market has tightened, more workers have been moving back to work and off disability rolls. There could also be benefits to productivity and potential growth. All told, though, the persistence of any such "positive hysteresis" benefits is uncertain, since, again, the historical evidence is sparse and inconclusive. As is often the case, in the current environment, significant uncertainty attends the process of making monetary policy. Today, with the economy strong and risks to the outlook balanced, the case for continued gradual increases in the federal funds rate remains strong and broadly supported among FOMC participants.
r180627a_FOMC
united states
2018-06-27T00:00:00
America's Vital Interest in Global Efforts to Promote Financial Stability
quarles
0
Good afternoon. It is a particular honor for me be here to address the Utah Bankers Association, which is like coming home in two ways: First, to Sun Valley, which is our family's deeply rooted second home--my wife's great grandfather established the first sawmill in the Wood River Valley near Hailey almost 150 years ago--and second, to Utah, the place that, like all of you, I dearly love and where I have always lived, despite a career that seems determined to keep taking me elsewhere. One of those elsewheres is Washington, and as a Utahan who has spent most of his career in the private sector advising and investing in the banking industry, I think I have a pretty good idea of how things look, from your vantage point, when someone from Washington shows up to give a speech. Since the financial crisis, bankers have had to adjust to challenging and evolving economic conditions and to many new regulations. At times, smaller and regional banks have been left wondering how actions in Washington focused on systemic vulnerabilities and the largest institutions were relevant to how they fund their businesses and in turn finance the aspirations of families, farmers, ranchers, and other entrepreneurs. I'll start today by trying to address those questions and provide a brief update on steps the Congress and the Federal Reserve are taking related to financial regulation. A decade after the crisis, implementation of the major post-crisis reforms is largely complete, and we have entered a new phase that is aimed at reviewing and improving regulations to ensure that they are achieving their aims in the most effective and efficient manner. I will explain more about that approach, and, relevant to your businesses, efforts to tailor regulation so it is appropriate for the size and business model of different institutions. I will also briefly describe pending regulatory changes passed last month by Congress, which I believe will further tailor regulations for banks, with particular benefit to community and regional banks. But I want to devote much of my time today to a broader message about the connection between these improvements in post-crisis regulation and the fundamental purpose of those regulations: to do what must be done to protect our economy from another severe financial crisis. Banks of all sizes have a shared interest in ensuring that regulation is efficient and appropriately tailored to promote a strong, fair, and competitive market for financial services. Likewise, banks have a shared interest in ensuring that regulation overall promotes a strong and stable financial system that keeps credit flowing to households and businesses in the communities you serve. Among the truths revealed by the financial crisis, one of the most important was the recognition that the vulnerabilities that had developed in the financial system were global in nature and that the problems our institutions and markets faced in the United States were inextricably connected to conditions and decisions outside our borders. Other governments likewise found that problems in the United States spilled over to their financial systems and economies. To cite just one example, it is well known now that the rapid growth of securitization of residential mortgages in the United States was a prominent factor driving up home lending and driving down lending standards. I think it is not as well known that a large share of those securities were being created, traded, and held by entities outside the United States. Some of the most important steps taken since the crisis to make our financial system more resilient have involved collecting information, identifying and monitoring stresses in the global financial system, and establishing and raising international standards. As I have noted, the improvements the Federal Reserve is making to financial regulation here in America, including tailoring, will help level the playing field for banks and help ensure you are able to continue to compete and serve your customers. The benefits of this for Utah banks are clear. But banks in Utah and elsewhere also benefit from a strong and stable global financial system, and as history has demonstrated, this in turn depends on strong international standards that help level the playing field. A strong and stable financial system depends also on transparency that helps both the private sector and regulators detect and deter vulnerabilities that could harm the U.S. economy. So I'd also like to talk to you today about one of the important international bodies created since the crisis to promote global financial stability, the Financial Stability Board (FSB) and tell you why I believe America's active participation in the FSB is important to our nation, and even, as remote as it might seem, relevant to your businesses. But let me begin with a topic of more immediate interest and offer a brief overview of legislation and regulatory action by the Federal Reserve that I know is important to you and your institutions. First, a little context: like our economy, the condition of the U.S. banking industry is strong. First quarter profits for all banks hit a new record of $56 billion. Banks are well capitalized and positioned to increase lending to finance investment in a strengthening economy. Community banks are also doing well. According to Federal Reserve data on more than 5,000 community-based holding companies, community banks reported net income of $20.6 billion during 2017, up 4 percent from the year before. Like larger banks more recently, this result was the product of particularly strong loan activity, with recent year-over-year loan growth of 7.7 percent, which was substantially above the increase last year in the banking industry as a whole. Turning to recent regulatory developments, the big news, of course, is the Crapo, passed by Congress at the end of May and signed by the President. Before I get to that, let me briefly mention some things the Federal Reserve and other agencies have done--in some cases presaging steps taken in the new legislation--to reduce the regulatory burden on community and regional banks. One supervisory improvement is a Federal Reserve program called Bank Exams Tailored to Risk, or the BETR program. It uses financial metrics to differentiate the level of risk between banks before examinations and assist examiners in tailoring examination procedures to minimize the regulatory burden for firms that engage in low-risk activities, while subjecting higher-risk activities to more testing and review. Another initiative has been to shift a significant amount of the Federal Reserve's examination activity offsite. Additionally, the Federal Reserve, along with other agencies, took action to simplify the reporting responsibilities of smaller banks with a new streamlined Call Report form in 2017. Based on feedback from community banks, we and other regulators also increased the threshold for requiring an appraisal on commercial real estate loans approach to determining "control" under the Bank Holding Company Act that could help banks raise capital and facilitate nonbank investments. I will now discuss the new law, which preserves the most important post-crisis reforms for the largest firms while directing the Federal Reserve and other agencies to make numerous changes that should reduce the regulatory burden for community and regional banks. On the Volcker rule, the legislation calls for exempting the vast majority of banks with $10 billion or less in assets from reporting requirements, which the Federal Reserve supported, due to the lack of trading activity that community banks engage in. This overtook efforts by the Fed and other regulatory agencies to refine the Volcker rule, but the bottom line is that this broad exemption is law and in the process of being implemented. Another change in the new legislation raises the asset threshold for bank holding $1 billion to $3 billion. The law also exempts bank holding companies with $50 billion to $100 billion in assets from enhanced prudential standards and exempts banks with less than $100 billion in assets from future stress testing. The lifting of this threshold importantly allows the Federal Reserve to tailor its rules for these firms moving forward while retaining the ability to protect the safety and soundness of the system. I mentioned steps related to Call Report streamlining, and the legislation addresses this topic also, allowing reduced Call Report requirements for certain banks with less than $5 billion in assets. For banks that are well managed and well-capitalized, the asset threshold was raised for a longer, 18-month examination cycle from $1 billion to $3 billion. The legislation would also exempt from an appraisal requirement rural properties for loans of less than $400,000, under certain circumstances. A common theme in the legislation and the Fed's steps to improve our regulation and supervision is tailoring. As the Fed continues to evaluate the effectiveness and efficiency of regulations, I expect tailoring will be a guiding principle. Let me now address international efforts to promote financial stability, specifically those centered in the Financial Stability Board. In the run-up to the crisis, as I'm sure you all know, decades of relative stability in the United States had left both the financial industry and government agencies complacent about potential threats. And even though financial crises had occurred during that time in some advanced economies, it is fair to say that the United States and other nations did not place a high probability on a crisis that could be global in nature. As a result, international coordination and collaboration on financial stability was limited, and there was a shortage of detailed and standardized information about financial conditions and vulnerabilities in different countries. As the crisis descended and the global nature of the problems became clear, the United States and other major economies, working through the Group of Twenty nations, created the Financial Stability Board to coordinate their efforts to stabilize the global financial system, reform international financial regulation, and share information. FSB includes central banks, finance ministries and regulators from 24 nations, the European Union, and also international organizations such as the International Monetary Fund and important global financial standard-setting bodies. Unlike other global organizations, the FSB includes multiple agencies from each government in recognition of the fact that financial stability is a responsibility shared across many parts of any and the Securities and Exchange Commission are members. Some of you may reasonably be wondering, at this point of the speech, how we got from rural appraisals in Utah to the Financial Stability Board in Switzerland. How are the conditions in 2008 and 2009 that led to creation of the FSB relevant to community banking? Let us remind ourselves how that global financial crisis and ensuing recession looked to communities in Utah and the bankers who serve them. Community banks, as we all know, engaged in little of the risky activity that was the basis of the crisis. But few community banks, I think, were unaffected by the competitive forces that were unleashed in the years leading up to 2008. When short-term wholesale funding froze up, and securitizing loans became impossible, and Fannie Mae and Freddie Mac effectively failed, community banks were affected. And when your customers were hit hard by the crisis, community banks were affected too. In two years, from 2007 to 2009, the unemployment rate in Utah more than tripled. As it usually does, Utah weathered the Great Recession better than most places, but it was still the toughest economic times our state has faced in many decades, and of course, this profoundly affected banks and their customers. While that was occurring, the Federal Reserve and governments in other countries affected by the crisis were tackling several challenges in trying to strengthen financial regulation and oversight. One fundamental problem was information, specifically the lack of information about risks and vulnerabilities both within and across jurisdictions. The Federal Reserve and other U.S. agencies had some tools to help assess prudential risks for U.S.-based firms when the crisis hit. Information sharing about systemic financial vulnerabilities was more limited, particularly for conditions outside the United States. For one, we did not understand the importance of some financial vulnerabilities or had only limited information on them, such as interconnectedness across financial firms, and therefore we were unable to share information. We also failed to appreciate the ways in which the shadow banking system that had grown up outside the institutions we oversaw had become interconnected with those institutions. The existing global forums for discussion of these issues were considered less important or were focused on just one financial sector, and membership was often limited to a handful of industrial countries. We now understand the importance of taking a global view on financial vulnerabilities, and we are learning from each other about how to fill the gaps in understanding and data that exist. An additional challenge that the United States faced, in responding to the crisis and establishing more effective oversight and higher standards was the inability to enforce such rules in a global financial system without common, more uniform standards. If some of the activities threatening financial stability occurred outside the United States and in jurisdictions with lower standards, raising standards in the United States would be both ineffective in fully stabilizing the financial system, and could put U.S. firms at a competitive disadvantage, which would be only an added disincentive to embrace effective standards. Every nation, of course, seeking to make its financial system more resilient faced these same challenges and disincentives, an example of the problem of collective action that points nations toward international cooperation. If the FSB had been in place before the crisis and working on identifying and assessing vulnerabilities to financial stability, that may have allowed us to take action at an earlier stage, frame our response with more information, and possibly mitigate some of the devastating consequences. I can attest to the FSB's improvement over the pre- crisis discussions that took place internationally because during the first Bush Administration I was a delegate to the informal and more limited group that preceded the An important part of the FSB's work is to endorse minimum standards in different areas; for example, identifying the key attributes of effective resolution planning for systemically important firms. In addition, the FSB is in the early stages of some critical work that examines the effects that reforms and standards are having. Are they doing what we intended them to do? Have there been unintended consequences? Can we make the reforms more efficient; that is can we achieve the same effects while lowering the burden on institutions and supervisors? Once again, you might be wondering why something like resolution regime- planning should matter to community bankers. You might be hoping that I get back to the good news I delivered earlier, about steps being taken in Washington to tailor regulation and reduce the regulatory burden on community banks. But, of course, these are two sides of the same coin. Appropriately reducing the regulatory burden for community banks is possible when we can get an accurate picture of the risks and vulnerabilities in the broader financial system, which Utah's banks are part of and depend on. Tailoring does not mean abandoning our responsibility to promote a stable financial system, but embracing it, assisted by FSB efforts to ensure that reforms are having the intended effects and supported by the global standards that the FSB and other international standard-setting bodies are able to establish and promote. In closing, I want to address an issue relevant to any international organization, which is sovereignty. More specifically, we sometimes hear concerns that international bodies such as the FSB threaten our sovereignty by imposing rules on the United States, which would be a concern. Let me be clear: the FSB has no enforcement powers, no legal authority to command its members to do anything, and not even authority, as in some international organizations, to induce action based on contractual obligations. The FSB does not impose obligations, it addresses problems--problems that are of great importance to the United States and which, because of the global nature of the financial system, we cannot address alone. The United States and other governments created the FSB and participate in it because it is in our national interests to do so, and that is really the basis of its effectiveness. The United States is not weaker or less independent by participating in the FSB or other standard-setting bodies. On the contrary, when rightly structured our participation in these groups makes our financial system significantly stronger by ensuring that the U.S. perspective is part of the discussions and reflected in standards agreed to. Our consumers and businesses are more secure and prosperous because the FSB helps make sure that all countries are doing their share in promoting financial stability and not gaining an unfair advantage. Like some other effective organizations, a source of the FSB's power is that it functions by consensus. That can make reaching decisions more difficult, but it also yields decisions that can be truly effective solutions because all participants feel a stake in them. It is useful when the credibility and commitment of the decisions are especially important, such as when my Fed colleagues and I set monetary policy. At the FSB, relying on consensus helps 68 agencies and other members from two dozen countries with different perspectives and agendas come together around our shared interest in a stable global financial system. International negotiations and standard-setting is not the best approach to all problems, but in my past experience as a Treasury Department official, it is often the best way to tackle problems that are global in scope. By actively participating in the FSB and engaging with its members at a high level, the United States is supporting high international standards that are equal to those in the United States. Our standards will be most effective when other major economies embrace them in a consistent manner. The goal is to limit the risks of another financial crisis and do what we can to promote prosperity and a bright future for the people you serve so faithfully in the great state of Utah.
r180718a_FOMC
united states
2018-07-18T00:00:00
Getting It Right: Factors for Tailoring Supervision and Regulation of Large Financial Institutions
quarles
0
I want to thank the American Bankers Association for inviting me to speak. is an era of relatively rapid evolution in banking regulation, an area of human endeavor that is not commonly known for its speedy metamorphoses. But in the time since I became the Vice Chairman for Supervision at the Federal Reserve, we have seen agreement on the final pieces of the international framework for post-crisis regulation-- the so-called Basel III endgame. The Federal Reserve has issued a number of proposed rule changes that would improve our capital and stress testing regime. And in late May, Act (EGRRCPA), which, among other things, directs us to further tailor our supervision and regulation of large banks with more than $100 billion in assets. In other words, the Congress wants to see action and has, to a certain degree, specified some of the steps we need to take. How we respond to that task, especially as it applies to large banks, will be the focus of my remarks today. But before I delve into those details, I want to provide an overview of what I hope you'll take away. First, I want to underscore that I believe tailoring of financial regulation is good public policy. The Federal Reserve Board is a firm adherent of the recent legislation's underlying principle that regulation should be tailored to risk. To an extent, as I will later describe, this principle is already embedded in several aspects of our supervisory framework. Second, the Federal Reserve will need to revise its framework to allow for a greater differentiation in the supervision and regulation of large firms. To date, our tailoring of regulations has been based largely--but not exclusively--on asset size, which reflects an unduly one-dimensional approach. We have been evaluating additional criteria that may provide for greater regulatory differentiation across large banks, and the recent legislation is consistent with the goals of that initiative. Specifically, the legislation recognizes that large banks have a variety of business models and risk profiles; supervision should be flexible enough to incorporate this heterogeneity. Third, I believe we already have a good start on a path forward for tailoring regulation. In my remarks today, I will touch upon some potential factors we have identified for tailoring the supervision and regulation of large banks. I think everyone in this room would agree that--while there are important ways it can be improved--the body of post-crisis regulation adopted by the Federal Reserve and its fellow banking agencies has, taken as a whole, clearly made the U.S. financial system stronger and more resilient. In implementing these reforms, the Fed sought to achieve two parallel goals: (1) promoting the safety and soundness of individual banking organizations and (2) enhancing the stability of the broader U.S. financial system. A certain amount of tailoring was reflected from the beginning in how the Fed sought to achieve these goals. As the Board built its post-crisis framework, supervision and regulation were designed to increase in stringency in tandem with a firm's size and systemic footprint. This can be seen in stricter requirements in various elements of the regulatory capital framework that apply only to larger or more complex banks, including certain buffers and surcharges, the application of the supplementary leverage ratio, and the application of the qualitative objections as part of the Board's capital planning framework, among others. In April, the Board proposed the stress capital buffer, which would simplify its regulatory capital requirements for the largest banks by integrating the stress test results with the Board's non-stress capital requirements. Further, the Board recognized that the failure of one of the largest banking organizations could create spillovers that would undermine U.S. financial stability and harm consumers and the broader economy. To offset this risk, the Board has required these firms to internalize the cost of their potential failure in a tailored manner that corresponds with their importance to the U.S. financial system. These efforts include the Board's capital surcharge for global systemically important banks (G-SIBs) and total loss-absorbing capacity requirements. But while the concept of tailoring is inherent in how the Board thinks about supervision and regulation, reasonable people can disagree on the sufficiency of tailoring to date. In my view, we've made a good start in improving the efficiency of our regulatory regime. But we still have more to do to streamline our framework in a manner that more directly addresses firm-specific risks. The recent legislation requires us to reevaluate how we regulate banks that have between $100 billion and $250 billion in total assets. In particular, we need to make a tailoring-related decision in the near term: How will we decide which enhanced prudential standards should apply to which firms with total assets between $100 billion In applying enhanced prudential standards for firms with total assets of more than $100 billion, the Congress requires the Board to consider not only size but also capital structure, riskiness, complexity, financial activities, and any other factors the Board deems relevant. While we use similar factors to calibrate the largest firms' G-SIB surcharges, we have not used them more holistically to tailor the overall supervision and regulation of large banks that do not qualify as G-SIBs. Further, consistent with the legislation's tailoring requirements, the Board must proactively consider how firms with more than $250 billion in assets that do not qualify as G-SIBs may be more efficiently regulated by applying more tailored standards. In conjunction with changing regulations, we also need to consider how such changes would be reflected in supervisory programs, guidance, and regulatory reporting. As supervisors, we need to balance providing appropriate relief to firms with ensuring that firms are maintaining resources and risk-management practices so they can be resilient under a range of conditions. We must also ensure we receive the right information in a timely manner so we can identify emerging risks. I want to spend the balance of my time focusing on the question I previously posed: On what basis will we decide to apply enhanced prudential standards to firms with total assets in the $100-billion to $250-billion range? The recent legislation directs the Board to consider factors other than size for differentiating supervision and regulation of large banking organizations. Before I talk about the "other" factors, let me acknowledge the merits of size as one relevant factor to include on the list. We know that the effect of a large bank's failure on the economy is greater than when a smaller bank fails, even though the two banks might be engaged in similar business lines. The recent financial crisis in 2008-09 saw a much more severe recession than other financial crises, such as the savings and loan crisis in the late 1980s that preceded the relatively mild recession of 1990-91, and this appears at least in part related to the size of the affected institutions. In fact, empirical research done at the Federal Reserve shows that stress among larger banks does more significant harm to the economy than stress at smaller banks, even after controlling for the aggregate size of bank failures. We also know that larger banks are more operationally complex--even when not engaged in complex business lines--with a broader geographic scope and more layers of management than smaller banks. Therefore, it seems appropriate to me that a path forward for tailoring supervision and regulation of large banks should not ignore size, but consider it as one factor--although only one factor--along with other factors. Before the enactment of the recent legislation, we had begun work on considering additional factors that capture large firms' degree of complexity and interconnectedness that--in conjunction with size--may provide a better basis for tailoring supervision and regulation than size alone. The systemic effect of a bank's failure or distress is positively correlated with that organization's business, operational, and structural complexity. Generally, the more complex a banking organization is, the greater the expense and time necessary to resolve it. Similarly, financial institutions may be interconnected in many ways, as large banks commonly engage in transactions with other financial institutions that give rise to a wide range of contractual obligations. Financial distress at a large bank may materially raise the likelihood of distress at other firms given the network of contractual obligations throughout the financial system. So how do we gauge the degree of complexity and interconnectedness of large firms? Rather than formulating new and untested measures of these factors, I believe we would be well served to begin by looking to our body of post-crisis regulation as a source. I will highlight a few factors that already reside in various areas of our regulatory framework that I am considering. The G-SIB surcharge indicators are likely to be a helpful source in this effort. For example, one factor from that framework that we might consider for purposes of tailoring is cross-border activity. This would measure assets and liabilities related to transactions with foreign banks, individuals, and companies, among others. This factor measures both a firm's complexity and resolvability, as foreign operations add operational complexity in normal times and complicate the ability of the firm to undergo an orderly resolution in times of stress. Another factor from the G-SIB surcharge framework that could be useful is a firm's use of short-term wholesale funding, which may serve as a proxy for liquidity vulnerability. Historically, reliance on short-term, uninsured funding from sophisticated funding sources has created vulnerability to large-scale funding runs and increased risks related to financial sector interconnectedness. Specifically, this can lead to "fire sale" effects that may affect broader financial stability--which occurs when banks that fund long-term or illiquid assets with short-term deposits from financial intermediaries like pension funds and money market mutual funds need to rapidly sell less-liquid assets to maintain their operations. Outside of the G-SIB surcharge framework, the Board has employed a measure of nonbank activities in certain rulemakings. These activities, which are conducted outside of a regulated depository institution, represent another source of complexity for large banks. For example, some nonbank entities engage in complex trading that is not permitted in depository institutions because of their risk. In addition to thinking about how we will tailor our regulation and supervisory programs for firms with assets between $100 billion and $250 billion, I believe the Board should also review the requirements applicable to the firms with more than $250 billion in total assets but below the G-SIB threshold. This review should ensure that our regulations continue to appropriately increase in stringency as the risk profiles of firms increase, consistent with our previously stated tailoring goals and the new legislation. The supervision and regulatory framework for these firms should reflect that there are material differences between those firms that qualify as U.S. G-SIBs and those that do not. For example, we know that non-G-SIBs with more than $250 billion in assets are generally less complex and less interconnected than U.S. G-SIBs and thus pose relatively less risk to financial stability. G-SIBs, on the other hand, have more complex activities, are more interconnected, and pose a far greater risk to financial stability should they fail. Yet at the moment, many aspects of our regulatory regime treat any bank with more than $250 billion in assets with the same stringency as a G-SIB. I believe there should be a clear differentiation. In my view, the Board should make it a near-term priority to issue a proposed rule concerning tailoring of enhanced prudential standards for large banking firms. This proposal, of course subject to notice and comment, would address our statutory obligations under the recent legislation by proposing to tailor enhanced prudential standards in a manner that recognizes relative complexity and interconnectedness among large banks. The statute sets an 18-month deadline for this regulatory process, but we can and will move much more rapidly than this. In terms of capital requirements, both risk-based and leverage capital requirements should remain core components of regulation for large firms with more than $100 billion in total assets. Stress testing should continue to play an important role in assessing potential losses that large firms would suffer under a severely adverse economic scenario; the recent legislation recognized the importance of stress testing by requiring a supervisory stress test for these large firms. Therefore, the Board's proposed stress capital buffer, if finalized, would be critical for these firms. However, we could consider a number of changes for less complex and less interconnected firms related to their capital requirements. For example, such firms, even if above $250 billion in assets, could have less frequent company-run stress tests. For those below $250 billion in assets, the statute requires supervisory stress tests to be conducted "periodically", which suggests the legislature wanted us to at least consider a rhythm other than annually. Additionally, less complex and less interconnected firms could be exempted from requirements to calculate risk-weighted assets under the models- based advanced approaches to capital. I continue to strongly believe that liquidity regulation should be a primary component of supervision and regulation of large banks. We all saw the central role that liquidity risk played in the recent financial crisis. Minimum standardized liquidity measures and internal liquidity stress tests remain critical at firms with more than $100 billion in total assets. However, for less complex and less interconnected firms with assets greater than $100 billion, there may be opportunities to modify aspects of the standardized liquidity requirements as well as expectations around internal liquidity stress tests and liquidity risk management. Similarly, banks with more than $250 billion in assets that are not G-SIBs currently face largely the same liquidity regulation as G-SIBs. As I've said previously, I believe it would make sense to calibrate the liquidity requirements differently for these firms relative to their G-SIB counterparts. Resolution planning is especially critical to ensure that the largest banking firms structure their operations in ways that make it more possible for them to be resolved upon failure without causing systemic risks for the broader economy. But most firms with total assets between $100 billion and $250 billion do not pose a high degree of resolvability risk, especially if they are less complex and less interconnected. Therefore, we should consider scaling back or removing entirely resolution planning requirements for most of the firms in that asset range. Further, we should consider limiting the scope of application of resolution planning requirements to only the largest, most complex, and most interconnected banking firms because their failure poses the greatest spillover risks to the broader economy. For firms that would still be subject to resolution planning requirements, we could reduce the frequency and burden of such requirements, perhaps by requiring more-targeted resolution plans. In conclusion, I believe we have a unique opportunity to further tailor our supervision and regulation framework for large banks in a manner that allows us to be more risk-sensitive while still meeting our core goals of promoting safety and soundness and enhancing financial stability. The recent legislation requires the Board to tailor its framework of supervision and regulation of large firms in a manner that continues to recognize size as one risk factor, but also more holistically incorporates other risk categories. In implementing this legislation, we should consider tailoring regulation further to take into account large banks' complexity and interconnectedness. Of course, the details of how we implement a tailored framework will be subject to debate, and you can expect the Federal Reserve to be highly engaged in the public Consumer Protection Act is a high priority for the Board, and we look forward to hearing the range of views as we make progress.
r180719a_FOMC
united states
2018-07-19T00:00:00
Introductory Remarks
quarles
0
Good morning. I am sorry that I cannot be with you in person for this third roundtable remarks to make clear the Federal Reserve's full commitment to mitigating the risk to financial stability should a key reference rate cease to be available. We support the ARRC and its work. And support for the ARRC is not limited to the Federal Reserve System: the Bureau of Treasury Department are all ex officio members of the ARRC. Since many have only recently begun to pay more attention to these issues, let me remind you of our reasons for convening the ARRC four years ago. The Federal Reserve began its role in co-chairing the Financial Stability Board's working group on interest rate benchmarks and joined in international efforts to strengthen LIBOR following reports that employees at several banks sought to manipulate these rates. We have served on the ICE Benchmark IBA in developing the reforms set out in IBA's Roadmap for LIBOR. Thanks in part to this, and LIBOR, the safeguards against that type of manipulation happening again have been greatly and appropriately tightened. But as part of this process, we also examined data that we had begun to collect on the markets that underlie LIBOR and found that those markets had become extraordinarily thin. On most days, the banks that submit to LIBOR have been forced to rely on their own judgement and models in submitting to LIBOR rather than actual transactions. Many of them have become justifiably uncomfortable with a system that pins hundreds of trillions of dollars' worth of financial contracts to that type of judgement call. People may have some general sense of this, but because IBA does not release data on the transactions that actually underlie LIBOR, many may not be aware of how truly thin these markets have become. At the last roundtable, Jay Powell showed data on the volume of borrowing in wholesale unsecured U.S. dollar markets. That chart can be found in the ARRC's Second Report, which estimated that on a typical day the volume of three-month funding transactions was about $500 million. On many days there is much less. By way of comparison, we estimate that there are roughly $200 trillion of financial securities referencing U.S. dollar To help provide some further transparency, I will show data that we have on the number of transactions involved. These data are based on the information available to the Federal Reserve, which is fairly comprehensive but still may differ in some respects from that available to IBA, so I should caution that this is merely an informed estimate of the number of transactions underlying U.S. dollar LIBOR and may not be exact. As you can see in the slide, there are a few more transactions at the shorter LIBOR maturities. On average, we observe six or seven transactions per day at market rates that could underpin one- and three-month LIBOR across all of the panel banks. The longer maturities have even fewer transactions. There are two to three transactions each day for six-month LIBOR. On average, there is only one transaction that we see underlying one-year LIBOR, and many days there are no transactions at all. The secured overnight financing rate (SOFR) has only been in existence three months, and SOFR futures have only been trading for two months, but on a daily basis there are already more transactions underlying them than there are underpinning LIBOR. And SOFR itself reflects over $700 billion in overnight repurchase agreement (repo) transactions every day. One of the many benefits of using a rate so firmly anchored in a market of this depth is that no one can question whether SOFR is representative. It clearly is. With LIBOR reliant on expert judgment rather than direct transactions, many banks increasingly uncomfortable providing that judgment, and the official sector unable to compel them to do so indefinitely, it was obvious to us that this structure--which bases so many trillions of financial instruments on such a small number of underlying transactions--was potentially unstable. It was clear that the market needed to develop alternatives in case the worst happened, and this was the reason that we convened the ARRC four years ago. We asked the ARRC to identify a robust alternative to U.S. dollar LIBOR and to develop a plan to promote its use. Sandie O'Connor will speak shortly about the ARRC's work, but I want to emphasize that this has been a model of cooperative effort between the private sector and the public sector. When the ARRC started, the interest rate benchmarks that they would eventually narrow their choice down to did not even exist. Those rates have not been easy to create, and as the ARRC expressed interest in a Treasury repo rate benchmark that would span the widest possible scope of the market, the Federal Reserve Bank of New York put great effort in working with the Federal Reserve Board and Office of Financial Research to create SOFR. The result, the rate the ARRC has chosen, reflects the largest and deepest rates market in the world and is a huge accomplishment for all of us. That accomplishment is only the start of what will be many. As I noted, just three months after SOFR has begun production, we have already seen the introduction of futures markets on the Chicago Mercantile Exchange. LCH has now begun to offer clearing of SOFR overnight index and basis swaps, and CME Group will begin to do so within a few months. These are steps that the private sector must lead, and they have, but it is important that the public sector encourage the development of SOFR markets where it can. The ARRC has issued a first letter to U.S. regulators asking us to consider exemptions for legacy swaps seeking to incorporate the International Swaps and Derivatives Association's protocol or exemptions for amending to move from LIBOR to SOFR, and these are issues that we should look at seriously; we should avoid placing unintended hurdles in the way of those who may seek to transition from LIBOR. Here is something you don't hear often in the context of any large, complex undertaking involving input from a large number of stakeholders: The effort to implement SOFR is ahead of schedule. The ARRC has recognized that we have to make transitioning to SOFR as easy as it can be. That is the reason that it has added the creation of a forward-looking term rate as the final step in its Paced Transition Plan. As the ARRC has noted and the Financial Stability Board has now reiterated, this kind of forward-looking term rate will be useful in facilitating a transition away from LIBOR in some cash markets, such as corporate loans, but it is not primarily intended for use in derivatives markets. In fact, we have to encourage use of SOFR in derivatives markets to the fullest extent possible if a robust forward-looking term rate is to be created. For that reason, we also have to find ways to encourage uses of SOFR in those cash markets where it is appropriate. The European Investment Bank's recent announcement that it would issue a floating rate note paying a compound average of Sterling Overnight Index Average (SONIA) shows that this can be done. In the spirit of encouraging this type of use of SOFR, I think it is appropriate for the Federal Reserve to consider publishing a compound average of SOFR that market participants could then use. It has been suggested that we could call it SAFR, for secured average financing rate, and this is something that we are encouraging our staff to explore. Publishing a compound average rate that encourages broader use of SOFR would help make our financial system more resilient. Sandie will talk more about the creation of the forward-looking term version of SOFR in her remarks. SAFR would not be a competitor to this forward-looking rate, it would be in alliance with it. If there were a large volume of products referencing the compound average rate, there would likely be related demand for futures contracts to hedge those positions, helping to make the forward rate more robust. It is important that we find ways to make it as easy as possible to use SOFR because the risks to LIBOR are, at this stage, quite considerable. Even as the ARRC and similar currency groups in other jurisdictions were being formed, the FCA was exerting considerable effort to convince banks to continue submitting to LIBOR. We have to be aware that two banks left the U.S. dollar panels despite this encouragement, and that the agreement that the FCA reached with the remaining banks to continue submitting voluntarily through the end of 2021 now has just three-and-a-half years left. And as Andrew Bailey noted last week, there is also the prospect that LIBOR could be judged to be nonrepresentative under the European Union's Benchmark Regulations, which would severely curtail the liquidity of products referencing LIBOR. Apart from the questions as to whether LIBOR will continue or whether IBA or the FCA may judge that it is not representative, market participants should consider whether a rate with so few transactions underlying it is really their best option. Do we really need or want to use other tenors of LIBOR in municipal or financial floating-rate notes? For some, the answer to this may still be yes, but it is then imperative that they work to incorporate safer fallback language into their contracts as quickly as they can. This is something that the ARRC has been working on intensively this year and much of today's roundtable will discuss. But for others, the safest thing you can do would be to move away from using LIBOR. Regardless of your answer to this question, we all have a stake in the ARRC's work, both in promoting SOFR and in promoting better contract language. I want to thank you all for coming to this roundtable, and want to thank Sandie and the ARRC for all the work that has made it possible.
r180824a_FOMC
united states
2018-08-24T00:00:00
Monetary Policy in a Changing Economy
powell
1
Thank you for the opportunity to speak here today. Fifteen years ago, during the period now referred to as the Great Moderation, the topic of this symposium was Greenspan famously declared that "uncertainty is not just an important feature of the monetary policy landscape; it is the defining characteristic of that landscape." On the doorstep of the period now referred to as the Global Financial Crisis, surely few, if any, at that symposium would have imagined how shockingly different the next 15 years would be from the 15 years that preceded it. Over the course of a long recovery, the U.S. economy has strengthened substantially. The unemployment rate has declined steadily for almost nine years and, at 3.9 percent, is now near a 20-year low. Most people who want jobs can find them. Inflation has moved up and is now near the Federal Open Market Committee's (FOMC) objective of 2 percent after running generally below that level for six years. With solid household and business confidence, healthy levels of job creation, rising incomes, and fiscal stimulus arriving, there is good reason to expect that this strong performance will continue. As the economy has strengthened, the FOMC has gradually raised the federal funds rate from its crisis-era low near zero toward more normal levels. We are also allowing our securities holdings--assets acquired to support the economy during the deep recession and the long recovery--to decline gradually as these securities are paid off. I will explain today why the Committee's consensus view is that this gradual process of normalization remains appropriate. As always, there are risk factors abroad and at home that, in time, could demand a different policy response, but today I will step back from these. In keeping with the spirit of this year's symposium topic--the changing structures of the economy--I would also note briefly that the U.S. economy faces a number of longer-term structural challenges that are mostly beyond the reach of monetary policy. For example, real wages, particularly for medium- and low-income workers, have grown quite slowly in recent decades. Economic mobility in the United States has declined and is now lower than in most other advanced economies. Addressing the federal budget deficit, which has long been on an unsustainable path, becomes increasingly important as a larger share of the population retires. Finally, it is difficult to say when or whether the economy will break out of its low-productivity mode of the past decade or more, as it must if incomes are to rise meaningfully over time. My FOMC colleagues and I believe that we can best support progress on these longer-term issues by pursuing the Federal Reserve's mandate and supporting continued economic growth, a strong labor market, and inflation near 2 percent. The topic of managing uncertainty in policymaking remains particularly salient. I will focus today on one of the many facets of uncertainty discussed at the 2003 symposium--uncertainty around the location of important macroeconomic variables such as the natural rate of unemployment. A good place to start is with two opposing questions that regularly arise in discussions of monetary policy both inside and outside the Fed: 1. With the unemployment rate well below estimates of its longer-term normal level, why isn't the FOMC tightening monetary policy more sharply to head off overheating and inflation? 2. With no clear sign of an inflation problem, why is the FOMC tightening policy at all, at the risk of choking off job growth and continued expansion? These questions strike me as representing the two errors that the Committee is always seeking to avoid as expansions continue--moving too fast and needlessly shortening the expansion, versus moving too slowly and risking a destabilizing overheating. As I will discuss, the job of avoiding these errors is made challenging today because the economy has been changing in ways that are difficult to detect and measure in real time. I will first lay out a standard view of a handful of basic relationships that are thought to reflect key aspects of the underlying structure of the economy. I will then use that framework to explain the role that structural change plays in our current policy deliberations, focusing on how that role has been shaped by two historical episodes. In conventional models of the economy, major economic quantities such as inflation, unemployment, and the growth rate of gross domestic product (GDP) fluctuate chosen a 2 percent inflation objective as one of these desired values. The other values are not directly observed, nor can they be chosen by anyone. Instead, these values result from myriad interactions throughout the economy. In the FOMC's quarterly Summary of Economic Projections (SEP), participants state their individual views on the longer-run normal values for the growth rate of GDP, the unemployment rate, and the federal funds rate. These fundamental structural features of the economy are also known by more familiar names such as the "natural rate of unemployment" and "potential output growth." The longer-run federal funds rate minus long-run inflation is the "neutral real interest rate." At the Fed and elsewhere, analysts talk about these values so often that they have acquired shorthand names. For example, u* (pronounced "u star") is the natural rate of unemployment, r* ("r star") is the neutral real rate of interest, and p* ("pi star") is the inflation objective. According to the conventional thinking, policymakers should navigate by these stars. In that sense, they are very much akin to celestial stars. For example, the famous Taylor rule calls for setting the federal funds rate based on where inflation and unemployment stand in relation to the stars. If inflation is higher than p*, raise the real federal funds rate relative to r*. The higher real interest rate will, through various channels, tend to moderate spending by businesses and households, which will reduce upward pressure on prices and wages as the economy cools off. In contrast, if the unemployment rate is above u*, lower the real federal funds rate relative to r*, which will stimulate spending and raise employment. Navigating by the stars can sound straightforward. Guiding policy by the stars in practice, however, has been quite challenging of late because our best assessments of the location of the stars have been changing significantly. In December 2013, the FOMC began winding down the final crisis-era asset purchase program. Asset purchases declined to zero over 2014, and in December 2015, the FOMC began the gradual normalization of interest rates that continues to this day. As normalization has proceeded, FOMC participants and many other private- and public- sector analysts regularly adjusted their assessments of the stars (figure 1). Many projections of the natural rate of unemployment fell roughly 1 full percentage point, as did assessments of the neutral interest rate. Estimates of the potential growth rate of GDP slipped about 1/2 percentage point. These changing assessments have big implications. For example, the 1 percentage point fall in the neutral interest rate implies that the federal funds rate was considerably closer to its longer-run normal and, hence, that policy was less accommodative than thought at the beginning of normalization. The 1 percentage point fall in the natural rate of unemployment implies at present that about 1.6 million more people would have jobs when unemployment is at its longer-run level. These shifts in the stars generally reflect analysts' attempts to square their estimates with arriving macroeconomic data. For example, as the unemployment rate fell toward, and then below, estimates of its natural rate, many expected inflation to move up. When inflation instead moved sideways, a reasonable inference was that the natural rate was lower than previously thought. Further, over this period, GDP growth was slower than one might have expected based on the rapid decline in unemployment and the well-known relationship between output and unemployment known as Okun's law. Put another way, labor productivity growth consistently disappointed, which raised the question of whether that shortfall was temporary--perhaps due to headwinds from the crisis--or was part of a new normal. These assessments of the values of the stars are imprecise and subject to further revision. To return to the nautical metaphor, the FOMC has been navigating between the shoals of overheating and premature tightening with only a hazy view of what seem to be shifting navigational guides. Our approach to this challenge has been shaped by two much discussed historical episodes--the Great Inflation of the 1960s and 1970s and the "new economy" period of the late 1990s. While the crisis and its aftermath have been extraordinary in many ways, the shifting of the stars is not one of them. Figure 2 illustrates the Congressional Budget Office's (CBO) current estimate of movements in the natural rate of unemployment and potential GDP growth from 1960 to 2000. Viewed against the ups and downs observed over these four decades, the recent shifts in longer-run values are not all that dramatic. Of course, these CBO estimates benefit from many years of hindsight, whereas monetary policy must be based on assessments made in real time. The Great Inflation period vividly illustrates the difficulties this difference raises. Around 1965, the United States entered a period of high and volatile inflation that ended with inflation in double digits in the early 1980s. Multiple factors, including monetary policy errors, contributed to the Great Inflation. Many researchers have concluded that a key mistake was that monetary policymakers placed too much emphasis on imprecise--and, as it turns out, overly optimistic--real-time estimates of the natural rate of unemployment. Figure 3 compares the CBO's current view of the natural rate of unemployment in that era with an estimate by Athanasios Orphanides and John Williams of the rate as policymakers perceived it in real time. From 1965 to the early 1980s, this real-time estimate of u* was well below where hindsight now places it. The unemployment rate over this period was generally well above the real-time natural rate, and contemporary documents reveal that policymakers were wary of pushing the unemployment rate even further With the benefit of hindsight, we now think that, except for a few years in the mid-1970s, the labor market was tight and contributing to inflation's rise (figure 4, lower panel). It is now clear that the FOMC had placed too much emphasis on its imprecise estimates of u* and too little emphasis on evidence of rising inflation expectations. The Great Inflation did, however, prompt an "expectations revolution" in macroeconomic thinking, with one overwhelmingly important lesson for monetary policymakers: Anchoring longer-term inflation expectations is a vital precondition for reaching all other monetary policy goals. When longer-term inflation expectations are anchored, unanticipated developments may push inflation up or down, but people expect that inflation will return fairly promptly to the desired value. This is the key insight at the heart of the widespread adoption of inflation targeting by central banks in the wake of the Great Inflation. Anchored expectations give a central bank greater flexibility to stabilize both unemployment and inflation. When a central bank acts to stimulate the economy to bring down unemployment, inflation might push above the bank's inflation target. With expectations anchored, people expect the central bank to pursue policies that bring inflation back down, and longer-term inflation expectations do not rise. Thus, policy can be a bit more accommodative than if policymakers had to offset a rise in longer-term expectations. The second half of the 1990s confronted policymakers with a situation that was in some ways the flipside of that in the Great Inflation. In mid-1996, the unemployment rate was below the natural rate as perceived in real time, and many FOMC participants and others were forecasting growth above the economy's potential. Sentiment was building on the FOMC to raise the federal funds rate to head off the risk of rising inflation. But Chairman Greenspan had a hunch that the United States was experiencing the wonders of a "new economy" in which improved productivity growth would allow faster output growth and lower unemployment, without serious inflation risks. Greenspan argued that the FOMC should hold off on rate increases. Over the next two years, thanks to his considerable fortitude, Greenspan prevailed, and the FOMC raised the federal funds rate only once from mid-1996 through late 1998. Starting in 1996, the economy boomed and the unemployment rate fell, but, contrary to conventional wisdom at the time, inflation fell. Once again, shifting stars help explain the performance of inflation, which many had seen as a puzzle. Whereas during the Great Inflation period the real-time natural rate of unemployment had been well below our current-day assessment, in the new-economy period, this relation was reversed (figure 3). The labor market looked to be tight and getting tighter in real time, but in retrospect, we estimate that there was slack in the labor market in 1996 and early 1997, and the labor market only tightened appreciably through 1998 (figure 4). Greenspan was also right that the potential growth rate had shifted up. With hindsight, we recognize today that higher potential growth could accommodate the very strong growth that actually materialized, let alone the moderate growth policymakers were forecasting. The FOMC thus avoided the Great-Inflation-era mistake of overemphasizing imprecise estimates of the stars. Under Chairman Greenspan's leadership, the Committee converged on a risk-management strategy that can be distilled into a simple request: Let's wait one more meeting; if there are clearer signs of inflation, we will commence tightening. Meeting after meeting, the Committee held off on rate increases while believing that signs of rising inflation would soon appear. And meeting after meeting, inflation gradually declined. In retrospect, it may seem odd that it took great fortitude to defend "let's wait one more meeting," given that inflation was low and falling. Conventional wisdom at the time, however, still urged policymakers to respond preemptively to inflation risk--even when that risk was gleaned mainly from hazy, real-time assessments of the stars. With the experience in the new-economy period, policymakers were beginning to appreciate that, with inflation expectations much better anchored than before, there was a smaller risk that an inflation uptick under Greenspan's "wait and see" approach would become a significant problem. Given what the economy has shown us over the past 15 years, the need for the sort of risk-management approach that originated in the new-economy era is clearer than ever before. That approach continues to evolve based on experience and the growing literature on monetary policy and structural uncertainty. Experience has revealed two realities about the relation between inflation and unemployment, and these bear directly on the two questions I started with. First, the stars are sometimes far from where we perceive them to be. In particular, we now know that the level of the unemployment rate relative to our real-time estimate of u* will sometimes be a misleading indicator of the state of the economy or of future inflation. Second, the reverse also seems to be true: Inflation may no longer be the first or best indicator of a tight labor market and rising pressures on resource utilization. Part of the reason inflation sends a weaker signal is undoubtedly the achievement of anchored inflation expectations and the related flattening of the Phillips curve. Whatever the cause, in the run-up to the past two recessions, destabilizing excesses appeared mainly in financial markets rather than in inflation. Thus, risk management suggests looking beyond inflation for signs of excesses. These two realities present challenges. The literature on uncertainty reviewed at the 2003 symposium--and much refined since then--provides important advice for how policy should respond, although not yet, in my view, an explicit recipe or rule that a prudent central bank should follow. The literature on robust rules, such as so-called difference rules, for example, supports the idea of putting less emphasis on the level of unemployment relative to u*. The FOMC's practice of looking at a broad range of indicators when assessing the state of the labor market has explicitly been part of the inception in 2012. We have greatly expanded the scope of our surveillance for signs of labor market tightness and of destabilizing excesses more generally. The risks from misperceiving the stars also now play a prominent role in the FOMC's deliberations. A paper by Federal Reserve Board staff is a recent example of a range of research that helps FOMC participants visualize and manage these risks. research reports simulations of the economic outcomes that might result under various policy rules and policymaker misperceptions about the economy. One general finding is that no single, simple approach to monetary policy is likely to be appropriate across a broad range of plausible scenarios. More concretely, simulations like these inform our risk management by assessing the likelihood that misperception would lead to adverse outcomes, such as inflation falling below zero or rising above 5 percent. Finally, the literature on structural uncertainty suggests some broader insights. This literature started with the work of William Brainard and the well-known Brainard principle, which recommends that when you are uncertain about the effects of your actions, you should move conservatively. In other words, when unsure of the potency of a medicine, start with a somewhat smaller dose. As Brainard made clear, this is not a universal truth, and recent research highlights two particularly important cases in which doing too little comes with higher costs than doing too much. The first case is when attempting to avoid severely adverse events such as a financial crisis or an extended period with interest rates at the effective lower bound. In such situations, the famous words "We will do whatever it takes" will likely be more effective than "We will take cautious steps toward doing whatever it takes." The second case is when inflation expectations threaten to become unanchored. If expectations were to begin to drift, the reality or expectation of a weak initial response could exacerbate the problem. I am confident that the FOMC would resolutely "do whatever it takes" should inflation expectations drift materially up or down or should crisis again threaten. In addition, a decade of regulatory reforms and private-sector advances have greatly increased the strength and resilience of the financial system, with the aim of reducing the likelihood that the inevitable financial shocks will become crises. Let me conclude by returning to the matter of navigating between the two risks I identified--moving too fast and needlessly shortening the expansion, versus moving too slowly and risking a destabilizing overheating. Readers of the minutes of FOMC meetings and other communications will know that our discussions focus keenly on the relative salience of these risks. The diversity of views on the FOMC is one of the great virtues of our system. Despite differing views on these questions and others, we have a long institutional tradition of finding common ground in coalescing around a policy stance. I see the current path of gradually raising interest rates as the FOMC's approach to taking seriously both of these risks. While the unemployment rate is below the Committee's estimate of the longer-run natural rate, estimates of this rate are quite uncertain. The same is true of estimates of the neutral interest rate. We therefore refer to many indicators when judging the degree of slack in the economy or the degree of accommodation in the current policy stance. We are also aware that, over time, inflation has become much less responsive to changes in resource utilization. While inflation has recently moved up near 2 percent, we have seen no clear sign of an acceleration above 2 percent, and there does not seem to be an elevated risk of overheating. This is good news, and we believe that this good news results in part from the ongoing normalization process, which has moved the stance of policy gradually closer to the FOMC's rough assessment of neutral as the expansion has continued. As the most recent FOMC statement indicates, if the strong growth in income and jobs continues, further gradual increases in the target range for the federal funds rate will likely be appropriate. The economy is strong. Inflation is near our 2 percent objective, and most people who want a job are finding one. My colleagues and I are carefully monitoring incoming data, and we are setting policy to do what monetary policy can do to support continued growth, a strong labor market, and inflation near 2 percent. delivered at the Monetary Economics Workshop of the National Bureau of . . Lessons from the 1990s. . vol. . vol. . vol. . . . . . . . delivered at the 38th . -------- (2004). Orphanides, eds., vol. 32 . . Handbook of Adapting to a Changing Economy," a symposium sponsored by the Federal . . .
r180912a_FOMC
united states
2018-09-12T00:00:00
What Do We Mean by Neutral and What Role Does It Play in Monetary Policy?
brainard
0
It is a pleasure to be in Detroit. I started my career working here in the Motor City, and I have followed the fortunes of this area with interest ever since. A few years ago, I visited some of Detroit's neighborhoods with our local officials at a time when damage from the crisis was still pervasive . While challenges remain in many of the city's neighborhoods, since that time the metropolitan area overall has seen signs of a rebound in business activity and investment, and the unemployment rate has continued to trend downward, recently falling to 4.3 percent. This is similar to the nation's economy more broadly. While challenges remain for many, aggregate growth is strong, and the economy is meeting our full employment and inflation objectives. Given the outlook, it comes as no surprise that the Federal Open Market Committee (FOMC) has been gradually raising interest rates from crisis-era lows and sees further gradual increases as likely to be appropriate in its most recent statement. Before discussing the outlook, it might be useful to first explore some concepts that are important in informing the path of rates. In thinking about how we should set the federal funds rate, many policymakers and economists find the concept of the neutral rate of interest to be a useful frame of reference. So, what does the neutral rate mean? Intuitively, I think of the nominal neutral interest rate as the level of the federal funds rate that keeps output growing around its potential rate in an environment of full employment and stable inflation. Focusing first on the "shorter-run" neutral rate, this does not stay fixed, but rather fluctuates along with important changes in economic conditions. For instance, legislation that increases the budget deficit through tax cuts and spending increases can be expected to generate tailwinds to domestic demand and thus to push up the shorter-run neutral interest rate. Heightened risk appetite among investors similarly can be expected to push up the shorter-run neutral rate. Conversely, many of the forces that contributed to the financial crisis--such as fear and uncertainty on the part of businesses and households-- can be expected to lower the neutral rate of interest, as can declines in foreign demand for In many circumstances, monetary policy can help keep the economy on its sustainable path at full employment by adjusting the policy rate to reflect movements in the shorter-run neutral rate. In this context, the appropriate reference for assessing the stance of monetary policy is the gap between the policy rate and the nominal shorter-run neutral rate. So far, I have been focusing on the shorter-run neutral rate of interest that is responsive to headwinds or tailwinds to demand. The longer-run equilibrium rate is a related concept. The underlying concept of the "longer run" generally refers to the output growing at its longer-run trend, after transitory forces reflecting headwinds or tailwinds have played out, in an environment of full employment and inflation running at the FOMC objective. The longer-run federal funds rate estimated by FOMC participants in their equilibrium rate of interest. It is worth highlighting that the longer-run federal funds rate is the only neutral interest rate reported in the FOMC projections. But the shorter- run neutral rate, rather than the longer-run federal funds rate, is the relevant benchmark for assessing the near-term path of monetary policy in the presence of headwinds or tailwinds. Similar to other equilibrium macroeconomic concepts such as potential gross domestic product (GDP) and the natural rate of unemployment, the shorter- and longer- run levels of the neutral rate are not directly observable, so they must be estimated or inferred from the movements of variables that are observed, such as market interest rates, inflation, the unemployment rate, and GDP. In recent years, considerable work has derived estimates of the longer-run equilibrium rate, in some cases using statistical techniques that can be thought of as capturing the highly persistent component of the neutral rate. The central tendency of those estimates suggests that the longer-run trend rate is in the range of 2.5 to 3.5 percent in nominal terms. This range lines up well with the most recent median estimate of the longer-run federal funds rate in the FOMC SEP, ; which is just below 3 percent. By these estimates, the longer-run neutral rate has fallen considerably from the estimated range in earlier decades of 4 to 5 percent. Turning to the shorter-run neutral rate, although the estimates are model dependent and uncertain, we can make some general inferences about its recent evolution that are largely independent of the details of specific models. Estimates suggest the shorter-run neutral rate tends to be cyclical, falling in recessions and rising during expansions, and our current expansion appears to be no exception. Last year, the unemployment rate returned to pre-crisis levels, which required real interest rates that were below zero for nearly 10 years. This year, the unemployment rate has fallen further, and job market gains have gathered strength, at the same time that the federal funds rate has increased. This combination suggests that the short-run neutral interest rate likely has also increased. If, instead, the neutral rate had remained constant as the federal funds rate increased, we would have expected to see labor market gains slow. That inference is consistent with the formal model estimates, which indicate that the shorter-run neutral rate has gone up as the expansion has advanced. This is also suggested by the observation that overall financial conditions, as measured by a variety of indexes, have remained quite accommodative during a period when the federal funds rate has been moving higher. In the latest FOMC SEP median path, by the end of next year, the federal funds rate is projected to rise to a level that exceeds the longer-run federal funds rate during a time when real GDP growth is projected to exceed its longer-run pace and unemployment continues to fall. The shift from headwinds to tailwinds may be expected to push the shorter-run neutral rate above its longer-run trend in the next year or two, just as it fell below the longer-run equilibrium rate following the financial crisis. Notably, the sizable fiscal stimulus in the pipeline is likely to continue to bolster the short-run neutral rate over the next two years. The relatively rich level of current asset valuations relative to historical levels is another factor that could push the short-run neutral rate above its longer-run value. As was noted in the recent FOMC minutes, corporate credit spreads are very narrow, and equity valuations are elevated relative to historical patterns, even after taking into account the low level of interest rates. Business and consumer confidence is high, which is also consistent with a higher shorter-run neutral rate of interest. Having provided some context for how we might assess policy, I will turn to some observations on the outlook. By any measure, overall growth in the second quarter was strong. Real GDP increased at a 4-1/4 percent annual rate, a very rapid pace nine years into the expansion. Looking ahead, it seems likely that growth will remain solid. Confidence is high, private domestic demand momentum is solid, and recent fiscal stimulus will continue to work its way through the economy, at least for the next year or so. The labor market is also strong. So far this year, payroll gains have averaged more than 200,000 per month, a step-up from the 2017 pace and well above estimates of the pace necessary to absorb new entrants into the labor force. Among prime-age workers, the employment-to-population ratio is 79.3 percent, up almost 1 percentage point over the past year. These developments are heartening, suggesting the tight labor market is providing employment opportunities to more Americans. Nonetheless, this is still about 1 percentage point below its previous cyclical peak, suggesting there may be some room for further gains. In another encouraging development, wage gains in the August report reached their highest level since the depth of the financial crisis, although wage growth remains moderate by historical standards. While a variety of wage measures have accelerated over the past year and there is anecdotal evidence of worker shortages in some sectors and regions, there is no evidence of rapid acceleration in the aggregate wage indicators. At 3.9 percent, the August unemployment rate was about 1/2 percentage point lower than the previous year. If unemployment continues to decline at the same rate as we have seen over the past year, we will soon see unemployment rates not seen since the 1960s. Historically, the few periods when resource utilization has been at similarly tight levels have tended to see elevated risks of either accelerating inflation or financial imbalances. For instance, the inflation process may change in unexpected ways. So far, the data on inflation remain encouraging, providing little signal of an outbreak of inflation to the upside, on the one hand, and some reassurance that underlying trend inflation may be moving closer to 2 percent, on the other. Core personal consumption expenditures (PCE) prices have increased 2 percent over the past 12 months, consistent with the FOMC's objective. Survey measures of inflation expectations remain stable in the lower end of the historical ranges, while market-based measures of inflation compensation remain stable at levels above the lows seen in 2016. With various measures of underlying trend inflation having come in below our 2 percent objective over a sustained period, it is important to sustainably achieve inflation around 2 percent to prevent an erosion of underlying trend inflation the next time the economy faces a downturn and the federal funds rate hits its lower limit. The past few times unemployment fell to levels as low as those projected over the next year, signs of overheating showed up in financial-sector imbalances rather than in accelerating inflation. The Federal Reserve's assessment suggests that financial vulnerabilities are building, which might be expected after a long period of economic expansion and very low interest rates. Rising risks are notable in the corporate sector, where low spreads and loosening credit terms are mirrored by rising indebtedness among corporations that could be vulnerable to downgrades in the event of unexpected adverse developments. Leveraged lending is again on the rise; spreads on leveraged loans and the securitized products backed by those loans are low, and the Board's Senior Loan Officer Opinion Survey on Bank Lending Practices suggests that underwriting standards for leveraged loans may be declining to levels not seen since 2005. While tightening resource utilization and loose financial conditions present upside risks, recent foreign developments present downside risks. Trade policy has introduced uncertainty. Growth in Europe and Japan has moderated from its strong pace of last year, and political risks have reemerged in countries such as Italy. China is contending with deleveraging and deceleration as well as a challenging trade environment. As U.S. growth has pulled away from foreign growth, in part reflecting fiscal policy divergence, expectations of monetary policy divergence strengthened, contributing to upward pressure on the dollar earlier this year. The resulting currency adjustments are compounding challenges faced by some emerging market economies, along with a complicated and unpredictable trade environment and gradually increasing interest rates. Although capital flow reversals have been contained to several notably vulnerable countries so far, I am attentive to the risk that a pullback from emerging markets could broaden. What are the implications for policy? Over the next year or two, barring unexpected developments, continued gradual increases in the federal funds rate are likely to be appropriate to sustain full employment and inflation near its objective. With government stimulus in the pipeline providing tailwinds to demand over the next two years, it appears reasonable to expect the shorter-run neutral rate to rise somewhat higher than the longer-run neutral rate. Further out, the policy path will depend on how the economy evolves. These developments raise the prospect that, at some point, the Committee's setting of the federal funds rate will exceed current estimates of the longer-run federal funds rate. Indeed, the median projection in the SEP has this property. This raises the possibility of a flattening or inversion of the yield curve in the event that term premiums do not rise from their currently very low levels. Like many of you, I am attentive to the historical observation that inversions of the yield curve between the 3-month and 10-year Treasury rates have had a relatively reliable track record of preceding recessions in the United States But unlike these historical episodes, today the current 10-year yield is very low at around 3 percent, which is well below the average of 6-1/4 percent during the decades before the crisis. Part of the reason the 10-year Treasury yield is unusually low is that market expectations of interest rates in the longer run are themselves quite low, as discussed earlier. Another important reason the 10-year Treasury yield is very low is that the term premium has fallen to levels that are very low by historical standards. According to one estimate from Federal Reserve staff, the term premium has been slightly negative until very recently and remains very low. By contrast, it was close to 100 basis points when the spread between the 3-month and 10-year Treasury yields was at its peak of 325 basis points in early 2010. This may temper somewhat the conclusions that we can draw from historical yield curve relationships characterized by a substantially higher term premium. If the term premium remains very low, any given amount of monetary policy tightening will lead to an inversion sooner so that even a modest tightening that might not have led to an inversion historically could do so today. One reason the term premium may be lower than in the past is the changed correlation between stock and bond returns, likely associated with changes in expected inflation outcomes. The other driver of the low level of the term premium globally is the asset purchases of central banks in several major economies. In this case, if the term premium rises as the effect of asset purchase programs diminishes, the effect may be to forestall an inversion of the long-dated yield curve. It is worth highlighting that the flattening yield curve suggested by the SEP median is associated with a policy path calibrated to sustain full employment and inflation around target. So, while I will keep a close watch on the yield curve as an important signal on financial conditions, I will want to interpret yield curve movements as one of several considerations informing appropriate policy. Indeed, the possibility that the projected policy path may have unintended consequences is one of the compelling reasons for raising interest rates gradually. The gradual pace of interest rate increases anticipated in the SEP median path should give us some time to assess the effects of our policies as we proceed. While the information available to us today suggests that a gradual path is appropriate, we would not hesitate to act decisively if circumstances were to change. If, for example, underlying inflation were to move abruptly and unexpectedly higher, it might be appropriate to depart from the gradual path. Stable inflation expectations is one of the key achievements of central banks in the past several decades, and we would defend it vigorously. Our challenge is to sustain full employment and inflation at 2 percent, which is likely to warrant continued gradual increases in the federal funds rate. With fiscal stimulus in the pipeline and financial conditions supportive of growth, the shorter-run neutral interest rate is likely to move up somewhat further, and it may well surpass the longer-run equilibrium rate for some period. Beyond the near term, how much the neutral rate is likely to rise and whether it flattens or moderates further out will depend on a variety of developments--such as whether fiscal stimulus is extended or expires, whether foreign and trade risks grow or recede, and whether financial system vulnerabilities extend. As such, the gradual pace of rate increases implicit in the SEP's median policy path incorporates a degree of caution, which is appropriate, in my view. . . . . . . . . . , Federal Journal of . . . . . . . symposium sponsored by the Federal Reserve Bank of Kansas City, held in . .
r180927a_FOMC
united states
2018-09-27T00:00:00
Brief Remarks on the U.S. Economy
powell
1
Good afternoon. Thank you, Senator Reed, for the kind words and the opportunity to be a part of the annual Rhode Island Business Leaders Day. And thank you, all, for sticking with the program for the last speaker of the day. The Federal Open Market Committee, the body within the Federal Reserve that sets monetary policy, just concluded a meeting yesterday. I plan to talk briefly about how my colleagues and I see the economy evolving and our role in keeping it healthy. Importantly, I want to hear from you. I very much appreciate your views, as business people, of economic conditions where you live and work. And, of course, I will be happy to respond to questions. Our economy is strong. Growth is running at a healthy clip. Unemployment is low, the number of people working is rising steadily, and wages are up. Inflation is low and stable. All of these developments are very good signs. Of course, that is not to say that everything is perfect. The benefits of this strong economy have not reached all Americans. Many of our country's economic challenges are beyond the scope of the Fed, but we are doing all we can to keep the economy strong and moving forward. That is the best way we can promote an environment in which every American has the opportunity to succeed. Each time we meet, we face the same question: How can we set monetary policy to best support job growth and low, steady inflation? For many years, this question called for very low interest rates to help an economy that had been damaged by the deep financial crisis that gripped the world 10 years ago. As the economy has steadily gained strength, the Fed has been gradually returning interest rates closer to the levels that are normal in a healthy economy. We took another step on that path yesterday, with a quarter-point increase in short-term interest rates. These rates remain low, and my colleagues and I believe that this gradual return to normal is helping to sustain this strong economy for the longer-run benefit of all Americans. As I mentioned, 10 years have now passed since the depths of the financial crisis--a painful part of our history that cost many Americans their jobs, their homes, and, for some, their hopes and dreams. In addition to holding interest rates low to support the recovery, we have taken many steps to make the financial system safer. In particular, we are holding the largest banks to much higher standards in the amount of capital and liquidity they hold and in the ways they assess and manage the risks they take. I am confident that the system today is stronger and in a far better position to support the financial needs of households and businesses through good times and bad. We continue to work to sustain these fundamental improvements while also ensuring that regulation is both effective and efficient. Thank you. I am happy to take your questions.
r181002a_FOMC
united states
2018-10-02T00:00:00
Monetary Policy and Risk Management at a Time of Low Inflation and Low Unemployment
powell
1
It is a pleasure and an honor to speak here today at the 60th Annual Meeting of promoted the use of economics in the workplace and advanced the worthy purpose of ensuring that leading American businesses benefit from the insights of economists. Today I will focus on the Federal Reserve's ongoing efforts to promote maximum employment and stable prices. I am pleased to say that, by these measures, the economy looks very good. The unemployment rate stands at 3.9 percent, near a 20-year low. objective of 2 percent. While these two top-line statistics do not always present an accurate picture of overall economic conditions, a wide range of data on jobs and prices supports a positive view. In addition, many forecasters are predicting that these favorable conditions are likely to continue. For example, the medians of the most recent projections from FOMC participants and the Survey of Professional Forecasters, as well as the most recent Congressional Budget Office (CBO) forecast, all have the unemployment rate remaining below 4 percent through the end of 2020, with inflation staying very near 2 percent over the same period. From the standpoint of our dual mandate, this is a remarkably positive outlook. Indeed, I was asked at last week's press conference whether these forecasts are too good to be true--a reasonable question! Since 1950, the U.S. economy has experienced periods of low, stable inflation and periods of very low unemployment, but never both for such an extended time as is seen in these forecasts. Standard economic thinking has long offered an explanation for this: If unemployment were to remain this low for this long, employers would be pushing up wages as they compete for scarce workers, and rising labor costs would feed into more-rapid price inflation faced by consumers. This dynamic between unemployment and inflation is known as a Phillips curve relationship, and at times it can pose a fundamental tension between the two sides of the Fed's mandate to promote maximum employment and price stability. Recent low inflation and unemployment have some analysts asking, "Is the Phillips curve dead?" Others argue that the Phillips curve still lurks in the background and could reemerge at any time to exact revenge for low unemployment in the form of high inflation. My comments today have two main objectives. The first is to explain how changes in the Phillips curve help account for the somewhat surprising but broadly shared current forecasts of continued very low unemployment with inflation near 2 percent. At the risk of spoiling the surprise, I do not see it as likely that the Phillips curve is dead, or that it will soon exact revenge. What is more likely, in my view, is that many factors, including better conduct of monetary policy over the past few decades, have greatly reduced, but not eliminated, the effects that tight labor markets have on inflation. However, no one fully understands the nature of these changes or the role they play in the current context. Common sense suggests we should beware when forecasts predict events seldom before observed in the economy. Thus, my second objective today is to explain, given this uncertainty about the unemployment-inflation relationship, the important role that risk management plays in setting monetary policy. I will explore the FOMC's monitoring and balancing of risks as well as our contingency planning for cases when risk becomes reality. Let us start with a look at the modern history of jobs and inflation in the United States. Figure 1 shows headline inflation and unemployment from 1960 to today and extended through 2020 using the average of median projections from both FOMC participants and the Survey of Professional Forecasters, and the CBO projections. As the figure makes clear, a multiyear period with unemployment below 4 percent and stable inflation would, if realized, be unique in modern U.S. data. To understand the basis for these forecasts, it is useful to contrast two very different periods included in figure 1: From 1960 to 1985, and the period from 1995 to today. The first period includes the Great Inflation, and the latter includes both the Great Moderation and the distinctly immoderate period of the Global Financial Crisis and its aftermath. Figure 2 shows unemployment and core, rather than headline, inflation in these two periods. While our inflation objective concerns headline inflation, switching to core inflation makes some relationships clearer by removing a good deal of variability due to food and energy prices, variability that is not primarily driven by labor market conditions or monetary policy. There is a dramatic difference in the unemployment-inflation relationship across these two periods. During the Great Inflation, unemployment fluctuated between roughly 4 percent and 10 percent, and inflation moved over a similar range. In the recent period, the unemployment rate also fluctuated between roughly 4 percent and 10 percent, but inflation has been relatively tame, averaging 1.7 percent and never declining below 1 percent or rising to 2.5 percent. Even during the financial crisis, core inflation barely budged. As a thought experiment, look at the right panel and imagine that you could see only the red line (inflation), and not the blue line (unemployment). Nothing in the red line hints at a major economic event, let alone the immense upheaval around the time of the global financial crisis. Notice that, in each period, there is only one episode in which unemployment drops below 4 percent. In the late 1960s, unemployment remained at or below 4 percent for four years, and during that time inflation rose steadily from under 2 percent to almost 5 percent. By contrast, the late 1990s episode of below-4-percent unemployment was quite brief, and during the episode and surrounding quarters inflation was reasonably stable and remained below 2 percent. To explore the Phillips curve relationship in these two periods more closely, we need to bring in the concept of the natural rate of unemployment. In standard economic thinking, an unemployment rate above the natural rate indicates slack in the labor market and tends to be associated with downward pressure on inflation; unemployment below the natural rate represents a tight labor market and is associated with upward inflation pressure. Figure 3 repeats figure 2 but replaces unemployment with labor market slack as measured by unemployment minus the CBO's current estimate of the natural rate of unemployment at each point in time. Periods of tight labor markets are shaded. During the Great Inflation, inflation generally rose in the tight, shaded periods and fell in the unshaded ones, just as conventional Phillips curve reasoning predicts. From 1995 to today, the large and persistent swings in the gap between unemployment and the natural rate were associated with, at most, a move of a few tenths in the inflation rate. Comparing the shaded and unshaded regions, you might see some association between slack and the minor ups and downs in inflation, but the pattern is not at all consistent. It is evidence like this that fuels speculation about the Phillips curve's demise. Whether dead, sick, or merely resting, many of the questions about the Phillips curve come down to figuring out what changed between these two periods, and why. Let us turn to a conceptual framework for examining these questions more systematically. A natural starting point is the simplest form of a Phillips curve equation, which posits that inflation this year is determined by some combination of current labor market slack, inflation last year, and some other factors that I will leave aside for this discussion t t t t The value of is often referred to as the slope of the Phillips curve. With a larger value of any change in labor market slack translates into a bigger change in inflation. As we say, as increases, the Phillips curve steepens. The value of determines inflation's persistence--that is, how long any given change in inflation tends to linger. As the value of increases, higher inflation this year translates more into higher inflation next year. A particularly nasty case arises when and are both large. In this case, slack has a large effect on inflation, and that effect tends to be very persistent. One implication of a large is that, if a boom drives inflation up, it will tend to stay up unless offset by a subsequent bust. Figure 5 shows regression estimates of and computed over 20-year samples starting with the sample from 1965 to 1984 and including each 20-year sample through 2017. During the Great Inflation samples, the value of is near 1, meaning that higher inflation one year tended to translate almost one-for-one into higher inflation the next. The Phillips curve is also relatively steep in the Great Inflation samples, with 1 extra percentage point of lower unemployment converting into roughly 1/2 percentage point of higher inflation. Thus, the Great Inflation presented that nasty case just described. Fortunately, things changed. The estimates of both and fall in value as the estimation sample shifts forward in time. In the most recent samples, the Phillips curve is nearly flat, with very near zero, and is about 0.25, meaning that roughly one fourth of any rise or fall in inflation carries forward. These results give numerical form to what we see in the right-hand panel of figure 3, covering the recent period: Large and persistent moves in the unemployment gap were associated with, at most, modest transitory moves in inflation. These developments amount to a better world for households and businesses, which no longer experience or even fear the scourge of high and volatile inflation. To provide a sound basis for monetary policy, it is important to understand what happened and why, so we can avoid a return to the bad old days of the 1970s. Like many, I believe better monetary policy has played a central role. To understand the mechanism, let us ask how central banks could, presumably inadvertently, amplify and extend the duration of inflation's response to labor market tightness. To do so, the central bank could persistently ease the stance of monetary policy in response to an uptick in inflation. No responsible central banker today would intentionally do this, but much research suggests that during the Great Inflation, misunderstandings about how the economy worked led the Fed effectively to behave in this manner. Some policymakers may have believed the misguided notion that accommodating permanently higher inflation could purchase permanently higher employment. Other policymakers misperceived the level of the natural rate of unemployment, which we now believe had shifted up markedly in the 1960s. With the higher natural rate, the labor market was much tighter and provided much greater upward pressure on inflation than policymakers realized in real time. As a result, they were continually "behind the curve." The channel through which monetary policy can amplify and extend inflation's response to shocks becomes even stronger when we take account of expectations. If people come to expect that upward blips in inflation will result in ongoing higher inflation, they will build that view into wage and price decisions. In this case, people's expectations become a force adding momentum to inflation, and breaking inflation's momentum can require convincing people to change their minds and behavior--never an easy task. Arguably, this is why a federal funds rate near 20 percent--roughly 10 percent in real terms--was required in the early 1980s to turn the tide on high inflation. The cost, in the form of very high unemployment, is clear in the Great Inflation figures. The Great Inflation taught us that a main task of monetary policy is to keep inflation expectations anchored at some low level. This idea is behind the adoption in recent decades of inflation targets, such as the Fed's 2 percent objective, by central banks around the world. When monetary policy tends to offset shocks to inflation, rather than amplifying and extending them, and when people come to expect this policy response, a surprise rise or fall in labor market tightness will naturally have smaller and less persistent effects on inflation. Research suggests that this reasoning can account for a good deal of the change in the Phillips curve relationship. It is also likely that many other factors have contributed to changes in inflation dynamics over recent decades. We do not fully understand the causes and implications of these changes, which raises risk management issues that I will take up now. To set the stage, let us return to the situation facing the FOMC. The baseline forecasts of most FOMC participants and a broad range of others show unemployment remaining below 4 percent for an extended period, with inflation steady near 2 percent. I have made the case that this forecast is not too good to be true and does not signal the death of the Phillips curve. Instead, the outlook is consistent with evidence of a very flat Phillips curve and inflation expectations anchored near 2 percent. But we still must face the cautionary advice to beware when forecasts point to rarely seen outcomes. As a way of heeding this advice, the Committee takes a risk management approach, which has three important parts: monitoring risks; balancing risks, both upside and downside; and contingency planning for surprises. Let me describe a few of the risks and how we are thinking about them. First is the risk that inflation expectations might lose their anchor. We attribute a great deal of the stability of inflation in recent years to the anchoring of longer-term inflation expectations. And we are aware that it could be very costly if those expectations were to drift materially. As you probably know from our public communications, we carefully monitor survey- and market-based proxies for expectations, and we do not see evidence of a material shift in longer-term expectations (figure 6). The survey measures have been particularly steady for some time. The financial market-based measures include both an expectations component and a volatile inflation premium component, so they tend to move around much more than the surveys, but we see no evidence of a material change in these measures, either. The risks to inflation expectations are, of course, two sided. Until this summer, inflation had remained stubbornly below 2 percent for several years. And major economies in much of the world have been struggling mainly with disinflationary forces. Thus, we have been and will remain alert for possible downward drift in expectations. Some argue the contrary case--that by only gradually removing accommodation as the unemployment rate has fallen, the FOMC may have fallen behind the curve, thereby risking an upward drift in expectations. From the standpoint of contingency planning, our course is clear: Resolutely conduct policy consistent with the FOMC's symmetric 2 percent inflation objective, and stand ready to act with authority if expectations drift materially up or down. A second risk is that labor market tightness or tightness in other parts of the production chain might lead to higher inflation pressure than expected--the "revenge of the Phillips curve" scenario. As I mentioned, the FOMC carefully monitors a wide array of early indicators of inflation pressure to evaluate this risk. Wages and compensation data are one important source of information. These measures have picked up some recently, but in a way that is quite welcome. Specifically, the rise in wages is broadly consistent with observed rates of price inflation and labor productivity growth and therefore does not point to an overheating labor market. Further, higher wage growth alone need not be inflationary. The late 1990s episode of low unemployment saw wages rise faster than inflation plus productivity growth without an appreciable rise in inflation. Despite what shows up in the aggregate wage and compensation data, however, I am sure that, like us, many of you are hearing widespread anecdotes about labor shortages and increasing bottlenecks in production. For example, as shown in figure 7, the words "shortage" and "bottleneck" are increasingly appearing in the Beige Book, the Federal Reserve's report summarizing discussions with our business contacts around the country. The message we are hearing in our conversations is supported by a wide range of more conventional measures. For example, the survey of members of the National Federation of Independent Business finds firms increasingly reporting that job openings are hard to fill (figure 8). Further, these businesses now list "quality of labor" as their most important problem, as opposed to the more typical report of "poor sales." We review a wide variety of measures of this type, and these indicators show what I think most business people see: an economy operating with limited slack. Notice, however, that these measures are near levels that prevailed in the late 1990s or early 2000s, a period when core inflation remained under 2 percent. While the late 1990s case proves that elevated values of these tightness measures do not automatically translate into rising inflation, a single episode provides only limited reassurance. Thus, the FOMC takes seriously the possibility that tight markets for labor or other inputs could provide greater upward pressure on inflation than in the baseline outlook. Our best estimates, however, suggest that so long as inflation expectations remain anchored, a modest steepening of the Phillips curve would be unlikely to cause a significant rise in inflation or demand a disruptive policy tightening. Once again, the key is the anchored expectations. A third risk--in this case an upside risk--is that the natural rate of unemployment could be even lower than current estimates. Some have argued that the Fed should be removing policy accommodation much more slowly, pushing the economy to see if the natural rate of unemployment is lower still. Advocates of this view note that over the past several years of policy normalization, the economy has continued to strengthen and unemployment has fallen, but inflation has remained quiet. As I discussed in a recent speech, many analysts have accounted for the lack of rising inflation pressure by lowering their estimate of the natural rate. For example, since the start of 2016, the unemployment rate has fallen about 1 percentage point, and estimates of the natural rate from four well-known sources have If the natural rate is now materially lower than we believe, that would imply less upward pressure on inflation--the flip side of the "revenge of the Phillips curve" risk. Our policy of gradual interest rate normalization represents the FOMC's attempt to take both of these risks seriously. Removing accommodation too quickly could needlessly foreshorten the expansion. Moving too slowly could risk rising inflation and inflation expectations. Our path of gradually removing accommodation, while closely monitoring the economy, is designed to balance these risks. In wrapping up this discussion of risks to the favorable outlook, I should emphasize that I have chosen to focus on three risks that are all associated with the Phillips curve. There are, of course, myriad other risks. To name just a few, we must consider the strength of economies abroad, the effects of ongoing trade disputes, and financial stability issues. I hope my discussion of three particular risks gives a sense of how we approach these issues. Many of us have been looking back recently on the decade that has passed since the depths of the financial crisis. In light of that experience, I am glad to be able to stand here and say that the economy is strong, unemployment is near 50-year lows, and inflation is roughly at our 2 percent objective. The baseline outlook of forecasters inside and outside the Fed is for more of the same. This historically rare pairing of steady, low inflation and very low unemployment is testament to the fact that we remain in extraordinary times. Our ongoing policy of gradual interest rate normalization reflects our efforts to balance the inevitable risks that come with extraordinary times, so as to extend the current expansion, while maintaining maximum employment and low and stable inflation. delivered at the Monetary Economics Workshop of the National Bureau of . . . . . symposium sponsored by the Federal Reserve Bank of Kansas City, held in . . . ," speech delivered at .
r181003a_FOMC
united states
2018-10-03T00:00:00
Supporting Fast Payments for All
brainard
0
It is a pleasure to be here in Chicago today to talk about payments. The confidence that payments will be timely and dependable is a cornerstone of America's dynamic and resilient economy. Today, Americans take the reliability and safety of their payments for granted. But that was not always the case. The nation's first 150 years were frequently disrupted by panics in the financial system that extended into the payment system, which was highly fragmented and inefficient. Before the Federal Reserve took on a role in the payment system, a check recipient could not count on receiving the full value written on the check and faced long and unpredictable delays in getting access to the funds. As a result of banks' efforts to avoid excessive clearance fees, it could take days, weeks, or more to clear checks. Chairman, William P.G. Harding, described the circuitous, opaque, and costly route one payment made: "...a bank in Rochester, N.Y., sent a check drawn on a Birmingham, Ala., account to a correspondent bank in New York, which sent it along to a bank in Jacksonville, Fla. From there, it traveled to Philadelphia, Baltimore and Cincinnati before it finally reached the originating bank in Birmingham The fragmentation of the payment system imposed costs on American merchants, households, community banks, and, ultimately, on the overall U.S. economy. Ensuring a reliable nationwide payment infrastructure was one of the motivations that led the Congress to create the Federal Reserve after the severe financial panic of 1907. Fostering a safe, efficient, and widely accessible payment infrastructure has been a crucial aspect of the Fed's mission from its founding in 1913. By creating a new core infrastructure for clearing and settling checks, the Fed was able to boost confidence in banks and America's payment system, ensure Americans received the full value of their checks, and speed up payments. That was the first, but not the last, time that the Fed played a central role in transforming America's retail payment system. By the 1970s, the payment system was staggering under the described the exponential growth in check volume and the time-consuming and expensive process to clear paper checks as a "time bomb." The Federal Reserve and payment system stakeholders faced a choice: continue making incremental changes to manage the growing avalanche of checks, or look to technology to facilitate transformational change. Working with the private sector, the Federal Reserve chose transformation, and the effect was dramatic. In partnership with the private sector, the Federal Reserve supported the implementation of the automated clearinghouse, or ACH system, to process payroll payments for individuals, bill payments for households, and vendor payments for businesses. Before the advent of the ACH, it would typically take a week for an employee to be able to access the funds from a paycheck. It would require first a trip to the bank during business hours to deposit the paycheck and then several business days for that bank to complete the time-consuming process of collecting the funds from the employer's bank. With the advent of ACH, an employee could expect to access their pay from their bank on the same day it was deposited electronically by their employer. Today, the ACH operates with important roles for the private sector and the Fed, is available nationwide, and constitutes a vital piece of infrastructure supporting earlier access to funds and reliable settlement of payments. Today, our payment system is again at a crossroads. There is a growing gap between the transaction capabilities we need and expect in the digital economy--fast, convenient, and accessible to all--and the underlying settlement capabilities. Consumers and businesses increasingly expect to complete transactions with a simple keystroke, swipe, or tap. If I want to split a restaurant tab with my friends, I can use an app on my smartphone. But my friends have to be signed up for the same app to receive the payment, and they may have to wait for confirmation that the funds have moved from the app into their bank accounts before they can use the funds outside the app. Similarly, if I want to make a purchase from a vendor online, all I have to do is upload my payment information and touch a screen to complete the purchase immediately. But that payment in turn may not be immediately available to the seller. While we are seeing a growing demand for payments to be as instantaneous as the apps on our smartphones, in reality, under the hood, these payments currently rely on a patchwork of systems that can result in inefficiencies and delays, as well as uneven access. To meet the expectations of our 24/7 app economy, there is a growing demand for broadly and nationally accessible faster payments that make funds available immediately. Faster payments would allow consumers and businesses to send and immediately receive payments at any time of the day, any day of the year, and provide recipients the ability to use their funds anywhere they choose. Nascent faster payment services are emerging to address this demand from individuals and businesses for the capability to manage their finances more efficiently in real time. These faster payment innovations are striving to keep up with this demand, but gaps in the underlying infrastructure pose challenges associated with safety, efficiency, and accessibility. In many circumstances, the underlying infrastructure in place today cannot ensure that a fast payment is fully complete before the recipient seeks to use the funds. To complete a payment, the banks behind the transaction need to transfer funds between each other. Until this happens, the payment between them is like an "IOU." Today's systems that transfer funds between banks are not set up to work in a 24/7, real- time world. Instead, most faster payments settle funds between banks on a deferred basis. Deferred settlement entails a buildup of obligations--like IOUs between banks--that could present real risks to the financial system in times of stress. Although faster payment systems that rely on deferred settlements can incorporate certain measures to mitigate these risks, these measures may be appropriate for a nascent faster payment market only for a limited time. As we saw with the avalanche of paper checks prior to ACH, as the volume and value of faster payments grow over time, the potential risks of deferred settlement to the financial system are also likely to grow. To fully deliver on the promise of faster payments into the future, we need an infrastructure that can support continued growth and innovation, with a goal of settlement on a 24/7 basis in real time. To ensure the robustness of the payment system into the future, banks and other providers acting as their agents should have access to a settlement system that operates 24/7 and settles each payment as soon as an individual sends it. While everyone stands to benefit from faster payments, the benefits could be especially important for households and small business owners who face cash flow constraints. We know from the Fed's Survey of Household Economics and Decisionmaking that four in 10 adults say they would need to cover an unexpected expense of $400 by borrowing or selling something, or simply be unable to pay. A forthcoming Fed research note estimates that a quarter of households have less than $400 combined in their bank accounts. For these households, the difference between waiting for a payment to clear and receiving a payment in real time is not merely an inconvenience; it could tip the balance toward overdraft fees, bounced checks, or collections fees. To be clear, faster payments would not address the root causes of these households' financial fragility. Even so, faster payments could help reduce the strain on some. Similarly, many small businesses cite immediate access to working capital to finance inventories or pay employees as their number one constraint on growth. Smaller businesses and merchants would benefit from faster payments because of their need to tightly manage how much of their capital is tied up in unused material or inventory. If a small business could count on its customers' payments being immediately accessible in its bank account, it could reduce its need for short-term financing to cover the costs of ordering materials and goods well in advance. So what is our role at this moment of opportunity for the payment system? In keeping with the Fed's historical role in payments, five years ago we launched a collaborative effort with a broad array of stakeholders to catalyze a safe, efficient, and accessible faster payment system for the United States. As part of that effort, 300 organizations representing consumers, businesses, banks, card companies, and technology providers, including many represented here today, came together to define expectations for the future of the retail payment system and to make recommendations on how to implement those expectations. The members of the task force told us that the Federal Reserve needed to develop a 24/7 settlement system. echoed this theme in a recent report. Over the past year, we have undertaken an assessment of what the Federal Reserve could do to modernize its infrastructure to support interbank settlement of faster payments. That assessment found that 24/7 payment-by-payment interbank settlement in real-time--what we refer to as real-time gross settlement (RTGS)--offers clear benefits in minimizing risk and settlement, and RTGS is the way to achieve this. That is where we believe that the Federal Reserve and the private sector together need to make investments for the future. In this regard, the U.S. retail payment system lags behind some other countries: the Reserve Bank of Australia and the European Central Bank have already implemented or are on the cusp of implementing RTGS systems to support private-sector faster payment services. The Federal Reserve and payment system stakeholders have an opportunity to upgrade America's payment system to meet the needs of households, businesses, and banks in the app economy. Today, we are publishing a notice that seeks public comment on potential steps the Federal Reserve could take to support the vision of RTGS of faster payments. The Reserve Banks could develop a service for RTGS that is available on a 24/7 basis to provide payment-by-payment interbank settlement in real time and at any time, on any day, including weekends and holidays. The Reserve Banks currently provide payment services to more than significantly improve the prospect that banks of all sizes will have equitable access to a real-time interbank settlement infrastructure for faster payments in the long term. This common infrastructure would support connections across banks, and faster payment service providers acting as their agents, with the potential to weave together the current patchwork of systems. As a result, we would also expect the overall safety of faster payments to increase. The capability to finalize interbank settlement before funds are made available to the recipient would avoid an undesirable buildup of risk in the system. The more banks that have access to real-time as opposed to deferred settlement mechanisms, the lower the risk would be from deferring settlement to the payment system as a whole. Although RTGS may be operationally demanding, it offers clear benefits from a risk and efficiency perspective over the long term. The development of a nationwide real-time interbank settlement infrastructure could also support the development of private-sector faster payment services, thereby increasing innovation and choice in the market. Banks and technology providers of all sizes may be able to develop new services or enhance existing services by capitalizing on the underlying interbank settlement infrastructure. This could ultimately benefit all consumers by lowering costs, increasing choice, and improving quality. In that regard, we are also seeking comment on whether the Reserve Banks should consider developing a liquidity management tool that would operate 24/7 in support of services for real-time interbank settlement of faster payments. The tool could support settlement services provided by the private sector or the Reserve Banks. The Federal Reserve has a responsibility to promote a payment system that serves the evolving needs of the public. We hope to solicit as wide a range of views as possible as we consider these and possibly other options to support prompt, secure, and resilient settlement in a 24/7 world. To that end, the notice poses a series of questions related to whether an RTGS is the appropriate strategic foundation for interbank settlement of faster payments. We also solicit views on the appropriate role for the Federal Reserve in such a system. We look forward to hearing from, and learning from, a broad array of stakeholders, including those who are actively involved in the provision of payment and banking services, as well as representatives of the consumers, households, and businesses who rely on the ability to make and receive payments every day. The future of payments will be determined by the actions we take today. We can wait and watch how these issues evolve on their own. But this will likely result in a fragmented patchwork of systems that entails inefficiencies and risks and could leave behind many households, small businesses, and smaller banks. Alternatively, we can work with other stakeholders to embrace innovation and design a faster payments infrastructure for the future to promote broad access and resilience. As technological change continues to drive payments innovation, we continue to focus on the same basic objective that motivated our initial engagement in the payment system a century ago: promoting a safe, efficient, and accessible payment system that serves the interests of all Americans. The Fed has the unique ability to provide the infrastructure to reliably settle obligations between banks using balances at the central bank. As such, we have a responsibility to serve the broad public interest by providing a flexible and robust payment infrastructure on which the private sector can innovate. We look forward to hearing from you on the important choices before us as we work to safeguard the integrity and efficiency of our payment system.
r181004a_FOMC
united states
2018-10-04T00:00:00
Trends in Urban and Rural Community Banks
quarles
0
For release on delivery Remarks by at Good morning. I want to thank the conference organizers for inviting me to speak to you today. It is an honor and a pleasure to be part of this unique annual event that brings together bankers, bank supervisors, and researchers to discuss the latest community banking research, recent trends in community banking practices, and policy issues that are on the minds of conference attendees. I want to start, first, by conveying my own perspective on the importance of community banks. Community banks have a long history of providing essential financial services to households, small businesses, and small farms in communities across the United States. Their ability to effectively provide these services speaks to the strength of the community banking business model--that is, establishing and maintaining local relationships, and offering customers a face-to-face interaction with a local banker. And it is something I have observed firsthand, especially for community banks in rural communities. Growing up in rural Colorado and Utah, I saw the importance of community bankers having local knowledge and being personally invested in the same communities that they serve. That local knowledge, and the relationship-based lending that is the hallmark of community banking, can stem losses during downturns, as community banks may be able to work with borrowers to avoid losses. Indeed, research has shown that small business lending at small banks declined less severely than at large banks during the last recession. At the same time, I have seen the challenges that many community banks face. I want to be careful not to overstate those challenges--to paraphrase Mark Twain, the reports of your demise are greatly exaggerated--and I believe that the community bank model has many advantages and will continue to play an integral role in our financial system. These sorts of dynamics are one reason that community banks are an important topic for research. As you probably know, this is the sixth annual community banking conference cosponsored by the Federal Reserve and the Conference of State Bank Supervisors (CSBS), and it is the first conference for which the Federal Deposit Insurance Corporation is joining as a cosponsor. The organizers of the inaugural conference decided that, rather than holding a traditional academic-style conference, they would invite bankers and bank supervisors to hear what the researchers had to say and to share their real-world experience with the researchers. The hope was that these interactions would prove beneficial to all three groups. The positive feedback that we have received from conference attendees over the past five years strongly supports the wisdom of the organizers' decision. Over time, the conference has evolved, with some new features introduced each year. The Case Study Competition, which is sponsored by the CSBS, introduces undergraduates to community banking and some of the challenges that community bankers face. The Emerging Scholars Program was also added to the conference a few years ago. This program is intended to support Ph.D. students who are considering or working on a dissertation on a banking-related topic and encourage them to develop a research agenda that focuses on community banking issues. I would like to congratulate this year's winning case study team, Eastern Kentucky University, and emerging Turning to the topic of today's speech, it occurs to me that we often speak of community banks as though they are all pretty much the same. But, in reality, there is considerable heterogeneity within the group of firms that are commonly considered to be community banks. One important dimension of diversity is size, which can range anywhere from less than $100 million in assets to around $10 billion in assets. As noted by Chairman Powell when he spoke at this conference two years ago, looking at community banks as a monolithic group masks some important differences between the smallest and largest community banking organizations. For example, essentially all of the decline in the number of community banking organizations over the past two decades has taken place among those with assets less than $100 million. And these smallest banking organizations have consistently had a lower average rate of return on assets than their larger peers. Another significant aspect of diversity among community banks is the type of market served--urban versus rural. These two types of areas differ in many respects, including the age distribution of the population, the share of the population with a college degree, homeownership rates, poverty rates, and the share of the population with internet access. And while the national population has been growing over the past 20 to 30 years, many rural areas have experienced population declines, and the share of the population living in rural areas has been falling. Furthermore, since 2008, most job growth in the United States has occurred in urban areas. Given these differing circumstances, it is not surprising that community banks operating in rural and urban areas tend to face different challenges. For example, the number of competitors faced by a banking organization tends to be larger in urban banking markets than in rural markets, while hiring and retaining high-quality employees can be more difficult in rural areas. And some observers have expressed concern about the implications of bank consolidation over the past two or three decades for access to banking services in rural areas while also wondering about the future viability of rural community banks. The number of community banks has been declining over the last 20 years, but community banks still account for more than 95 percent of banks operating in the United States. The decline has been roughly similar for urban and rural community banks, leaving the share of community banks that operate primarily in rural markets quite stable at just over 50 percent. While urban community banks are quite a bit larger than rural community banks on average, over the last 20 years, rural community banks have consistently earned higher rates of return on assets and rates of return on equity than their urban peers despite a more challenging economic environment. While the data present a compelling high-level picture, they do not tell the whole story. For example, averages across a large number of markets do not tell us what is going on in any individual market. In addition, much of the data that I am presenting today is aggregated to the county level, which obscures community-level dynamics: Some communities within a county may have lost banks or bank branches, while others may have gained. In the rural Mountain West, where I grew up, a single county can be physically larger than some Eastern states. And the demographics of the communities-- for example, high- or low-income--that have lost or gained are also not visible, but important. Federal Reserve staff are engaged in efforts to further our understanding of the effects of losses of banks or bank branches on the people who live and work in the affected communities. The number of banks in the United States fell by almost half over the past decrease was accounted for by community banks. Looking at the trend in the number of urban and rural community banks (figure 1), we see that the number of banks in both of these categories has been falling over time. The rate of decline was steeper for rural banks than for urban banks before the financial crisis but has reversed in the post-crisis period. This reversal may be due to the fact that, as we will see momentarily, urban community banks suffered more severe losses in the immediate post-crisis period than did rural community banks. And the share of community banks that operate primarily in rural markets has increased slightly, from 53 percent to 54 percent. While there are more rural community banks than urban community banks, the latter consistently account for a larger volume of deposits, loans, and offices than the former. This difference is due, in part, to the average size of an urban community bank, in terms of total assets, being about two and a half to three and a half times that of the average rural community bank (figure 2). As community banks have increased in asset size, they have also grown their branch networks. The average number of branches for an urban bank is about 1.7 to 2 times that of a rural bank (figure 3). Looking next at the total amount of deposits held by all urban community banks and all rural community banks, we can see that both have been trending upward over time (figure 4). Growth in total loans outstanding was strong for both urban and rural and 2011 were more severe for urban community banks than rural ones. Coming out of the recent recessions, rural community banks have seen quite modest loan growth since 2011, while the pace of growth in urban community bank lending has been strong since 2013. This divergence in recent growth rates may be due to the fact that the recovery from the recent recession has been much more robust in urban areas than in rural areas of the country. When it comes to performance measures, rural community banks consistently outperform urban community banks with regard to return on assets (figure 6) and return on equity (figure 7). This difference was particularly pronounced during the financial crisis, when profitability fell much more sharply at urban community banks than at rural community banks. Looking at charge-off rates (figure 8), we see that they have been quite similar for the two types of community banks over most of the past 20 years, except for the period from 2008 to 2013, when rural community banks had lower charge-off rates than urban community banks. This data suggest that, despite facing a more challenging economic environment, rural community banks appear to be holding their own relative to urban community banks. Now I would like to shift my focus from the banks themselves to the communities they serve by exploring whether access to banking services--provided by community or larger banks--has been declining in urban or rural areas of the country. As of 2017, the average urban market was home to 18 community banks and just over 8 large banks, which represents a change from 21 community banks and 6 large banks in 1997 (figure 9). The average number of community banks per rural market has been remarkably stable over the past 20 years at right around 4, while the average number of large banks per rural market has increased from just under 1 to 1.4 (figure 10). These statistics indicate that, perhaps surprisingly, the average number of banks in rural markets has actually increased in the past 20 years. If we look at the number of bank branches rather than the number of competitors, we see a significant increase in the number of branches in the average urban market, with the entire increase coming from branches of large banks (figure 11). Over the same period, there was essentially no change in the number of branches in rural markets, with a slight shift upward in the share of branches accounted for by large banks (figure 12). Of course, there is substantial variation in the experiences of individual markets, as some local rural and urban markets gained and others lost bank branches. As the share of branches in the average banking market operated by community banks has declined, so, too, has the share of deposits held at community banks. This shift in deposit shares away from community banks, similar to the shift in branch shares, has been substantial in urban markets but only marginal in rural markets (figure 13). Community banks held almost half of all deposits at urban bank branches in 1997, but just over one-third in 2017. In rural markets, community banks collectively had a deposit market share of 80 percent in 1997, declining moderately to 77 percent in 2017. Despite the decline in the overall share of deposits held by community banks in urban and rural markets over the past 20 years, the average individual community bank operating in each type of market has seen almost no change in its deposit market share (figure 14). In other words, the decline in the share of market deposits held in aggregate by community banks is due to fewer community banks. The fact that the average individual community bank has maintained or increased its deposit market share since 2008 suggests that community banks have been able to compete quite successfully with larger banks in both urban and rural markets during and since the recent recession. As I have mentioned, the average numbers of banks and bank branches have increased or remained constant in rural markets in recent years despite a wave of mergers that has greatly reduced the number of U.S. banking organizations. Industry consolidation has led to fewer banks but maintained most of the branches of the acquired banks. In addition, most mergers and acquisitions have involved expansion into new markets by the acquiring bank rather than acquisitions of local competitors, which has allowed local communities to continue to enjoy a variety of potential providers of banking services despite the industry consolidation. One unavoidable aspect of consolidation is a loss in the number of bank headquarters offices. The number of bank headquarters located in urban markets has fallen by half over the past 20 years, while the number in rural markets has fallen by 45 percent. Consolidation has led to a doubling in the number of banking markets-- almost all of which are rural--in which no banks are headquartered (figure 16). We hear anecdotally that banks are more attuned to the needs of the communities in which they are headquartered, so the significance of this loss could have an effect on the local markets. As I previewed at the beginning of my remarks, data and averages often do not tell the whole story. At the end of the day, we care about bank branch locations because we care about the people and communities that they are serving or, in some cases, not serving adequately. So even if the data tell us that most rural markets are well served, we need to focus our attention on those markets that may not be as well served and how that is affecting the people who bank, or cannot bank, there. That is why the Federal Reserve System's Community Development function has undertaken a national series of listening sessions to assess the real effects of bank closures on rural communities. Reserve Bank staff members are conducting the sessions around the country to gather information from consumers and small business owners in rural communities that have been directly affected by bank closures. To identify where to conduct these sessions, we used data to identify rural towns that have experienced bank branch closures; in some cases, these towns lost the only bank in town and now have no remaining banks. Then we convened local residents and small business owners to ask them about what the loss of a bank meant to them and their community. For some residents, the closure has not been much of a problem. Most, if not all, of the listening session participants in Clark, South Dakota, noted that they do most of their routine banking online. Some in Clark even noted that they did not think ATMs (automated teller machines) were necessary, as most of the retail businesses in town offer cash back with debit card purchases. Nonetheless, residents of Clark spoke positively about the importance of the personal touch a local bank can provide. Residents liked familiar faces at the teller window and loan officers who understand the local economy when making small business loan decisions. However, online banking is not necessarily an opportunity for everyone. Indeed, even for community banks themselves, technology may be perceived as a threat or an opportunity, depending for example on whether the necessary infrastructure is available. In Nicholas County, Kentucky, which is located in Appalachia, many residents do not have access to high-speed internet; this lack of access has led most residents to travel outside of the county to conduct their banking needs. This situation is certainly not optimal for anyone, but it presents a particular challenge to the elderly, those without a car, and busy small business owners who do not have time to travel 25 miles each way to make change and deposit checks. In Centre County, Pennsylvania, we heard that the transportation challenge is particularly acute for the Amish--10 miles is a long way to travel by horse and buggy. The loss of a bank branch also has a ripple effect on a community as a whole. In the village of Brushton, New York, which lost its only bank branch in 2014, small business owners commented that when residents must leave the village to access banking services, they are more likely to shop, eat, and pay for services in other towns, which creates additional hardship for the small businesses of Brushton. Additionally, as the ability of community members to access cash has decreased, credit card use has increased. Small business owners in Brushton say that this growth in credit card usage has significantly increased the cost of doing business. Lastly, one theme that we heard loud and clear across the country was that the loss of a local bank meant the loss of an important civic institution. Banks do not just cash checks and make loans--they also place ads in small town newspapers, donate to local nonprofits, and sponsor local Little League teams. As towns lose banks and bankers, they also lose important local leaders. We will continue to conduct these listening sessions across the country throughout the fall. In fact, one of these listening sessions will take place next week just two hours down the road in Reynolds County, Missouri. We look forward to sharing the collective results of our efforts in a report that should be published in early 2019. To sum up, the numbers of urban and rural community banks have been declining over the past 20 years, but community banks continue to play an important role in both types of markets. Urban and rural community banks face different challenges, but, on average, both seem to be faring well in the post-crisis period. And the average rural banking market has not seen any decline in the number of banks or bank branches over recent years. For the local areas where the availability of banking services has declined, we are in the process of assessing the effects of this decline on the people who live and work in those communities. I look forward to continuing to engage in this area and monitor the developments in this most vital part of the banking ecosystem.
r181015a_FOMC
united states
2018-10-15T00:00:00
Community Investment in Denver
brainard
0
Good afternoon. I am delighted to be here in Denver for a few days to visit with community leaders working to support affordable housing, workforce development, and small businesses across the Mile High City. So far, I have had a chance to hear firsthand about the housing affordability challenges facing many Denver families, which are requiring some families to make difficult choices between paying rent and paying for other necessities, and the associated challenges of homelessness. I have also had the opportunity to see the incredible work underway in Denver to ensure that individuals and families in need have access to safe, affordable shelter and the healthcare and case management services they need to get back on their feet. Not only is Denver's approach to service delivery a national model, but your use of Social Impact Bonds to fund those programs is also at the forefront of innovation in the community development sector. I can see there is a lot of pride in what you are accomplishing locally, and I look forward to seeing more groundbreaking work during the remainder of my time here. Every few months, I try to make a visit such as this one to better understand how different communities across the country are experiencing the economy. Promoting community development is one of the key purposes and functions of the Federal Reserve, and it requires that we have a strong understanding of the impacts of financial policies and practices in communities in all of our Districts. has played in strengthening community investment, so I am happy to have this opportunity to learn about your perspectives, as bankers here in Denver, on opportunities for further improvement. This roundtable is the first of several that the Federal Reserve System will host in the next several months to hear directly from bankers, community members, practitioners, researchers, and others about improving the CRA's effectiveness in making credit available to lower-income areas. We are inviting representatives of other banking regulatory agencies to join us so that we can all benefit from learning from different local experiences with the CRA. Our hosts at the Reserve Banks will be taking notes, and a summary of the findings from this outreach initiative will be made public. Before we get started, I want to say a few words about the interagency process for revising CRA regulations and the Federal Reserve's priorities. As you know, the Office of the Comptroller of the Currency recently published an Advance Notice of Proposed Rulemaking (ANPR) to solicit public comment on a variety of questions related to revising CRA regulations. Comments are due by November 19. Last week, the Board heard from members of the Fed's Community Advisory Council that there may be some confusion about commenting on the ANPR because it was not published on an interagency basis. Even though the Federal Reserve did not join in the publication of the ANPR, we will be reading the comment letters in anticipation of working with the Comptroller and Federal Deposit Insurance Corporation on a joint proposal. We understand the importance of having the agencies work toward one set of CRA regulations that are clear and consistently applied and will do everything we can to make that possible. CRA regulations are complicated, and the regulators will benefit from perspectives from a variety of channels, so I encourage you all to submit comments. The CRA is too important to the financial well-being of communities across this country for banks and community members to disengage in any part of this process. The CRA has not only made more credit available in lower-income areas, but it has also helped to create a valuable community and economic development infrastructure. Revising the regulations will require careful consideration in order to strengthen that infrastructure. All of the agencies share the stated purpose of revising the regulations, which is to promote more, not less, CRA activity in underserved areas, as the Comptroller stated in recent testimony. Properly balancing the goals of simplifying and clarifying the regulations, with the goal of promoting more CRA activity through stronger local community engagement, will require careful consideration. I want to emphasize the Board's commitment to regulatory revisions that strengthen the CRA's purpose, which is to encourage banks to help meet the credit needs of the communities they are chartered to serve, including low- and moderate-income neighborhoods. In that regard, there are a few principles that will guide our work. First, we should update the area in which the agencies assess a bank's CRA activities while retaining the core focus on place. I look forward to hearing your suggestions about how we can broaden our evaluation of banks' CRA performances to take into account the technological advances that have made it possible for banks to serve customers remotely, while retaining the focus on local credit needs that vary from place to place. The Federal Reserve's research, surveys, and outreach point to an economy that is very strong on an aggregate level but significantly more varied at the community level, with many neighborhoods still struggling. Second, the CRA regulations should be tailored to banks of different sizes and business models. Currently, the evaluation methods are tailored to banks of different sizes and business models. We should consider whether assessment areas also should be tailored to the business models employed by banks. Third, any redesign of CRA regulations should be done with the goal of encouraging banks to seek out opportunities in underserved areas. This is not simply a question of expanding a bank's assessment area, but of providing more incentives for banks to more effectively address the needs in neighborhoods that they may already be serving with branches. For example, there are concerns that distortions may lead some areas to become credit hot spots, while others become credit deserts. Investments, such as Low-Income Housing Tax Credits, are in such high demand in some areas that there is little return on investment, while it is difficult to find investors in other areas. We believe that it is important for revisions to CRA regulations to address this type of market distortion to promote more lending and investment in smaller markets within a bank's footprint. Fourth, the Federal Reserve is interested in promoting greater consistency and predictability in CRA evaluations and ratings. We see the value of clearer definitions and metrics that use publicly available information to identify local credit needs and opportunities. Finally, it is important to recognize that the CRA is one of several mutually reinforcing laws. The central thrust of the CRA is to encourage banks to ensure that all creditworthy borrowers have fair access to credit. For banks to be successful in meeting the credit needs of their entire community, it has long been recognized that they must guard against discriminatory or unfair and deceptive lending practices. With these principles in mind, I look forward to hearing your ideas on how we can make the CRA a more effective tool for bringing credit and capital to underserved communities.
r181017a_FOMC
united states
2018-10-17T00:00:00
FinTech and the Search for Full Stack Financial Inclusion
brainard
0
Thank you for inviting me to join today's discussion. Like many of you, I have long been interested in the potential for innovation to improve financial access for families and small businesses who are underserved. The combination of smartphone apps, big data, artificial intelligence, and cloud technology holds out intriguing possibilities in financial services. But no single app is likely to be a silver bullet for the complex challenges faced by underserved households and small businesses. Achieving inclusion will require a holistic understanding of the challenges faced by underserved groups in order to develop full stack solutions to address them. Fortunately, an emerging generation of metrics may offer a more complete picture of consumers' financial needs. In addition, technological infrastructure is developing, such as faster payment systems, along with the potential for more transparent and simpler product offerings enabled by richer data and lower-cost processing. These new building blocks may make a difference on their own--and, more importantly, may be combined in powerful ways to bring end-to-end solutions to financial inclusion. I will briefly discuss each of these developments in turn. Let's start with the basic question of how to measure financial inclusion in order to evaluate the impact of financial innovations and inclusion policies. The World Bank's Global Findex Database starts with a seemingly reasonable proxy for financial inclusion: access to financial accounts. Having access to basic transaction and savings accounts has been shown to be an important step to financial inclusion, particularly in developing countries. For instance, a 2013 experiment provided savings accounts to a random sample of market vendors in Kenya, most of whom were women. The vendors with accounts saved at a higher rate and invested 60 percent more in their businesses relative to those who did not have accounts. Similarly, women-headed households in Nepal ultimately spent more on nutritious foods and education when they received free savings accounts relative to those who did not. Using account access as a proxy for financial inclusion suggests substantial progress has been made. According to the Findex Database, over two-thirds of adults around the globe now have some form of financial account, up from roughly half just seven years ago. This represents an increase of 800 million people. Here in the United unbanked and underbanked households found that 7 percent of U.S. households in 2015 The unbanked rate was nearly a percentage point lower than the prior survey in 2013 and is lower than in earlier years. Even with this modest improvement, however, lower-income and minority households have substantially higher rates of being unbanked. The widespread adoption of mobile phones and the data they generate are expected to extend these gains even further. The World Bank noted, for instance, that about 1.1 billion unbanked people, about two-thirds of the unbanked population, has a mobile phone. adult population had a smart (internet-enabled) phone. It also found that the rates of mobile banking usage are higher among minorities. And even amongst the unbanked population, 40 percent of adults had access to a smartphone in their households; the same was true for 70 percent of underbanked adults. Another proxy used to gauge financial inclusion is access to credit. When I worked on microcredit 3 decades ago, it was very difficult to scale up the provision of loans to small businesses. Reaching small enterprises and evaluating their creditworthiness was very expensive, especially in small cities and rural areas, and it was difficult to secure loans with collateral. The high transaction costs were reflected in interest rates many times greater than those available for established businesses in urban areas. . Today, new technologies are lowering transaction costs by automating the customer interface and underwriting processes. A recent analysis by staff at the Federal Reserve Bank of Atlanta notes that automated fintech platforms have lower operating costs relative to storefront payday lenders. A recent study by staff at the Federal Reserve Bank of New York finds that fintech lenders process mortgages on average 20 percent faster than traditional lenders. Of course, these differences focus on costs and do not address loan pricing. But alongside these promising developments, we have also learned some cautionary tales from the early wave of fintech. In particular, while access to accounts and to credit may be beneficial, they are by no means sufficient to ensure financial resilience on their own. Not surprisingly, access to an account is only a small part of achieving financial health. The Findex database itself notes that a quarter of all accounts worldwide are "inactive," meaning that there were no deposits or withdrawals made over the prior 12 months. In some cases, customers that are provided financial accounts quickly return to the cash economy. For example, in 2014, India launched an ambitious financial inclusion program with the goal of connecting every citizen to a basic bank account that facilitated the opening of 240 million accounts. Before too long, however, it became evident that over a quarter of the new accounts held balances of one rupee or less. As for credit, fintech lending has moved beyond niche to the mainstream. In 2010, fintech lenders made only 1 percent of personal loan originations in the United States. By mid-2017, fintech lenders--often in conjunction with bank partners--were responsible for nearly a third of the personal loan market. It is not clear how much of this fintech lending is making a significant dent in financial inclusion, as opposed to serving prime and near-prime consumers in the United States. A 2017 study by TransUnion found that fintech lenders focused 59 percent of their originations in the near prime and prime risk tiers by the end of 2016--up marginally over the previous two years. that at least some fintech lenders were able to slot "some borrowers who would be classified as subprime by traditional criteria" into better loan grades. But the differences from existing channels may not be large: TransUnion found that around 10 percent of loans originated by fintech lenders were to subprime consumers, as compared to 14 percent for the overall market for personal loans. Account access and credit may be helpful and possibly even necessary components of a solution. But they are unlikely to provide a complete solution on their own. Continued progress on financial inclusion is likely to require solutions that are designed with an understanding of the issues that underserved communities face. In particular, it appears that many unbanked or underbanked people in the United States are intentionally choosing not to maintain a bank account, which may hold clues to what underserved families and small businesses actually need. The data show that nearly half of the unbanked households actually had bank accounts in the past, based on a 2015 FDIC survey. A third of these previously banked households explained that they currently did not have bank accounts because of high or unpredictable fees, as did roughly a fifth of unbanked households that previously did not have accounts. The single most cited reason for not having accounts was not having "enough money." More than 10 percent of the unbanked explained that they simply "don't trust banks." Those findings were echoed at an international level by the Findex database, where roughly a fifth of adults without a financial account cited a lack of trust in the financial system. To explore why consumers would choose to use alternative financial services over traditional bank accounts, Lisa Servon of the University of Pennsylvania worked for both a check-cashing firm and a payday lender. Servon observed that check-cashing firms used simple fee structures that were transparent and prominently displayed, similar to an overhead priced menu at a fast-food restaurant. As a result, individuals could understand clearly what fees they will have to pay up front. In contrast, checking accounts can be more unpredictable. The funds from a deposited check may not be immediately available. By the same token, it may not be clear exactly when the funds associated with a payment by check are likely to be deducted from the account. That lack of predictability can be extremely important to a family that is living paycheck to paycheck. Bouncing a rent check, phone payment, or utility payment can have a destabilizing impact on day-to-day life with further knock-on effects and costs. When a check for a utility payment is rejected due to insufficient funds, for instance, the account holder may have to make time-consuming calls to get the lights turned back on, while also struggling to pay the associated overdraft fee. That may, in turn, lead to a vexing chain of additional late payments and fees. A 2013 study by the Bureau of Consumer Financial Protection found that more than a quarter of checking accounts in the study experienced at least one overdraft fee in 2011. The average fee total for overdrawn accounts was $225, but varied as much as $200 between banks. Among bank customers who bounced checks, a relatively small minority bore the brunt of the fees: A quarter of accounts with overdrafts incurred in 2011 had more than 10 overdraft fees during that year. It is increasingly clear that financial inclusion is less about account access and more about families' financial resilience in the face of volatile income and expenses. It is common to assess households' earnings and living expenses in terms of annual totals. But what keeps many people up at night is their ability to pay this week's bills, especially when their paycheck is not coming until next week. families studied experienced fluctuations in income either 25 percent above or below average for five months over the course of a year. Nearly a third of households whose annual incomes were twice the supplemental poverty measure dipped below the poverty line for at least one month during the year. adults self-reported that they struggled to pay their bills at least once in the past year due to volatile income. Jonathan Morduch and Rachel Schneider, the principal investigators of the Financial Diaries, argue that access to steady, predictable cash flows is an important source of financial inequality. Expenses can be just as unpredictable as income, and financial fragility results from the mismatch between the two. The Pew Charitable Trusts showed that 60 percent of households self-reported that over the prior 12 months, they had experienced at least one financial shock, like a job change, divorce, major illness, or breakdown of a car or major household appliance. The median household spent half-a-month's income on the most expensive shock (the median expense being $2,000). Further complicating things, expense shocks rarely occur on their own, but instead can cascade into additional expenses and stress for families on the margins. Pew showed that a third of households they studied self-reported two or more different financial shocks over the course of a year. Morduch and Schneider found that 65 percent of spending spikes involved increases in two categories of expenses, such as transportation and health care. In more than half of the spending spikes they studied, spending was well above average in three categories. Moreover, the spikes in expenses tended not to coincide with corresponding spikes in income. Financially vulnerable families typically do not have sufficient savings to smooth through the income and expense volatility mismatches they experience. Households in the Financial Diaries expected to spend more than 80 percent of their savings within the year. Nearly half of households studied by the Pew Charitable Trust had not recovered from financial shocks at least six months afterward. finds that 40 percent of adults report they would have difficulty covering an unexpected expense of $400. This inability to plan for the future dramatically affects individuals' ability to engage in the investments, such as cutting back on work hours to pursue education and training, and the risk-taking that is necessary to improve their financial lives. Access to credit can be an important part of weathering these shocks. The Federal Reserve SHED finds that families with less access to credit are more likely to self-report financial hardship due to income volatility. But credit is only useful if consumers have the means to repay the debt in a timely manner and stabilize their financial lives. Research increasingly shows that borrowers often choose lenders based on their perceived chance of being approved for funding rather than on the cost or their ability to that this is one of the most appealing features of online lenders for small business borrowers. With regard to consumers, a 2014 report by the Bureau of Consumer Financial Protection found that 80 percent of payday loans are rolled over or followed by another loan within 14 days. The Bureau found that many loan sequences end quickly, but 15 percent of new loans are followed by others--and half of all loans were part of a sequence of at least 10 loans. A sustainable solution is likely to require a comprehensive understanding of the needs of financially underserved families and small businesses. Accordingly, policymakers and financial services providers are beginning to assess financial inclusion in more nuanced ways. And, in parallel, new technological building blocks increasingly can be used to build more full-stack approaches to financial inclusion. When considering how to think about progress, we are seeing a move away from basic financial inclusion to a more holistic understanding of financial health. Financial health is more difficult to measure than bank accounts, and policymakers and researchers are learning how to better work with that complexity. For instance, when the Bureau of Consumer Financial Protection released the findings of its research on financial well- tool designed to help an individual think comprehensively about his or her financial life. The questionnaire asks consumers to rate statements such as "I can enjoy life because of the way I'm managing my money" and "I am securing my financial future." launch of a new data set that uses subjective consumer responses to survey questions and pairs that information with actual data on their financial transactions. The regularly refreshed data is designed to give industry, researchers, and policymakers better insight into consumers' financial lives by providing more accurate metrics for assessing changes over time. The new initiative, which they call "Financial Health Pulse," aims to bring the siloes of a consumer's financial life into one comprehensive picture, encompassing "income, spending, savings, debt, retirement, and credit scores." Just as we are learning how to assess financial health more effectively, we are also seeing progress on tools that are specifically designed to address the challenges underlying financial inclusion. Moreover, these tools can be built upon a new generation of platforms and other basic building blocks, which can be used in combination to develop more effective full-stack solutions. Within the Federal Reserve System, we recognize that we have a role and, potentially, a responsibility to help create an infrastructure that facilitates safe, innovative, and ubiquitous faster payment services. Earlier this month, we announced that we are seeking public comment on whether the Reserve Banks should take a more active role in modernizing our infrastructure to support interbank settlement of faster payments in real time. For households living paycheck to paycheck, the difference between waiting for a payment to clear and receiving a payment in real time is not merely an inconvenience; it could tip the balance toward overdraft fees, bounced checks, or collection fees. Of course, faster payments would not address the root causes of financial fragility, but they could help reduce the strain on some. In addition, many small businesses cite immediate access to working capital to finance inventories or pay employees as their number one constraint on growth. If a small business could count on its customers' payments being immediately accessible in its bank account, it could reduce the need for short-term financing to cover the costs of ordering materials and goods well in advance. This could also make a difference for the many hourly wage employees that receive their pay in the form of paper checks. Electronic payments currently can take multiple days to process, so if an employer seeks to ensure that an employee receives their pay on the last day of a pay period, the employer usually has to release the funds a few days in advance. With hourly employees, it can be hard to know how much pay is owed until the hours are actually worked. So employers may wait until the last day of the pay period and then release a paper check at the end of the day, potentially leading to delays for the employee in receiving the funds or a fee to obtain immediate availability. Faster payment systems can change that. Employers would be able to push funds to employees shortly after their shifts have ended. Indeed, some "gig economy" employers have begun to offer "instant pay" options that allow contractors to cash out their earnings as frequently as five times a day. But when these options recently stopped working for about a week, news reports cited comments from drivers that were hindered in their ability to refuel, pay rent, or buy groceries because they were temporarily unable to access their pay as they earned it. We are similarly seeing infrastructure-level innovations in the basic accounts offered to consumers. Many of the recent mobile apps provided by bank and fintech providers, separately or in partnership, enable consumers to check balances, pay bills, and deposit checks around the clock and every day of the week via their phones. Some banks have introduced innovative online-only accounts that are fee-free and require no minimum deposits, while offering phone-based deposits, account interfaces, and bill payment. Some banks offer no-fee, phone-based accounts that incorporate savings and budgeting tools that look and feel like nonbank fintech apps. Consumers can set goals and ask that the account begin "automatically" saving toward those objectives. example, consumers might be able to swipe to see how much they can safely spend on a purchase without falling behind on their financial goals or missing scheduled payments. With other tools, using just their phone, a customer can set rules for their bank account, ranging from declining transactions that would overdraw the account, to automatically transferring funds from a related account, to opening an overdraft line of credit, or being charged a fixed-dollar overdraft fee, with a one-day grace period to avoid the fee. New platforms like faster payment systems have the potential to combine with other technological improvements, like cheap access to cloud computing and an open- source approach to artificial intelligence, to create more full-stack approaches to financial inclusion. A new generation of offerings experiments with using machine-learning tools and data aggregation to study consumers' expense and income flows in order to offer credit to consumers with little to no traditional credit histories. As mentioned earlier, other apps are using faster payments and cheap accounts to offer consumers tools to smooth volatility in their incomes. Still other products are using behavioral economics- based "nudges" to help consumers grow their savings. It is still very early, and many of these products have difficult issues to work through with respect to consumer data security and privacy, which may have important implications for pricing of services. Again and again, we are reminded that when products are free, the consumers themselves may be the product. For instance, many financial apps that provide "free" services earn revenue by being paid for lead generation. Other apps may sell consumer data in ways that consumers may not be aware of. some workers, their employers may be able to be part of the solution, by subsidizing some or all of the costs of using these new tools. For example, some employers are partnering with fintechs to offer their employees better savings tools and the ability to draw emergency funds as an advance on their paycheck. While financial innovation may hold great promise, a lot of work is needed to ensure it will be able to reach communities that lack infrastructure for digital service delivery. In the colonias area on the Texas-Mexico border, I met with families that are unbanked, students that could not complete their homework, and businesses that could not serve customers outside their local area because they lacked the internet connectivity that many of us take for granted. The Federal Reserve, and other federal banking agencies, view access to technology as increasingly essential to households and small businesses in underserved low- and moderate-income communities. That is why we have clarified that efforts to provide communications infrastructure, such as broadband internet service, may be viewed favorably under the Community Reinvestment Act or CRA. Expanding access to financial services is most effective when consumers and small businesses are equipped with the ability and information to determine which financial products are suitable for their needs. Financial literacy and consumer protections are critically important regardless of whether financial services are delivered through traditional means or smartphone apps. Here too, digital delivery can expand the reach of traditional financial education systems by providing consumers with online and mobile education, just-in-time information, and interactive financial tools to evaluate their options. Financial services providers have an affirmative obligation to deliver clear and transparent products and services and to protect the personal information and financial assets of the customers they serve. Our challenge as regulators is to ensure trust in financial products and services by maintaining the focus on consumer protection, while supporting responsible innovation that provides social benefits. It is too early to tell if many of these innovations will ultimately make good on their promise to help underserved consumers navigate their complex financial lives. And, of course, no app can solve a persistent gap between living expenses and real wages in some occupations, sectors, or counties. Still, I am cautiously optimistic. Together, we are developing a more holistic understanding of the financial needs of underserved households and small businesses. We are seeing the development of powerful new technologies. There is reason to hope these new technologies will be combined in ways that move the needle on financial inclusion.
r181018a_FOMC
united states
2018-10-18T00:00:00
Don’t Chase the Needles: An Optimistic Assessment of the Economic Outlook and Monetary Policy
quarles
0
Thank you for having me. I very much appreciate the opportunity to speak to this distinguished group and look forward to my discussion with Greg Ip. Now almost exactly a year into my appointment as Vice Chairman for Supervision, I have, as might be expected, spoken publicly most often about banking and the financial system more generally. However, supervision and regulation are not all that I do at the Federal Reserve, and I welcome this opportunity to speak to another part of my day job, as a my take on the economic outlook, which is optimistic, and explain how I view my optimism as consistent with the continued gradual pace of policy tightening that many Committee participants have projected. In particular, I will explain how my views on potential growth help shape my outlook, both for the economy and for the appropriate path of monetary policy. Relatedly, I will discuss the uncertainties that arise when a central element of the outlook--in this case, the potential capacity of the economy--is unknown and largely unobservable. Such uncertainty can complicate policymaking in even what appears to be a very healthy economy, providing a further argument for gradualism. In previous remarks on the economic outlook, delivered at a National Association for Business Economics conference in February, I characterized the U.S. economy as being in a "good spot" and asked if the economy had reached a positive turning point following an extended period of post-crisis slow growth. I argued that while it might be too soon to call a turning point, there was a definite possibility of an upside surprise. So now that we are fairly deep into 2018, where do we stand overall? My view has not changed all that much from February. While many other forecasters had to revise up their forecasts over the course of the year, my own outlook is basically unchanged, because the economy is evolving essentially as I expected at the outset of the year. The economy remains in a good spot. Gross domestic product increased a robust 3-1/4 percent in the first half of the year, and indicators suggest continued strong growth through the summer. Economic conditions are as close to meeting the Federal Reserve's dual mandate for monetary policy--maximum sustainable employment and price stability--as they have been in a long time. Inflation is in line with the Committee's 2 percent objective, and the unemployment rate is at nearly a 50-year low. How long can this strong growth be sustained? The answer depends largely on what form growth takes. Growth that is supported by increases in the productive capacity of the economy should be durable. However, if growth primarily reflects strong demand that stretches production beyond its sustainable capacity, the economy will run into constraints that will result in slower growth, higher prices, or a potentially destabilizing buildup of financial imbalances. So, which is it? Unfortunately, it is very difficult to tell. I will return to that question shortly. That said, I see many reasons to be optimistic about the growth of the potential capacity of the economy over the next few years. In part, my optimism is rooted in the view that many of the factors that have been weighing on potential growth since the financial crisis could be lifting. So, have we reached the turning point? While I believe the issue remains unresolved, the recent evidence is encouraging. Why am I optimistic about the economy's supply potential? The growth of potential can, at the most basic level, be broken down into two factors: the supply of labor and the productivity of that labor. Productivity, in turn, is importantly affected by changes in the stock of capital--that is, machines and factories--as well as technological advances and improved production methods. I see reasons to be hopeful about both factors. Let us start with labor. For some time, the contribution of the labor force to potential growth has been held down by the predictable drag of baby boomers moving into retirement. However, the decline in labor force participation following the financial crisis exceeded even what might be expected given this long-standing downward trend-- in particular, as the participation of prime-age workers (those between the ages of 25 and 54) fell and teenagers exited the workforce in droves. The reasons behind the fall in participation among non-retirement-age workers have been the subject of much debate. However, I see little reason to assume that it will be permanent, and we have already seen some signs of a turnaround. Thus, I think there is some potential for labor force participation to move up, perhaps as workers respond to the incentives of plentiful job opportunities and higher wages, thereby adding to the productive capacity of the economy and pushing back the constraints on growth. Some recent labor force trends have been promising. For example, the two- decade-long trend increase in the population not in the labor force on account of disability peaked in 2014 and has started to move down quite rapidly--again, for reasons that defy easy explanation but may reflect the general improvement in labor market opportunities. I will now turn to productivity. Labor productivity has averaged an annual growth rate of only 3/4 percent since 2011, far below the 2-1/4 percent pace that prevailed in the two decades before the financial crisis. Although there are competing theories, the productivity decline is not well understood, and a consensus explanation has yet to emerge. As such, the slowdown could reverse unpredictably as well. The most recent data have been moving in the right direction, but only haltingly, with labor productivity increasing about 1-1/4 percent over the past year. There are reasons that productivity growth could shake off some of its recent torpidity. I would like to start with the capital stock. After a few years of abysmal business-sector investment spending, it appears as though the drought has broken. After picking up in 2017, business fixed investment climbed a robust 10 percent at an annual rate in the first half of this year, likely supported by lower corporate tax rates and other incentives in last year's tax bill. Also, indicators for investment, including orders and shipments of nondefense capital goods, point toward continued strength, and survey evidence points to high business optimism and solid capital expenditure plans. More capital should allow labor to be more productive. I am also a bit of a techno-enthusiast. We are in the early stages of a more widespread application of a wave of new technologies, such as 5G communications, AI (artificial intelligence) and machine learning, and 3-D printing. It might be that the productivity gains associated with these and other new technologies are embodied in new capital equipment and will only now start to become apparent as the investment drought of recent years comes to an end. A tighter economy could also create incentives for firms to revamp their production methods to save on scarce labor resources. To summarize, the economy has been doing very well. Whether this performance is sustained will be importantly determined by whether growth is supported by increases in the economy's potential. I am hopeful that potential growth, and particularly productivity, could accelerate from its relatively anemic pace of late, sustaining growth without overheating the economy. The more the economy's potential growth increases, the more gradual we can be in our removal of monetary policy accommodation. Thus, an assessment of the pace of potential growth will be an important input into what I view as the appropriate path of policy to achieve our objectives of maximum sustainable employment and price stability. The tricky thing, as I pointed out earlier, is that potential output is unobserved and can only be inferred from the behavior of other measured economic indicators. Traditionally, as taught in Econ 101, inflation provides a signal on whether the economy is operating above or below its potential level. If inflation moves up in a sustained manner, not just because of temporary shocks, then the economy is likely operating above its productive capacity, as firms have the leeway to raise prices given the strength of demand. Likewise, if inflation moves down persistently, then the economy is likely operating with some slack, as firms restrain prices to sell their products in the face of weak demand. If inflation is the primary indicator of the economy's position relative to potential, how confident can we be in the quality of the signal? It has been noted--quite frequently, I might add--that the relationship between inflation and the tightness of the economy has gotten weaker, which is to say that inflation appears to be less affected by movements in economic slack or tightness, traditionally measured by the unemployment rate, than in the past. As the role of slack in explaining inflation has diminished, inflation expectations have assumed greater importance. However, it is reasonable to ask, if inflation is, in fact, now largely a reflection of inflation expectations, is inflation still a good indicator of the cyclical state of the economy? Or, more directly, can we count on inflation to warn us in time if the economy is overheating? To be a little controversial, perhaps what we are witnessing with inflation is an application of what has been called Goodhart's law, named after Charles Goodhart, the distinguished scholar of central banking at the London School of Economics. The law can be summarized as the idea that if an indicator becomes a target of policy, that indicator loses its value as a gauge of the state of the economy. Rather, the indicator becomes a signal of the public's belief in the competence and commitment of the government agency that is targeting the indicator. Something along these lines could be happening to inflation, especially given the important role of expected inflation in the behavior of actual inflation. Perhaps inflation is just sending a signal of people's trust in the Fed's ability to meet its inflation objective. If so, no complaints here. That is a good thing. However, a problem does arise if the Fed remains reliant on inflation as our only gauge of the economy's position relative to its potential. There are risks in pushing the economy into a place it does not want to go if we limit ourselves to navigating by what might be a faulty indicator. Anchored inflation expectations might mask the inflation signal coming from an overheated economy for a period, but I have no doubt that prices would eventually move up in response to resource constraints. The ultimate price, from the perspective of the dual mandate, would be an unanchoring of inflation expectations. Of course, I view this more as a risk than my baseline expectation. As I have said, I am optimistic about potential growth, and I expect that even relatively strong growth can be met without running into economic constraints. However, I also think that we should pay attention to other indicators of tightness and overheating in addition to inflation. There are other signs of potential besides inflation, including, but not limited to, direct measures of labor utilization or indications of shortages and bottlenecks in production. How should these thoughts affect monetary policy? I began my remarks by noting that there may be reason to think that the productive capacity of our economy could be accelerating, which would allow a more gradual withdrawal of accommodation without overheating. But I have noted as well that there may be reason to think that resource constraints could be more binding than current inflation measures would traditionally indicate, which would call for a more athletic response. Moreover, there is today a higher degree of uncertainty about many of these factors--measures of labor slack, the relation between labor slack and inflation, the sensitivity of current inflation measures to actual resource constraints, and the future growth of productivity, to name a few--than there has been for many years. In such an environment, some have argued that this greater uncertainty leaves policymakers without a clear guide and market participants without a firm anchor, meaning policy itself could drift--perhaps dangerously. I do not think that is the case at all. Instead, I think this situation reinforces and supports the importance of a clear, steady strategy and a gradual, predictable approach to the removal of accommodation as we continue to monitor the data. The analogy I frequently use is the old pilot's adage of "Don't chase the needles." The control panel of any airplane has an instrument to guide your course: a circular gauge with a vertical bar, or "needle," running through the center of it. If the bar moves sideways to the left, you are drifting off course and should change course to the left until the bar comes back into the center of the circle; if it moves to the right, you change course to the right. Today these instruments get their inputs from the plane's GPS (Global Positioning System) and are fairly sensitive and accurate, but decades ago, when I was a young man first learning to fly in the clouds, they got their information from radio beacons stationed on the ground and were quite squirrelly. The bar could wander from side to side for a while for any of a number of reasons--distance from the beacon, the angle you were approaching it from, interference from other instruments on the panel, sunspots, or rain--and because of this uncertainty, the first rule taught to us as young pilots was, "Don't chase the needles." Precisely because of the uncertainty around the course inputs, the right strategy was to set a course based on your knowledge of the destination, winds, and performance of your plane; communicate that course clearly to air traffic control so everyone knew what you were doing; and then stick to that course steadily even as the course needle might waver from side to side across your instrument. If the needle moved to one side substantially and stayed there pretty consistently, then you would make a small, firm correction--but even then, only gradually and with clear communication about what you were doing. In a world where you had great confidence in the sensitivity of your instrument, such as in today's GPS-based avionics, you could respond immediately to moment-to-moment changes in your course readings, but in the world of radio beacons and sunspots, "chasing the needles" would at best lead to inefficient fishtailing across the sky, and at worst to a substantial deviation from your destination. Today uncertainty around many of the macroeconomic inputs to monetary policy decisions argues for just the same approach to navigation. Rather than meaning that policy will drift because of this uncertainty, it means that policymakers should chart a course that is stable, gradual, and predictable; communicate it clearly; and then follow that course through the temporarily shifting and sometimes conflicting signs from the economy unless some strong and steady signal requires a firm but moderate correction. Given that the economy has performed fundamentally as I expected at the outset of this year, the right strategy is to maintain the gradual course that I have thought appropriate for some time now. Put another way, while I think that there is enough reason to think that the productive capacity of our economy might be increasing so that we should not feel compelled to accelerate our pace, I also think there is enough doubt about current inflation as an infallibly reliable measure of current resource constraints that the continued gradual removal of accommodation is appropriate. Like pilots back in the days of radio beacons, don't chase the needles.
r181025a_FOMC
united states
2018-10-25T00:00:00
Outlook for the U.S. Economy and Monetary Policy
clarida
0
I see some familiar faces in the audience, and I am delighted to be at Peterson today to offer my first public remarks since being sworn in last month as Vice Chairman of the Federal Reserve Board. As some of you know, I have been a student of U.S. monetary policy for more than 30 years. So, for me, personally, it is a distinct honor and real privilege to have the opportunity to serve with my colleagues on the Board of Governors and, along with the 12 Reserve Bank presidents, on the Federal Open Market Now, of course, I fully realize that I have participated in just one FOMC meeting to date, so my remarks today will not come with a patina earned from long experience as a monetary policymaker. That said, I thought it might be of interest to share my thinking on the current state of the U.S. economy, to explain how it informed my support for the FOMC's policy decision last month, and to discuss my views on the way forward for The U.S. economic expansion, now in its 10th year, is marked by strong growth in the gross domestic product (GDP) and a job market that has been surprising on the upside for nearly two years. It is impossible today to know with much precision how much of the pickup in growth and the decline in unemployment that we have seen over the past two years is structural and how much is cyclical. Most likely, both factors are at work. That said, based on my reading of the accumulating evidence, I believe that trend growth in the economy may well be faster and the structural rate of unemployment lower than I would have thought several years ago. Let me elaborate. First, let's look at the demand side of the economy. Consider the benchmark revisions to the household saving rate. Recently revised Commerce Department data now show that the aggregate household saving rate is running at 6.7 percent of disposable income. This revised estimate is nearly double the previous estimate. The higher level suggests that, in contrast to the previous economic expansion from 2001 to 2007, when households were borrowing to maintain consumption while income growth slowed, households today, at least in the aggregate, are well positioned to maintain or even increase consumption relative to gains in income. To me, at this stage in the business cycle, a historically high household saving rate is a tailwind for the economy, not a headwind. And, of course, recent reductions in personal income tax rates are also a tailwind for the economy. Productivity and investment data provide another vantage point from which to assess both the demand and supply sides of the current expansion. Over the past few years, we have seen some pickup in productivity growth, albeit from a very depressed pace. By contrast, at a comparable stage in both the 2001-07 and 1982-90 economic expansions, productivity growth (as measured by an eight-quarter moving average) was actually slowing relative to its contemporaneous peak-to-present pace. I should also note that this recent pickup in productivity has coincided with a rebound in business investment, and that this increase in capital spending has been evident in both the equipment and intellectual property categories; it is not just an "oil patch" story. Business investment is being supported by recent changes in the tax code that lower the cost of capital as well as by continued strong profitability of U.S. companies. While capital investment is one important source of productivity growth and recent data on this front are encouraging, predicting future--or even identifying past--inflection points in productivity growth is notoriously difficult. Although it may be tempting simply to extrapolate a decade of disappointing productivity data into a distant future, a pickup in trend productivity growth is a possibility that deserves close monitoring. Let me now turn to the job market and inflation outlook. Average monthly job gains continue to outpace the increase needed to provide jobs for new entrants to the labor force over the longer run. At 3.7 percent in September, the unemployment rate has not been this low since 1969. In addition, after remaining stubbornly sluggish throughout much of the expansion, wage growth is picking up. A sustained rise in inflation-adjusted, or "real," wages at or above the pace of productivity growth is typical in an economy operating in the vicinity of full employment, and we are starting to see some evidence of this. I certainly hope it continues. Now, some might see a rise in wages as leading to upside inflationary pressures, but here, again, the experience of earlier cycles is instructive. In the past two U.S. expansions, gains in real wages in excess of productivity growth were not accompanied by a material rise in price inflation. Of course, this time may be different, and as with growth, the job market could perform better or worse than the baseline outlook. However, for now, the increase in wages has been broadly consistent with the pickup in productivity growth that I have just discussed, and a rise in the still-low rate of labor force participation among the prime-age population provides scope for the job market to strengthen further without generating inflationary pressures. This outlook for the labor market also reflects my view that the structural, or longer-run, rate of unemployment--that is, the unemployment rate consistent with stable inflation over the longer run--may be somewhat lower than I would have thought several years ago. What this means is that, even with today's very low unemployment rate, the labor market might not be as tight--and inflationary pressures not as strong--as I once would have thought. I am certainly not alone in this thinking. Over the past several years, most FOMC participants have been reducing their individual estimates of the longer-run level of the unemployment rate. The median assessment of FOMC participants fell from around 5-1/2 percent five years ago to 4-1/2 percent in the projections published last month. Outside estimates, such as those from the similar pattern of downward revisions. This makes sense. With unemployment falling and wage gains thus far in line with productivity and expected inflation, the traditional indicators of cost-push price pressure are not flashing red right now. Both total and core personal consumption expenditure inflation are now running close to the FOMC's 2 percent objective. When thinking about the inflation outlook, I pay attention to market-based measures of inflation compensation from the TIPS expectations. These "breakeven inflation rates" are simply the difference between yields on traditional Treasury securities and those on TIPS with comparable maturities. While these market-based measures are not perfect and need to be adjusted for liquidity and term premium factors, they can provide a useful signal about market inflation expectations, which can be combined with signals from surveys of expected inflation to get a read on inflation expectations. Breakeven inflation rates have only recently risen to a range that is in line--but just barely--with the expectation that inflation will remain close to our 2 percent inflation goal over the medium-to-longer run. Survey-based measures of inflation expectations also appear consistent with the Fed's inflation goal. In short, the labor market today is robust, and inflation is at or close to the Fed's 2 percent inflation goal. Thus, the economy is as near as it has been in a decade to meeting both of the Fed's dual-mandate objectives, which suggests to me that monetary policy at this stage of the economic expansion should be aimed at sustaining growth and employment at levels consistent with keeping inflation at or close to the 2 percent rate consistent with price stability. By contrast, until this year, the appropriate focus of policy had been to return employment and inflation to levels consistent with our dual-mandate objectives. With the economy now operating at or close to mandate-consistent levels for inflation and unemployment, the risks that monetary policy must balance are now more symmetric and less skewed to the downside. I supported the FOMC's decision last month to raise the target for the federal funds rate to a range of 2 to 2-1/4 percent. With the economy growing briskly, the labor market operating in the vicinity of full employment, and inflation running close to 2 percent, I saw our decision as another step in removing the extraordinary degree of accommodation put in place in the aftermath of the Global Financial Crisis. However, even after our September decision, I believe U.S. monetary policy remains accommodative. The funds rate is just now--for the first time in a decade--above the Fed's inflation objective, but the inflation-adjusted real funds rate remains below the range of estimates for the longer-run neutral real rate, often referred to as r*, computed from the projections submitted by Board members and the Reserve Bank presidents. This longer-run r*, like the natural rate of unemployment, is both unobserved and time varying--and thus must be inferred as a signal extracted from noisy macro and financial data. That said, and notwithstanding the imprecision with which r* is estimated, it remains to me a relevant consideration as I assess the current stance and best path forward for policy. The reason for this is because, as Milton Friedman argued in his classic American Economic Association presidential address, a central bank that seeks to consistently keep real interest rates below r* will eventually face rising inflation and inflation expectations, while a central bank that seeks to keep real interest rates above r* will eventually face falling inflation and inflation expectations. My own and others' research suggests that the failure of the Fed to respect this principle contributed to the Great Inflation of the 1970s, while the incorporation of this principle into Fed policy in the 1990s and 2000s contributed to the achievement of stable and low inflation during and since those years. So, even though estimates of r* are imprecise, I do not believe they should be ignored. Instead, when thinking about monetary policy, I believe it is best not to ignore entirely an admittedly imprecise estimate of r* today, but instead to update that estimate as new data on inflation, inflation expectations, employment, growth, and productivity arrive. Moreover, because monetary policy operates with a lag, and with inflation presently close to the 2 percent goal, it will be especially important to monitor inflation expectations closely--using both surveys and financial market data--to best calibrate the pace and destination for policy normalization. It will also be important to monitor both model-based and financial-market-based estimates of expected future inflation-indexed real interest rates (for example, 5-year real rates 5 years forward)--suitably adjusted for term premium and liquidity effects--as one indicator of longer-run r*. Before the financial crisis, these 5-year real rates 5 years forward averaged around 2 percent after a term premium and liquidity adjustment. Since 2015, they have averaged about 0.50 percent but recently have approached 0.75 percent, also after a term and liquidity premium adjustment. Given that real interest rates and economic growth tend to move together over the longer run, one possible source of these upward revisions in forward real rates could be that financial market participants may have become more optimistic about the growth potential of our economy. Evidence also suggests that the term premium that investors require to hold longer-maturity bonds has risen as well. If the data come in as I expect, I believe that some further gradual adjustment in the federal funds rate will be appropriate. As I mentioned earlier, I believe monetary policy today remains accommodative, and that, with the economy now operating at or close to mandate-consistent levels for inflation and unemployment, the risks that monetary policy must balance are now more symmetric and less skewed to the downside. Raising rates too quickly could unnecessarily shorten the economic expansion while moving too slowly could result in rising inflation and inflation expectations down the road that could be costly to reverse. As I calibrate, in the months ahead, the pace and ultimate destination for monetary policy adjustments that will best allow the Fed to achieve its dual-mandate objectives, it will be important to me to evaluate a wide range of economic and financial market indicators to complement the predictions yielded by model-based scenarios. As I look ahead, if strong growth and robust employment gains were to continue into 2019 and be accompanied by a material rise in actual and expected inflation, that circumstance would indicate to me that additional policy normalization might well be required beyond what I currently expect. By contrast, if strong growth and employment gains were to continue and be accompanied by stable inflation, inflation expectations, and expectations for Fed policy, that situation, to me, would argue against raising short-term interest rates by more than I currently expect. In closing, with the economy operating as close as it has in a decade to the Federal Reserve's dual-mandate objectives of price stability and maximum employment, I believe monetary policy at this stage of the economic expansion should be aimed at sustaining growth and maximum employment at levels consistent with keeping inflation at or close to the 2 percent objective. Even after our most recent policy decision to raise the range for the federal funds rate by 1/4 percentage point, monetary policy remains accommodative, and I believe some further gradual adjustment in the policy rate range will likely be appropriate. That said, at this stage in the business cycle, I believe it will be especially important to monitor a wide range of data to continually assess and calibrate the level of the policy rate that is consistent with meeting our objectives on a sustained basis.
r181109a_FOMC
united states
2018-11-09T00:00:00
A New Chapter in Stress Testing
quarles
0
Thank you to Brookings for inviting me to speak today. It is an honor and pleasure to talk to you about the next chapter in stress testing, particularly before this extremely distinguished audience of Brookings scholars that includes former Chairs, Vice Chairs, and senior leaders of the Federal Reserve who did the hard foundational work of developing and implementing this new approach to the assessment of bank resiliency. In these circumstances, my discussing changes to our stress testing regime could sound uncomfortably close to the serene arrogance of Alfonso X of Castile, who famously said that "Had I been present at the creation, I would have given some useful hints for the better ordering of the universe." My thoughts today, however, are not a call to rewrite Genesis, but rather a recognition that our stress testing regime--like the banking and financial system that it evaluates--will and should evolve as we continue to learn from experience in the management of this tool. In the best traditions of the Federal Reserve, this evolution should be grounded in rigorous analysis of the facts and a commitment to continual improvement of our methods. In my remarks today, I will begin with what has been successful about the stress tests and why those elements should remain. I will then spend some time discussing some of the changes we have proposed to our stress testing program, and how we are now thinking about moving forward with those changes. After that, I will discuss the tension between, on the one hand, providing additional information to the firms, while, on the other hand, ensuring that the tests remain effective. And last, I will close with some thoughts on the "qualitative" element of our stress testing program. The adjustments I will discuss are intended to increase both the transparency and the efficiency of the stress testing regime. Enhanced transparency goes to the very core of democratic accountability and the rights of all U.S. citizens--including the management and shareholders of the institutions that are subject to the stress tests--to understand the requirements to which they are subject. It also helps ensure the continued credibility of our regime. Let me begin with a short discussion of the core elements of our stress testing program. As you know, one of the most visible aspects of our stress testing program is that the firms and the public receive an independent view of the capital adequacy of the largest banks. The results are based on our own models and provide a consistent yardstick to measure resiliency across the banking system. But the stress test conducted by the Federal Reserve is only one part of our stress testing regime. Just as important, we require each firm to run its own stress test, using its own models and a stressful scenario that reflects the firm's assessment of its idiosyncratic risks and key vulnerabilities. Underpinning a firm's stress test is the firm's ability to identify and measure risks under normal and stressful conditions and the strength of the firm's internal processes, and we use the supervisory process to ensure that the firms' stress testing practices employ sound methodologies. The combination of the Federal Reserve's common yardstick, the firm's own stress tests, and supervisory oversight over the firms' practices has resulted in a meaningful increase in the post-stress resiliency of large financial institutions. All of these core components will remain in place. Further, the changes I'll speak about today are not intended to alter materially the overall level of capital in the system or the stringency of the regime. A healthy U.S. economy relies on a strong, well-capitalized banking system that can weather stressful events and continue lending to households and businesses. The U.S. banking the Fed's evaluation of capital adequacy for large holding companies--have increased the dollar amount of their common equity tier 1 (CET1) capital from around $500 billion in 2009 to more than $1.2 trillion as of the second quarter of this year, and have more than doubled CET1 risk-based capital ratios from approximately 5 percent to over 12 percent over the same period. While the regime has been successful overall, I believe it is prudent to review all our practices to ensure that they are as efficient and transparent as possible and that they remain appropriate in light of changes in the industry that have been achieved. For instance, as firms become more resilient, they may no longer need to build capital to support their current level of risk taking, but rather move into the mode of retaining the capital they have already built. Firms have also significantly improved their risk management, providing room to adjust our approach to assessing capital planning practices. Finally, as we make changes to our regime, the issue of volatility in the stress test results becomes more pronounced. As I'll discuss, we are considering how to balance the need to preserve the dynamism in stress testing with the need to ensure that firms have sufficient notice regarding the capital requirements to which they are held. Many of you are familiar with the Federal Reserve's proposal to integrate the stress test with the regulatory capital rule--known as the stress capital buffer (SCB). believe the SCB proposal represents an important milestone as we enter the next chapter of our stress testing regime. For those who are not familiar with the SCB, let me provide a little background about how our capital rule currently works, and how it would be modified by the SCB. As devoted readers of our capital rules may know, our regulatory capital rule includes both minimum capital requirements and a buffer that sits on top of those minimum requirements. The buffer serves as an early warning to a firm and to supervisors, and it requires the firm to reduce its capital distributions as the firm approaches the minimum requirements. Under the current capital rule, all firms are subject to a fixed buffer requirement of 2.5 percent of risk-weighted assets--the largest firms are also subject to a global systemically important bank surcharge and potential countercyclical capital buffer. For large firms, the SCB would replace the fixed 2.5 percent risk-based buffer with a firm-specific buffer the size of which would be based on the firm's stress test results. In this way, we are integrating the automatic restrictions on capital distributions in the current capital rule with the output of the most dynamic tool we have for assessing risk--the stress test--to create a more robust and dynamic regulatory capital regime. The SCB would also result in a more transparent and simplified system of regulatory capital requirements, because a firm will be held to a single, integrated capital regime. When we made our SCB proposal last April, we had aimed to make the SCB final for the 2019 stress test cycle. However, the comments we received have been extensive and thoughtful, and have raised issues that require a carefully considered response. While I don't believe these issues will prevent us from ultimately implementing the SCB, they have flagged certain elements of the regime that could benefit from further refinement. Accordingly, I expect we will adopt a final rule in the near future that will settle the basic framework of the SCB, but re-propose certain elements. To enable this process to run its course, I expect that the first SCB would not go into effect before 2020. For 2019, I expect CCAR will remain in place for firms with over $250 billion in assets or that are otherwise complex; however, we will consider whether we can move forward with any aspects of the SCB proposal for CCAR 2019, such as assumptions related to balance sheet growth. I will also ask the Board to exempt firms with less than $250 billion in assets from the CCAR quantitative assessment and supervisory stress testing in 2019 in light of the every-other-year cycle contemplated in the tailoring proposal that the Board approved two weeks ago. Returning to the SCB proposal, I would like to give you a sense of the comments we received and our approach to addressing those concerns. The issue foremost on my mind is the volatility of the stress test results. One concern frequently expressed is that the results of the supervisory stress test can lead to capital requirements that change significantly from year to year, which limits a firm's ability to manage its capital effectively. Some amount of volatility is necessary to preserve the dynamism of the stress test--by nature, the stress test will differ year-over-year based on macroeconomic conditions and contemporary understanding of salient risks in the economy. In addition, the stress test results are sensitive to changes in a firm's balance sheet, which means that a firm's capital requirements will evolve as the firm's activities and exposure evolve. However, I do think there is an important balance to be struck between preserving this dynamism and ensuring that firms have sufficient notice regarding the capital requirements to which they are held. In the first years after the financial crisis, as the banking system was dramatically ramping up the total amount of capital in the system, this volatility was less of a management problem: until we reached reasonably full capitalization, each year every bank needed to increase its capital. If one year that increment was a little higher or a little lower than the previous year, that was simply a modest difference of velocity, not direction. At the current juncture, however, both our system as a whole and each of the largest banks in that system are fully meeting their capital requirements. In these circumstances, having a highly variable capital requirement presents a significant management challenge. We are considering ways of preserving the dynamism of stress testing while reducing its volatility, and plan to seek comment on a proposal in this area in the not-too- distant future. In addition, we are also exploring ways of improving our approach to measuring risks in the trading book. Firms' trading books are dynamic and complex, as firms hold both long and short positions. Many have noted that a single market shock does not adequately capture risks in firms' trading book, and we agree with those comments. We are exploring ways to incorporate multiple market shocks in our stress test without adding volatility to the results and without increasing the compliance burden. We are also considering adjusting another element of the SCB proposal in order to provide more notice to firms. Currently, and under the SCB proposal, a firm must decide whether to increase or decrease its planned dividends and share repurchases for the upcoming year without knowledge of a key constraint: the results of the stress test. In other words, we require a firm to give us a formal plan for dividends and stock repurchases without knowing what its effective capital requirement is. If it guesses wrong, it could be publicly shamed for failing the stress test (if its dividends are too high relative to capital), or penalized in the markets for inadequate distribution of income (if its dividends are too low relative to capital). Now, while this might at first blush appear to be pointless and obdurate cruelty, the reasoning behind the practice was initially perfectly sensible: it reflected the view that firms should think rigorously about their capital uses and needs in developing their capital distribution plans, rather than rely primarily on the results of the supervisory stress test to guide those plans. Now that we all have several years' experience with this system, however, firms have told us that they would be able to engage in more thoughtful capital planning if they had knowledge of that year's stress test results before finalizing their distribution plans for the upcoming year. I am sympathetic to their concerns, and will ask the Board to adjust the operation of the rule so that firms know their SCB before they decide on their planned distributions for the coming year. This adjustment in sequence will also help firms manage volatility in the SCB. We expect firms to continue to maintain robust stress testing practices and use those results to inform their capital distribution plans, and we will continue to use the supervisory process to reinforce this expectation. The comments also highlighted an issue with how the capital buffers operate today, which is amplified by the inherent volatility of the SCB. As I noted earlier, a firm operating in its capital buffer is required to reduce its capital distributions so that it can build capital over time. By design, the buffer was intended to apply increasingly stringent limitations as the firm's capital ratios decline. But, in our current world in which a healthy and profitable banking system is seeking to maintain its capital levels rather than continue to increase them, a bank will appropriately and safely tend to distribute much or all of its income in any given year. In that case, the operation of the buffer would not result in a proportional restriction of income as the firm's capital eases, but could be a sudden cessation of all dividends if the firm dips into the buffer by even a small amount, even if the changes in its capital levels are quite minor. We are considering adjustments that would make the rules more consistent with the graduated intent. We plan to work with the other banking agencies to consider how best to effect this change. There are two additional elements of the SCB proposal that I believe would benefit from modification. First, the SCB proposal would have included four quarters of dividends in a firm's SCB, in recognition of the fact that firms experience market pressure to hold dividends constant, even under stress. In my view, there may be ways of encouraging greater reliance on less sticky repurchases while providing more flexibility in the regime, and we are exploring alternatives. Last, the SCB proposal would have included a post-stress leverage requirement. As the Federal Reserve has long maintained, leverage requirements are intended to serve as a backstop to the risk-based capital requirements. By definition, they are not intended to be risk-sensitive. Thus, I am concerned that explicitly assigning a leverage buffer requirement to a firm on the basis of risk-sensitive post-stress estimates runs afoul of the intellectual underpinnings of the leverage ratio, and I would advocate removing this element of the stress capital buffer regime. Of course, leverage ratios, including the enhanced supplementary leverage requirements, would remain a critical part of our regulatory capital regime, and we will maintain the supervisory expectation that firms have sufficient capital to meet all minimum regulatory requirements. Together, the adjustments we are contemplating to the SCB offer promise in improving the efficiency, coherence, and transparency of the regulatory capital regime while maintaining the overall level of capital in the system and the core principles of the regime that have proven successful. That being said, we welcome a continued dialogue with the public as we implement these changes, and through that implementation, we will ensure that we maintain the incentives for effective stress testing practices that exist today. Transparency of the stress test and its inputs and outputs is key to the credibility of the stress test, and there are several initiatives underway to provide additional transparency regarding the supervisory stress test models and scenario design process. We believe that the adjustments under development strike the right balance of advancing transparency, while maintaining incentives for firms to think critically about their own risks. I expect that soon you will see the Federal Reserve issue a policy statement describing governing principles around the supervisory stress testing process. As a part of that statement, I would expect a commitment to disclose additional detail about supervisory stress test models and results and to publish portfolios of hypothetical loans and associated loss rates. I expect that we will begin providing some of this additional detail starting in early 2019, and that these changes will allow firms to benchmark the results of their own models against those of the supervisory models. However, our commitment to transparency does not end there. We are currently considering options to provide additional transparency regarding scenarios and scenario design and I expect that the Board will seek comment on the advisability of, and possible approaches to, gathering the public's input on scenarios and salient risks facing the banking system each year. Such a proposal may also provide additional details about the scenario design features that underpins each year's scenarios, and a range of other enhancements. Additional transparency regarding the stress test scenarios serves multiple purposes: it both provides additional due process to affected participants, and provides additional sources of insight--including from academics and thought leaders like many of you in this room--that could be used to inform a given year's scenario. Increased transparency could also enable us to be more nimble in our scenario design to the extent we uncover through external input new salient risks that we have not previously considered. As we develop proposals, a key consideration is to ensure that we maintain incentives for firms to conduct their own stress tests rigorously and thoughtfully. Firms have indicated that additional disclosure about models would not affect their own stress tests. We expect them to make good on that representation, as the Federal Reserve's stress test is not, and cannot be, a full picture of a firm's resiliency in light of its idiosyncratic risks. For example, there is some risk that firms will use knowledge from additional transparency about the stress test to engage in transactions that are solely designed to reduce losses in the test, but that do not truly reduce risk in their portfolios. As supervisors, however, this is something that we can guard against through the regular examination process. We will closely monitor changes in firms' portfolios and take appropriate actions to ensure firms are holding sufficient capital, and have sufficient controls and governance, in light of the risk characteristics of their activities. Before I close, I would like to say a few words about the role of the qualitative objection in our new chapter of stress testing. As originally conceived, CCAR had both a quantitative component--based on the supervisory stress test--and a qualitative component--based on the Federal Reserve's assessment of a firm's stress testing practices. In 2017, the Federal Reserve eliminated the qualitative objection as part of CCAR for large and noncomplex firms, in part because of improvements in risk management at these firms. By incorporating supervisory stress testing into the regulatory capital regime and introducing an automatic penalty if a firm falls below its SCB-derived capital requirement, the SCB proposal will eliminate the quantitative objection from CCAR. The natural next question is: has the CCAR qualitative objection for the largest firms also run its course? In my view, the time has come to normalize the CCAR qualitative assessment by removing the public objection tool, and continuing to evaluate firms' stress testing practices through normal supervision. In such an environment, firms would remain subject to the same supervisory expectations, and examiners would continue to conduct rigorous horizontal and firm-specific assessments of a firm's capital positions and capital planning, tailored to the risk profile of the firm. While much of the examination work would center on a firm's capital plan submissions, examination work would continue on a year-round basis, taking into account the firm's management of other financial risks. The evaluation of the firm's capital position and capital planning would culminate in a rating of the firm's capital position and planning. Firms with deficient practices would receive supervisory findings through the examination process, and would be at risk of a ratings downgrade or enforcement action if those deficiencies were sufficiently material. As we begin the next chapter in stress testing, my objective is to ensure the continued credibility of the program by increasing its transparency, simplicity, and stability while maintaining the strength of the supervisory and internal stress testing elements that are central to the program today. These adjustments will be coupled with our continued commitment to strong supervision and our expectation that financial institutions ensure they are managing their risks and holding sufficient capital to continue operations through times of stress.
r181113a_FOMC
united states
2018-11-13T00:00:00
What Are We Learning about Artificial Intelligence in Financial Services?
brainard
0
Although it is still early days, it is already evident that the application of artificial intelligence (AI) in financial services is potentially quite important and merits our attention. Through our Fintech working group, we are working across the Federal Reserve System to take a deliberate approach to understanding the potential implications of AI for financial services, particularly as they relate to our responsibilities. In light of the potential importance of AI, we are seeking to learn from industry, banks, consumer advocates, researchers, and others, including through today's conference. I am pleased to take part in this timely discussion of how technology is changing the financial landscape. My focus today is the branch of artificial intelligence known as machine learning, which is the basis of many recent advances and commercial applications. Modern machine learning applies and refines, or "trains," a series of algorithms on a large data set by optimizing iteratively as it learns in order to identify patterns and make predictions for new data. Machine learning essentially imposes much less structure on how data is interpreted compared to conventional approaches in which programmers impose ex ante rule sets to make decisions. The three key components of AI--algorithms, processing power, and big data--are all increasingly accessible. Due to an early commitment to open-source principles, AI algorithms from some of the largest companies are available to even nascent startups. As for processing power, continuing innovation by public cloud providers means that with only a laptop and a credit card, it is possible to tap into some of the world's most powerful computing systems by paying only for usage time, without having to build out substantial hardware infrastructure. Vendors have made it easy to use these tools for even small businesses and non-technology firms, including in the financial sector. Public cloud companies provide access to pre-trained AI models via developer-friendly application programming interfaces or even "drop and drag" tools for creating sophisticated AI models. Most notably, the world is creating data to feed those models at an ever-increasing rate. Whereas in 2013 it was estimated that 90 percent of the world's data had been created in the prior two years, by 2016, IBM estimated that 90 percent of global data had been created in the prior year alone. The pace and ubiquity of AI innovation have surprised even experts. The best AI result on a popular image recognition challenge improved from a 26 percent error rate to 3.5 percent in just four years. That is lower than the human error rate of 5 percent. In one study, a combination AI-human approach brought the error rate down even further--to 0.5 percent. So it is no surprise that many financial services firms are devoting so much money, attention, and time to developing and using AI approaches. Broadly, there is particular interest in at least five capabilities. First, firms view AI approaches as potentially having superior ability for pattern recognition, such as identifying relationships among variables that are not intuitive or not revealed by more traditional modeling. Second, firms see potential cost efficiencies where AI approaches may be able to arrive at outcomes more cheaply with no reduction in performance. Third, AI approaches might have greater accuracy in processing because of their greater automation compared to approaches that have more human input and higher "operator error." Fourth, firms may see better predictive power with AI compared to more traditional approaches--for instance, in improving investment performance or expanding credit access. Finally, AI approaches are better than conventional approaches at accommodating very large and less-structured data sets and processing those data more efficiently and effectively. Some machine learning approaches can be "let loose" on data sets to identify patterns or develop predictions without the need to specify a functional form ex ante. What do those capabilities mean in terms of how we bank? The Financial Stability Board highlighted four areas where AI could impact banking. First, customer-facing uses could combine expanded consumer data sets with new algorithms to assess credit quality or price insurance policies. And chatbots could provide help and even financial advice to consumers, saving them the waiting time to speak with a live operator. Second, there is the potential for strengthening back-office operations, such as advanced models for capital optimization, model risk management, stress testing, and market impact analysis. Third, AI approaches could be applied to trading and investment strategies, from identifying new signals on price movements to using past trading behavior to anticipate a client's next order. Finally, there are likely to be AI advancements in compliance and risk mitigation by banks. AI solutions are already being used by some firms in areas like fraud detection, capital optimization, and portfolio management. The potential breadth and power of these new AI applications inevitably raise questions about potential risks to bank safety and soundness, consumer protection, or the financial system. The question, then, is how should we approach regulation and supervision? It is incumbent on regulators to review the potential consequences of AI, including the possible risks, and take a balanced view about its use by supervised firms. Regulation and supervision need to be thoughtfully designed so that they ensure risks are appropriately mitigated but do not stand in the way of responsible innovations that might expand access and convenience for consumers and small businesses or bring greater efficiency, risk detection, and accuracy. Likewise, it is important not to drive responsible innovation away from supervised institutions and toward less regulated and more opaque spaces in the financial system. Our existing regulatory and supervisory guardrails are a good place to start as we assess the appropriate approach for AI processes. The National Science and Technology Council, in an extensive study addressing regulatory activity generally, concludes that if an AI-related risk "falls within the bounds of an existing regulatory regime, . . . the policy discussion should start by considering whether the existing regulations already adequately address the risk, or whether they need to be adapted to the addition of AI." A recent report by the U.S. Department of the Treasury reaches a similar conclusion with regard to financial services. With respect to banking services, a few generally applicable laws, regulations, guidance, and supervisory approaches appear particularly relevant to the use of AI tools. First, the Federal safety and soundness of embedding critical analysis throughout the development, implementation, and use of models, which include complex algorithms like AI. It also underscores "effective challenge" of models by a "second set of eyes"--unbiased, qualified individuals separated from the model's development, implementation, and use. It describes supervisory expectations for sound independent review of a firm's own models to confirm they are fit for purpose and functioning as intended. If the reviewers are unable to evaluate a model in full or if they identify issues, they might recommend the model be used with greater caution or with compensating controls. Similarly, when our own examiners evaluate model risk, they generally begin with an evaluation of the processes firms have for developing and reviewing models, as well as the response to any shortcomings in a model or the ability to review it. Importantly, the guidance recognizes that not all aspects of a model may be fully transparent, as with proprietary vendor models, for instance. Banks can use such models, but the guidance highlights the importance of using other tools to cabin or otherwise mitigate the risk of an unexplained or opaque model. Risks may be offset by mitigating external controls like "circuit- breakers" or other mechanisms. And importantly, models should always be interpreted in context. prudential regulators' guidance on technology service providers, highlights considerations firms should weigh when outsourcing business functions or activities--and could be expected to apply as well to AI-based tools or services that are externally sourced. The vast majority of the banks that we supervise will have to rely on the expertise, data, and off-the-shelf AI tools of nonbank vendors to take advantage of AI-powered processes. Whether these tools are chatbots, anti- money-laundering/know your customer compliance products, or new credit evaluation tools, it seems likely that they would be classified as services to the bank. The vendor risk-management guidance discusses best practices for supervised firms regarding due diligence, selection, and contracting processes in selecting an outside vendor. It also describes ways that firms can provide oversight and monitoring throughout the relationship with the vendor, and considerations about business continuity and contingencies for a firm to consider before the termination of any such relationship. Third, it is important to emphasize that guidance has to be read in the context of the relative risk and importance of the specific use-case in question. We have long taken a risk- focused supervisory approach--the level of scrutiny should be commensurate with the potential risk posed by the approach, tool, model, or process used. That principle also applies generally to the attention that supervised firms devote to the different approaches they use: firms should apply more care and caution to a tool they use for major decisions or that could have a material impact on consumers, compliance, or safety and soundness. For its part, AI is likely to present some challenges in the areas of opacity and explainability. Recognizing there are likely to be circumstances when using an AI tool is beneficial, even though it may be unexplainable or opaque, the AI tool should be subject to appropriate controls, as with any other tool or process, including how the AI tool is used in practice and not just how it is built. This is especially true for any new application that has not been fully tested in a variety of conditions. Given the large data sets involved with most AI approaches, it is vital to have controls around the various aspects of data--including data quality as well as data suitability. Just as with conventional models, problems with the input data can lead to cascading problems down the line. Accordingly, we would expect firms to apply robust analysis and prudent risk management and controls to AI tools, as they do in other areas, as well as to monitor potential changes and ongoing developments. For example, let's take the areas of fraud prevention and cybersecurity, where supervised institutions may need their own AI tools to identify and combat outside AI-powered threats. The wide availability of AI's building blocks means that phishers and fraudsters have access to best-in-class technologies to build AI tools that are powerful and adaptable. Supervised institutions will likely need tools that are just as powerful and adaptable as the threats that they are designed to face, which likely entails some degree of opacity. While so far, most phishing attacks against consumers have relied on standard-form emails, likely due to the high cost of personalization, in the future, AI tools could be used to make internet fraud and phishing highly personalized. By accessing data sets with consumers' personally identifiable information and applying open-source AI tools, a phisher may be able to churn out highly targeted emails to millions of consumers at relatively low cost, containing personalized information such as their bank account number and logo, along with past transactions. In cases such as this, where large data sets and AI tools may be used for malevolent purposes, it may be that AI is the best tool to fight AI. Let's turn to the related issue of the proverbial "black box"--the potential lack of explainability associated with some AI approaches. In the banking sector, it is not uncommon for there to be questions as to what level of understanding a bank should have of its vendors' models, due to the balancing of risk management, on the one hand, and protection of proprietary information, on the other. To some degree, the opacity of AI products can be seen as an extension of this balancing. But AI can introduce additional complexity because many AI tools and models develop analysis, arrive at conclusions, or recommend decisions that may be hard to explain. For instance, some AI approaches are able to identify patterns that were previously unidentified and are intuitively quite hard to grasp. Depending on what algorithms are used, it is possible that no one, including the algorithm's creators, can easily explain why the model generated the results that it did. The challenge of explainability can translate into a higher level of uncertainty about the suitability of an AI approach, all else equal. So how does, or even can, a firm assess the use of an approach it might not fully understand? To a large degree, this will depend on the capacity in which AI is used and the risks presented. One area where the risks may be particularly acute is the consumer space generally, and consumer lending in particular, where transparency is integral to avoiding discrimination and other unfair outcomes, as well as meeting disclosure obligations. Let me turn briefly to this topic. The potential for the application of AI tools to result in new benefits to consumers is garnering a lot of attention. The opportunity to access services through innovative channels or processes can be a potent way to advance financial inclusion. Consider, for instance, consumer credit scoring. There are longstanding and well-documented concerns that many consumers are burdened by material errors on their credit reports, lack sufficient credit reporting information necessary for a score, or have credit reports that are unscorable. As noted earlier, banks and other financial service providers are using AI to develop credit-scoring models that take into account factors beyond the usual metrics. There is substantial interest in the potential for those new models to allow more consumers on the margins of the current credit system to improve their credit standing, at potentially lower cost. As noted earlier, AI also has the potential to allow creditors to more accurately model and price risk, and to bring greater speed to decisions. AI may offer new consumer benefits, but it is not immune from fair lending and other consumer protection risks, and compliance with fair lending and other consumer protection laws is important. Of course, it should not be assumed that AI approaches are free of bias simply because they are automated and rely less on direct human intervention. Algorithms and models reflect the goals and perspectives of those who develop them as well as the data that trains them and, as a result, AI tools can reflect or "learn" the biases of the society in which they were created. A 2016 Treasury Department report noted that while "data-driven algorithms may expedite credit assessments and reduce costs, they also carry the risk of disparate impact in credit outcomes and the potential for fair lending violations." A recent example illustrates the risk of unwittingly introducing bias into an AI model. It was recently reported that a large employer attempted to develop an AI hiring tool for software developers that was trained with a data set of the resumes of past successful hires, which it later abandoned. Because the pool of previously hired software developers in the training data set was overwhelmingly male, the AI developed a bias against female applicants, going so far as to exclude resumes of graduates from two women's colleges. (FCRA) include requirements for creditors to provide notice of the factors involved in taking actions that are adverse or unfavorable for the consumer. These requirements help provide transparency in the underwriting process, promote fair lending by requiring creditors to explain why they reached their decisions, and provide consumers with actionable information to improve their credit standing. Compliance with these requirements implies finding a way to explain AI decisions. However, the opacity of some AI tools may make it challenging to explain credit decisions to consumers, which would make it harder for consumers to improve their credit score by changing their behavior. Fortunately, AI itself may play a role in the solution: The AI community is responding with important advances in developing "explainable" AI tools with a focus on expanding consumer access to credit. I am pleased that this is one of the topics on your agenda today. Perhaps one of the most important early lessons is that not all potential consequences are knowable now--firms should be continually vigilant for new issues in the rapidly evolving area of AI. Throughout the history of banking, new products and processes have been an area where problems can arise. Further, firms should not assume that AI approaches are less susceptible to problems because they are purported to be able to "learn" or less prone to human error. There are plenty of examples of AI approaches not functioning as expected--a reminder that things can go wrong. It is important for firms to recognize the possible pitfalls and employ sound controls now to prevent and mitigate possible future problems. For our part, we are still learning how AI tools can be used in the banking sector. We welcome discussion about what use cases banks and other financial services firms are exploring with AI approaches and other innovations, and how our existing laws, regulations, guidance, and policy interests may intersect with these new approaches. When considering financial innovation of any type, our task is to facilitate an environment in which socially beneficial, responsible innovation can progress with appropriate mitigation of risk and consistent with applicable statutes and regulations. As with other technological advances, AI presents regulators with a responsibility to act with thoughtfulness and perspective in carrying out their mandates, learning from the experience in other areas. As we move ahead in exploring the policy and regulatory issues related to artificial intelligence, we look forward to collaborating with a broad array of stakeholders.
r181116a_FOMC
united states
2018-11-16T00:00:00
Beginning Stress Testing’s New Chapter
quarles
0
Professor Scott, and our hosts from Harvard and the Program on International Financial Systems, thank you for the chance to participate in today's meeting. Looking around the room, I see a mix of past and current colleagues, from academics, to supervisors and central bankers, to researchers and practitioners in industry. All of you have seen, felt, and lived different aspects of the transition to the post-crisis regulatory framework, and I am grateful to hear your perspectives on such a critical aspect of it. In the depths of the financial crisis, the first regulatory stress tests were designed under intense scrutiny with high-stakes consequences. Their contribution--an independent public view of the capital adequacy of the largest firms--helped reinforce the banking system at a critical juncture. Since then, stress testing has meaningfully increased the post-stress resiliency of large financial institutions, and become a critical tool in keeping the system strong. Those accomplishments are real, and we should aim to do more than simply preserve them. Now is a prudent time to consolidate the gains we have made, and to promote the efficiency and transparency of our processes. Today, I will review some of our efforts along those lines, focusing on proposed changes to our stress-testing program. These changes, which I described in more detail in remarks last week, are intended to improve the program, maintaining its dynamism and flexibility while providing adequate notice to regulated firms, without altering materially the stringency of the tests or the overall level of capital in the system. I share these views with a deep appreciation of the decades of international experience represented in this room. The crisis came with a reminder that the financial system is global, that risks in one country can quickly spread to another, and that in keeping the system and the economy safe, we have no choice but to work together. I look forward to hearing your thoughts on the changes I outline, and on how to improve our stress testing processes in the years ahead. Many of you are familiar with the Federal Reserve's proposed stress capital buffer (SCB), which would replace the current fixed buffer requirement of 2.5 percent of risk- weighted assets with one based on each firm's stress test results. I believe the proposal represents an important milestone in crafting an integrated capital regime, and in keeping with its importance, we have received extensive and thoughtful public comments, identifying elements of it that could benefit from further refinement. I described several of these elements last week, including my views on some areas which I believe we should revisit: improving measurement of risks in the trading book; encouraging less sticky forms of capital distribution without requiring dividend pre-funding; and reevaluating the interaction of the capital buffer with capital distributions. Today, I want to highlight three elements in particular. Foremost among these is the volatility of stress test results. Some volatility in annual results is necessary to preserve the dynamism of the stress test, and to reflect changes in macroeconomic conditions, salient economic risks, and the composition of firm balance sheets. However, when the largest banks in the system are fully meeting their capital requirements, a highly variable capital requirement from year to year can present a significant management challenge. I believe there is an important balance to strike in this area, which will let us preserve dynamism while reducing volatility, and we plan to seek comment on a relevant proposal in the not-too-distant future. The second is the sequencing of stress test results with capital plan submissions. Currently, and under the SCB proposal, a firm must decide whether to increase or decrease its planned dividends and share repurchases for the upcoming year without knowledge of a key constraint: the results of the stress test. Initially, this phasing reflected the view that firms should think rigorously about their capital uses and needs, rather than relying primarily on the results of the supervisory stress test to guide those plans. However, now that we all have several years' experience with this system, firms have told us that they would be able to engage in more thoughtful capital planning if they had knowledge of that year's stress test results before finalizing their distribution plans for the upcoming year. I am sympathetic to their concerns, and will ask the Board to adjust the operation of the rule, so that firms know their SCB before they decide on their planned distributions for the coming year. Of course, we expect firms to continue to maintain robust stress testing practices and use those results to inform their capital distribution plans, and we will continue to use the supervisory process to reinforce this expectation. The third is the post-stress leverage requirement. As the Federal Reserve has long maintained, leverage requirements are intended to serve as a backstop to the risk-based capital requirements. By definition, they are not intended to be risk-sensitive. Thus, I am concerned that explicitly assigning a leverage buffer requirement to a firm on the basis of risk-sensitive post-stress estimates runs afoul of the intellectual underpinnings of the leverage ratio, and I would advocate removing this element of the stress capital buffer regime. Of course, leverage ratios, including the enhanced supplementary leverage requirements, would remain a critical part of our regulatory capital regime, and we will maintain the supervisory expectation that firms have sufficient capital to meet all minimum regulatory requirements. To give these issues the careful consideration they deserve, I expect we will adopt a final rule in the near future, settling the basic SCB framework while re-proposing certain elements. I expect that the first SCB would not go into effect before 2020, and that CCAR will remain in place in 2019 for firms with over $250 billion in assets or that are otherwise complex. However, we will consider whether we can move forward with any aspects of the SCB proposal for CCAR 2019, such as assumptions related to balance sheet growth, and I will ask the Board to exempt firms with less than $250 billion in assets from the CCAR quantitative assessment and supervisory stress testing in 2019. In the meantime, several initiatives are also underway to provide additional transparency into stress testing. I expect you will soon see the Federal Reserve issue a policy statement describing governing principles around the supervisory stress testing process--and with it, a commitment to disclosing additional detail about supervisory stress test models and results, along with portfolios of hypothetical loans and associated loss rates. I expect we will begin providing some of this additional detail starting in early 2019. I also expect the Board will seek comment on the advisability of, and possible approaches to, gathering public input on scenarios and salient risks facing the banking system each year. Transparency matters not only because it provides additional due process to affected participants; it also creates an opportunity for broader, more insightful comments from the public. As a result, it can allow us to be more nimble and better informed in our scenario design. However, we want to maintain incentives for firms to conduct their own stress tests rigorously and thoughtfully, and avoid the risk that firms will use this new information to engage in transactions that are solely designed to reduce losses in the test without reducing actual risk. Firms have indicated that additional disclosure about models would not affect their own stress tests. We expect them to make good on that representation, as the Federal Reserve's stress test is not, and cannot be, a full picture of a firm's resiliency in light of its idiosyncratic risks. We are confident that we can address these concerns through the regular examination process, by closely monitoring changes in firms' portfolios and ensuring sufficient capital, controls, and governance in light of the risk characteristics of their activities. I also want to reiterate a point regarding the role of the qualitative objection. The Federal Reserve eliminated this element of CCAR for large and noncomplex firms in 2017, in part because of improvements in risk management at those firms. In my view, the time has come to normalize the CCAR qualitative assessment by removing the public objection tool, and continuing to evaluate firms' stress testing practices through normal supervision. While supervisory assessments would continue to center on a firm's capital plan submissions, examination work would continue on a year-round basis, taking into account the firm's management of other financial risks, and culminating in a rating of the firm's capital position and planning. Firms with deficient practices would receive supervisory findings through the examination process, and would be at risk of a ratings downgrade or enforcement action if those deficiencies were sufficiently material. These changes are aimed at preserving the foundation laid over nearly a decade of stress testing experience, including by many of the people in this room. Our goal is to bolster the program's credibility by increasing its transparency, simplicity, and stability, while maintaining the strength of the supervisory and internal stress testing elements that are central to the program today. These adjustments will be coupled with our continued commitment to strong supervision, and our expectation that financial institutions manage their risks and hold sufficient capital to continue operations through times of stress. I look forward to hearing your insights into these changes, and I thank you for your time.
r181127a_FOMC
united states
2018-11-27T00:00:00
Data Dependence and U.S. Monetary Policy
clarida
0
I am delighted to be speaking at this annual conference of the Clearing House and the Bank Policy Institute. Today I will discuss recent economic developments and the economic outlook before going on to outline my thinking about the connections between data dependence and monetary policy. I will close with some observations on the implications for U.S. monetary policy that flow from this perspective. U.S. economic fundamentals are robust, as indicated by strong growth in gross domestic product (GDP) and a job market that has been surprising on the upside for nearly two years. Smoothing across the first three quarters of this year, real, or inflation- adjusted, GDP growth is averaging an annual rate of 3.3 percent. Private-sector forecasts for the full year--that is, on a fourth-quarter-over-fourth-quarter basis--suggest that growth is likely to equal, or perhaps slightly exceed, 3 percent. If this occurs, GDP growth in 2018 will be the fastest recorded so far during the current expansion, which in July entered its 10th year. If, as I expect, the economic expansion continues in 2019, this will become the longest U.S. expansion in recorded history. Likewise, the labor market remains healthy. Average monthly job gains continue to outpace the increase needed to provide jobs for new entrants to the labor force over the longer run, with payrolls rising by 250,000 in October. And, at 3.7 percent, the unemployment rate is the lowest it has been since 1969. In addition, after remaining stubbornly sluggish throughout much of the expansion, nominal wage growth is picking up, with various measures now running in the neighborhood of 3 percent on an annual basis. The inflation data in the year to date for the price index for personal consumption expenditures (PCE) have been running at or close to our 2 percent objective, including on a core basis--that is, excluding volatile food and energy prices. While my base case is for this pattern to continue, it is important to monitor measures of inflation expectations to confirm that households and businesses expect price stability to be maintained. The median of expected inflation 5-to-10 years in the future from the University of Michigan Surveys of Consumers is within--but I believe at the lower end of--the range consistent with price stability. Likewise, inflation readings from the TIPS (Treasury Inflation- Protected Securities) market indicate to me that financial markets expect consumer price index (CPI) inflation of about 2 percent to be maintained. That said, historically, PCE inflation has averaged about 0.3 percent less than CPI inflation, and if this were to continue, the readings from the TIPS market would indicate that expected PCE inflation is running at somewhat less than 2 percent. What might explain why inflation is running at or close to the Federal Reserve's long-run objective of 2 percent, and not well above it, when growth is strong and the labor market robust? According to the Bureau of Labor Statistics, productivity growth in the business sector, as measured by output per hour, is averaging 2 percent at an annualized rate this year, while aggregate hours worked in the business sector have risen at an average annual rate of 1.8 percent through the third quarter. This decomposition--in which the growth in output is broken down into measures of aggregate supply, the growth of aggregate hours and the growth of output per hour--suggests that the growth rates of productivity and hours worked in 2018 each have been exceeding their respective longer- run rates as estimated by the Congressional Budget Office. In other words, while growth in aggregate demand in 2018 has been above the expected long-run growth rate in aggregate supply, it has not been exceeding this year's growth in actual aggregate supply. Ultimately, hours growth will likely converge to a slower pace because of demographic factors. But how rapidly this happens will depend in part on the behavior of labor force participation. And recent years' developments suggest there may still be some further room for participation in the job market--especially in the prime-age group of 25-to-54-year-olds--to rise. Labor participation by prime-age women has increased around 2 percentage points in the past three years and is now at its highest level in a decade. That said, it is still 1-1/2 percentage points below the peak level reached in 2000. Labor force participation among 25- to 54-year-old men has risen by roughly 1 percentage point in the past several years. But it is still 2 percentage points below levels seen a decade ago, and it is 3 percentage points below the levels that prevailed in the late As for productivity growth, there is considerable uncertainty about how much of the rebound in productivity growth that we have seen in recent quarters is cyclical and how much is structural. I believe both factors are at work. The structural, or trend, component of productivity growth is a function of capital deepening through business investment as well as a multifactor component sometimes referred to as the "Solow residual." Initial estimates from the recent GDP release indicate that equipment and software investment in the third quarter moderated from the rapid pace recorded in the first half of the year. One data point does not make a trend, but an improvement in business investment will be important if the pickup in productivity growth that we have seen in recent quarters is to be sustained. As for the economic outlook, in the most recent Summary of Economic Projections (SEP) released in September, participants had a median projection for real was expected to decline to 3-1/2 percent next year. And, for total PCE inflation, the median projection remains near 2 percent. With a robust labor market and inflation at or close to our 2 percent inflation goal and based on the baseline economic outlook for 2019 I have just laid out, I believe monetary policy at this stage of the economic expansion should be aimed at sustaining growth and maximum employment at levels consistent with our inflation objective. At this stage of the interest rate cycle, I believe it will be especially important to monitor a wide range of data as we continually assess and calibrate whether the path for the policy rate is consistent with meeting our dual-mandate objectives on a sustained basis. Economic research suggests that monetary policy should be "data dependent." And, indeed, central banks around the world, including the Federal Reserve, often describe their policies in this way. I would now like to discuss how I think about two distinct roles that data dependence should play in the formulation and communication of monetary policy. It is important to state up-front that data dependence is not, in and of itself, a monetary policy strategy. A monetary policy strategy must find a way to combine incoming data and a model of the economy with a healthy dose of judgment--and humility!--to formulate, and then communicate, a path for the policy rate most consistent with our policy objectives. In the case of the Fed, those objectives are assigned to us by the Congress, and they are to achieve maximum employment and price stability. Importantly, because households and firms must make long-term saving and investment decisions and because these decisions--directly or indirectly--depend on the expected future path for the policy rate, the central bank should find a way to communicate and explain how incoming data are or are not changing the expected path for the policy rate consistent with best meeting its objectives. Absent such communication, inefficient divergences between public expectations and central bank intentions for the policy rate path can emerge and persist in ways that are costly to the economy when reversed. Within this general framework, let me now consider two distinct ways in which I think that the path for the federal funds rate should be data dependent. U.S. monetary policy has for some time and will, I believe, continue to be data dependent in the sense that incoming data reveal at the time of each Federal Open Market Committee (FOMC) meeting where the economy is at the time of each meeting relative to the goals of monetary policy. This information on where the economy is relative to the goals of monetary policy is an important input into the policy decision. If, for example, incoming data in the months ahead were to reveal that inflation and inflation expectations are running higher than projected at present and in ways that are inconsistent with our 2 percent objective, then I would be receptive to increasing the policy rate by more than I currently expect will be necessary. Data dependence in this sense is easy to understand, as it is of the type implied by a large family of policy rules in which the parameters of the economy are known . But what if key parameters that describe the long-run destination of the economy are unknown? This is indeed the relevant case that the FOMC and other monetary policymakers face in practice. The two most important unknown parameters needed to conduct--and communicate--monetary policy are the rate of unemployment consistent with maximum employment, u *, and the riskless real rate of interest consistent with price stability, r *. As a result, in the real world, monetary policy should, I believe, be data dependent in a second sense: that incoming data can reveal at each FOMC meeting signals that will enable it to update its estimates of r * and u * in order to obtain its best estimate of where the economy is heading . And, indeed, as indicated by the SEP, FOMC participants have, over the past nearly seven years, revised their estimates of both u * and r * substantially lower as unemployment fell and real interest rates remained well below prior estimates of neutral without the rise in inflation or inflation expectations those earlier estimates would have predicted. And these revisions to u * and r * almost certainly did have an important influence on the path for the policy rate that was actually realized in recent years. I would expect to revise my estimates of r * and u * as appropriate if incoming data on future inflation and unemployment diverge materially and persistently from my baseline projections today. What does this mean for the conduct of monetary policy? As the economy has moved to a neighborhood consistent with the Fed's dual-mandate objectives, risks have become more symmetric and less skewed to the downside than when the current rate cycle began three years ago. Raising rates too quickly could unnecessarily shorten the economic expansion, while moving too slowly could result in rising inflation and inflation expectations down the road that could be costly to reverse, as well as potentially pose financial stability risks. Although the real federal funds rate today is just below the range of longer-run estimates presented in the September SEP, it is much closer to the vicinity of r * than it was when the FOMC started to remove accommodation in December 2015. How close is a matter of judgment, and there is a range of views on the FOMC. As I have already stressed, r * and u * are uncertain, and I believe we should continue to update our estimates of them as new data arrive. This process of learning about r * and u * as new data arrive supports the case for gradual policy normalization, as it will allow the Fed to accumulate more information from the data about the ultimate destination for the policy rate and the unemployment rate at a time when inflation is close to our 2 percent objective.
r181128a_FOMC
united states
2018-11-28T00:00:00
The Federal Reserve's Framework for Monitoring Financial Stability
powell
1
It is a pleasure to be back at the Economic Club of New York. I will begin by briefly reviewing the outlook for the economy, and then turn to a discussion of financial stability. My main subject today will be the profound transformation since the Global Financial Crisis in the Federal Reserve's approach to monitoring and addressing financial stability. Today marks the publication of the Board of Governors' first . Earlier this month, we published our first . Together, these reports contain a wealth of information on our approach to financial stability and to financial regulation more broadly. By clearly and transparently explaining our policies, we aim to strengthen the foundation of democratic legitimacy that enables the Fed to serve the needs of the American public. Congress assigned the Federal Reserve the job of promoting maximum employment and price stability. I am pleased to say that our economy is now close to both of those objectives. The unemployment rate is 3.7 percent, a 49-year low, and many other measures of labor market strength are at or near historic bests. Inflation is near our 2 percent target. The economy is growing at an annual rate of about 3 percent, well above most estimates of its longer-run trend. For seven years during the crisis and its painful aftermath, the Federal Open Market Committee (FOMC) kept our policy interest rate unprecedentedly low--in fact, near zero--to support the economy as it struggled to recover. The health of the economy gradually but steadily improved, and about three years ago the FOMC judged that the interests of households and businesses, of savers and borrowers, were no longer best served by such extraordinarily low rates. We therefore began to raise our policy rate gradually toward levels that are more normal in a healthy economy. Interest rates are still low by historical standards, and they remain just below the broad range of estimates of the level that would be neutral for the economy--that is, neither speeding up nor slowing down growth. My FOMC colleagues and I, as well as many private-sector economists, are forecasting continued solid growth, low unemployment, and inflation near 2 percent. There is a great deal to like about this outlook. But we know that things often turn out to be quite different from even the most careful forecasts. For this reason, sound policymaking is as much about managing risks as it is about responding to the baseline forecast. Our gradual pace of raising interest rates has been an exercise in balancing risks. We know that moving too fast would risk shortening the expansion. We also know that moving too slowly--keeping interest rates too low for too long--could risk other distortions in the form of higher inflation or destabilizing financial imbalances. Our path of gradual increases has been designed to balance these two risks, both of which we must take seriously. We also know that the economic effects of our gradual rate increases are uncertain, and may take a year or more to be fully realized. While FOMC participants' projections are based on our best assessments of the outlook, there is no preset policy path. We will be paying very close attention to what incoming economic and financial data are telling us. As always, our decisions on monetary policy will be designed to keep the economy on track in light of the changing outlook for jobs and inflation. Under the dual mandate, jobs and inflation are the Fed's meat and potatoes. In the rest of my comments, I will focus on financial stability--a topic that has always been on the menu, but that, since the crisis, has become a more integral part of the meal. The term "financial stability" has a particular meaning in this context. A stable financial system is one that continues to function effectively even in severely adverse conditions. A stable system meets the borrowing and investment needs of households and businesses despite economic turbulence. An unstable system, in contrast, may amplify turbulence and prolong economic hardship in the face of stress by failing to provide these essential services when they are needed most. For Economic Club of New York trivia buffs, I will note that the second ever presentation to this club by a Federal Reserve official was about this very topic. The date was March 18, 1929. Weeks before, the Fed had issued a public statement of concern over stock market speculation, and had provided guidance frowning on bank funding of such speculation. William Harding, a former Fed Chair and then president of the Federal Reserve Bank of Boston, defended the Fed's actions in his talk. He argued that, while the Fed should not act as the arbiter of correct asset prices, it did have a primary responsibility to protect the banking system's capacity to meet the credit needs of households and businesses. At the meeting, critics argued that public statements about inflated asset prices were "fraught with danger;" that the nation's banks were so well managed that they should not "face public admonition"; and, more generally, that the Fed was "out of its sphere." Of course, Harding spoke just a few months before the 1929 stock market crash, which signaled the onset of the Great Depression. Fast forwarding, a host of Depression-era reforms helped avoid, for the next three-quarters of a century, a systemic financial crisis and the associated severe economic dislocation--the longest such period in American history. Those decades saw many advances in monetary policy and in bank regulatory policy, but the appropriate role for government in managing threats to the broader financial system remained unresolved. Periodic bouts of financial stress during this period--such as the Latin American debt crisis, the savings and loan crisis, and the Russian debt default--were met with improvised responses. Policymakers conjured fixes from a mixture of private-sector rescues, emergency liquidity, occasional implicit or explicit bailouts, and monetary accommodation. Outside of these crisis responses, however, systemic issues were not a central focus of policy. The Global Financial Crisis demonstrated, in the clearest way, the limits of this approach. Highly inventive and courageous improvisation amid scenes of great drama helped avoid another Great Depression, but failed to prevent the most severe recession in 75 years. The crisis made clear that there can be no macroeconomic stability without financial stability, and that systemic stability risks often take root and blossom in good times. Thus, as the emergency phase of the crisis subsided, Congress, the Fed, and the other financial regulators began developing a fundamentally different approach to financial stability. Instead of relying on improvised responses after crises strike, policymakers now constantly monitor vulnerabilities and require firms to plan in advance for financial distress, in a framework that lays out solutions in advance during good times. This new approach can be divided into three parts. First, build up the strength and resilience of the financial system. Second, develop and apply a broad framework for monitoring financial stability on an ongoing basis. And third, explain the new approach as transparently as possible, so that the public and its representatives in Congress can provide oversight and hold us accountable for this work. Although I'll focus mainly on the stability efforts of the Federal Reserve, a number of federal regulatory agencies have responsibilities in this area. All of these agencies are represented on the Financial Stability Oversight Council, or FSOC, which is chaired by the Treasury Secretary and which provides a forum for interagency cooperation in responding to emerging risks. After 10 years of concentrated effort in the public and private sectors, the system is now much stronger, with greater capacity to function effectively in stressful times. In the banking system, we have implemented a post-crisis regulatory framework based on robust capital and liquidity requirements, a strong stress-testing regime, and mandatory living wills for the largest firms. As a result, banks now have much more high quality capital than before (figure 1). The most recent stress tests indicate that, even after a severe global recession, capital levels at the largest banks would remain above regulatory minimums, and above the levels those banks held in good times before the crisis. most systemically important financial institutions also now hold roughly 20 percent of their assets in the form of high quality liquid assets--that is, safe assets that could be readily sold at short notice (figure 2). The share of these assets is about four times its pre-crisis level. Compared with other economies, lending and borrowing in the United States depend less on bank loans and more on funds flowing through a wide array of capital market channels. The crisis revealed that this capital market centric system, despite its many benefits, also provides more places where systemic risks can emerge. In response, Congress and the regulatory agencies have made many stability-enhancing changes outside of the banking system. For example, many derivatives transactions are now required to be centrally cleared, which, through netting, has reduced exposures and enabled better management of counterparty risk. Tri-party repurchase agreement (repo) reforms have substantially improved the resilience of that marketplace, in particular by limiting intraday loans. Before the crisis, prime institutional money market funds were permitted to report a constant, $1 share price so long as the value of the underlying assets remained near $1. This reporting convention, combined with the implicit support of the plans' sponsors, led investors to treat those funds like bank deposits, even though they were not likewise insured. These funds are now required to report floating net-asset values, and after this reform investors chose to migrate to government-only funds, which are safer and less susceptible to runs (figure 3). These and other measures have reduced the risk that key non-bank parts of the system would freeze up in the face of market stress. Innovation and risk-taking contribute to the dynamism of our financial system and our economy. As Hyman Minsky emphasized, along with the many benefits of dynamism comes the reality that the financial system will sometimes evolve toward excess and dangerous imbalances. This reality underscores the vital importance of the second part of post-crisis reform: monitoring for emerging vulnerabilities. As laid out in our new , we have developed a framework to help us monitor risks to stability in our complex and rapidly evolving financial system. The framework distinguishes between shocks, that is, trigger events that can be hard to predict or influence, and vulnerabilities, defined as features of the financial system that amplify shocks. The report is organized around four broad vulnerabilities that have been prominent in financial crises through the centuries. Each of these vulnerabilities is often found to some degree even in healthy market-based systems, and there is not, at present, any generally accepted standard for assessing at what level the vulnerabilities begin to pose serious stability risks. In lieu of such a standard, we flag cases in which the vulnerabilities rise well beyond historical norms, and then form judgments about the stability risks those cases present. The first vulnerability is excessive leverage in the financial sector. If a highly leveraged segment of the financial system is buffeted by adverse events, the affected entities may all need to deleverage at the same time by selling assets, leading to what is called a "fire sale." Both the resulting decline in asset prices and the impaired ability of the segment to play its role in the economy can amplify the effects of a downturn. We saw this chain of events play out repeatedly in various parts of the financial sector in the weeks following the failure of Lehman Brothers in 2008. In our surveillance, we examine leverage across many types of financial institutions, including banks, insurance companies, hedge funds, and various funding vehicles. Currently, we do not detect a broad-based buildup of abnormal or excessive leverage. As with banks, capital levels at insurance companies and broker-dealers appear robust. In addition, securitization levels are far below their pre-crisis levels, and those structures that do exist rely on more stable funding (see figure 4). Our view into leverage and risk-taking outside the banking sector is admittedly incomplete, however, and we are always working to get a better view of emerging leverage excesses. The second vulnerability is funding risk, which arises when banks or nonbank financial entities rely on funding that can be rapidly withdrawn. If depositors or market participants lose faith in the soundness of an institution or the system as a whole, unstable funding can simply vanish in what is called a "run." During the crisis, we saw widespread runs, including at broker-dealers, some segments of the repo market, and money market mutual funds. These runs did severe damage, contributing to a generalized panic at the time. Had the authorities not stepped in, the damage could have been even more severe. Today we view funding-risk vulnerabilities as low. Banks hold low levels of liabilities that are able and likely to run, and they hold high levels of liquid assets to fund any outflows that do occur. Money market mutual fund reforms have greatly reduced the run risk in that sector. More generally, it is short-term, uninsured funding that would be most likely to run in a future stress event, and the volume of such funding is now significantly below pre-crisis peaks. Taken together, the evidence on these first two vulnerabilities strongly supports the view that financial institutions and markets are substantially more resilient than they were before the crisis. Indeed, the American financial system has successfully weathered some periods of significant stress over the past several years. The third vulnerability is excessive debt loads at households and businesses. Credit booms have often led to credit busts and sometimes to painful economic downturns. When the bust comes, those who have overborrowed tend to sharply reduce their spending. Defaults typically rise faster than had been expected, which may put financial institutions into distress. These effects may combine to bring a serious economic downturn. This boom-bust pattern was clear in measures of household debt around the crisis period, with mortgage debt rising far above its historical trend and then contracting sharply (see figure 5). After the contraction, household debt has grown only moderately. The net increase in mortgage debt has been among borrowers with higher credit scores. While heavily indebted households always suffer in a downturn, all of this suggests that household debt would not present a systemic stability threat if the economy sours. Nonfinancial business borrowing presents a subtler story. With corporate debt, the United States has not faced a massive credit boom like that experienced with residential mortgages before the recent crisis. Instead, after controlling for its trend, business borrowing relative GDP has risen during expansions, no doubt reflecting business optimism, and then fallen when the cycle turned, as some of that optimism proved unfounded (see figure 6). By this measure, the ratio of corporate debt to GDP is about where one might expect after nearly a decade of economic expansion: it is well above its trend, but not yet at the peaks hit in the late 1980s or late 1990s. Further, the upward trend in recent years appears broadly consistent with the growth in business assets relative to GDP. There are reasons for concern, however. Information on individual firms reveals that, over the past year, firms with high leverage and interest burdens have been increasing their debt loads the most (see figure 7). In addition, other measures of underwriting quality have deteriorated, and leverage multiples have moved up. Some of these highly leveraged borrowers would surely face distress if the economy turned down, leading investors to take higher-than-expected losses--developments that could exacerbate the downturn. The question for financial stability is whether elevated business bankruptcies and outsized losses would risk undermining the ability of the financial system to perform its critical functions on behalf of households and businesses. For now, my view is that such losses are unlikely to pose a threat to the safety and soundness of the institutions at the core of the system and, instead, are likely to fall on investors in vehicles like collateralized loan obligations with stable funding that present little threat of damaging fire sales. Of course, we will continue to monitor developments in this sector carefully. The fourth and final vulnerability arises when asset values rise far above conventional, historically observed valuation benchmarks--a phenomenon popularly referred to as a "bubble." The contentious term "bubble" does not appear in our work, however. Instead, we focus is on the extent to which an asset's price is high or low relative to conventional benchmarks based on expected payoffs and current economic conditions. Historically, when asset prices soar far above standard benchmarks, sharp declines follow with some regularity, and those declines may bring economic misery reaching far beyond investors directly involved in the speculative boom. We therefore pay close attention when valuations get to the extreme ends of what we have seen in history. Looking across the landscape of major asset classes, we see some classes for which valuations seem high relative to history. For example, even after standard adjustments for economic conditions, valuations on riskier forms of corporate debt and commercial properties are in the upper ends of their post-crisis distributions, although they are short of the levels they hit in the pre-crisis credit boom. We see no major asset class, however, where valuations appear far in excess of standard benchmarks as some did, for example, in the late 1990s dot-com boom or the pre-crisis credit boom. The asset class that gets the most attention day-to-day is, of course, the stock market. Today, equity market prices are broadly consistent with historical benchmarks such as forward price-to-earnings ratios (see figure 8). It is important to distinguish between market volatility and events that threaten financial stability. Large, sustained declines in equity prices can put downward pressure on spending and confidence. From the financial stability perspective, however, today we do not see dangerous excesses in the stock market. I mentioned the distinction between vulnerabilities and shocks, or triggers. In addition to monitoring vulnerabilities under our four-part framework, we also consult a broad range of contacts regarding sources of risk that might trigger distress at any given time. For example, discussions with contacts currently point to risks emanating from the normalization of monetary policy in the United States and elsewhere, the unsettled state of trade negotiations, Brexit negotiations, budget discussions between Italy and the European Union, and cyber-related disruptions. Having identified possible triggers, we can assess how a particular trigger is likely to interact with known vulnerabilities. A good current example is that of Brexit. U.S. banks and broker-dealers participate in some of the markets most likely to be affected by Brexit. The Fed and other regulators have been working with U.S. financial institutions that have operations in the European Union or the United Kingdom to prepare for the full range of possible outcomes to the negotiations. In addition, the scenarios used in the stress tests routinely feature severe global contractions and show that U.S. banks have the capital to weather even highly disruptive events. I have reviewed a few of the key facts that inform our thinking about financial stability, and you will find a great deal more detail in our new report. You will also find that the report does not come to a bottom line conclusion. As I noted earlier, we have limited experience with this monitoring, and there is no widely accepted basis for reaching a bottom line. Thus, the purpose of the report is to provide a common platform and set of readings from which policymakers and other interested parties can form their own views. Individual policymakers will sometimes differ in their assessments and on the relative weight they put on particular vulnerabilities. My own assessment is that, while risks are above normal in some areas and below normal in others, overall financial stability vulnerabilities are at a moderate level. In my view, the most important feature of the stability landscape is the strength of the financial system. The risks of destabilizing runs are far lower than in the past. The institutions at the heart of the financial system are more resilient. The stress tests routinely feature extremely severe downturns in business credit, and the largest banks have the capital and liquidity to continue to function under such circumstances. Because this core resilience is so important, we are committed to preserving and strengthening the key improvements since the crisis, particularly those in capital, liquidity, stress testing, and resolution. I'd like to conclude by putting financial stability and our two new reports in a longer-term context. To paraphrase a famous line, "eternal vigilance is the price of financial stability." We will publish these reports regularly as part of our vigilance. Over time, some may be tempted to dismiss the reports entirely or to overdramatize any concerns they raise. Instead, these reports should be viewed as you might view the results of a regular health checkup. We all hope for a report that is not very exciting. Many baby boomers like me are, however, reaching an age where a good report is, "Well, there are a number of things we should keep an eye on, but all things considered you are in good health." That is how I view the out today. We hope that this report and the will be important tools, sharing Federal Reserve views and stimulating public dialogue regarding the stability of the financial system.
r181203a_FOMC
united states
2018-12-03T00:00:00
Celebrating Excellence in Community Development
brainard
0
Thank you, Chairman Powell. It is my honor to introduce Chair Yellen. When I first arrived at the Board, Chair Yellen offered me the opportunity to lead the Committee on Consumer and Community Affairs--even though I had little experience in this area. I greatly appreciate the opportunity she gave me. After visits to 16 communities across the country--both rural and urban--I can attest to the richness of our community development work. It helps us see the economy as it is experienced by Americans in their communities. It provides a valuable perspective on our monetary policy goals by putting names and faces on the aggregate unemployment statistics. It helps us spot problems in consumer credit in overstretched communities well before they show up in national statistics. And it is essential in helping our banks meet their affirmative obligations to the low- and moderate-income communities they serve and to understand what is likely to be most effective in lifting up the lives of people in challenged communities. during some of the most challenging economic times for our country no doubt shaped her commitment to the Federal Reserve's community development work. One associate from the San Francisco Federal Reserve Bank recounts a meeting at the height of the crisis when staff suggested their exhausted boss should take a break from the round-the-clock emergency meetings. Then-President Yellen stated that she had an obligation to continue working until such time that families were no longer losing their homes, their livelihoods, and their pensions. That moment encapsulates Chair Yellen's consistent orientation to how our work touches the lives of American families. It is perhaps no accident that Chair Yellen delivered her first speech as Chair at a community reinvestment conference. In navigating the Federal Open Market Committee's objectives of price stability and full employment, Chair Yellen was attentive to low- and moderate-income communities, recognizing that Americans on the most precarious rungs of the ladder often feel the impacts of a downturn soonest and the longest. Chair Yellen brought the subject of economic disparities to the forefront of our conversations, consistently emphasizing the importance of an economy that works for everyone. As Chair, Janet Yellen continued her practice of meeting with community members, where she had in-depth conversations about disparities in employment, labor force participation, income, and wealth, recognizing these direct interactions provide valuable insights no statistic or report can fully capture. For instance, she visited the Manufacturing Technology Center on Cuyahoga Community College's Metropolitan Campus where she participated in a roundtable discussion on the state of manufacturing workforce development in Northeast Ohio. We are committed to continuing Chair Yellen's legacy. In our service to the American people, we recognize that our monetary policy, financial stability, and supervisory activities have an important influence on the financial health of Americans in communities across the country. Promoting community development is one of the key purposes and functions of the Federal Reserve, and we have a responsibility to ensure that consumer and community perspectives inform Federal Reserve policy, research, and actions, including vulnerable communities. In his inaugural remarks as Chairman, Jay Powell underscored that it is our duty at the Federal Reserve to "approach every issue through a rigorous evaluation of the facts, theory, empirical analysis, and relevant research." That lens informs our community development responsibilities: we seek to learn from the experience of manufacturing workers, homeowners, community bankers, nonprofit executives, community organizers, and small business owners to help inform our policymaking. Advisory Council (CAC) to help provide regular insights into the conditions facing low- and moderate-income communities. Recently, CAC members provided informative public comments to the Board on the Community Reinvestment Act and current market conditions within low- and moderate-income communities. . and Community Affairs for Reserve Bank presidents--and community development staff across the System are continuing to advance this agenda. Over the past year, all 12 Reserve Banks have adopted a common community development strategic plan that seeks to advance the economic resilience and mobility of low- and moderate-income households and communities and to enhance public awareness of these issues. During Chair Yellen's 16 years as a public servant in the Federal Reserve, her words, actions, and research demonstrated a deep commitment to striving for an inclusive economy and recognizing the challenges faced by underserved communities. From her speeches to her policy deliberations, Janet made clear that her duty was to serve Main Street and all Americans. The Award for Excellence in Community Development that we are establishing tonight memorializes Janet Yellen's commitment to the people and places that make up our vibrant economy, including those who face challenges. Going forward, we will look to celebrate staff within the System like Ariel Cisneros--the first to receive this award in Janet Yellen's name-- thus honoring those who innovate, trailblaze, and embody the same deep-seated commitment to low- and moderate-income communities that Janet Yellen espouses. With that, it is my great pleasure to introduce Chair Janet Yellen.
r181203b_FOMC
united states
2018-12-03T00:00:00
Celebrating Excellence in Community Development
powell
1
Thank you, Anna. It's an honor to be part of this important occasion. Tonight is an opportunity to recognize the vital contributions of the Federal Reserve's community development staff, and an opportunity to honor Chair Yellen, who did so much to advance the Fed's community development mission. Briefly, I would like to focus on the importance of promoting a strong economy that extends opportunity to all and on the role our community development staff plays to advance that goal. The Federal Reserve's mission is to promote a strong economy and sound financial system; I am glad to say we have made a great deal of progress toward those goals. Unemployment is 3.7 percent, the lowest in nearly half a century. Over 17 million jobs have been created during this expansion, with an additional 250,000 created in October. Beyond the labor market, there are other signs of economic strength. The steady decline in the unemployment rate is mirrored by the decline in financial hardship reported by respondents to the Federal Reserve's Survey of Household Economics and Decisionmaking over the past five years. Wage gains, increased household wealth, and elevated consumer confidence are supporting robust consumer spending. Since the crisis, we have also taken numerous steps to make the financial system safer and stronger, leaving it better equipped to support the financial needs of consumers and communities through good times and bad. However, the benefits of this strong economy and sound financial system have not reached all Americans. The aggregate statistics tend to mask important disparities by income, race, and geography. Moreover, the economy faces a number of longer-term challenges. While there have been gains in the pace of wage growth recently, wages for lower-income workers have grown quite slowly over the past few decades. Productivity has picked up in recent months after several years of very slow growth, but it is not clear whether this is a trend that will be sustained. An aging population is limiting growth in labor supply, which in turn limits potential growth. And a decades-long decline in economic mobility in the United States reflects the difficulty faced by lower-income Americans in moving up the economic ladder. The Fed's community development function plays a key role in helping us carry out our broad responsibilities. Information gathered by the community development staff ensures that the perspectives of individuals and communities inform the Fed's research, policy, and actions. Soliciting diverse views on issues affecting the economy and financial markets improves the quality of our research, the fairness of our policies, and the transparency of our actions. Raising awareness of emerging economic trends and risks makes regulation and supervision more responsive to evolving consumer financial services markets and technologies. The Fed's community development function also advances our Community Reinvestment Act responsibilities by analyzing and disseminating information related to local financial needs and successful approaches for attracting and deploying capital. These efforts strengthen the capacity of both financial institutions and community organizations to meet the needs of the communities they serve. In addition to providing us with a richer, more nuanced understanding of current economic and financial conditions, the Federal Reserve's community development staff is deeply engaged in helping lower-income and underserved communities overcome their challenges and capitalize on their assets. One thing that was apparent during the recession and the uneven recovery that followed was that people and the places they live are linked. When one struggles, both struggle. Successful community development invests in and builds up both the physical infrastructure and human capital in underserved areas. The Federal Reserve is uniquely positioned to bring together diverse stakeholders to disseminate information, exchange ideas, and identify shared interests that foster local partnerships and comprehensive solutions. This event is an opportunity to recognize your efforts to advance solutions that build resilient and more prosperous communities. I want to thank Ariel Cisneros--our distinguished inaugural recipient of the Yellen Award for Excellence in Community Development--as well as each of you for your individual and collective service to this effort. And speaking of service: It is fitting that this award for excellence in community development will today and thenceforth be given in honor of Janet Yellen. During your tenure as Chair, Janet, you elevated the importance of economic and financial inclusion and the Fed's role in community development. You reminded us that an inclusive economy is a vibrant economy. Governor Brainard, the principal oversight governor for community development, and I are honored to carry this message forward. The economy benefits when we successfully tap into the underutilized potential of more of our fellow Americans. Thank you, Janet, for everything that you accomplished as Chair and over a lifetime of public service that thankfully continues today. And congratulations, Ariel, for what you have accomplished in service to others, which has made you such a deserving recipient of this award.
r181205a_FOMC
united states
2018-12-05T00:00:00
Banks as Vital Infrastructure for Rural Communities of the West
quarles
0
It's a pleasure to be here at Stanford, an honor to be invited to speak by SIEPR and the Bill Lane Center--two institutions for which I have long had fondness and respect--and a great luxury to have the chance to talk for an hour with a group of people who share my love for a part of the world we call the West. Most of my day job enmeshes me, of necessity, in either broad systemic questions of global financial stability or the impossibly arcane minutiae of our convoluted and labyrinthine financial regulatory system. Both of those are perfectly worthy occupations but inescapably require a relentlessly global outlook. Yet my first intellectual passion as a very young man was for the history and life of a specific part of the world--the Western United States, as place and idea. Your own presence here in the audience suggests that many of you have been moved at some point in your life, as I have been, by the words of If there is such a thing as being conditioned by climate and geography, and I think there is, it is the West that has conditioned me. It has the forms and lights and colors that I respond to in nature and in art. If there is a western speech, I speak it; if there is a western character or personality, I am some variant of it; if there is a western culture in the small-c, anthropological sense, I have not escaped it. So, that is why I say it is a pleasure and an honor but most especially a luxury for me to speak to an audience that shares my concern for and love of the West on the topic of how the preoccupations of my day job--banking and finance--affect this particular part of the world. We all know that throughout the history of the West, banking and finance have played an important role as vital infrastructure for the economy. That remains true today, although it is often overlooked in the traditional litany of issues critical to the West. A strong banking industry is necessary for households and businesses to engage in the spending, saving, and investment that constitute economic activity, and one of the purposes of financial regulation is to ensure that banks continue to be able to serve this purpose. So it is my plan today to talk about the economy of the West and link the recent and future performance of the western economy to the central and supportive role banks play as vital infrastructure in their communities. Let's start by defining our terms. What do we mean by the West? Before this audience, I approach this question with respect and a little caution, because I know that in a broad sense, it is what the Center for the American West has been addressing since its founding. That aside, it makes sense to use a definition, as the center often does, that includes parts of states in the middle of the country whose economies, heavily dependent on natural resources, ranching, and farming, more closely resemble Montana and New Mexico than they do states on the Eastern seaboard. Nevertheless, for the purposes of these remarks, I am going to use a less expansive definition comprising the 13 westernmost states. I do so partly because this is a common definition of the West, and because it is the one used by the U.S. Census Bureau in dividing the country into four regions and this has helped me gather data for this speech more readily. (Figure The first thing to mention that distinguishes the West from the rest of the country is apparent on this map--land area. The West is large and would appear even larger of course, if it weren't necessary to shrink the vastness of Alaska to fit on one page. Even using my more modest definition, the West still represents half of the land area of the United States. This is important in this context because of one theme I will be exploring today, which is the economy of the rural West and the banking services available there. As this map should make plain, when we talk about rural America, to a significant extent we are talking about the rural West. Another way to put it is that while the West is half of the United States geographically, it is only a quarter of its population. Eight of the 13 states in our definition of the West are among the dozen in the nation with the lowest population density. Even after subtracting the large share of unpopulated government-owned land in the West, the overwhelming majority of the West is rural. Yet it is also true that, as most of you know, the West is the most urban region of the country. Five of the 15 largest metro areas in the United States are located in the West, and 90 percent of the entire region's population lives in cities, compared to 81 percent of the country as a whole. Nevada and Utah are among the least densely populated states, but 90 percent or more of their residents live in cities. Like the rest of the country, the West is urbanizing. I will return to the differences between urban and rural areas, but first, let's look at the economy of the The West accounts for 25 percent of U.S. gross domestic product, almost exactly the share of the U.S. population in the West. This isn't entirely a coincidence, because population relates to labor supply, which is one determinant of economic output. I mention this because it is also true that the change in population over time has a bearing on the economic health of communities and regions, and is part of the story for economic growth in the western United In recent years, that story has been a very positive one. The U.S. economy is strong, and the western economy is especially strong. While we don't have state or regional numbers for this year, from 2015 through 2017, GDP, after adjusting for inflation, grew two to nearly three times as fast in the West as in the rest of the United States. Since the Great Recession ended in 2010, real GDP has grown twice as fast in the West as in the Northeast, about 50 percent faster than in the Midwest, and a third faster than in the South. But there is another distinction to the western economy, one that many in this audience know well, that brings these high-flying results down to earth. While the good times are typically very good in the West, the bad times are usually pretty bad, and that was certainly true during the Great Recession. From 2008 through 2010, the western economy contracted four times as much as the average for all of the United States. The same is true for per capita personal income, which contracted more sharply in the West than elsewhere, as job losses in the western region during the recession were worse than elsewhere. Taking a longer perspective, however, even with the booms and the busts that have characterized the West for upwards of 200 years, the West is still the fastest growing region of the United States. Since 2000, real GDP has grown twice as fast in the West as in the Northeast and Midwest, and about 30 percent faster than in the South. Since 1988, the numbers are roughly the same. Per capita personal income grew faster in the West during good years, outweighing the sharper losses during the recession and since 2000 has grown about 20 percent faster in the West than in the rest of the country. This picture of the West leading the nation in economic growth is also reflected in population, a key factor in that growth. From 2010 to 2017, the population of the West increased 7.4 percent, compared to 5.3 percent for the nation. One might guess that this strong growth is primarily in cities, and in fact reflects the drain of population from smaller towns that is reported to be happening all over the United States. But in fact, population gains in the West are quite well balanced between cities and towns, and smaller towns are doing better in the West than elsewhere in holding and even increasing their populations. In the cities of the West, those communities of 50,000 or more, the population grew 7.8 percent from 2010 to 2017, eclipsed by the 10 percent growth in southern cities but well above the 2 percent to 3 percent increase in the cities of the Northeast and Midwest. At the same time, the West led the other regions of the United States in growth for towns and cities from 10,000 to Now let's look at small towns, those with 5,000 or fewer residents. It is this cohort of towns and their people that to my mind is really the essence of rural America. There is a widespread impression that small towns all over America are shrinking and that population is shifting to larger towns and cities, and that is the story we can see in much of the regional data. From 2010 to 2017, the average small town shrank by 2 percent in the Northeast, by 1.4 percent in the Midwest, and only grew 1.3 percent in the South, while larger towns and cities grew substantially faster there and elsewhere. What about the West, where urbanization is happening more quickly than anywhere else? In fact, the average western small town grew 7.8 percent in those seven years, which is roughly the same healthy pace of growth registered in western towns and cities from the smallest to the largest. With a nod to Mark Twain, the message here is that reports of the death of small towns, at least in the West, have been greatly exaggerated. I don't mean to minimize the challenges small towns face, because they are considerable, or suggest that western small towns are immune to them, because they certainly are not. But the more positive message I come away with here is that a rising tide of prosperity in the West seems to be lifting communities large and small, and that the many instances of small towns in decline that we all hear about are counterbalanced by other small towns growing healthily. Another lesson is that population growth and the attendant economic growth in the West is not zero sum, and that small towns can still grow healthily while the metropolises of the West continue to attract people, partly from outside the region. I will return to the issue of how the economies of the urban and rural West differ, but the backdrop is that population data are promising for both. Banking in the West Now that I have outlined how the economy of the West relates to that of the rest of the United States, let me do the same for banking in the West. There is a lot to this topic, but my focus is banking as it is experienced at the retail level by households and small- and medium- sized businesses. Again, I use the Census Bureau's list of 13 states to define the West, and for a simpler comparison, and in honor of Frank Church, I will refer to the other 37 states as "eastern" even though they cover several regions. Some of you are old enough to remember Frank Church, the U.S. Senator from Idaho--a Stanford grad and hero of mine when I was a teenager. When he ran for president in 1976 and had some success in the primaries, he was challenged by the press that said his victories were limited to the West, and he replied "What do you mean I can't win an eastern state? I won Nebraska!" One dimension to how households and most businesses experience banking is the number of banks competing for their services, and one difference between the West and the East is that the West has fewer banks, even if we count them on a per capita basis. In addition, banks in the West tend to be larger. One reason for the disparity in the average number and size of banks is that historically, a larger share of western states allowed statewide branching or had relatively limited branching restrictions, leading to the development of fewer and larger banks. In metropolitan areas of the West, the average number of banks was 23 in 2017, compared to 27 in eastern metro areas. In rural counties of the West, customers had access to an average of 4.6 banks in 2017, compared to 5.4 banks in the East. In both urban and rural areas, in both the East and the West, the average number of banks increased in the years leading up to the financial crisis but has been declining since then. While there is certainly more to say about banking services in cities, I want to focus on rural areas for the moment. Over the last 20 years, rural westerners have consistently had access to something like 20 percent fewer banks than rural easterners, which amounts to about one less bank per rural county. That may not sound like much, but that one less bank, on average, can make a big difference to the households and small businesses inhabiting those rural counties. In any community, access to credit is essential for economic growth. In any community, but especially in rural communities, small businesses are key drivers of growth. Small businesses heavily rely on banks for funding, and community banks, those with less than $10 billion in assets, account for a disproportionate share of bank lending to small businesses. But across the country, the number of community banks has fallen by half over the past 20 years, mostly due to consolidation. This fall has been slightly larger in the West than the East, but the reason I highlight this trend is that I believe it has significant consequences for rural communities in the West. The reason returns me to that map I showed earlier. The West is vast, and many rural communities in the West must deal with the challenges of isolation from larger towns and cities that few communities in the East face. Rural westerners have access to far fewer banks overall than rural easterners, but that understates the disparity when some western counties are the size of Maryland or Massachusetts and many are more than a hundred miles across, so that access to a limited number of banks can be difficult or out of reach. To provide some perspective on this challenge, let me share what westerners in one small town have to say about their access to banking services. This account is culled from a series of public meetings that the Federal Reserve is convening in communities all around the country, in support of our responsibilities overseeing community banks and promoting community development. Not quite a month ago, we held one such session in my home state of Utah, in the town of Green River, population 940, located on the eastern side of the state. It is just off an isolated stretch of interstate, and its economy depends on those passersby and tourism related to outdoor recreation. In 2014, Green River lost its only bank, and around that time, banks closed in several nearby communities. As a result, most residents of Green River who need banking services must drive the 52 miles to Moab, Utah. This loss has had a profound effect on households and businesses, essentially turning back the clock in Green River by decades, as far as its access to the banking system. Business owners say their work days have been curtailed by the need to drive two or three times a week to Moab to obtain change and make deposits. Businesses have become banks for many residents, agreeing to cash their checks. Because of this new service to residents, and the long distance to a real bank, retail businesses say they are carrying much larger amounts of cash, which has heightened security concerns and prompted some merchants to invest in new safes. Security is also a concern for residents who no longer have access to safety deposit boxes for valuables and important papers. Faster and more efficient electronic payments hold some promise to bridge these gaps, eventually, and the Federal Reserve is working hard on this and making significant progress. But one thing that technology cannot do is replace the knowledge and perspective of a local banker who is part of the community. Relationship-based lending that is the hallmark of community banking can stem losses during downturns, since community banks may be able to work with borrowers to avoid losses. Research has shown that small business lending at smaller banks declined less severely than at large banks during the last recession. Community banks face considerable competition. One of the reasons that community banks continue to succeed in many places is their understanding of their customers' needs and opportunities to invest in families and businesses. The loss of this relationship, for a community, means that needs will go unmet, and opportunities will be lost. In some communities, of course, this may be unavoidable, if a town has lost a major employer and has no new industry or plan to replace the jobs lost. But let me remind you of the very encouraging data I cited on the growth of small towns in the West. Overall, rural communities are growing at a healthy pace, and growing communities need banks. It is my hope that the opportunity for the future that this population growth suggests will support the local banks that communities need to thrive. A significant part of the Federal Reserve's recent regulatory focus has been aimed at streamlining regulations and reducing the regulatory burden on smaller and regional banks. The Fed, along with the other banking agencies, recently proposed a community bank leverage ratio that is designed to simplify significantly the standards banks must abide by for holding capital. As a result, qualifying community banks will only need to calculate and meet a single measure of capital adequacy, rather than multiple measures. We have also proposed a reduction in the burden of reporting requirements for community banks, and expect soon to propose an exemption for community banks to the Volcker rule. For a subset of the smallest banks--those with less than $5 billion in total assets--we have also lengthened the amount of time between supervisory examinations and expanded eligibility of small bank holding companies that qualify for an exemption to the Federal Reserve's capital rules, a policy that was designed to promote local ownership of small banks and to help maintain banks in rural areas. Most recently, the Federal Reserve spearheaded a proposal to tailor regulation that applies to firms that are larger than community banks, but the potential failure of which generally does not pose risks to the financial system. These are firms with between $100 billion and $250 billion in total assets. Those changes are designed to reduce unnecessary regulatory burdens on specific institutions without any loss of resiliency for the financial system. By these steps, I believe the Federal Reserve is helping community banks remain competitive and play the central role they have long played as vital infrastructure for rural communities. The future of community banks in these communities is one reason that I'm optimistic about the economic prospects of small towns in the West, which continue to grow even as the region and the nation overall becomes more urban. That's a good thing, because small towns, and especially their values, have long helped define the West, and I hope will continue to do so.
r181206a_FOMC
united states
2018-12-06T00:00:00
Welcoming Remarks
powell
1
Thank you for the kind introduction, and thank you to the Housing Assistance Council (HAC) for inviting me to be part of this discussion of rural housing. I understand that you will shortly be presenting awards to people who are working at the local and national levels and in both the public and private sectors and whose efforts have improved housing conditions for the rural poor. All of you who work in these roles are doing your country a great service by helping to advance economic opportunity in our communities. I want to thank the award recipients for the difference you make in the lives of people in rural communities. I am happy to report that our economy is currently performing very well overall, with strong job creation and gradually rising wages. The unemployment rate is 3.7 percent, the lowest since 1969. A strong job market has encouraged more people to participate in the labor market, another positive development. In fact, by many national- level measures, our labor market is very strong. As those at this conference are acutely aware, however, aggregate statistics can mask important variations between different demographic and income groups, as well as significant regional differences. For example, unemployment rates in some persistently poor rural counties remain much higher than the national figures. The annual average unemployment rate in 2017 exceeded 10 percent in 27 persistently poor rural counties, and the rate was 20 percent or more in 2 of those counties. Recent Fed research found that, since 2007, labor force participation rates for those in their prime working years in rural areas have increasingly lagged rates in urban areas. Labor force participation has been particularly low for those with only a high school diploma or less. Research has also found that business formation and employment growth during the recovery have been concentrated in large urban areas. Data and research findings like these remind us that, despite positive trends in national data, the benefits of the ongoing economic expansion are still not reaching some communities. Through the Fed's 12 Reserve Banks and their branches, we are able to get a clearer picture of conditions in individual communities across the nation. Each of the Reserve Banks has an active, well-staffed community development function--one of the great benefits of the Federal Reserve's structure. We get important and timely information on the state of local economic and financial conditions, including those affecting low- and moderate-income, as well as other underserved, communities. Our community development staff provide us with a more nuanced understanding of current economic and financial conditions. They also help people in low-income and underserved communities overcome the challenges they face. We support numerous initiatives in rural communities across the country, several of which I would like to highlight. The first initiative is the Federal Reserve Bank of St. Louis's longtime commitment to the Mississippi and Arkansas Delta region. This region has struggled for decades with persistent intergenerational poverty and a lack of resources and capacity to address this challenge. Most recently, in 2016 the St. Louis Fed launched the Delta Communities Initiative, and since then it has held 26 regional forums with more than 500 participants. These forums help to build awareness of promising tools and strategies for community and economic development. A recent survey of participants suggests that these forums are making a real difference. A second example is the Federal Reserve Bank of Dallas's commitment to advancing digital inclusion in low-income rural communities. Despite significant effort and advances made by public, private, and nonprofit organizations in recent years, nearly 30 percent of rural households continue to lack access to broadband internet service. Also, fewer than half of households earning less than $20,000 per year have such service. In an increasingly digital economy, lack of access to high-speed internet and the knowledge of how to make the best use of it limits the ability of families and entire communities to reach their full potential. To address this problem, the Dallas Fed has explored the role of the Community Reinvestment Act (CRA) in addressing the digital divide, the potential for technology to bridge the urban-rural divide in access to health care, and the critical importance of preparing workers for the digital economy. Third and last, community development staff at the Federal Reserve Board and several Reserve Banks have conducted research to better understand housing affordability challenges affecting rural communities nationwide, recognizing the importance of sufficient affordable housing to a community's economic vitality. Researchers found that, as in many urban areas nationwide, a large portion of renters in rural communities struggle to afford their rent--a finding that will not come as a surprise in this room. Fortunately, our staff also highlighted promising policy and practice solutions that have been implemented in some communities to try to address these challenges, including the establishment of dedicated funding for affordable housing and the elimination of exclusionary land use and zoning policies. While this type of research and community engagement work is a central component of our efforts to support rural areas, we also know that communities need resources and dedicated local partners to help implement many of the strategies involved. CRA has been an important tool for strengthening local community and economic development infrastructure since it was enacted in 1977. We also recognize that significant changes in the financial services industry since then have hindered the law's effectiveness, especially in rural communities, and that an update of the implementing regulations is appropriate. As my colleague Governor Brainard has noted, one of the principles guiding our CRA modernization work is that any redesign of CRA regulations should continue to encourage banks to seek opportunities in underserved areas, including rural communities. I understand that the HAC has done considerable research and stakeholder engagement to shed light on the barriers to the effective use of CRA in rural communities; all of that work will benefit our reform efforts. The Fed's ongoing series of roundtable discussions in communities across the country will also allow us to hear suggestions for improving CRA from local stakeholders, including many people from rural communities. These perspectives will inform our deliberations on this critical regulation, and we will make a summary of our discussions available to the public. In closing, while the economy is strong overall, we recognize that some communities have yet to feel the full benefits of the ongoing expansion. We are conducting research, collaborating with communities, and assessing financial regulations so that our nation's current prosperity will benefit small towns and cities alike. The work being done by tonight's award recipients and by each of you in this room is critical to making progress toward that goal. Thank you for being our partners in this work, and enjoy the conference.
r181207a_FOMC
united states
2018-12-07T00:00:00
Assessing Financial Stability over the Cycle
brainard
0
Financial stability is integral to achieving the Federal Reserve's objectives of full employment and price stability. We need only look back a decade to see the dramatic damage from financial vulnerabilities that increase unchecked: Millions of Americans lost their livelihoods and their homes, business losses and failures rose, and the government had to provide extraordinary support to the system. Since then, financial sector resilience has strengthened, and household balance sheets have been repaired over the course of a lengthy recovery. Today employment is strong, inflation is around target, and incomes are growing. If we learned anything from this experience, it is that we must be especially vigilant to safeguard the resilience of our financial system in good times when vulnerabilities may be building. That is why the Federal Reserve actively monitors the potential vulnerabilities to the financial system. Last week, for the first time, we released our assessment in the Today I will offer a brief summary of the outlook, highlight areas where I see financial imbalances building, and touch on the implications for policy. Domestic economic momentum has been strong, as evidenced by the labor market. With the November data, monthly payroll gains have averaged 170,000 over the past 3 months, well above the pace necessary to absorb new entrants into the labor force. The share of the prime-age population (people ages 25 to 54) that is working is closing in on its pre-crisis level. By most measures, wages have accelerated over the past year and are now growing around 3 percent, the highest level since the crisis. These are welcome developments. While the most recent reading on core personal consumption expenditures, or PCE, inflation ticked down, indicators of underlying trend inflation remain encouraging overall, providing little signal of an outbreak of inflation to the upside, on the one hand, and reassurance that underlying trend inflation may be close to our target of 2 percent, on The economy has grown 3 percent over the past year, and there are good reasons to expect growth to remain solid next year, supported by the strong underlying momentum in domestic demand. Consumer spending looks to be robust going into the fourth quarter, and ongoing gains in income and employment provide positive fundamentals. In addition, business investment should be solid, even with recent declines in oil prices. Sizable fiscal stimulus has provided an important boost to demand this year and will likely contribute somewhat further next year, given the usual lags in outlays and in the effects of tax cuts on business and household spending. The most likely path for the economy is positive, although some tailwinds that have provided a boost are fading, and we may face some crosscurrents. The global growth that provided a strong tailwind going into this year has moderated. The earlier strong growth in Europe and Japan appears to be softening toward trend. China is shifting to an accommodative policy stance to contend with a challenging trade environment and lagged effects from its earlier tightening. Here at home, the impetus to growth from fiscal policy is likely to fade going into 2020. And after being exceptionally accommodative, financial conditions have tightened in recent months. Financial conditions are still supportive of growth by many measures (figure 2), but less so than last year. There are risks on both sides of the economy's likely path. In Europe, there are risks associated with deliberations over Italy's fiscal and debt trajectory and the United Kingdom's deliberations on the Brexit deal. Here at home, we hear from businesses that the uncertainty associated with trade policy and the implications for supply chains may weigh on business capital spending. Although it is reasonable to expect fiscal spending to be extended around current levels in real terms after the Bipartisan Budget Act expires, we cannot rule out that fiscal policy could become a headwind in 2020. The risks are two-sided. Business contacts report difficulties finding qualified workers and increased costs associated with inputs, tariffs, and transportation, along with somewhat greater ability to pass through those increases to consumer prices. Despite this, however, inflation remains muted overall. At 3.7 percent, the unemployment rate is at its lowest level in 49 years, and payrolls have been growing well above the pace that is consistent with labor market stabilization. Historically, the few periods when resource utilization has been similarly tight have seen elevated risks of either accelerating inflation or financial imbalances. Our goal now is to sustain the expansion by maintaining the economy around full employment and inflation around target. The gradual path of increases in the federal funds rate has served us well by giving us time to assess the effects of policy as we have proceeded. That approach remains appropriate in the near term, although the policy path increasingly will depend on how the outlook evolves. The last several times resource utilization approached levels similar to today, signs of overheating showed up in financial-sector imbalances rather than in accelerating inflation. In contrast to the past, the Federal Reserve now has a systematic forward- looking approach to identifying increases in financial vulnerabilities. This monitoring is discussions. Last week, the Board released its first to help inform the public and promote transparency and accountability as we carry out our financial stability responsibilities. While there has been substantial progress on reducing household debt burdens and increasing the resilience of the banking system, the Federal Reserve's assessment suggests that financial vulnerabilities associated with corporate debt are building against a backdrop of elevated risk appetite. Let me briefly review these developments in turn. In contrast to the years preceding the crisis, when household borrowing was growing at a pace far above that of gross domestic product (GDP), it has since come down and is now growing more slowly than the economy overall (figure 3). Moreover, while much of the increase before the crisis reflected borrowing that proved unsustainable, more recent borrowing has been concentrated among households with strong credit profiles. The regulated financial sector is also more resilient, owing to far-reaching reforms as well as favorable conditions. Large banks have increased both the size and quality of their capital buffers: The ratio of common equity to risk-weighted assets at large banks has increased by half relative to the pre-crisis average. It is now close to levels seen at smaller banks, although the risk-weighted capital ratio at large banks has moved down somewhat over the past year (figure 4). In addition, insurers appear generally well capitalized; broker-dealers, including those not affiliated with large bank holding companies, have reduced their leverage; and the outstanding values of funding vehicles that embed significant leverage, such as certain securitized products, are much lower. In contrast, there has been some evidence of rising use of leverage by hedge funds over the past year and a half. Financial reform has reduced funding risks associated with banks and money market funds. Large banks subject to liquidity regulation rely less on unstable short-term wholesale funding and have thicker liquidity buffers. As a result of money market reforms, investors have migrated toward government-only funds, which pose low run risk, and away from prime institutional funds, which proved highly susceptible to runs during the crisis and required extraordinary government support (figure 5). In contrast, we are seeing elevated vulnerabilities in the nonfinancial business sector. Business borrowing has risen more rapidly than GDP for much of the current expansion and now sits near its historical peak (figure 6). The run-up in corporate debt has brought the ratio of debt to assets close to its highest level in two decades on an overall basis, and this is also true for speculative-grade and unrated firms (figure 7). And whereas previously, mostly high-earning firms with relatively low leverage were taking on additional debt, analysis of detailed balance sheet information indicates that, over the past year, firms with high leverage, high interest expense ratios, and low earnings and cash holdings have been increasing their debt loads the most. Historically, high leverage has been linked to elevated financial distress and retrenchment by businesses in economic downturns. Regarding corporate bonds outstanding, recent years have witnessed little change in the relative shares of investment-grade bonds and high-yield bonds. Credit quality has deteriorated within the investment-grade segment, where the share of bonds rated at the lowest investment-grade level has reached near-record levels. As of mid-2018, around 35 percent of corporate bonds outstanding were at the lowest end of the investment-grade segment, which amounts to about $2-1/4 trillion. In comparison, the share of high-yield bonds outstanding that are rated "deep junk" has stayed flat at about one-third from 2015 to 2018, well below the financial crisis peak of 45 percent. In an economic downturn, widespread downgrades of these low-rated investment- grade bonds to speculative-grade ratings could induce some investors to sell them rapidly--for instance, because lower-rated bonds have higher regulatory capital requirements or because bond funds have limits on the share of non-investment-grade bonds they hold. This concern may be higher now than in the past, since total assets under management in bond mutual funds have more than doubled in the past decade to about $2.3 trillion this year. These funds now hold about one-tenth of the corporate bond market, and the redemption behavior of investors in these funds during a market correction is unclear. Bond sales could lead to large changes in bond prices and overall financial conditions if technological, market, or regulatory factors contribute to strains on market liquidity--a possibility that has been relatively untested over the course of the expansion. Further down the credit quality ladder, there has been sizable growth in leveraged lending, accompanied by a notable deterioration in underwriting standards. Net issuance of debt to risky borrowers, which had stopped growing in late 2016, rebounded over the past year. Leveraged loans outstanding rose about 12 percent over the past 12 months and now stand around $1 trillion overall. While leveraged loans have traditionally had important investor protections, loan covenants for new leveraged loans have weakened dramatically. Covenant-lite, or cov- lite, transactions now represent roughly 80 percent of the entire leveraged lending market--up from less than 30 percent a decade ago, when they were associated primarily with stronger borrowers. Deals increasingly involve features that increase opacity and risk, such as less subordinated debt; "EBITDA (earnings before interest, taxes, depreciation, and amortization) add backs," which could inflate the projected capacity of the borrowers to repay their loans; and "incremental facilities," which allow additional borrowing that is of equal seniority with the existing bank loan. The share of newly issued large loans to corporations with high leverage (debt to EBITDA ratios above 6) now exceeds previous peak levels observed in 2007 and 2014 (figure 8). Previously, much of this deterioration in underwriting appeared to be concentrated among nonbank lenders, but this year has witnessed a deterioration in underwriting at the largest banks. The widening adoption of practices that make risk harder to measure suggests a heightened focus on industry risk-management practices is warranted. A substantial share of the leveraged loans are packaged in collateralized loan obligations (CLOs). Gross issuance of CLOs hit $71 billion in the first half of 2018. This pace represents an increase of about one-third compared with the same period in the previous year. Many large banks are engaged in the origination of leveraged loans with an intent to distribute, often to CLOs. The originate-to-distribute model exposes banks to pipeline risk--the risk that some originated loans may be difficult to distribute if market conditions deteriorate. Although banks have improved pipeline management over the past decade, risk-management practices may have weakened somewhat recently in the face of strong investor demand. The direct exposures of the banking system, in the form of loan portfolios and warehousing exposures, can be tracked. But there are also indirect exposures, including through bank investments in CLOs on the order of $90 billion, that bear vigilance. More broadly, bank lines to the nonbank financial sector have increased Loan funds have also become increasingly important in the leveraged loan market and are estimated to purchase about one-fifth of newly originated leveraged loans. If prices were to move sharply lower, a rush to redeem shares by open-ended mutual fund investors could lead to large sales of their relatively illiquid holdings, further exacerbating price declines and run incentives. To date, the default rate on leveraged loans has been at the low end of its historical range, and corporate credit conditions have been favorable, with low interest expenses and low expected default rates. However, if spreads rise sharply or economic conditions deteriorate significantly, we could see downgrades, refinancing challenges, rising delinquencies and defaults, and losses to investors. It is thus particularly notable that the run-up in business debt occurred against a backdrop of generally elevated valuations (figure 10). Even with the recent volatility in equity markets and the recent widening of corporate bond spreads, a range of asset prices remain high relative to historical benchmarks. In particular, yields on high-yield corporate bonds relative to Treasury securities remain somewhat narrow on a historical basis despite recent increases. Similarly, although they have moved up in recent months, spreads on leveraged loans remain in the low end of their range since the financial crisis, which is notable given the evidence of weakening protections. Finally, capitalization rates on commercial real estate properties, which measure annual income relative to prices for recently transacted properties, have been low relative to Treasury yields. The generally high appetite for risk that we saw over the past two years makes the equity, corporate debt, and other asset markets more vulnerable to swings in market sentiment. In addition to generating losses for investors, declines in valuations could make it more challenging for firms to obtain or extend financing--especially among risky, indebted firms--which in turn could be amplified by the high levels of risky corporate debt. The assessment of financial vulnerabilities that I have outlined naturally raises the following question for policymakers: What is the appropriate risk tolerance? It goes without saying we must take all appropriate steps to prevent another Great Financial Crisis from causing the greatest contraction in global economic activity since the Great Depression. But it is also worth remembering that financial imbalances played a key role in each of the past three U.S. downturns--the risky investments and maturity mismatches associated with the savings and loan crisis and junk bond collapse; the tech boom and bust; and, most dramatically, the subprime crisis. This suggests policy might seek to moderate financial vulnerabilities when they are likely to materially exacerbate an economic downturn, leading to deeper declines in output and higher levels of unemployment. Each of the past three U.S. recessions featured important financial imbalances, although they differed in important ways. Both economic theory and econometric evidence point to the risk that excesses in corporate debt markets could similarly amplify adverse shocks and contribute to job losses. The economics are straightforward. Over- indebted businesses may face payment strains when earnings fall unexpectedly, and they may respond by pulling back on employment and investment. The slowdown in activity lowers investor demand for risky assets, thereby raising spreads and depressing valuations. In turn, business losses accumulate, and delinquencies and defaults rise, reducing the willingness or the ability of banks to lend. This dynamic feeds on itself, potentially amplifying moderate adverse shocks into more serious financial strains or a recession. Given the risks to the financial system and economy from this potential adverse feedback loop, a strong case can be made that the financial system's buffers should be fortified when the economy is strong. Reinforcing capital buffers during the strong part of the cycle means that banks will have a cushion to absorb losses and remain sound during a subsequent downturn. Thicker capital buffers help bolster the confidence of market participants when conditions deteriorate, helping prevent the downward spiral from a loss of confidence. And during the downturn, that extra buffer can be released to enable banks to continue lending and help mitigate its severity. History suggests that we should not expect the market to provide incentives for banks to build the necessary buffers when times are good; the essence of the cycle is that market sentiment become overconfident precisely when risk is actually highest. One of the roles for independent regulatory bodies such as the Federal Reserve is to serve as a counterweight. Moreover, as we saw in the last crisis, it is much costlier to rebuild capital in a downturn when earnings and risk appetite are low than to build buffers in an expansion when earnings are strong. At the Federal Reserve, the two important tools that can respond somewhat to rising vulnerabilities are the design of stress-test scenarios and the countercyclical capital buffer (CCyB). The annual supervisory stress test examines the resilience of large bank holding companies to a severely adverse scenario, which includes salient risks that can be adjusted over time. In recent years, the scenarios have been designed to explore severe dislocations in corporate credit markets as a salient risk. Nonetheless, the stress tests have limitations as a countercyclical tool during buoyant periods. For instance, while the severity of the stresses can be varied from year to year to address emerging risks to some degree, it is difficult to introduce entirely new scenarios each year to target specific sectoral risks without introducing excessive complexity. And while the stress tests and proposed stress capital buffer are designed to calibrate capital buffers for the riskiness of an institution's particular assets and exposures, the capital buffer does not vary systematically to counter the cyclicality that arises through elevated asset valuations and other channels. The limited ability of the stress tests to increase loss-absorbing capacity during buoyant economic times is illustrated in the results from recent years, where scenarios have involved increasingly severe recessions and strains in corporate debt markets but generally lower declines in capital ratios. More broadly, capital ratios at the largest banks have been flat and, more recently, modestly lower. In contrast, the first goal of the CCyB is to directly build resilience at large bank holding companies when there is an elevated risk of above-normal losses, which often follow periods of credit growth or rapid asset price appreciation. The second goal of the CCyB is to promote a more sustainable supply of credit over the economic cycle. The CCyB is expected to be reduced as credit growth slows in order to support credit supply at times when it might otherwise contract. As a rough rule of thumb, the criteria for implementing the CCyB described in the Board's framework of September 2016 are calibrated so that the CCyB will be above its minimum value of zero about one-third of the time--when financial vulnerabilities are assessed to be in the upper one-third of their historical distribution. There are several potential advantages to building additional resilience through the CCyB. First, countercyclical capital requirements are intended to lean against rising risks at a time when the degree of monetary tightening needed to achieve the same goal could be inconsistent with sustaining the expansion. And countercyclical capital requirements build resilience, unlike monetary policy. Second, the banks that are subject to the CCyB could achieve a modest buffer simply by safeguarding the capital they have built up or by reducing payouts moderately. Third, the CCyB is a simple, predictable, and slow-moving tool that applies equally across all large banks. It does not single out shortfalls in particular banks or result in hard-to-predict volatility in individual banks' stressed capital requirements. Finally and critically, the additional capital implied by the CCyB across the system can be released when conditions deteriorate to ensure the ability of large banks to lend into a downturn. A number of countries have raised their CCyB setting above zero, and we can learn from their experiences, which have generally been positive. Our job now is to sustain the expansion by maintaining the economy around full employment and inflation near target. Recent history suggests that the business cycle and . the financial cycle are increasingly intertwined. If history is any guide, as resource utilization continues to tighten, there is some risk that financial imbalances could grow. The U.S. financial system is much more resilient than before the crisis, owing importantly to strong financial reforms. Even so, the banking system's core capital and liquidity buffers have yet to be tested through a full cycle. At the same time, the appetite for risk among financial market participants rose notably over 2017 and much of 2018, and corporate borrowing has reached new heights amid rapid growth and deteriorating underwriting standards in riskier segments, such as leveraged lending. The mutual funds that have built up exposure to some of this risky debt have liquidity mismatches that could contribute to market dislocations in stressed conditions. This constellation of vulnerabilities could amplify adverse shocks that might materialize. At a time when cyclical pressures have been building and bank profitability has been strong, it might be prudent to ask large banking organizations to fortify their capital buffers, which could subsequently be released if conditions warrant. . . . pdf . -------- (2018). . speech . speech delivered at the Center for Global Economy and Business, Stern School of . . vol. 132 . .
r190109a_FOMC
united states
2019-01-09T00:00:00
Insurance Supervision and International Engagement
quarles
0
It needs no retelling in this audience that the insurance industry is of significant importance to the economy, both domestically and internationally. Insurers occupy a meaningful role in the financial sector, meeting the financial needs of consumers and businesses with a distinct business model that calls for appropriately tailored policies. As you know, the Federal Reserve has a role as consolidated supervisor of some insurers, a role we continually strive to fulfill in the most appropriate and tailored manner. The Federal Reserve also participates as a member of "Team USA"--together with the state insurance Insurance Office (FIO)--in the international insurance standard-setting process, including the ongoing development of an insurance capital standard (ICS) for internationally active insurance groups. In my remarks today, I will touch on these topics. In the hope of providing greater transparency, I will also provide some updates on the Board's forthcoming proposal on insurance holding company capital requirements, often referred to as the Building Block Approach (BBA). The U.S. insurance market, the largest in the world, generated over $2 trillion of direct written premium in 2017, over 10 percent of the total U.S. gross domestic product for that year and the highest direct premium volume in the years following the financial crisis. Over a quarter of that premium came from life insurers, including, of course, the institutions represented in this room. In addition to its role providing insurance products--taking on risk and enabling policyholders to plan financially--the insurance industry plays an immensely important role through its investments. In 2017, nearly 65 percent of the insurance industry's $6.5 trillion in cash and invested assets was from the life insurance industry. Over 70 percent of life insurers' invested assets were held in bonds, including corporate bonds, which fuel growth in the real economy. Historically, insurers have been relatively conservative, buy-and-hold investors. This has served both the industry and the economy well. As we all appreciate, life insurance and annuities are in many ways a spread business, with investment of premium dollars funding later payments to policyholders. In the years since the financial crisis, the insurance industry has shown resilience, with capital and surplus increasing at an average annual growth rate of 4.1 percent since 2007. With an approach to investment and financial management that focuses on the long-term, the U.S. insurance industry continues to play a valued role in the financial sector. Now in its eighth year of having a supervisory role over certain insurers, the Federal Reserve continues to thoughtfully approach its mandates and authority. The Dodd-Frank Act gave the Federal Reserve regulatory responsibilities for insurance holding companies that choose to own a federally insured depository institution and those designated by the Financial Stability Oversight Council. The insurance thrift holding companies supervised by the Federal Reserve represent just under 10 percent of U.S. insurance industry assets and span a wide range of sizes, structures, and business activities. With a core focus on ensuring the safety and soundness of the supervised insurance institutions, and protecting their subsidiary depository institutions, the Board aims to continue fulfilling its role as consolidated supervisor by tailoring its oversight to the insurance business of the supervised firms, complementing the existing work of state insurance supervisors. The Federal Reserve has aimed to develop policies that are insurance-centric and appropriate for insurance risks. For instance, the Board's advance notice of proposed rulemaking (ANPR) on insurance capital requirements set out two frameworks for capital standards that are each unlike the Board's capital rules for bank holding companies. as set out in the ANPR, was fashioned as a framework that builds upon the regulatory capital rules of subsidiaries' functional regulators--state or foreign insurance regulators for insurance subsidiaries and federal banking regulators for insured depository institutions--to provide a consolidated capital requirement. We appreciate the comments we have received, including from the ACLI and other interested parties, and have worked hard to reflect the perspectives of commenters in developing the framework. We expect to publish a formal proposal in the not- too-distant future, and I will provide a high-level preview here today. It is worth sharing some further context for the Board's development of the BBA before previewing its design and key attributes. As outlined in the ANPR, the Board strove to develop a framework that encompassed all material risks across the entire supervised enterprise. We favored an approach that is as standardized as possible, rather than relying on internal capital models of supervised firms, in order to promote transparency and facilitate comparability across firms. We also aimed to strike an efficient balance between simplicity and risk sensitivity in order to ensure accurate reflection of risks while minimizing regulatory burden. To further limit regulatory burden, the Board envisioned a BBA that would rely upon existing U.S. accounting and capital frameworks to the greatest extent possible. In designing the BBA, there were, of course, a number of ideas for how we should construct this framework. At the outset, we decided against applying the Board's bank holding company capital rules to supervised insurance firms at the enterprise level, in light of the very different business models of insurance and banking. We considered that a capital approach akin to the European Solvency II framework would not adequately incorporate U.S. accounting frameworks that, relative to other approaches, tend to be less prone to volatility and procyclicality. Volatility in a valuation approach that is used in a capital standard can especially affect long-term contracts, with the potential for unintended consequences on the ability of insurers to provide long-term life insurance and retirement planning products. Moreover, use of an approach that entails more reliance on internal models could undermine our desire for consistent, cross-firm comparisons and can lack transparency to market participants and supervisors. Another framework that could have informed the development of the BBA is the ICS, but much of ICS's evolution has been in the direction of a valuation method and overall framework that reflect approaches used elsewhere in the world. This may not be optimal for the United States insurance market. Importantly, the BBA will appropriately reflect, rather than unduly penalize, long duration liabilities in the United States, facilitating the continued robustness of product availability in the U.S. that contributes to greater financial security for Americans. In the U.S., an aggregation-based approach like the BBA could also strike a better balance between entity-level, and enterprise-wide, supervision of insurance firms. Having spoken enough about what the proposed BBA is not, I should spend some time previewing what it is, and I suspect there is some interest in this room about that topic. The proposed BBA is an approach to a consolidated capital requirement that considers all material risks within the enterprise by aggregating the capital positions of companies under an insurance holding company, after making some adjustments and scaling them to a common capital regime. To streamline implementation burden while reflecting all material risks, I think it would make sense to use the NAIC's insurance capital framework as the common capital regime. As the name implies, the BBA constructs "building blocks"--or groupings of entities in the supervised firm--that are covered under the same capital regime. These building blocks are then used to calculate combined, enterprise-level capital resources and requirements. In each building block, the BBA generally applies the capital regime for that block to the subsidiaries in that block. For instance, in a life insurance building block, subsidiaries within this block would be treated in the BBA the way they would be treated under life insurance capital requirements. In a depository institution building block, subsidiaries would be subject to bank regulatory capital requirements. The financial crisis taught us that certain activities in an insurance enterprise--for instance, derivatives activity--could pose risks to the enterprise that may not always be reflected through affiliates subject to capital regimes. To address regulatory gaps and arbitrage risks, like those made manifest in the financial crisis, the BBA generally would apply bank regulatory capital requirements to nonbank/non-insurance building blocks. Once the enterprise's entities are grouped into building blocks, and capital resources and requirements are computed for each building block, the enterprise's capital position is produced by generally adding up the capital positions of each building block. In order for the BBA's aggregation to function appropriately, and to reflect the Board's supervisory objectives, the BBA needs to make certain adjustments to the building blocks. For instance, to compare similar activities across building blocks, the BBA would apply insurance capital rules consistently, without regard to permitted accounting practices granted by an individual state, thus uniformly applying statutory accounting principles as set forth by the NAIC. Measures to avoid double-counting, including double-leverage, that could arise from intercompany transactions would be built into the BBA's aggregation process. Other areas where adjustments would be made include provisions to comply with the Collins Amendment of --for instance, allowing only instruments meeting the criteria under the Board's bank capital rules to qualify as capital for an insurance holding company--and technical adjustments. Aggregation requires a further step after adjustments are applied to each building block, a step that is frequently termed "scaling." Two building blocks under two different capital regimes cannot simply be added together if, as is frequently the case, each regime has a different scale for its ratios and thresholds. The BBA proposes to scale and equate capital positions in different regimes through analyzing historical defaults under those regimes. Once the insurance holding company's aggregate capital position is calculated, the BBA would impose a minimum requirement that is calibrated to be consistent with the Board's role to ensure that the risks of the enterprise do not present undue risk to the safety and soundness of the depository institution. Our goal with the BBA is to capture all material risk of each supervised organization, leverage existing legal-entity standards, and minimize burden. In developing the BBA, we have been mindful of the role that the insurance industry plays through its buy-and-hold, long-term approach to investments. A capital standard can have incentive effects, sometimes substantial. A capital standard that uses market-based valuation can introduce volatility and procyclicality, and one that is excessively volatile or procyclical can influence a firm to veer away from a long-term perspective and concentrate instead on the short term. This can have undesirable consequences, including diminishing product availability. In contrast, a capital standard that is stable in its valuation, conservative in its design, and appropriately reflective of financial soundness can influence firms to plan for the long term, consistent with the nature of life insurance and retirement products, and similarly invest for the long term through assets like government and municipal bonds, corporate bonds, and infrastructure. Moreover, a capital standard like the BBA that largely builds on state-based insurance capital standards would tend to reinforce, rather than frustrate, the important role that insurers' investments play in our economy. Engagement with the NAIC In developing the BBA, we also have been mindful of the potential interaction with the valuable work of another Team USA member, the NAIC, to develop its Group Capital Calculation (GCC). We also applaud the NAIC's efforts to address life insurers' potential liquidity risks and develop a liquidity stress testing framework for large life insurers through the The primary functional supervisor for insurance companies for which the Board is consolidated supervisor is a state insurance regulator. It is just good policy for the authorities that mutually supervise firms to coordinate efforts in order to streamline, seek harmony, and minimize inconsistencies. To that end, in August of 2017, the Federal Reserve initiated contact with the NAIC and state insurance supervisors to engage in dialogue with the aim of achieving consistency, wherever possible, between the two capital frameworks under development. We have met frequently and engaged substantively with representatives of the NAIC and the states. Input from the NAIC and the states has helped identify areas of commonality while remaining respectful of the somewhat different objectives of the relevant supervisory bodies and legal environments. Some differences between the Board's BBA and the NAIC's GCC may arise. There are reasons for this. The Board's mandate includes protection of a federally insured depository institution and the attendant safety-and-soundness objectives. As to a firm's insurance subsidiaries, the states' focus is on policyholder protection, while the Board serves as the overall firm's consolidated safety-and-soundness supervisor. The Board's capital standard also must comport with federal law for insurance holding companies with depository institution subsidiaries, while the NAIC's GCC interfaces primarily with state laws. Moreover, because of the characteristics of the current population of insurance thrift holding companies, the BBA currently would only need to scale between two regimes: the NAIC's insurance risk-based capital regime and federal bank capital rules. By contrast, the firms to which the GCC may apply can encompass operations in a number of non-U.S. jurisdictions, calling for a much more extensive set of scaling parameters. The NAIC has announced its plan to conduct a field test of the GCC this spring. Likewise, to be transparent, gather additional input, and provide a valuable test of our approach, the Federal Reserve intends to conduct a quantitative impact study of the BBA as part of our rulemaking process. We hope supervised firms will take advantage of this opportunity to contribute valuable information and feedback on our approach. Our dialogue with the NAIC and state insurance regulators has been productive and helpful in the BBA's development. Moreover, this engagement has been helpful for Team USA's efforts in the international insurance standard-setting arena. It is our intent that the Federal Reserve's development of the BBA, together with the NAIC's development of the GCC, will assist with Team USA's advocacy of an aggregation method that can be deemed comparable to the ICS. In 2013, through the Federal Reserve's role as consolidated supervisor of certain insurers, collaboration with the other members of Team USA that continues to this day. The standards produced through the IAIS are, of course, not binding upon the United States. However, it remains in our national interest to engage in the international insurance standards-development process so that it produces standards that are appropriate for the U.S. market and consumers and for U.S. companies operating abroad. I see this philosophy as being important not only in the Federal Reserve's engagement in the IAIS, but also with the broader Financial Stability Board. Team USA's collaboration is prominently visible in our advocacy at the IAIS. In order for any form of an ICS to be implementable globally, it needs to be suitable for the U.S. insurance market. The current core proposal in the ICS would face implementation challenges in the United States. For instance, such a framework may fail to adequately account for U.S. NAIC's Statutory Accounting Principles, introduce excessive volatility, and involve excessive reliance on supervised firms' internal models. Among other things, this motivates our advocacy of an aggregation alternative, and the use of an alternative valuation method that derives from U.S. GAAP, in the ICS. Furthermore, we support the collection of information on an aggregation-based approach that would reside within the ICS, and actively participate, together with other jurisdictions that espouse aggregation-based approaches, in the development of such an approach for the ICS. Among other things, aggregation approaches bring advantages, including sensitivity to local products and risks. The BBA, in adding to Team USA's work on aggregation methods with a tangible example, can assist in our collective advocacy. In sum, we remain committed to discharging our domestic authority responsibly and in a way that recognizes the unique business of insurance companies. Additionally, we will continue to advocate for international insurance standards that promote a global level playing field and work well for the U.S. insurance market. Thank you again for the opportunity to speak with you today.
r190110a_FOMC
united states
2019-01-10T00:00:00
Monetary Policy Outlook for 2019
clarida
0
Happy New Year. I am very glad to be speaking to you here in New York, a city I lived and worked in for 30 years before joining the Federal Reserve Board in September. The new year is often a time when people make resolutions--resolutions to exercise more, learn a new skill, or spend more quality time with family and friends. My resolution, as a member of the Federal Open Market Committee (FOMC), is to work with my colleagues to implement a monetary policy that will sustain economic growth and maximum employment at levels consistent with our 2 percent inflation objective. As we begin 2019, the initial conditions for the real economy are favorable. Through the first three quarters of last year, gross domestic product growth averaged 3.2 percent. Private-sector forecasts as well as our Summary of Economic Projections indicate that, when the data for the fourth quarter are released, they will show the economy likely grew at 3 percent or perhaps a little faster in 2018 for the year as a whole. If so, economic growth in 2018 would be the fastest annual growth rate recorded in 13 years. In terms of the economic outlook, ongoing momentum heading into this year indicates that that above-trend growth is likely to continue in 2019. If the economy continues to grow in 2019 along the lines that I expect, in July the current expansion will become the longest in recorded U.S. history. The labor market remains healthy, with an unemployment rate near the lowest level recorded in 50 years and with average monthly job gains continuing to outpace the increases needed over the longer run to provide employment for new entrants to the labor force. Moreover, the declines in the unemployment rate have been widespread across racial and ethnic minority groups, though gaps for African Americans and Hispanics relative to whites remain sizable. At 3.9 percent, the overall unemployment rate is below the median of FOMC participants' estimates of its full employment level, u*, of 4.4 percent. That said, the participants' median estimate of u* has been falling for several years as strong employment gains have not triggered a worrying rise in price inflation. In a welcome development, nominal wage growth is picking up, with most measures now running around 3 percent on an annual basis. And, for the past couple of years, wage gains have been notably faster for lower-income workers. Aggregate wage gains are broadly in line with productivity growth and our 2 percent inflation objective, and they are consistent with a labor market that is operating in the vicinity of full employment. They are not, at present, a source of upward, cost-push pressure on price inflation. With regard to labor supply, we have had a pickup in labor force participation among prime-age workers (those 25 to 54 years old) that is, at least for now, boosting the supply side of the economy. The participation rate of prime-age workers has risen about 1-1/2 percentage points over the past few years. And participation in the job market may still have some further room to rise, as the prime-age participation rate is still a couple of percentage points below the levels that prevailed in the late 1990s, when the labor market was last this strong. Price stability, of course, is the other leg of our dual mandate, and PCE (personal consumption expenditures) inflation over the past 12 months has been running close to our 2 percent objective. That said, and notwithstanding strong economic growth and a low unemployment rate, inflation has surprised to the downside recently, and it is not yet clear that inflation has moved back to 2 percent on a sustainable basis. Because expectations of future inflation are such an important determinant of actual inflation, central banks are as much in the business of anchoring inflation expectations as they are of managing actual inflation. Longer-run inflation expectations, based on straight readings of inflation compensation from TIPS (Treasury Inflation- Protected Securities), have drifted downward, although, when adjusted for term premiums and liquidity, they remain near 2 percent. The University of Michigan Surveys of Consumers' measure of expected inflation over the next 5 to 10 years has been broadly stable but has edged down over the past few years and is now at the very lower end of the range that has prevailed historically. Inflation expectations of professional forecasters have remained stable and consistent with our 2 percent objective. At each future FOMC meeting, as I consider what, if any, adjustment to our policy stance is warranted to achieve and sustain our dual-mandate objectives, I will closely monitor the incoming data on inflation expectations as well as actual inflation, among the broad range of real and financial indicators that I consult. To me, it is important that any future policy decisions we may consider in 2019 be consistent with both pillars of our dual mandate. I will also be monitoring closely the incoming data on labor supply and productivity. Not only has aggregate demand growth been robust, but so, too, has been the growth in realized aggregate supply. Over the first three quarters of 2018, hours worked in the nonfarm business sector were up 2.0 percent (at an annual rate), and productivity was up 1.8 percent. Realized productivity growth over the past eight quarters has averaged 1.3 percent, which is up from the 0.7 percent average recorded between 2011 and 2016. Strong growth supported by supply-side gains in hours worked and productivity is not inflationary, as the experience of 2018 confirms. With labor supply and productivity growth in 2018 having surprised on the upside, some mean reversion in 2019 is not unreasonable to forecast. But right now, that is just a forecast, and if the positive developments on the supply side of the economy continue in 2019, they would need to be factored into the inflation outlook and thus the appropriate settings for monetary policy. As I have indicated previously, I believe we may have seen the bottom on the productivity slowdown, but how much of the recent welcome uptick in productivity growth can be sustained or extended is hard to judge at this point. It will depend in part on how much business investment spending adds to the stock of capital in the economy. We saw a welcome pickup in investment in the first half of last year, but growth of capital spending slowed notably in the third quarter and the manufacturing indexes from the Institute for Supply Management have softened, though other data are consistent with a rebound in business spending in the fourth quarter. If a pickup in the growth of investment spending was realized and sustained, it would be expected to contribute to future productivity growth. With a robust labor market and inflation running close to our 2 percent inflation objective, the Committee decided at its December meeting to raise the target range for the That said, growth and growth prospects in other economies around the world have moderated somewhat in recent months, and overall financial conditions have tightened materially. These recent developments in the global economy and financial markets represent crosswinds to the U.S. economy. If these crosswinds are sustained, appropriate forward-looking monetary policy should respond to keep the economy as close as possible to our dual-mandate objectives of maximum employment and price stability. I will closely monitor the incoming data on these global economic and financial developments as, at each future FOMC meeting, I consider what adjustment to our monetary policy stance is warranted to achieve and sustain our dual-mandate objectives. With our December decision, the 2.5 percent upper limit of our target range for the federal funds rate is now equal to the lower end of the range of Committee participants' estimates of its longer-run equilibrium level, r*. One defines r* as the level of the policy rate that, if sustained, would maintain full employment and price stability in the long run. As I have discussed in a previous speech, r* is both unobserved and time varying, so it must be inferred from macroeconomic and financial data. It is for this reason that, at this stage of the business cycle and with the economy operating close to our dual-mandate objectives, it will be especially important for our policy decisions to continue to be data dependent. We need to be data dependent in two related but distinct ways. First, we need to base our policy decisions on what trends in the data tell us about where the economy is at the time of each meeting relative to our dual-mandate objectives for unemployment and inflation. Second, we need to be data dependent by looking at a wide range of real and financial data that can provide information on where the economy is heading under appropriate policy, including the ultimate destination for u* and r*. Over the past seven years, FOMC participants have continually revised down their estimates of long-run u* and r* as the unemployment rate fell and historically low policy rates did not trigger a surge in inflation and inflation expectations above target. This process of learning about u* and r* as new data arrive continues and reinforces that we are not on a preset course. With inflation muted, I believe that the Committee can afford to be patient as we see how the data evolve in 2019 and as we assess what monetary policy stance is warranted to sustain strong growth and our dual-mandate objectives. In terms of monetary policy implementation, the FOMC is assessing how the demand for our liabilities, especially reserve balances held at the Fed by depository institutions and U.S. currency holdings, is evolving in a world in which regulation and prudence boost holdings of liquid assets by financial institutions. Ultimately, these factors, along with the choice we make with regard to our operating framework, will be the primary determinants of the size of our balance sheet, which since October 2017 has been shrinking as we allow our holdings of Treasury securities and agency mortgage- backed securities to roll off as they mature and prepay. There have been significant changes in financial regulation pertaining to high-quality liquid assets and liquidity coverage ratios, and the legacy of the Global Financial Crisis has likely led financial institutions to want to have higher liquidity even outside of regulation. And that means that we are going to be in a world in which financial institutions are likely either required or going to want to hold onto liquid assets, including reserves at the Fed. The FOMC is discussing the pros and cons of different longer-run approaches to monetary policy implementation and is still learning about the evolution of the demand for reserves in the banking system. As noted in the FOMC's Policy Normalization Principles and Plans, the Committee intends to, in the longer run, hold "no more securities than necessary to implement monetary policy efficiently and effectively." that vein, as indicated in the minutes from our recent meetings, the Committee has been weighing the costs and benefits of an implementation system with abundant reserves. The current system for policy implementation with abundant reserves has, to date, served us well. We have good control of short-term money market rates in a variety of market conditions, and these rates have been effectively transmitting to broader financial conditions in the economy. However, while the assessment of our operating framework is ongoing, let me be clear that any decisions we make on the ultimate size of the balance sheet and the implementation of policy will be taken so as to be consistent with our goals of sustaining strong growth, maximum employment, and price stability. If we find that the ongoing program of balance sheet normalization or any other aspect of normalization no longer promotes the achievement of our dual-mandate goals, we will not hesitate to make changes. In November, the Federal Reserve announced that it will conduct a wide-ranging and public review in 2019 of how we go about achieving the twin goals of maximum employment and price stability assigned to us by the Congress. The review will cover the Fed's monetary policy strategy, policy tools, and communication practices and will include outreach to businesses, community groups, academics, and other interested parties. As Chairman Powell has indicated, with labor market conditions close to maximum employment and inflation near our 2 percent objective, now is a good time to take stock of how the Federal Reserve formulates, conducts, and communicates monetary policy. As part of this outreach effort, the Federal Reserve System will hold a research will also be holding outreach and public events as we seek views from a wide range of interested parties. Beginning in the summer of 2019, the FOMC will draw on what it has learned from the conference and the System outreach events as it assesses possible ways in which the Fed's strategy, tools, and communication practices might evolve to best achieve, on a sustained basis, the twin goals of maximum employment and price stability assigned to it by the Congress. We anticipate making our findings public after the FOMC concludes this review sometime in 2020. The U.S. economy enters 2019 after a year of strong growth, with inflation near our 2 percent objective, and with the unemployment rate near 50-year lows. That said, growth and growth prospects in other economies around the world have moderated somewhat in recent months, and overall financial conditions have tightened materially. These recent developments in the global economy and financial markets represent crosswinds to the U.S. economy. If these crosswinds are sustained, appropriate forward-looking monetary policy should seek to offset them to keep the economy as close as possible to our dual-mandate objectives of maximum employment and price stability. As we have long said, monetary policy is not on a preset course. Going forward, we need, I believe, to be cognizant of the balance we must strike between (1) being forward looking and preemptive and (2) maximizing the odds of being right. For example, were models to predict a surge in inflation, a decision for preemptive hikes before the surge is evident in actual data would need to be balanced against the cost of the model being wrong. Speaking for myself, I believe we can afford to be patient about assessing how to adjust our policy stance to achieve and sustain our dual-mandate objectives. We begin the year as close to our assigned objectives as we have in a very long time. In these circumstances, I believe patience is a virtue and is one we can today afford. Thank you.
r190201a_FOMC
united states
2019-02-01T00:00:00
Strengthening the Community Reinvestment Act: What Are We Learning?
brainard
0
Thank you all for participating in our Research Symposium on the Community Reinvestment Act (CRA). I am happy to have an opportunity to learn from your extensive experience and expertise. At the Federal Reserve, we value the CRA as a critical tool for providing support to low- and moderate income (LMI) families and their communities. And we are interested in strengthening the CRA as it encourages banks to help meet the credit needs of the communities they are chartered to serve. Today's research forum is one part of an extensive outreach effort we are undertaking to gather the best ideas for improving implementation of the Community Over the past four months alone, all 12 of our Reserve Banks have hosted roundtables in locations around the country, from San Francisco to Boston, and from Rapid City to Puerto Rico. The purpose is to hear ideas on improving the CRA regulations from the bankers and community groups that have a stake in the CRA's success. In addition, we held two roundtables at the Federal Reserve Board earlier this week to gather perspectives from national organizations focused on policy topics, such as housing, small business lending, and consumer credit. We have also consulted with our advisory councils to gather their thoughts on CRA reform. We have asked our large and community bank advisory councils, the Federal Advisory with the CRA and suggestions for improvements. We have also sought community perspectives. At our most recent meeting with our Community Advisory Council, we asked for their recommendations for reform. Even though we decided not to join the Office of the Comptroller of the Currency in the publication of its August 2018 Advance Notice of Proposed Rulemaking concerning revisions to the CRA regulations, we have been reviewing the approximately 1,500 comment letters submitted by academics, banks and banking trade associations, community and consumer groups, and citizens. So what have we learned so far from the comment letters we have reviewed and the roundtables we have held? If there is one common thread, it is that support for the Community Reinvestment Act is broad and deep. Commenters across the board applauded the significant volume of CRA loans and investments that have supported LMI households and communities, as well as the benefits households and communities have realized from the CRA's focus on local retail financial services, small business lending, and community development lending, investments, and services. And they asked that the three banking agencies work together toward a joint rulemaking proposal so that CRA policies can be clearly and consistently applied across agencies. Second, there are some good ideas about how to modernize the procedures for setting the area in which the agencies assess a bank's CRA activities while retaining the core focus on place. This is not a simple challenge, and this morning's panel identified some promising solutions to the challenge of modernizing the definition of assessment areas to keep up with changes in banks' business models. I appreciated the panelists' insights on how to balance the importance of place with various business models, including to reflect the extensive use of digital channels and other changes in the banking industry. The public comments we have read so far suggest general agreement that there is a need for an update--but not a complete overhaul-- of assessment areas through a balanced package of reforms. We have heard general support for assessment areas that reflect each bank's business model, recognizing that branch-based assessment areas work for many banks but that additional or different assessment areas may be appropriate for others. Third, we have received helpful input on tailoring CRA regulations to banks of different sizes and business models. Many of the comments we reviewed expressed support for retaining different performance tests for different types of banks, including the strategic plan option. We also heard this at the regional roundtables, where banker participants ranged from small community banks to large internet-only banks. It was clear that CRA regulations cannot be one- size-fits-all. Fourth, we have heard some good suggestions for ensuring that any modernization of assessment areas should keep in focus the goal of encouraging banks to seek out opportunities in underserved areas, including in this morning's panel on assessment areas. The concern about CRA hotspots and credit deserts was echoed in the comment letters, and several commenters offered helpful suggestions for addressing this problem going forward. And the need to create incentives for CRA capital to reach underserved communities was a theme we heard in our regional roundtables from both bankers and community groups. Fifth, we have received many suggestions about how to increase the consistency and predictability of CRA evaluations and ratings. Although we are still in the process of working through the public's comments, those we have read so far suggest general support for the view that the CRA regulations and examinations would benefit from more clarity, consistency, and predictability. Likewise, there is an openness to expanding the use of metrics that evaluate components of a bank's activity on an assessment area level, while recognizing the importance of also leveraging performance context information, including of a qualitative nature, so that bankers and examiners are able to identify and understand local community needs. The first panel this morning on metrics and evaluating performance also helped further our understanding in this area, with particular focus on the investment behaviors of CRA-motivated banks and on how we might strengthen the CRA to better evaluate a bank's performance in meeting the credit needs of its communities. Sixth, in both comment letters and roundtables, community and consumer groups emphasized the historical context of the CRA as it relates to redlining practices. To that end, they strongly supported the CRA retaining a proactive focus on reaching all underserved borrowers, including low-income communities and communities of color. The central thrust of the CRA is to encourage banks to ensure that all creditworthy borrowers have fair access to credit, and, to do so successfully, it has long been recognized that they must guard against discriminatory or unfair and deceptive lending practices. This has been an excellent convening so far, and I want to thank you for sharing your knowledge and insights with us. The Federal Reserve is a research-driven institution, and we want to be sure that we are aware of all the latest research on the effectiveness of the CRA and what the research has to say about potential regulatory improvements. Today's conversation is an opportunity not only to hear from external academic researchers, but also to have a robust conversation with practitioners about how this research might inform the Board's work. This afternoon, I look forward to hearing the conversation on the effectiveness of the CRA, past, present, and future. I have had opportunities to hear directly from stakeholders in a variety of settings, kicking off with a community development visit in Baltimore last April and most recently in Denver, at our first regional roundtable. The Denver roundtable was attended by state member banks and was hosted by the Federal Reserve Bank of Kansas City. I appreciated the robust conversation among knowledgeable individuals whose work touches on the CRA every day. Sitting around a table together provided an opportunity for me to hear community bankers reflect on what has worked well for their communities and what they see as challenges, and to provide thoughtful suggestions on what they think might work best going forward. It was also helpful to be exposed to some differences of views. The best approach to implementing the CRA in today's environment is a complex issue, so I value hearing a wide range of suggestions. In closing, I want to reiterate my own commitment to strengthening the CRA, which is widely shared across the Federal Reserve System. We aim to promote more CRA activity, not less. We think that simplifying and clarifying the regulations while strengthening local community engagement will help us accomplish that goal. Thank you for your help in this process.
r190206a_FOMC
united states
2019-02-06T00:00:00
Welcoming Remarks
powell
1
Thank you to all the educators who are here with us in Washington or are joining us online. I look forward to responding to your questions. But first, I have a few thoughts about the vital work you do as economics educators and its connection to what the Federal Reserve is trying to accomplish. I promise to be brief, because it is a school night. I am here today, and the Fed has organized this event, because of the importance of economics education. Some of your students may go on to become professional economists, but all of them, I hope, will apply the valuable lessons and the skills they have gained from economics in other careers and in others aspects of their lives. Studying economics can benefit students in multiple ways. The lessons of economics are valuable in a wide variety of vocations. Moreover, the knowledge gained will empower students as consumers, managers of their own finances, and as informed citizens. Economics has been consistently useful to me over my career in law, finance, and government service. It is, of course, central to my current role as a monetary policymaker and financial regulator. In government, economic analysis is one of the principal tools we use in making policy decisions. Among other things, economics is an essential facet of the science of public policy. What policies actually work? Which ones sound good but don't work, or are actually counter-productive? Economics gives us the tools to answer those questions and help us make the best the decisions on behalf of the public. Of course, economics is not only the basis for judgments and decisions made by the Fed and other government agencies. It also underpins the countless decisions by consumers, businesses, and investors that drive economic activity. The concepts you teach and apply in the classroom guide those decisions and even help explain human behavior outside of the workings of the economy. For example, to continue to grow and succeed, any business owner should understand the differences between fixed cost, variable cost, average cost, and marginal cost. Businesses and investors need to understand present or discounted value, but so should any parent or grandparent starting a college fund. Economics teaches us about the power of incentives, which are central to thinking about and understanding regulatory and tax policy. But incentives also help motivate people in a variety of other settings, such as encouraging students to do their best in school, helping reduce traffic jams, or even nudging someone to save more or to exercise regularly. Economics is a practical and powerful tool for understanding how we relate to each other. And that's why what you do for your students is so important. Like all teachers, you are helping prepare them for success in life. The knowledge you impart and the intellect and talents you help develop are tools that your students can use to achieve that success. Economics teaches analytical and critical thinking skills useful to anyone. Part of your students' success is their economic success as capable, creative, and productive members of the workforce and as consumers adept at managing their finances. Your students benefit from this education, but so does everyone else in society. We all benefit when better-educated citizens support economic policies that help our nation prosper. We all benefit from the capability, creativity, and productivity of our workforce, because nothing is more important to a healthy and growing economy. Responsible consumers skilled in managing their finances are better prepared to weather bad times, and stronger household finances overall can help sustain economic growth and mitigate a downturn. Stabilizing growth and mitigating a downturn, of course, are aspects of the Federal Reserve's mission. Monetary policy can be a powerful tool to achieve these ends, but, in truth, its powers are dwarfed by larger forces, such as the productivity of the American people and the strength of their finances. By educating students and supporting their future contributions to the economy as workers and consumers, all teachers, especially economics teachers, are furthering our goals at the Fed, so let me offer my further thanks for making our job easier. To help support your work as teachers, the Federal Reserve Board and the Reserve Banks conduct programs, organize events, and publish books to spread knowledge of economics, financial literacy, and the role of the Fed in promoting a healthy economy and financial system. You can find some of those resources at our Each of the Reserve Banks has community outreach and educational initiatives, and the outreach to economics teachers is coordinated by the System Economic Education Group, which has been chaired by Princeton Williams. At the Board of Governors, for some years we have operated a program called FedEd, which sends Board employees into high schools throughout the Washington, D.C., area. This outreach depends on several dozen volunteers from our staff--typically, recent college graduates--who help teach about the Fed, economics and finance, and answer questions about work opportunities at the Board. The Federal Reserve is dedicated to promoting diversity in our ranks and in the economics profession, and FedEd and other programs across the Federal Reserve System have helped advance this goal by reaching many schools with significant numbers of minority students. Let me leave it there, and again thank you for participating in this town hall, and thank you for the valuable work you do every school day.
r190206b_FOMC
united states
2019-02-06T00:00:00
Inviting Participation: The Public's Role in Stress Testing's Next Chapter
quarles
0
Thank you, Nan, for that kind introduction, and thank you to President Mester of the Federal Reserve Bank of Cleveland for inviting me to speak this evening. I am honored to be here and to support the mission that you and the Council for Economic Education have worked so hard to advance--that every student in America gets a strong, early start on their financial education. That mission is critical in its own right, but it also reflects the deeply held value of participation--of giving young people the chance to shape not just their own futures, but also the futures of their communities and their country. Because so much of the language of finance is couched in terms of metrics and rationality, we often forget that finance is something we never do alone. It is, by definition, a collaboration, which helps us work together to achieve common goals. The Federal Reserve is no exception. Tonight, I want to briefly discuss the role that participation plays in the Federal Reserve's work and outline one effort to solicit broad participation--an upcoming conference on stress tests, intended to make those tests more open, transparent, and effective. Public institutions exist under a grant of trust from the people they serve, to pursue a specific policy goal. When the public holds an institution accountable for that grant, the institution becomes stronger. The Federal Reserve System we know today emerged through decades of legislation, public consultation, and debate--from the original Federal Reserve Act, which created the Federal Reserve System, to the Banking Act of 1935, which established the 1951, which ensured the separation of monetary and fiscal policy. These changes made our economy and our country stronger, because they improved the Federal Reserve's ability to accomplish the mission Congress assigned it. Throughout this evolution, a key principle has been that accountability allows the Federal Reserve to be independent--that we are subject to challenge, to counterargument, and to the emergence of new evidence and ideas. For our work to remain legitimate, the public must be able to see, understand, and engage with our efforts; to reaffirm their support when we have earned it; and to offer informed guidance on when to change course. Accountability is only one reason the Federal Reserve relies on public outreach and participation. We also rely on participation for our effectiveness, because the best ideas in finance and economics can, and often do, come from a wide variety of sources. Agencies like the Federal Reserve are a collection of expertise--informed by experience and positioned to turn a broad range of information into policy. But we are not, and cannot be, a monopoly on insight or wisdom. The Federal Reserve recognizes these limits, and the need to invite new ideas, through a variety of initiatives. We seek out a qualified, diverse workforce, and foster an inclusive workplace. We meet frequently with a range of advisory councils, drawing on expertise in banking, modeling, and consumer and community finance. We have increased transparency around our policy process and issued new reports on financial stability and banking supervision and regulation, with new details about our work. staff publishes a wide range of economic and policy research and plays an active role in academic discourse. Monetary policy itself shows the value of participation and transparency. U.S. monetary policy is the sole responsibility of the Federal Reserve. Yet some of the most important innovations in the field have come from outside the Reserve System. Since 1935, we have decided monetary policy by committee, a structure that has served us well because it is designed to capture different views of a wide and varied national economy. And over the past several decades, the FOMC has greatly increased its own transparency--from postmeeting announcements, to announcing an objective for inflation, to a published survey of economic projections, to postmeeting press conferences (which will now take place after every FOMC As many of you know, over the course of 2019, we will be reviewing our monetary policy strategy, tools, and communication practices, and we will hold a research conference on the subject with outside speakers, as well as "Fed Listens" events at a number of Reserve Banks, to hear from a broad range of constituencies. But these improvements are more than a simple matter of disclosure. They are an invitation to participate, and a way to provide the public with the means and opportunity to inform our work. This year, we are taking similar steps to improve a cornerstone of our post-crisis rules. Supervisory stress tests offer an independent and valuable lens on the health of the banking system. They offer us a forward-looking measurement of bank capital, a view of common and systemic risks across the banking sector, and a broader understanding of the health of the financial system. The results are valuable for markets, analysts, and ultimately, the participating firms. Ten years have passed since the Federal Reserve conducted its first supervisory stress tests. That initial experiment helped stabilize financial markets and shore up our banking system at a critical and uncertain time. Our challenge now is to preserve the strength of the test, while improving its efficiency, transparency, and integration into the post-crisis regulatory framework. To that end, the stress tests have not remained static. Just in the past several days, the Board acted to suspend stress tests this year for lower-risk firms--generally, those with total assets between $100 billion and $250 billion. That move follows the passage of the Economic The extended cycle provides administrative burden relief for these institutions and recognizes the different risks that they typically pose--especially compared to the largest and most complex firms, whose failure poses the greatest risk to the real economy. Even with this change, the stress tests remain a core part of our supervision of these firms. Our experience with this "interim" year will inform the move to a permanently longer testing cycle--a change that would, of course, be subject to a full notice and comment process. Improvements like these are necessary to ensure our supervisory framework evolves from its post-crisis origins to an effective steady state. The question of how best to consolidate the gains from the first 10 years of stress testing deserves the attention and effort of the country's best minds. We should welcome changes and novel ideas, even when they explore stress testing in a new and unfamiliar light. In July, as a forum for such ideas, we will host a public conference focused on the transparency and effectiveness of stress testing. Called , the event will convene panel discussions, drawing on a mix of presenters with industry, academic, and regulatory backgrounds. It will involve written papers, which will be compiled and published to spur further research. We expect the insights from the conference to inform the evolution of our stress-testing framework--and we hope to continue the conversation well after the conference ends. This input is as essential to our work as any public outreach we do. Stress testing provides insight into a dynamic financial system, and our stress-testing process must be dynamic as well. More broadly, the core of the Federal Reserve's independence is a broad consensus around the value and public worth of our mission. The Federal Reserve is the steward and trustee of that mission, but the public is its owner. To serve the public, we must not just allow input, but welcome it; not just permit debate, but foster it; not just allow participation, but treat it as essential to our work. Thank you.
r190210a_FOMC
united states
2019-02-10T00:00:00
Ideas of Order: Charting a Course for the Financial Stability Board
quarles
0
Let me begin by thanking the Bank for International Settlements and Agustin Carstens for the invitation to join you today. In considering venues for my inaugural speech as chair of honor to be here, in Hong Kong, with colleagues from around the world, representing central banks who are committed to advancing financial stability not only in their home countries but also globally. I would like to discuss with you tonight my view of how the work of the FSB must evolve, and some key principles that I think should inform that work. My predecessor, Bank of England Governor Mark Carney, guided the FSB for the last seven years along a path that was formidably challenging. The Global Financial Crisis had exposed fault lines in the financial system that had to be addressed immediately, comprehensively, and vigorously. The body of post-crisis regulation that has resulted, though it involved the energy and efforts of a kaleidoscopically varied host of standard setters, regulators, and central banks--including all the institutions in this room--was nonetheless accomplished under the aegis and at the instigation of the FSB. It was a tour de force of orchestration, and it has unquestionably made the financial system safer and more resilient. Today, however, the post-crisis reform agenda has been largely completed. Basel III is final, the largest global banks have substantially more capital and liquidity, over-the-counter derivatives markets are safer, and steps have been taken to address the risks of too-big-to-to fail institutions. Through greater monitoring and policy measures, the FSB is addressing risks from non-bank financial intermediation. And there has been remarkable progress on the difficult and unsung task of establishing workable resolution regimes that are consistent with the FSB's clearly defined principles. Yes, we all have work to do to ensure full, timely, and consitent implementation of the agreed reforms, and, yes, we will do that work, but, it is time for the FSB to turn more of its energy and attention to the future. Tonight I would like to outline a few core principles that should guide this pivot forward. First, engagement: to maintain the legitimacy of our work, to increase understanding of it, and to enhance its effectiveness, we must improve our outreach and transparency--including to our membership, other global authorities, the public, and key stakeholders. Second, rigor: as we devote more attention to evaluation of new and evolving risks in the financial sector, we must ensure that our assessment of vulnerabilities is based on cutting-edge thinking and a disciplined methodology. And third, analysis: regulation has evolved rapidly in the last decade, and--if we are doing our jobs right--will continue to evolve with rapid developments in the financial sector. An important part of our work must be continual, critical analysis of the effects of regulaton with an eye to making useful improvements where possible. Let me begin with the principle of engagement, and let me lay the groundwork for this discussion by reviewing the way the FSB was established and its mandate, to see how we can continue to fulfill that mandate going forward. As we all know, the FSB was born out of the crucible of the 2007-09 Global Financial Crisis--a crisis that demonstrated in the starkest possible way the importance of global financial stability to the well-being of families and businesses around the world. In the months following the peak of the crisis, the world was struggling with financial market turmoil, and the resultant macroeconomic effects were felt by people everywhere around the world. It was clear that the response to this crisis needed to be global, and the G7 and G10, without any emerging market representation, were not the right bodies to organize a global response. As such, the Heads of State and Government of the G20 called for the Financial Stability Forum (FSF), a relatively small and unmuscular group, to expand its membership and to strengthen its institutional framework. The result was the Financial Stability Board, which was designed as a mechanism for national authorities, global standards-setting bodies, and international authorities to identify and address vulnerabilities in the global financial system and to develop stronger regulatory and supervisory policies to create a more resilient global financial system. This new group is more representative of the interconnected global economy and financial system and can more effectively mobilize to promote global financial stability than anything that existed before. Whereas the FSF included only 11 jurisdictions (all of which were advanced economies), the FSB includes 24 jurisdictions and 73 representatives, which include all the members of the G20 and of which 10 are emerging market economies. In fostering global financial stability, the actions of the FSB have the potential to affect the global economy and financial system in important ways. Success in promoting global financial stability should benefit everybody, through more sustainable and stronger economic growth. At the same time, financial stability policy will also affect institutions and markets beyond the FSB's membership. Recognizing the wide-reaching effects of its work, the FSB must seek input from a broad range of stakeholders, each of whom brings a different perspective to the issues under consideration. While we are directly accountable to the G20, we are, through the G20, accountable to all of the people affected by our actions. In my view, that means we must engage in genuine, substantial dialogue with all of these stakeholders, to a greater and more effective degree than we have in the past. Let me take an example that may be of particular relevance to many of you here tonight: our Regional Consultative Groups or RCGs. These are six groups around the globe--the Independent States, and Asia--that bring together FSB members with about 70 additional jurisdictions. These groups help the FSB get broader input into its policy development and improve our outreach efforts. Each RCG meets once or twice a year to discuss policy development, regional and global financial vulnerabilities, and other current topics of particular interest to the respective region. For instance, topics discussed at the last meeting in Asia included capital flow volatility, cyber resilience of financial institutions, and fintech. These RCGs are great in concept, but they have struggled in practice. We have already begun a study to look back at how the RCGs have operated since their creation in 2011, how they interact with the FSB, and what best practices have been learned. Even while that study is underway, I am committed to improving our mechanisms for reaching out to countries outside the FSB for genuine learning about the effects of FSB actions. I plan to attend a number of RCG meetings each year because of the importance I place on making sure these groups are truly useful tools, and we will improve these vital conduits of two-way information in order to improve our effectiveness and our transparency. While the RCGs may be of particular interest to some here tonight, they are just the tip of the iceberg for what we have to do to improve our engagement and outreach. Because the FSB's authority is ultimately derived from the people and from the political power given to authorities in their countries, we have a responsibility to seek input from and to provide information about our deliberations and actions to the broader public. To that end, for the first time in the FSB's history, we will shortly publish our work program to the public to provide people with a full picture of the issues we are investigating over the coming year. Finally, there are the businesses, institutions, and market participants that are directly affected by the policy recommendations of the FSB. We currently engage with those entities to gather information as we consider recommendations to the G20. For example, we have in the past conducted public consultations on FSB policy recommendations--but often on a very short timetable, which limits the ability for true exchange. We have now established an expectation that the public consultation period will be at least 60 days. The key point is that when it comes to outreach to inform our work and transparency in the communication of it, we must do more, and more eagerly. This discussion of engagement, outreach and transparency may seem like an odd place to begin a "vision" speech, but I think it is foundational. The FSB must maintain its legitimacy in order to be effective, and to do that we have to work hard to hear from all relevant parties when deliberating. What's more, we have to do so publicly and methodically. Everyone around the world should understand that we only make recommendations once we have gathered and considered all points of view. Process is important, and good process leads to good substance. With that said, let me turn to the core piece of the FSB's mission--assessing and mitigating vulnerabilities, especially in the nonbank sector. The FSB's work has a natural flow. We strive to identify vulnerabilities in the financial system that could threaten financial stability when a shock hits. When a vulnerability has been identified, we examine possible policy responses. If a policy is recommended to the G20 and adopted, we then monitor the implementation of that recommendation globally. Finally, when enough time has passed for the enacted policies to have had measurable effects, we study those effects. For much of the post-crisis period, our focus was on developing policy recommendations to address vulnerabilities made apparent by the crisis. Now, however, with nearly all of the post- crisis reform agenda complete, the FSB needs to put more of our resources into identifying new vulnerabilities in a financial sector that continues to evolve and to studying the effects of the many reforms that have been enacted. Let me start by discussing vulnerability assessment. The reforms that have been implemented over the past decade have changed the financial system for the better. However, that does not mean we are immune from future financial crises. It means that we have boosted the financial system's resilience to some of the types of shocks and vulnerabilities that precipitated the crisis. To be sure, some of these measures, like higher bank capital and liquidity requirements, after effective against a wide range of shocks. However, we cannot be complacent and assume that we are safe from all shocks. As a result, the FSB has decided to undertake a review of its framework for assessing vulnerabilities to ensure that we are at the cutting edge of financial stability vulnerability assessment. This work will be undertaken by our committee charged with assessing vulnerabilities Nederlandsche Bank). This should be a framework that starts from first principles and benefits from substantial dialogue with nonbanks as well as banks, regulators, and other relevant official bodies. There is a lot that we can draw on. The crisis led to an explosion of work in this area aimed at improving the ability of authorities to identify financial vulnerabilities in order to be able to take appropriate action in a timely fashion. I trust that the framework will harness the strength of the broad and diverse membership of the FSB, that it will be forward looking, and that it will be flexible enough to handle a financial system that will continue to evolve over time. This will not be easy--developments like the emergence of crypto-assets may challenge any framework--but that makes the goal of a robust framework all the more important. Finally, let me turn to the FSB's evaluation of the effects of reforms that have been implemented. In the past few years, we have reached a point where many of the reforms have been in place for long enough to have effects that we can measure and analyze. For those reforms, we need to ask several key questions. First, to what extent are those reforms having the intended effects and building a more resilient financial system? Second, have those reforms had any unintended, adverse effects that we can address? These first two questions are, perhaps, apparent. We should also ask a third question, however, that is equally important: can we achieve a strong level of financial resilience with reforms that are more efficient, simple, transparent, and tailored? If so, we owe it to everyone affected by the policies recommended by the FSB to try, because there is a strong public interest in the efficiency of the financial sector, just as there is in its safety and soundness. If reforms are unnecessarily burdensome and we can achieve strong resiliency more efficiently and simply, we should be able to boost sustainable financial and economic activity, thus benefitting everyone. Evaluating reforms in light of these questions requires rigor--in terms of the selection of evaluation topics, the choice of analytical methods, the assessment of effects, and the development of policy conclusions. This rigor is critical for the quality and credibility of the evaluation work. For this reason, the FSB developed a clear Evaluation Framework, under which the evaluations are based, as much as possible, on quantitative data to measure the costs and benefits of reforms. Equally critical to a successful review is a truly open mindset with which to evaluate the relevant data. One public measure of whether we are doing this review process well will be whether we recommend any improvements or revisions on the basis of it. In any system as complex and consequential as the body of post-crisis financial regulations, there will always be aspects--and sometimes material aspects--that can be improved on the basis of experience and analysis. A credible review process that is both rigorous and dispassionate will find a few. Last year, the FSB completed the first two evaluations under its framework, of the effects of reform on infrastructure fin ance and the clearing of derivative contracts. We are currently engaged in work examining the effects of reforms on the financing of small and medium enterprises, which are the lifeblood of many of the world's economies. We will consult publicly on the findings of this evaluation in June, ahead of the G20 Summit in Osaka. And we are in the process of launching an important study on the effects of reforms aimed at ending too big to fail. This evaluation is being led by Claudia Buch, vice president of the Deutsche Bundesbank. This is an important time for the FSB. We are nearing completion of the post-crisis reform agenda, a major accomplishment. With that comes the opportunity to turn our focus to ways in which we can improve the FSB and prepare it for the next phase of its existence. We will work diligently to enhance our transparency and to expand our efforts to reach out to as many stakeholders as possible. We will prepare for the next crisis by making sure that our framework to assess vulnerabilities to financial stability is state of the art and remains so going forward. And, we will work hard to maintain the important reforms in place, ensure they are working as intended, and, where possible, improve them.
r190211a_FOMC
united states
2019-02-11T00:00:00
A Conversation on Community Banking
bowman
0
Good morning. It is a pleasure to be here today to talk about the Federal Reserve's commitment to and oversight of community banking. I appreciate the invitation to attend the American Bankers Association's Conference for Community Bankers, as community banking has been a focus of my career and plays a vital role in supporting our economy. I believe it is particularly important to bring together community bankers at conferences like this to discuss issues impacting your banks and the communities that you serve. As many of you know, before joining the Federal Reserve, I was a community banker and more recently had the privilege to serve as the Kansas State Bank Commissioner. While I'm not the first community banker to serve on the Board, I now have the honor of being the first governor to fill the role designated for someone with community banking experience on the Federal Reserve Board, a position that was created by statute in 2015. I plan to fulfill this unique responsibility by traveling widely and listening closely to community bankers, consumers, small business owners, community leaders--all of the stakeholders with an interest in this area. I will take back the knowledge I gain from these discussions and use it to improve our work. And in the process, I hope to help you better understand what the Federal Reserve is doing and what we are trying to accomplish. In doing so, I am confident our work will be more effective and efficient. As you all know well, community banks are a critical engine of the economy, and they play a key role in providing access to credit in communities of all sizes--big, small, rural, and every size in between. Community bankers not only assist in making people's dreams come true--whether the dream is starting a small business, buying a home or farm, or financing a car. They also provide critical leadership in their communities in many ways--including serving on local boards of schools and hospitals, donating to nonprofits, and volunteering in the community. For all of these reasons, and more, I firmly believe a thriving community banking sector is important to the health of the economy. Like many of you, I witnessed firsthand how community banks were significantly affected by the Global Financial Crisis, a crisis they did not cause. In my work as state bank commissioner, I learned how bank failures affected cities and towns across the country, and in my home state of Kansas I saw the profound effects a single failure can have on a community. To ensure that community banks can continue to meet the credit needs of their communities, the Federal Reserve and other banking agencies strive to achieve a fair balance between safety and soundness and reducing unnecessary regulatory burden. Given the straightforward nature of community banking, regulators have an obligation to develop and refine approaches to supervision that fit the smaller size and less-complex risk profiles of these banks. If we keep our focus on appropriately tailoring regulatory requirements for community banks so they may continue to prudently thrive, then community bankers should be able to devote more resources and time to serving their customers and communities. Ultimately, when access to credit is limited, communities suffer and so does the larger economy. In view of these goals and contributions, and an understanding that systemic risk is not likely to be posed by any single community bank, the Federal Reserve continues to tailor supervision and refine our approach to risk-focused examinations of community banks. We are also charged with supervising financial institutions to make sure they comply with applicable federal consumer protection laws and regulations. Here too, we apply a risk-focused approach to consumer compliance supervision, focusing most intensely on those areas involving the greatest compliance risk. Similarly, we want to ensure that rules that address the risks posed by the business models of the largest banks do not unintentionally create barriers to entry or unnecessary burden for community banks. During the remaining time with you today, I will touch on the condition of community banks, and supervision and regulation of community banks. I am also interested in hearing your perspective on the challenges and opportunities in the current community bank landscape. When I analyze how community banks are faring, I always keep in mind that the range of institutions we call community banks is remarkably diverse. As Kansas State Bank Commissioner, I oversaw banks that had four employees and less than $20 million in assets up to institutions with more than 100 employees and more than $1 billion in assets. It's the same picture nationwide. Though if we look more closely, we can see that three out of four community banks hold assets of less than $500 million. As a regulator, I am particularly interested in how our work affects institutions of this size. One good reason for this interest is the distinct contribution community banks make to economic activity. For example, while community banks account for just 17 percent of financial industry assets, they are responsible for some 53 percent of bank lending to small businesses. Further, the health of the community banking sector has improved significantly since the financial crisis. Over the past decade, the majority of community banks have maintained high common equity tier 1 capital levels--consistent with the "well- capitalized" designation under regulatory capital standards. Though there are considerable challenges to the community bank model, these banks continue to post strong earnings, which, in turn, contribute to healthy capital accounts. In fact, given the sound condition of the banking industry, there were no community bank failures in 2018. Our job now is to ensure that community banks continue to remain strong. That requires bankers and supervisors alike to stay vigilant in the management and supervision of risks facing these institutions and the community bank model. While banks are performing well and loan portfolios are growing, we want to ensure that loans are underwritten prudently. We also want bankers to actively manage concentrations of credit risk, and be mindful that strong lending activity can strain liquidity. For example, concentrations of commercial real estate are rising, and are quite high at some banks, prompting us to remind bankers of the difficulties that such concentrations presented in the past. We also continue to focus on concentrations of agricultural credit. In an ongoing effort to understand the emerging risks related to agricultural lending, the Federal semiannually. In late March of this year, the Federal Reserve will be hosting this Depository Institution Advisory Council. I view meetings like these as important learning opportunities where I can hear directly from leaders in the field about the challenges and opportunities in community banking. These meetings and direct lines of communication with community bankers help to inform one of the Board's primary goals of ensuring that both supervisory programs and regulations are appropriately tailored to the size, complexity, and risk of a financial institution. The Federal Reserve and other federal banking agencies have demonstrated a commitment to reducing regulatory burden, especially on community banks, while maintaining safety and soundness. In particular, the Federal Reserve has acted to implement provisions of S. 2155 that provide relief to community banks. Several proposals were issued at the end of last year, and I'd like to encourage this group to submit your views and comments on these proposals. It is important for the Federal Reserve and other regulatory agencies to receive input directly from community bankers. One of these important proposals is to implement the community bank leverage ratio. While the community bank leverage ratio proposal would increase the minimum leverage ratio for banks that opt in to the new framework, the proposal would allow qualifying community banks to opt out of the more complicated risk-based capital framework. Other proposals include raising to $400,000 the threshold for determining when an appraisal is required for a residential real estate transaction, and excluding community banks from the Volcker rule. The Federal Reserve also acted to raise the asset threshold from $1 billion to $3 billion of total consolidated assets for the Small Bank Holding Company and Savings and that have limited access to the capital markets to take advantage of using debt in bank acquisitions. This helps foster local ownership of small banks. The change also exempts eligible small holding companies from consolidated risk-based capital rules, a significant burden reduction. In December, the federal banking agencies issued a final rule allowing qualifying insured depository institutions with less than $3 billion in total assets to benefit from an extended 18-month on-site examination cycle. This increases the former threshold of $1 billion and provides examination burden relief to a substantial number of community banks with relatively simple risk profiles. With respect to supervision, the Federal Reserve continues to tailor and reduce burden by conducting portions of community bank examinations offsite. However, we also understand the importance of face-to-face interaction and continue to be responsive to bankers' requests for on-site examinations. The Federal Reserve has also implemented a risk-focused supervisory program-- activities within state member banks and apply appropriately streamlined examination work programs to these activities, and conversely, to identify high-risk activities within state member banks for prompt supervisory attention. This enhanced tailoring of supervision minimizes regulatory burden for the many community banks that are well managed, and directs supervisory resources to higher-risk activities where they are most needed to contain the risks that can result from aggressive banking strategies. Similarly, the Federal Reserve tailors its supervision of holding companies based on the size, complexity, and risk profile of each institution. As I conclude, I would emphasize how crucial it is to balance effective regulation and supervision to ensure the safety and soundness of community banks while also ensuring that undue burden does not constrain the capacity of these institutions to support the communities they serve. As I previously noted, one of the most important aspects of my job, as I see it, is to have open lines of communication and feedback between regulators and community banks and bankers. As a former community banker and state regulator, I understand how clear communication can help us all do our jobs better. So, I encourage you--and everyone with a stake in this work--to share your thoughts on the impact of regulation on community banks and the communities they serve. This dialogue is especially important as we continue to work to tailor our supervision and regulation to the size and risk profile of the institutions we oversee. Although we have different responsibilities, I believe we can agree that we must keep our financial system strong, while maintaining the ability of community banks to fulfill their important role in our economy.
r190212a_FOMC
united states
2019-02-12T00:00:00
Encouraging Economic Development in High-Poverty Rural Communities
powell
1
Thank you for that kind introduction. I also want to thank Hope Enterprise hosting us on its lovely campus. It is a great pleasure to visit the Mississippi Delta. This region gave America the priceless musical heritage of the Blues. It was also the site of some of the watershed events of the civil rights movement. But the Delta, like many rural areas, has long confronted the challenge of poverty. Today I'll speak about three areas of opportunity for addressing the challenges of rural poverty--education and workforce development, entrepreneurship and small business development, as well as access to financial services-- the topic of this conference. MVSU's mission, providing young people an affordable college education, is itself a very powerful antipoverty program--for its graduates, of course, but also for other communities that are touched by the university. MVSU opened its doors in 1950 as Mississippi Vocational College. The vocation of every member of its first class of 200 students was teaching. And even as it added other programs and became a full-fledged university, MVSU has continued to train many of the teachers who enrich, inspire, and spark the imagination of children who grow up in the Delta. MVSU and other historically black colleges and universities, or HBCUs, play a crucial role in their communities. For much of the 20th century, HBCUs were the primary means for black men and women to obtain the education needed for middle class or professional jobs. Larger percentages of HBCU students, compared with students at predominantly white institutions, come from lower-income families and are the first in their families to attend college. This university and other HBCUs support intellectual leadership, creativity, and innovation. I want to honor you for this proud history and for your continuing commitment to a better future for your students and for the region where many of your students were born and will build their lives and raise their families. Today, data at the national level show a strong economy. Unemployment is near a half-century low, and economic output is growing at a solid pace. But we know that prosperity has not been felt as much in some areas, including many rural places. The Federal Reserve can help by carrying out our monetary policy mission of supporting maximum employment and price stability. We also support strong communities by conducting research, promoting community development, and enforcing laws like the Community Reinvestment Act, which helps ensure that people have adequate access to financial services wherever they live. We not only work with communities, we are in communities, through the presence of our 12 regional Reserve Banks. Poverty remains a challenge in many rural communities. Indeed, 70 percent of the 473 "persistent poverty" counties in the United States are rural. Unemployment and mortality rates remain high in these communities. Along with lower incomes and wealth, the rate of business start-ups in these areas is lower. And their residents have less access to financial services. Many of these disparities have existed for generations, and in some places have roots in a history of discrimination. These areas also generally lack diverse industries and employment options and often have suffered from a decline in a traditional industry. In Appalachia for instance, timber, coal mining, tobacco, and textiles have long been in decline. Likewise, the number of jobs in agriculture and low-skilled manufacturing, mainstays of the Delta's economy, is decreasing as a result of automation and outsourcing. High-quality education and training play a crucial role in extending opportunity in rural areas, starting with early childhood education. But in many rural communities, access to high-quality preschool education is limited. Mississippi is one of several mostly rural states where nearly half of residents lack access to good quality childcare, which is the main source of early childhood education. Many decades of research also confirm that children who grow up in areas with better-quality K-12 schools or in classes with higher-quality teachers have better outcomes in life. Later in life, workforce training and education are most effective when they train workers in skills needed by local employers. Rural areas where traditional industries are declining and where new employers may be moving in often experience a mismatch between the skills of local workers and those demanded by the new employers. Training and retraining programs tend to be centralized and concentrated in areas with a higher population density, which puts many rural areas at a disadvantage. Education benefits both the student and the community. It is true that some young people leave their hometowns to seek greater opportunity. However, it is also true that young families, drawn back home by family and social ties, are the largest source of in-migration in many rural communities. Young families are more likely to come if they believe their children will get a good education. Returnees often bring back important skills and experiences and make meaningful contributions to the local economy. An excellent example is Tim Lampkin, an MVSU graduate. Tim grew up and graduated from high school in Clarksdale. After graduating from MVSU, he moved to Mobile for work. He wasn't gone long, though, before he felt called to return to make his home here in the Delta. Since moving back, Tim has founded several businesses, as well as a nonprofit that helps people of color start and grow their own businesses. As with Tim and his clients, entrepreneurship opportunities can motivate residents who have left rural communities to return home--or to keep them from moving away in the first place. Business ownership represents an important source of income and wealth for both owners and their employees. Recognizing this fact, successful communities find ways to help residents turn what they know and do best, including skills they may have gained from industries in decline, into profitable small businesses. One example of a community taking this approach is Clarksdale, Mississippi. About 10 years ago, civic and business leaders assessed what made Clarksdale special and how those characteristics could help the community thrive. Since Clarksdale calls itself "Home of the Blues," attracting tourists using the local Blues culture seemed promising. Over the years, Clarksdale has created new jobs by investing to improve its downtown, and has supported local businesses that cater to tourists interested in the community's musical heritage. This growth is also creating new businesses not directly related to tourism-- her urgent care facility last spring, partly in response to the closure of nearby Quitman County Hospital. Dr. Williams worked with local organizations to develop a business plan and obtain the financing needed to open a business that is critical to the community. DeWitt, Arkansas, is another rural community where entrepreneurship is sparking a local revival. DeWitt is a town of about 3,000 people. Like Itta Bena and many small towns, it has seen a steady decline in population over the past few decades. The economy there is a mix of manufacturing and agriculture, and many families have farmed the land for generations, such as the family of Tami and Troy Hornbeck. The Hornbecks grew up in DeWitt, starting and eventually selling a successful business built on a new soybean varietal that grows well in the South. More recently, they have been helping lead a new industry in their community using waste vegetable oil from local restaurants and oil pressed from camelina, a crop farmers are beginning to grow locally. This cooperative effort involves a partnership between a community college, a university, the local government, farmers and restaurants, and a Community While still in the early stages, it has led to the creation of several new businesses and imbued the town with a new sense of what is possible. Entrepreneurs such as the Hornbecks and Dr. Williams need support to succeed, the kind of support HBCUs and other institutions of higher education have long provided. MVSU itself is supporting rural minority entrepreneurs through its participation in the HBCU Entrepreneurial Ecosystem Initiative, which teaches students how to start and grow a business. But inspiring stories like these are not as common as they could be. Entrepreneurship has the potential to play a greater role in poor rural areas, particularly in areas whose residents are predominantly black and other people of color. Recent surveys have found relatively high levels of interest in owning businesses among young people of color. Indeed, nationwide, black women represent one of the fastest growing groups of entrepreneurs. Yet research by the Federal Reserve suggests people of color experience greater challenges in gaining access to credit to start or expand businesses, which leads me to my last topic and the focus of this forum: access to financial services. Access to safe and affordable financial services is vital, especially among families with limited wealth--whether they are looking to invest in education, start a business, or simply manage the ups and downs of life. Family income and savings represent the largest source of funding for students' education. However, more than half of students or their families also borrow to finance their studies. And racial wealth disparities make black students more likely than white students to borrow for their education. racial wealth disparities mean access to credit is critical for would-be entrepreneurs who must borrow to supplement personal savings and support from friends and family. Banks often consider credit scores when evaluating loan requests, however, and high-poverty regions tend to have a larger percentage of people with no established credit rating or a low credit score. This combination of low wealth and low credit scores limits access to credit or causes borrowers to turn to higher-cost credit. Fortunately, CDFIs are helping potential borrowers improve their creditworthiness and providing them with safe and affordable credit. While CDFIs fill a critical gap in some communities, most consumers use other banks and credit unions. In rural areas, this often means a community bank. Industry consolidation has led to a long-term decline in the number of community banks. While most rural communities continue to be relatively well served, that is less often the case in communities with high- poverty rates. In 2018, Federal Reserve staff members met with leaders in rural areas across the country that had recently experienced a bank branch closure. We found that small businesses, older people, and people with limited access to transportation are most affected. We also learned that the loss of the branch often meant more than the loss of access to financial services; it also meant the loss of financial advice, local civic leadership, and an institution that brought needed customers to nearby businesses. Regulation and supervision need to be carefully tailored to suit the size and business model of different types of institutions. At the Fed, we have renewed our efforts to avoid unnecessary regulatory burden on community banks, which provide essential credit in their local communities. Another means to address the issue of branch closures is the Community Reinvestment Act, or CRA, which encourages banks to help meet the credit needs of the communities they are chartered to serve. The CRA has been an important tool for strengthening local communities. The trend toward fewer branches and increased use of technology to deliver financial services presents a particular challenge to our current approach to CRA evaluations. Specifically, the current regulations use a bank's branches to define its assessment area, the area for which it is evaluated for CRA purposes. To the extent that banks serve much broader areas using online or other non-branch delivery systems, or have so many assessment areas that examiners cannot do a thorough evaluation in each, the financial needs of many rural communities may be overlooked. We believe that revisions to the CRA's implementing regulations should more effectively encourage banks to seek opportunities in underserved areas. To summarize my main points today, people in rural communities who are struggling with persistent poverty need access to high-quality education from preschool through college. They need support for their aspirations to own their own businesses. And they need access to safe and affordable credit. I will conclude where I began, by applauding both MVSU and Hope for their long-standing contribution toward bettering the lives of the people of Itta Bena and similar rural communities.
r190222b_FOMC
united states
2019-02-22T00:00:00
The Federal Reserve's Review of Its Monetary Policy Strategy, Tools, and Communication Practices
clarida
0
I am pleased to participate in this year's U.S. Monetary Policy Forum, which, since its inception, has brought together policymakers, academics, and market participants to share ideas and perspectives on U.S. monetary policy. Today I would like to discuss the broad review of the Federal Reserve's monetary policy framework that we are undertaking this year. We will examine the policy strategy, tools, and communication practices that we use to pursue our dual-mandate goals of maximum employment and price stability. In my remarks, I will describe the motivation for and scope of this review and will preview some of the events we are planning as part of it. Policy Forum is an excellent venue for this presentation. For more than a decade, it has focused attention and timely analysis on critical issues confronting the Federal Open Market Committee (FOMC). Its programs have drawn on the latest economic research and considered a range of views. Similarly, the Federal Reserve's review of its monetary policy framework will be transparent, will be open minded, and will seek perspectives from a broad range of interested individuals and groups, including academics, other specialists, and the public at large. Motivation for the Review The fact that the System is conducting this review does not suggest that we are dissatisfied with the existing policy framework. Indeed, we believe our existing framework has served us well, helping us effectively achieve our statutorily assigned dual-mandate goals of maximum employment and price stability. Nonetheless, in light of the unprecedented events of the past decade, we believe it is a good time to step back and assess whether, and in what possible ways, we can refine our strategy, tools, and communication practices to achieve and maintain these goals as consistently and robustly as possible. I note that central banks in other countries have conducted periodic reviews of their monetary policy frameworks, and their experience has informed the approach we are pursuing. As Chairman Powell has indicated, with the U.S. economy operating at or close to our maximum-employment and price-stability goals, now is an especially opportune time to conduct this review. The unemployment rate is near a multidecade low, and inflation is running close to our 2 percent objective. By conducting this review, we want to ensure that we are well positioned to continue to meet our statutory goals in coming years. In addition, the Federal Reserve used new policy tools and enhanced its communication practices in response to the Global Financial Crisis and the Great Recession, and the review will evaluate these changes. Furthermore, U.S. and foreign economies have significantly evolved since the pre-crisis experience that informed much of the research that provided the foundation for our current approach. Perhaps most significantly, neutral interest rates appear to have fallen in the United States and abroad. Moreover, this global decline in r* is widely expected to persist for years. The decline in neutral policy rates likely reflects several factors, including aging populations, changes in risk-taking behavior, and a slowdown in technology growth. These factors' contributions are highly uncertain, but irrespective of their precise role, the policy implications of the decline in neutral rates are important. All else being equal, a fall in neutral rates increases the likelihood that a central bank's policy rate will reach its effective lower bound (ELB) in future economic downturns. That development, in turn, could make it more difficult during downturns for monetary policy to support spending and employment, and keep inflation from falling too low. Another key development in recent decades is that inflation appears less responsive to resource slack. That is, the short-run Phillips curve appears to have flattened, implying a change in the dynamic relationship between inflation and employment. A flatter Phillips curve is, in a sense, a proverbial double-edged sword. It permits the Federal Reserve to support employment more aggressively during downturns--as was the case during and after the Great Recession--because a sustained inflation breakout is less likely when the Phillips curve is flatter. However, a flatter Phillips curve also increases the cost, in terms of economic output, of reversing unwelcome increases in longer-run inflation expectations. Thus, a flatter Phillips curve makes it all the more important that longer-run inflation expectations remain anchored at levels consistent with our 2 percent inflation objective. Scope of the Review responsibility to conduct monetary policy "so as to promote effectively the goals of maximum employment, stable prices, and moderate long-term interest rates." review this year will take this statutory mandate as given and will also take as given that inflation at a rate of 2 percent is most consistent over the longer run with the congressional mandate. Our existing monetary policy strategy is laid out in the Committee's Statement on First adopted in January 2012, the statement has been reaffirmed at the start of each subsequent year, including at the FOMC's meeting last month with unanimous support from all 17 FOMC participants. The statement indicates that the Committee seeks to mitigate deviations of inflation from 2 percent and deviations of employment from assessments of its maximum level. In doing so, the FOMC recognizes that these assessments of maximum employment are necessarily uncertain and subject to revision. According to the Federal Reserve Act, the employment objective is on an equal footing with the inflation objective. As a practical matter, our current strategy shares many elements with the policy framework known in the research literature as "flexible inflation targeting." the Fed's mandate is much more explicit about the role of employment than that of most flexible inflation-targeting central banks, and our statement reflects this by stating that when the two sides of the mandate are in conflict, neither one takes precedent over the other. We believe this transparency about the balanced approach the FOMC takes has served us well over the past decade when high unemployment called for extraordinary policies that entailed some risk of inflation. The review of our current framework will be wide ranging, and we will not prejudge where it will take us, but events of the past decade highlight three broad questions. The first question is, "Can the Federal Reserve best meet its statutory objectives with its existing monetary policy strategy, or should it consider strategies that aim to reverse past misses of the inflation objective?" Under our current approach as well as that of most flexible inflation-targeting central banks around the world, the persistent shortfalls of inflation from 2 percent that many advanced economies have experienced over most of the past decade are treated as "bygones." This means that policy today is not adjusted to offset past inflation shortfalls with future overshoots of the inflation target (nor do persistent overshoots of inflation trigger policies that aim to undershoot the inflation target). Central banks are generally believed to have effective tools for preventing persistent inflation overshoots, but the effective lower bound on interest rates makes persistent undershoots more likely. Persistent inflation shortfalls carry the risk that longer-term inflation expectations become poorly anchored or become anchored below the stated inflation goal. In part because of that concern, some economists have advocated "makeup" strategies under which policymakers seek to undo, in part or in whole, past inflation deviations from target. Such strategies include targeting average inflation over a multiyear period and price-level targeting, in which policymakers seek to stabilize the price level around a constant growth path. These strategies could be implemented either permanently or as a temporary response to extraordinary circumstances. For example, the central bank could commit, at the time when the policy rate reaches the ELB, to maintain the policy rate at this level until inflation over the ELB period has, on average, run at the target rate. Other makeup strategies seek to reverse shortfalls in policy accommodation at the ELB by keeping the policy rate lower for longer than otherwise would be the case. In many models that incorporate the ELB, these makeup strategies lead to better average performance on both legs of the dual mandate and thereby, viewed over time, provide no conflict between the dual-mandate goals. The benefits of the makeup strategies rest heavily on households and firms believing in advance that the makeup will, in fact, be delivered when the time comes--for example, that a persistent inflation shortfall will be met by future inflation above 2 percent. As is well known from the research literature, makeup strategies, in general, are not time consistent because when the time comes to push inflation above 2 percent, conditions at that time will not warrant doing so. Because of this time inconsistency, any makeup strategy, to be successful, would have to be understood by the public to represent a credible commitment. That important real-world consideration is often neglected in the academic literature, in which central bank "commitment devices" are simply assumed to exist and be instantly credible on decree. Thus, one of the most challenging questions is whether the Fed could, in practice, attain the benefits of makeup strategies that are possible in models. The next question the review will consider is, "Are the existing monetary policy tools adequate to achieve and maintain maximum employment and price stability, or should the toolkit be expanded? And, if so, how?" The FOMC's primary means of changing the stance of monetary policy is by adjusting its target range for the federal funds rate. In the fall of 2008, the FOMC cut that target to just above zero in response to financial turmoil and deteriorating economic conditions. Because the U.S. economy required additional policy accommodation after the ELB was reached, the FOMC deployed two additional tools in the years following the crisis: balance sheet policies and forward guidance regarding the likely path of the federal funds rate. The FOMC altered the size and composition of the Fed's balance sheet through a sequence of three large-scale securities purchase programs, via a maturity extension program, and by adjusting the reinvestment of principal payments on maturing securities. With regard to forward guidance, the FOMC initially made "calendar based" statements, and, later on, it issued "outcome based" guidance. Overall, the empirical evidence suggests that these added tools helped stem the crisis and support economic recovery by strengthening the labor market and lifting inflation back toward 2 percent. That said, estimates of the effects of these unconventional policies range widely. In addition to assessing the efficacy of these existing tools, we will consider additional tools to ease policy when the ELB is binding. For example, as is presently Bank of Japan policy, the FOMC could, when the ELB is binding, establish a temporary ceiling for Treasury yields at longer maturities by standing ready to purchase them at a preannounced floor price. During the crisis and its aftermath, the Federal Reserve reviewed but ultimately found this tool and some others deployed by foreign central banks wanting relative to the alternatives it did pursue. But the review will reassess the case for these and other tools in light of more recent experience in other countries. The third question the review will consider is, "How can the FOMC's communication of its policy framework and implementation be improved?" Our communication practices have evolved considerably since 1994, when the Federal Reserve released the first statement after an FOMC meeting. Over the past decade or so, the FOMC has enhanced its communication practices to promote public understanding of its policy goals, strategy, and actions, as well as to foster democratic accountability. Strategy; postmeeting press conferences; various statements about principles and strategy guiding the Committee's normalization of monetary policy; and quarterly summaries of individual FOMC participants' economic projections, assessments about the appropriate path of the federal funds rate, and judgments of the uncertainty and balance of risks around their projections. As part of the review, we will assess the Committee's current and past communications and additional forms of communication that could be helpful. For example, there might be ways to improve communication about the coordination of policy tools or the interplay between monetary policy and financial stability. Activities and Timeline for the Review The review will have several components. will be conducting town hall-style "Fed Listens" events this year. We will hear from a broad range of interested individuals and groups, including business and labor leaders, community development professionals, and academics. The first of these events will take place Monday in Dallas. Another is scheduled at the Federal Reserve Bank of Minneapolis in early April, and other Reserve Banks will host events over the course of the year. In addition, we will sponsor a System research conference on June 4-5, 2019, at the Federal Reserve Bank of Chicago, with speakers and panelists from outside the Fed. The sessions will include overviews by academic experts of themes that are central to the review, including the FOMC's monetary policy since the financial crisis, assessments of the maximum sustainable level of employment, alternative policy frameworks and strategies to achieve the dual mandate, policy tools, global considerations, financial stability considerations, and central bank communications. Other sessions will feature panels of community leaders who will share their perspectives on the labor market and the effects of interest rates on their constituencies. We expect to release summaries of the "Fed Listens" events and to livestream the Chicago conference. Building on the perspectives we hear and on staff analysis, the FOMC will conduct its own assessment of its monetary policy framework, beginning around the middle of the year. We will share our conclusions with the public in the first half 2020. The economy is constantly evolving, bringing with it new policy challenges. So it makes sense for us to remain open minded as we assess current practices and consider ideas that could potentially enhance our ability to deliver on the goals the Congress has assigned us. For this reason, my colleagues and I do not want to preempt or to predict our ultimate finding. What I can say is that any refinements or more material changes to our framework that we might make will be aimed solely at enhancing our ability to achieve and sustain our dual-mandate objectives in the world we live in today. vol. . vol. 10 . . . . . . . . . Forum, sponsored by the Initiative on Global Markets at the University of Chicago . vol. 122 . Policy in a Low Journal of . no. 1, . . . , Fall, held at the . .pdf . . . Journal of . . . . vol. 4 . Journal of . vol. 32 . . symposium sponsored by the Federal Reserve Bank of Kansas City, held in . . .
r190222a_FOMC
united states
2019-02-22T00:00:00
The Future of the Federal Reserve's Balance Sheet
quarles
0
When I was asked to participate on this panel in the middle of last year, the prevailing metaphor regarding Federal Reserve balance sheet policy was "as boring as watching paint dry." Well, times have changed, and I commend the conference organizers for their foresight. Today I would like to discuss some of the recent decisions and lay out a rough framework for some further issues that are on the horizon. In January, after much discussion, including in previous meetings, the FOMC announced its intent to continue operating in a framework of ample reserves. In this regime, active management of the reserve supply is not needed. The Federal Reserve controls the level of the federal funds rate and other short-term interest rates primarily through the use of administered rates, including the rate paid on reserve balances and the offered rate on overnight reverse repurchase agreements. This regime is sometimes referred to as a floor system, because the administered rates place a floor under the rate at which banks and others will lend in the federal funds market. In adopting this framework, the Committee stated its intention to continue operating as it has for the past decade. The announcement was an important step in our normalization process. And we are now set up to make further decisions on the eventual size and composition of our balance sheet. Before providing more context on those decisions, let me first provide a little more detail around our decision to remain in the current framework of ample reserves. The most important factor in the decision was that the current system has worked very well. It has supported the achievement of our dual-mandate objectives of maximum employment and price stability. And it has shown itself to be flexible and well suited to maintaining interest rate control through various changes in money markets, bank regulation, and the Federal Reserve's balance sheet. Since the FOMC began lifting interest rates in December 2015, money market rates have generally moved closely with the federal funds rate, which in turn has followed changes in administered rates. Now that the decision on the operating framework has been made, a natural next step is to contemplate the appropriate size of the Fed's balance sheet and reserves and the process for getting there. In line with the requirements of operating with ample reserves- -and boosted by the growth in nonreserve liabilities--the Fed will maintain a larger balance sheet and reserve supply relative to the pre-crisis period, with the goal of remaining on the flat portion of the reserve demand curve. I would note that reserves have already declined appreciably from their peak, falling by $1.2 trillion to a current level of around $1.6 trillion. At the same time, we have seen a substantial increase in our nonreserve liabilities, such as currency in circulation and the Treasury General Account balance. In our statement on Policy Normalization Principles and Plans, we outlined an intention to hold no more securities than necessary to implement monetary policy efficiently and effectively. As the balance sheet continues to shrink, we are now in the process of determining that necessary size. Ultimately, the size of the balance sheet will be determined by a number of factors, including demand for nonreserve liabilities, such as currency (which has been rising), and, importantly, the quantity of reserves necessary to remain reliably on the flat portion of the reserve demand curve. Survey results suggest that banks have greatly increased their demand for reserves in the post-crisis period. Responses to the September Senior Financial Officer Survey report that banks would be comfortable with a level of reserves in the system in the neighborhood of $800 billion, taking into consideration the level of interest rates at the time. In part, this increased demand reflects a response to regulatory changes introduced after the crisis. These changes include, importantly, the requiring firms to hold sufficient high-quality liquid assets to cover potential outflows during times of stress. Reserves, along with Treasury securities, are favored under the LCR, and, consequently, firms currently meet a sizable fraction of their LCR requirements by holding reserves. Notwithstanding survey results, the level of reserve demand remains quite uncertain. It is possible that, over time, the preferences of banks will shift, or that demand will prove more price elastic than banks are currently expecting. As I have discussed previously, bank holdings of reserves to meet LCR requirements could shift toward Treasury securities, as aggregate reserves decline, without much upward pressure on the federal funds rate. That said, even if uncertain, it is probably safe to say that reserve demand is much higher than before the crisis. As we work to calibrate ample reserves, there are some tradeoffs that are worth noting. For example, we could operate with a level of reserve balances at the lower end of what might be considered ample. In that case, there would likely be occasions when unexpected declines in the supply of reserves or increases in the demand for reserves would require an open market operation to offset temporary upward pressures on the federal funds rate. Alternatively, we could operate with an average supply of reserves large enough to keep the federal funds rate determined along the flat portion of the reserve demand curve even with an unexpected shift in the supply of or demand for reserves. This approach would be operationally convenient but would also leave the size of the balance sheet and reserves larger than necessary most of the time. In my view, it might be appropriate for us to operate somewhere in between these two extremes, with a sizable quantity of reserves large enough to buffer against most shocks to reserve supply. On those few days when that buffer is likely to be exhausted, we could conduct open market operations to temporarily boost the supply of reserves. With so much uncertainty over the level and slope of the reserve demand curve, a degree of caution is warranted. As outlined in the minutes of the January FOMC meeting, the Committee has discussed ending the reduction in the Fed's aggregate asset holdings sometime in the latter half of this year, with still-ample reserves in the system. At that point, one option discussed, without any decision being made at this point, is to hold the level of total assets roughly fixed for a time. Even as the total size of the balance sheet remains fixed, the composition of the liabilities would gradually change, in part as demand for currency grows in line with the economy. Over time, the gradual increase in nonreserve liabilities would displace reserves as the overall balance sheet remains fixed. This plan would substantially reduce the pace of the decline in reserves, allowing us to gradually approach our assessment of the appropriate amount of reserves for the efficient and effective implementation of monetary policy. Of course, in the longer run, once we reach our preferred level of reserves, the balance sheet would have to resume growth to match a continued increase in demand for nonreserve liabilities. I would like to wrap up with a brief discussion of some of the other decision points we will encounter as we continue the process of normalizing our balance sheet. In particular, what does the Committee judge to be normal in regard to the type and duration of assets that we will hold? On composition, in line with our previously announced normalization principles, I favor a return to a balance sheet with all Treasury securities, allowing our mortgage-backed securities (MBS) holdings to run to zero. In those principles, we also state that while we do not expect sales of MBS as part of the normalization process, later we would be open to limited sales to reduce or eliminate residual holdings of MBS. In regard to duration, moving to shorten the duration of our holdings could increase the Fed's ability to affect long-term interest rates if the need arose. However, it might be preferable to have the composition of our Treasury holdings roughly match the maturity composition of outstanding Treasury securities, minimizing any market distortions that could arise from our holdings. Over the course of our upcoming meetings, I look forward to what promises to be an interesting discussion on these issues with my colleagues. Finally, in assessing our balance sheet policy, it is important to point out that the Fed remains entirely focused on meeting its statutory dual-mandate objectives of maximum employment and price stability. The normalization of the balance sheet is not a competing goal. If ever it appears that our plans for the balance sheet are running counter to the achievement of our dual-mandate objectives, we would quickly reassess our approach to the balance sheet.
r190223a_FOMC
united states
2019-02-23T00:00:00
Is Economics for Me? Increasing the Participation of Black Women in Economics
brainard
0
It is an honor to be a part of the inaugural Sadie T. M. Alexander conference in economics and related fields. I want to thank each of you in this room and those of you livestreaming who form the Sadie Collective. Special thanks go to Mykelle Richburg, Gifty Opoku-Agyeman for taking the initiative and launching this organization and this conference to increase the representation of black women in the field of economics. I applaud your efforts, and I support your mission. The story of Dr. Sadie Tanner Mossell Alexander is an inspiration for us all. A woman of firsts--the first African American to receive a Ph.D. in economics and the first woman to receive a law degree from the University of Pennsylvania--she was a pioneer who knocked down some daunting doors. The question for the women in this room here today is whether you will follow her through those doors and maybe knock down a few of your own. Let me start by making two observations about economics. First, the field of economics is rooted in evidence and research. The second observation follows strongly from the research and the evidence: Economics has a diversity challenge. There is a stubbornly persistent lack of diversity in the economics profession. Year after year, minorities and women are underrepresented in the pool of individuals awarded a doctorate in economics in the United States relative to their share in the broader population, and the gap is especially acute for women of color. The American American women were awarded a doctorate in economics in the United States, along with eight black/African American men, out of a total pool of 1,150 economics Ph.D.'s awarded overall. The economics diversity gap starts even earlier. Wilcox have documented, women and minorities are also underrepresented in undergraduate economics programs. From 2011 to 2015, women accounted for only 31 percent of undergraduate degrees in economics--substantially below their 57 percent share of all four-year undergraduate degrees. For black women, the gap is even bigger: they accounted for 1.5 percent of undergraduate economics degrees compared with a 6.2 percent share of all undergraduate degrees. A growing body of research and evidence makes clear that the quality of the economics profession and its contribution to society will be greater when a broader range of people are engaged. Research shows that greater diversity results in better outcomes- -it broadens the range of ideas and perspectives brought to bear on solving problems, and it brings important insights to the analysis of our economy. As demonstrate the benefits of diversity for group deliberations and decisionmaking. For instance, one well- known experiment found that racially diverse groups of students outperformed other groups in solving problems, and another found similar benefits from gender diversity. Turning to my own institution, it is notable that when the Congress established the Federal Reserve System, it took great care to ensure there would be a diversity of perspectives around the decisionmaking table in terms of regional representation. That is Committee. But we have not lived up to that standard on other dimensions of diversity. For instance, it was not until 2017, more than 100 years after the creation of the Federal Reserve, that the first African American, Raphael Bostic, was chosen to lead a Reserve Bank. We need to do better than that, and we will continue our efforts until the group of people around that table is more like America. To achieve our goals, we will need to improve the diversity of the economics ecosystem more broadly. Of course, the Federal Reserve System hires people with all kinds of expertise--from lawyers to law enforcement, from financial analysts to data scientists. But our footprint is especially large in the economics job market, where as a System we routinely hire one of every 25 newly minted economics Ph.D.'s each year. In short, we have a significant stake in the diversity and vibrancy of the economics profession overall. I have discussed why it is important for the Federal Reserve and for our economy to see more women of color embracing the field of economics. But what's in it for you? There are many ways your career may unfold where economics might fit in it. Some of you may decide your passion lies elsewhere as an undergraduate, and later find that advanced studies in economics align with your career interests. Others may already feel that economics is your calling. And just as there are a variety of reasons that people decide to deepen their studies in economics, so too there are many career fields where a degree in economics can be a powerful enabler. In my case, although I did not have an undergraduate economics degree, I later decided to pursue advanced study in economics because it provided a rigorous analytical framework and approach to data to address important problems facing America. Growing up, I saw how every family's well-being is affected by their financial resilience and their economic opportunities. I decided to study economics because it provides powerful tools to help promote a better future for many Americans. I have found different ways to approach that throughout my career--from teaching the next generation of problem solvers, to assessing the challenges facing American manufacturers, to seeing the potential of microfinance for financial inclusion, to promoting maximum employment and stable inflation in my current job. I recognize that it is challenging to look around a classroom or conference room and not see colleagues who you can relate to. It is humbling to imagine how that first economics class looked to Dr. Sadie Alexander. But now there are a number of leading scholars and practitioners in the field to provide inspiration and serve as role models. Today, you will hear from several, including Dr. Julianne Malveaux, Dr. Willene Johnson, and Dr. Lisa Cook, who will share with you their perspectives and discuss their important and interesting work. So what might you find by pursuing economics? As I have noted, you can influence people's lives for the better. You can craft policy to change our world. You can teach and help shape the next generation. You can find the answers to questions that matter most to you. You can develop the intellectual framework and tools that will enable you to pursue a range of opportunities, not just in economics, but also in business, finance, policy, and nonprofits. I hope you will take away the message that the field of economics would be the richer from your engagement and has much to offer as you decide how to make your contribution. There are many opportunities for you in the field of economics, both here and abroad, in government, academia, and the private sector. Dr. Sadie Alexander knocked down the door, and I hope you will consider following her through it.
r190228b_FOMC
united states
2019-02-28T00:00:00
U.S. Economic Outlook and Monetary Policy
clarida
0
Thank you for the opportunity to participate in the 35th Annual Economic Policy Conference of the National Association for Business Economics. Before we begin our conversation, I want to share a few thoughts about the outlook for the economy and monetary policy. The U.S. economy expanded at a robust pace in 2018, and my baseline outlook for 2019 foresees somewhat slower but still-solid growth in the year ahead. In July, just about four months from now, the current economic expansion will become the longest on record. The Federal Reserve is charged by the Congress with achieving and sustaining a dual mandate of maximum employment and price stability, and the economy is as close as it has been in many years to meeting these goals. The unemployment rate is near the lowest level recorded in 50 years, and average monthly job gains have continued to well outpace the increases needed over the longer run to provide jobs for new entrants to the labor force. Most measures of nominal wage growth are running at or somewhat above the 3 percent pace, and recent wage gains have been strongest for lower-skilled workers. Moreover, the strength of the labor market appears to have encouraged people to join the labor force and others, who might have left it, to continue working. The labor force participation rate--the share of people who are either working or looking for work-- has moved up 1/2 percentage point over the past year, and the participation rate of prime- age workers (those 25 to 54 years old) has risen about 1-1/2 percentage points over the past few years. Inflation, as measured by the 12-month change in the price index for personal consumption expenditures (PCE), is estimated to have been a little bit below 2 percent of late, largely because of recent declines in energy prices. However, core PCE inflation, which excludes food and energy prices and tends to be a better indicator of future inflation, is estimated to have been about 2 percent. Market-based measures of inflation compensation have moved lower, on net, since last summer, though they have increased some recently, and some survey-based measures of longer-term inflation expectations are little changed. That said, taken together, the evidence suggests that measures of expected inflation are at the lower end of a range that I consider to be consistent with our price- stability goal of 2 percent PCE inflation. While my baseline outlook for growth, employment, and inflation is a positive one, a number of crosscurrents that are buffeting the economy bear careful scrutiny. Global growth is slowing, particularly in China and Europe. Global policy uncertainty remains elevated. And financial conditions have been volatile, making efforts to extract signal from noise more challenging. As I have indicated in recent speeches, monetary policy at this juncture needs to be especially data dependent, with the federal funds rate now in the range of Federal Moreover, with employment and inflation now at or close to our dual-mandate objectives, the FOMC in its January statement indicated it can afford to be patient as we assess the need for further adjustments in our policy stance. Going forward, we need, I believe, to be cognizant of the balance we must strike between (1) being forward looking and (2) maximizing the odds of being right given the reality that the models that we consult are not infallible. For example, were a model to predict a surge in inflation, a decision for preemptive hikes before the surge is evident in actual data would need to be balanced against the considerable cost of the model being wrong. Given muted inflation and stable inflation expectations, I believe we can be patient and allow the data to flow in as we determine what future adjustments to the target range for the federal funds rate may be appropriate to strike this balance. We also decided at our January meeting to maintain our current operating regime--a "floor" system--for implementing monetary policy. The FOMC will continue to set the stance of policy by establishing the target range for the federal funds rate. The interest on excess reserves rate will be our primary tool to keep the federal funds rate in the target range. In this regime, we will provide an ample supply of reserves in the banking system to ensure that we remain on the flat portion of the reserve demand curve and that the federal funds rate is insulated from shocks to reserve demand and supply. With this decision on our operating regime made, the Committee can now decide on the appropriate timing and pace for concluding our balance sheet drawdown. In the longer run, the ultimate size of the balance sheet will be determined by the demand for Federal Reserve liabilities such as currency and reserve balances. Finally, in November, we announced a review of the Federal Reserve's monetary policy strategy, tools, and communications practices. In this review, we will listen carefully to a broad range of stakeholders offering a full range of perspectives from across the country, and we will draw on these insights as we assess how best to achieve and maintain maximum employment and price stability in the most robust fashion possible. Taking these viewpoints on board, the FOMC will begin its own discussions this summer on how we might refine our framework and will provide a public assessment after the review is completed. Thank you for your attention, and I look forward to our conversation.
r190228a_FOMC
united states
2019-02-28T00:00:00
Recent Economic Developments and Longer-Term Challenges
powell
1
It is a pleasure to speak here this evening at the 87th Awards Dinner. Tonight I will start with the near-term outlook for the U.S. economy. Then I will turn to a topic that is inspired by the Citizens Budget Commission's mission statement, which focuses on the "well-being of future New Yorkers." I imagine that future New Yorkers attending this dinner in 50 years may not look back on the near-term outlook in February 2018 as very interesting or important. So, tonight, after a brief review of the here and now, I will focus on an issue that is likely to be of more lasting importance: the need for policies that will support and encourage participation in the labor force, promote longer-term growth in our rapidly evolving economy, and spread the benefits of prosperity as widely as possible. Beginning with the here and now, Congress has charged the Federal Reserve with achieving maximum employment and stable prices, two objectives that together are called the dual mandate. I am pleased to say that, judged against these goals, the economy is in a good place. The current economic expansion has been under way for almost 10 years. This long period of growth has pushed the unemployment rate down near historic lows (figure 1). The employment gains have been broad based across all racial and ethnic groups and all levels of educational attainment as well as among the And while the unemployment rate for African Americans and Hispanics remains above the rates for whites and Asians, the disparities have narrowed appreciably as the economic expansion has continued. Nearly all job market indicators are better than a few years ago, and many are at their most favorable levels in decades. After lagging earlier in the expansion, wages and overall compensation--pay plus benefits--are now growing faster than a few years ago (figure 3). It is especially encouraging that the labor force participation rate of people in their prime working years, ages 25 to 54, has been rising for the past three years. More plentiful jobs and rising wages are drawing more people into the workforce and encouraging others who might have left to stay. In addition, business-sector productivity growth, which had been disappointing during the expansion, moved up in the first three quarters of 2018. Rising productivity allows wages to increase without adding to inflation pressures. Sustained productivity growth is a necessary ingredient for longer-run improvements in living standards. The price stability side of our mandate is also in a good place. After remaining below our target for several years, inflation by our preferred measure averaged roughly 2 percent last year (figure 4). Inflation has softened a bit since then, largely reflecting the recent drop in oil prices. Futures markets and other indicators suggest that oil prices are unlikely to fall further, and if this proves correct, oil's drag on overall inflation will subside. Consistent with that view, core inflation, which excludes volatile food and energy prices and often provides a better signal of where inflation is heading, is currently running just a touch below our 2 percent objective. Signs of upward pressure on inflation appear muted despite the strong labor market. While the data I have discussed so far give a favorable picture of the economy, it is also important to acknowledge that not everyone has shared in the benefits of the expansion to the same extent, and that too many households still struggle to make ends meet. In addition, over the past few months we have seen some crosscurrents and conflicting signals about the near-term outlook. For instance, growth has slowed in some major economies, particularly China and Europe. Uncertainty is elevated around some unresolved government policy issues, including Brexit and ongoing trade negotiations. And financial conditions have tightened since last fall. While most of the incoming domestic economic data have been solid, some surveys of business and consumer sentiment have moved lower. Unexpectedly weak retail sales data for December also give reason for caution. Given the positive outlook but also muted inflation pressures and the patient as we determine what future adjustments to the target range for the federal funds rate may be appropriate to support our dual-mandate objectives. This common-sense risk-management approach has served the Committee well in the past. I will turn now from the near-term outlook to the question of how the economy will perform over the long haul. From 1991 through 2007, the economy expanded annually at about 3 percent, similar to the pace for much of second half of the 20th century. Since 2007, however, growth has averaged just 1.6 percent. If the earlier 3 percent growth had persisted over the past 12 years, incomes today would be almost 20 percent higher than they now are. From the standpoint of future Americans, if the slower growth persists for a half-century, incomes will end up roughly half of what they would have been. Why has growth slowed, and what can we do about it? To understand the causes of the slowdown, it is useful to divide growth into two components: (1) growth in the cumulative number of hours of worked by all workers and (2) growth in the amount of output derived, on average, from each hour of work. We refer to output per hour of work as "labor productivity." From 1991 through 2007, when the economy expanded at a 3 percent average rate, hours worked increased about 1 percent a year and economy-wide Since 2007, both of these growth factors have slowed by about half, with hours worked annually increasing only 0.5 percent from 2008 to 2018 and productivity rising just 1 percent on average. Growth in hours worked has slowed, in part, because of slower U.S. population growth. Birth rates have edged down, and immigration has slowed. Not only is the total population growing more slowly, but the share of the population in their prime working years is falling steadily as the very large baby-boom generation is moving into retirement. Demographic factors are generally slow moving and predictable, and there is no surprise in the fact that slower population growth and the retirement of the baby boomers are now contributing to slower growth in the total amount of work performed in the economy. There is another factor contributing to the slower growth in hours, however, and this factor is more surprising and more troubling. To be counted as "in the labor force," a person must either be employed or have looked for work within the past four weeks. The share of people of working age who are actually in the labor force has fallen significantly since the late 1990s. This decline raises the important question of why have people of working age increasingly chosen not to work. The data suggest that there are both positive and more problematic forces at work. For example, among those aged 16 to 24, participation in the labor market has fallen from about 65 percent in the 1990s to 55 percent now (figure 6). But this drop in participation appears to reflect young people getting more education. The fraction of this age group who are neither in school nor in the labor force has held fairly constant at around 11 percent, and measures of school enrollment are up. Higher educational attainment is much more important in today's job market than in the past, and investing in education today has long-term benefits for both the student and for society. Statistics confirm that higher educational attainment is associated with higher labor force participation, lower unemployment, and higher wages. Turning to those aged 25 to 54, the participation picture is more troubling. Among prime-age men, participation has been falling for more than 60 years, with the decline averaging about 1.5 percentage points per decade (figure 7). For women, participation rose over the second half of the 20th century until peaking in the late 1990s. Since then, women's participation has dropped just a bit. To put these numbers in context, let's look at data from other advanced economies. Prime-age male participation has fallen some across most of these economies since 1995 (figure 8). But the decline in the United States has been much larger than most, and U.S. participation was below the middle of the pack at the outset. As a result, the United States now has the fourth lowest participation rate among 34 advanced economies. For women's participation, the details are different, but the bottom line is similar. In the mid-1990s, the United States ranked in the upper tier for prime-age women's participation, but since then participation by women has advanced rapidly in many countries while it has declined slightly in the United States. Now the United States is sixth lowest among these 34 countries. Researchers have investigated numerous possible reasons for the decline in prime- age participation. Among men, the drop in participation is much sharper for those with only a high school education or less. The drop for women is also sharper for those who are high school educated. This pattern is consistent with the idea that a modern economy demands ever-higher skills, and that workers without those skills are being left behind (figure 9). But the international experience suggests that this outcome is not inevitable: The drop in participation among those with less education is much smaller in some comparable countries than in the United States. The research into labor force participation in the United States and across the world does not find a magic fix, but it does suggest a variety of policies that might better prepare people for the modern workforce as well as support and reward labor force participation. I should note that the Fed has neither the tools nor the mandate to directly address the forces that are holding back labor force participation. We can contribute by fostering a strong labor market, in accordance with our mandate. While it is not the Fed's role to advocate particular labor force policies, I do want to put a spotlight on this important issue. I strongly believe policies that bring prime-age workers into productive employment, particularly those who may have been left behind because of low skills or educational attainment, could bring great benefits both to those workers and to our economy. The second factor accounting for the slowdown in GDP growth is the slower pace of labor productivity growth, or output per hour worked. When measured annually, labor productivity growth is volatile, but focusing on five-year averages, we can see that from 1975 through 2007, productivity growth averaged about 2 percent while fluctuating between about 1 and 4 percent (figure 10). Since then, growth seems to have settled at the low end of that historical range. Unlike the situation with labor force participation, the slowdown in productivity growth is also evident in most advanced economies, and the U.S. experience is roughly comparable to that of other countries. There is an ongoing debate over the causes and implications of this global slowdown in productivity growth. Some argue that the rapid growth seen over much of the 20th century was historically anomalous, and that we are destined to return to the slower growth of centuries past. Others are more optimistic that strong growth can return. Many have noted that during the current expansion, investment and capital accumulation have been lower than in previous expansions, so perhaps we just need to invest more. Unfortunately, it does not seem to be that simple. Standard reasoning holds that capital-per-worker drives productivity. While we have had slower capital accumulation of late, we have also had slower growth in labor supply--hours worked. Thus, capital per worker, according to some analysis, has continued to increase roughly at its pre-recession trend. In this view, the productivity problem is not simply one of inadequate investment. Researchers have proposed several reasons why, even if the quantity of investment has kept up, recent investment may be leading to smaller productivity advances than in the past. The more optimistic analysts argue that we may be in a productivity lull while businesses work to realize the full benefit of advances that are embedded in recent investment. Others suggest that productivity-advancing ideas are inherently harder to find and exploit than in the past, implying that slower productivity growth may be with us for the long haul. This debate is unlikely to be resolved anytime soon. In the meantime, we should look for policies that will create an environment in which productivity can flourish. We need policies that support innovation and create a favorable environment for investment in both the skills of workers and the tools they have. Indeed, the recent tax reforms were designed in part to boost capital investment and thus productivity. Once again, my goal tonight is to highlight the importance of growth-enhancing policies. Because these policies are not the province of the Fed, I will not advocate for particular approaches. Instead, I will just observe that researchers and policy analysts have proposed many promising ideas that may be capable of attracting wide support. Policies that succeed in enhancing productivity growth would greatly benefit future generations of To conclude, the United States is currently in the midst of one of the longest economic expansions in our history. Unemployment is low and inflation is close to our 2 percent objective. My colleagues and I on the FOMC are focused on using our monetary policy tools to sustain those favorable conditions. Tonight I have also highlighted some longer-term challenges we face, including low labor force participation by prime-age workers and low productivity growth. By promoting macroeconomic stability, the Fed helps create a healthy environment for growth. But these longer-term issues require policies that are more in the province of elected representatives. The nation would benefit greatly from a search for policies with broad appeal that could promote labor force participation and higher productivity, with benefits shared broadly across the nation.