Jump to content

David Broadland

David Broadland
  • Posts

    274
  • Joined

  • Last visited

 Content Type 

Focus Magazine Nov/Dec 2016

Sept/Oct 2016.2

Past Editions in PDF format

Advertorials

Focus Magazine July/August 2016

Focus Magazine Jan/Feb 2017

Focus Magazine March/April 2017

Passages

Local Lens

Focus Magazine May/June 2017

Focus Magazine July/August2017

Focus Magazine Sept/Oct 2017

Focus Magazine Nov/Dec 2017

Focus Magazine Jan/Feb 2018

Focus Magazine March/April 2018

Focus Magazine May/June 2018

Focus Magazine July/August 2018

Focus Magazine Sept/Oct 2018

Focus Magazine Nov/Dec 2018

Focus Magazine Jan/Feb 2019

Focus Magazine March/April 2019

Focus Magazine May/June 2019

Focus Magazine July/August 2019

Focus Magazine Sept/Oct 2019

Focus Magazine Nov/Dec 2019

Focus Magazine Jan/Feb 2020

Focus Magazine March-April 2020

COVID-19 Pandemic

Navigating through pandemonium

Informed Comment

Palette

Earthrise

Investigations

Reporting

Analysis

Commentary

Letters

Development and architecture

Books

Forests

Controversial developments

Gallery

Store

Forums

Downloads

Blogs

Events

Everything posted by David Broadland

  1. November 2017 Recent scientific studies show how resident orca populations are affected by diminishing chinook runs and—critically—why the chinook are disappearing. RIVERS RUNNING INTO PUGET SOUND have perennially low returns of chinook salmon—currently estimated at just 10 percent of their historic levels—even though many of them are enhanced with hatcheries. Last year, scientific research connected this decline to secondary sewage treatment plants discharging partially-treated effluent into Puget Sound. Last June, a group of Washington scientists published a study showing the extent to which the decline in the birth rate of the Southern Resident Killer Whale population, listed as “endangered” by both the Canadian and US federal governments, is linked to the precarious state of the Salish Sea’s chinook salmon. Puget Sound chinook, which were given “threatened” status under the US Endangered Species Act in 1999, have become a cross-border issue. Recovery of both Puget Sound chinook and the Southern Resident Killer Whale population would require investment of many billions of dollars by Washington State in new sewage treatment infrastructure. While taking action to protect both the orca and chinook is required by US federal law, Washington State currently has no plans to make that investment. Is our southern neighbour ignoring its responsibility to be a good environmental steward? Killer Whales can be long-lived (“Granny,” above, lived past 100), but their birth rate is dependent on chinook salmon, a threatened species in Puget Sound. (Photo: markmallesonphotography.com) LAST JUNE, A BRILLIANT SEVEN-YEAR-LONG STUDY that correlated the declining birth rate of the Southern Resident Killer Whale population with falling chinook salmon numbers, mercilessly compared what’s happening to the remaining orcas to the mass starvation of the Dutch population at the hands of German Nazis during World War II. The authors stated: “The Nazis closed off the borders of Holland between October 1944 and May 1945, causing massive starvation over a 5–8 month period, with good food conditions before and after. There was a one-third decline in the expected number of births among confirmed pregnant woman during the under-nutrition period. Conceptions during the hunger period were very low. However, women who conceived during the hunger period had higher rates of abortion, premature and stillbirths, neonatal mortality and malformation. Nutrition had its greatest impact on birth weight and length for mothers experiencing hunger during their second half of gestation, when the fetus is growing most rapidly.” The inclusion of the word “Nazis” in a peer-reviewed scientific study on the reproductive dynamics of an endangered whale population may strike some as odd, but the Dutch Famine, as the above events are known, was highly unusual: it took place in a well-developed, literate population that kept excellent health records and the vast majority of those affected survived. Thus it was one of the first events in human history for which scientists had accurate, reliable records to help them understand what health impacts occur when a population of mammals is starved. The orca scientists found that a similar dynamic between food availability and birth rate has been impacting the Southern Resident Killer Whale (SRKW) population, but with one big difference: For the orca, this is not a one-time event. For them, a months-long famine now occurs almost every year. Dr Samuel Wasser, the study’s lead author, is a research professor of conservation biology at the University of Washington. Wasser’s team gathered evidence from 2008 to 2014. They found that 69 percent of detectable pregnancies in the SRKW population failed during that period. Of those failed pregnancies, the scientists found, “33 percent failed relatively late in gestation or immediately post-partum, when the cost is especially high.” That high cost included an increased risk of mortality for the would-be mother. The scientists observed: “Low availability of chinook salmon appears to be an important stressor among these fish-eating whales as well as a significant cause of late pregnancy failure, including unobserved perinatal loss.” They added: “However, release of lipophilic toxicants during fat metabolism in the nutritionally deprived animals may also provide a contributor to these cumulative effects.” In other words, not only are the orca being starved, but when a starved, pregnant orca begins burning off her fat reserves in response to the scarce supply of food, toxins bioaccumulated in her fat reserves—such as PCBs and PBDEs—begin to have more of an impact on her health, such as a reduced ability to fight infections. This could contribute to the demise of the fetus and increase the risk to the mother’s life. As a consequence of these conditions, the study found “the 31 potentially reproductive females in the SRKW population should have had 48 births between 2008–2015. Yet, only 28 births were recorded during that period. The 7 adult females in K pod have not had a birth since 2011, and just two births since 2007. The 24 females in the remaining two pods (J and L) have averaged less than 1 birth per pod since 2011, with no births in 2013, but had 7 births in 2015. One of the two offspring born in 2014 died.” As of this writing, the population has dwindled to 76 whales. As recently as 1996 there were 98 orca in the 3 pods. How did the scientists determine that 69 percent of all pregnancies failed? After all, many of the pregnancies terminated early on, and there would have been no visible signs that the females had been pregnant. How does one detect whale pregnancies? Detection dogs. Tucker, one of Wasser’s orca poop detection dogs (Photo: University of Washington) Over the seven years of the study, the scientists intermittently followed J, K and L pods through the Salish Sea and used specially-trained dogs stationed at the bow of the research vessel to sniff for orca poop, and then point out its location to the scientists. The poop was collected and later genotyped (associated with a known individual whale) and analyzed for hormone measures of pregnancy occurrence and health. The scientists also looked for chemical indicators of nutritional and disturbance stress in the poop. By making the same measurements over time, they were able to distinguish between nutritional stress caused by low availability of chinook salmon, and disturbance stress caused by the presence of nearby boats. Fisheries scientists had previously estimated that 70 to 80 percent of the SRKW population’s year-long diet consists of chinook salmon. The whales are thought to prefer chinook over other species of salmon partly because they use echolocation to find their prey. Since adult chinook are physically larger (they can weigh as much as 55 kilograms) than adults of other salmon species, chinook might be easier for orca to find. As well, there are runs of chinook returning to spawn in different river systems in the spring, summer and fall (sockeye, coho and chum return only in the fall). In the past that meant a reliable, almost year-round supply of chinook. And chinook may be preferred by the orca simply because of its higher fat content compared to other salmon. Canada’s Department of Fisheries and Oceans (DFO) estimates that reliance on chinook rises to 90 percent during July and August as the resident orca target returns to the Fraser River and rivers flowing into Puget Sound. Although the link between the abundance of chinook salmon in the Salish Sea and the physical health of the southern resident population was known, Wasser’s research provides the first confirmation that low availability of chinook is suppressing the population’s birth rate and endangering the health of reproductive females. Wasser included comparison over the seven years of the study of the two main chinook runs that are keeping the southern orcas alive: the Columbia River early spring run and the Fraser River summer and fall runs. Depending on the timing of those runs, and how many fish were in them, the southern resident orca experienced more or less intense famines through the winter months and between the spring and summer runs. Estimating how many more chinook would need to be in the Salish Sea to make up for the southern orcas’ nutritional deficit wasn’t part of Wasser’s research. But in 2010, DFO estimated the nutritional requirement of the southern resident orca population, which then numbered 87, at about “1200 to 1400” chinook per day. Over the five-month period the orca occupy their critical habitat in the Salish Sea each year, that would amount to 180,000 to 210,000 chinook. Wasser’s research shows the whales weren’t catching enough chinook in 2010 and the deficit is threatening the population. Yet in the Salish Sea in 2010, the total number of chinook caught by commercial and sport fisheries, plus the number of chinook that escaped to spawn, was about 500,000 fish. (These numbers are from the US EPA and the Pacific Salmon Commission.) Of those, 320,000 returned to their natal rivers to spawn. The 180,000 fish taken by commercial and sports fishers were split roughly in half between Canada and the US, even though 94 percent of the spawning fish were headed for the Fraser River in Canada. Only 6 percent were headed for rivers in Puget Sound. Note that the total catch taken by humans is roughly equivalent to the catch required by orca. The quickest way to end the orca famine would be to end the commercial and sports fisheries for chinook in the Salish Sea, and Canadian scientist David Suzuki called for that action following the release of Wasser’s study. To recover chinook populations, however, will require a deeper understanding of why they are declining. A comparison of the Southern Resident Killer Whale population with their northern cousins helps in that understanding. Wasser noted the “significantly lower” fecundity rate of SRKW compared to the Northern Resident Killer Whale (NRKW) population. From a 2011 study by Ellis, Tower and Ford, we know that in 1974 there were 120 whales in the NRKW population; by 2011 that had risen to 262. According to Canada’s Species at Risk Registry, the population grew to 290 by 2014. DFO used this number in its 2017 reports. Above: Both NRKW and SRKW populations feed primarily on chinook, but one population of whales is growing while the other has stagnated since 1974. Data from DFO and The Center for Whale Research. Over that same period, though, the SRKW population went from 70 to a high of 98 in 1996 and then dropped to the current 76. Although both resident groups experienced a decline in population after 1996-1997 following significant declines in chinook runs, the northern population then recovered and grew steadily while the southern population has languished. As mentioned above, scientists have determined that both orca populations prey heavily on chinook as they return to spawn. It’s also known that, while their territories overlap, the northern orca rely on chinook returning to spawn in rivers north of the Salish Sea. The relative strength of the northern population compared to the southern, then, suggests the low availability of chinook that’s limiting growth of the southern orca population is a result of something that’s happening to the southern chinook that’s not happening to the northern chinook. What could that be? The most dangerous period in a chinook salmon’s life, according to fisheries scientists, is its first year. Research scientist Dr James Meador, an environmental toxicologist with the US National Oceanic and Atmospheric Administration (Fisheries) in Seattle, estimates the current first-year survival rate of Pacific Northwest ocean-type juvenile chinook salmon at 0.4 percent. That’s four-tenths of one percent. Another way of stating that is that 99.6 percent of ocean-type chinook salmon die in their first year. That year is spent in their natal river, their natal estuary and marine waters not too far from that estuary. Since this is where almost all of the mortality occurs, it follows that any substantial recovery of chinook numbers would require conditions in these areas to improve. A doubling of the current rate of survival in that first year—so that only 99.2 percent of them die—could double the number of fish that return to spawn. We’ll come back to Meador later. Wasser and his University of Washington team concluded their paper with this noteworthy comment: “Results of the SRKW study strongly suggest that recovering Fraser River and Columbia River chinook runs should be among the highest priorities for managers aiming to recover this endangered population of killer whales.” What about Puget Sound, where chinook runs are listed as “threatened”? Historically, according to Jim Myers of the Northwest Fisheries Science Centre in Seattle, the Puget Sound chinook runs were about 25 percent greater than the Fraser River’s. But in 2010, according to the US EPA and Pacific Salmon Commission, Puget Sound returns were only 6 percent of Fraser River returns. The much bigger hole in chinook numbers is in Puget Sound. Shouldn’t international attention be focussed there? Instead of accepting responsibility for the role it has played in the orca famine, Washington State has shifted public attention away from its lack of action, thereby reducing the chances of the Southern Resident Killer Whales’ survival. Now the situation is getting critical. The EPA recently downgraded the endangered whales’ survival status from “neutral” to “declining.” Time is running out. Wasser, on sabbatical, was unavailable to explain why the recovery of Puget Sound chinook stocks shouldn’t be a priority in the effort to recover the southern population of killer whales. However, an examination of two scientific studies published by Meador shed light on why Wasser and other fisheries researchers might not regard recovery of the Puget Sound runs as a likely prospect to save the orca. The decline of the Southern Resident Killer Whales may be linked to the low survival rate of juvenile Chinook salmon in contaminated Puget Sound estuaries. (Photo by Roger Tabor, US Fish and Wildlife Service) IN 2013, DR JAMES MEADOR published the study “Do chemically contaminated river estuaries in Puget Sound affect the survival rate of hatchery-reared chinook salmon?” Meador was with the Ecotoxicology and Fish Health Program at the Northwest Fisheries Science Center in Seattle. NFSC is a division of NOAA. In that study, Meador observed: “Ocean-type chinook salmon that rear naturally or are released from a hatchery migrate in the spring and summer to the estuary as subyearlings (age 0+) and reside there for several weeks as they adjust physiologically to seawater and increase in size and lipid content before moving offshore to marine waters… Conversely, juvenile coho salmon spend their first year in freshwater and migrate to the estuary in the spring or summer as yearlings (age 1+), generally spending only a few days in the local estuary before proceeding to more open waters. This major difference in life history can have a large effect on the degree of toxicant exposure in contaminated estuaries, which can affect fish in several ways, including impaired growth, altered behavior, higher rates of pathogenic infections, and changes to physiological homeostasis, all of which can lead to increased rates of mortality.” The physiological process of a juvenile salmon acclimatizing to saltwater is known as “smolting.” The juveniles become “smolts.” Meador examined the records from hatcheries on major rivers flowing into Puget Sound over the 36 years between 1972 and 2008. Some of the rivers had contaminated estuaries while others were considered uncontaminated. He determined the difference in the chinook smolt-to-adult return rate between rivers with contaminated estuaries and those with uncontaminated estuaries. Meador noted that the smolt-to-adult return rate is the “primary metric to assess life-cycle success.” He did the same analysis for hatchery coho in these rivers. Coho pass quickly through their natal estuaries and so would be far less impacted by contaminants in that estuary. The coho data, Meador clarified, “was used as another line of evidence to test the hypothesis that contaminated estuaries are one of the main factors determining the rate of survival for chinook.” And that’s what he found: Coho survival was not substantially impacted by contamination in their natal estuary. Meador noted that “Salmonid survival is dependent on a large number of factors, many that co-occur. The analysis presented here is simplistic, but highlights an important relationship between hatchery chinook survival and contaminated estuaries. Because this analysis examined the smolt-to-adult survival rate in fish from a large number of hatcheries and estuaries over several years in one geographical location, many of these factors were likely accounted for and therefore had less of an effect on the overall results.” As mentioned earlier, mortality in the first year of an ocean-type chinook is high. Meador described this as follows: “Survival for first-year ocean-type chinook in the Pacific Northwest has been estimated at 0.4 percent. Rates of survival over successive years are considerably higher for 2-, 3-, 4-, and 5-year-old fish at 60 percent, 70 percent, 80 percent, and 90 percent, respectively. Clearly, first-year survival is important for chinook, and most of the mortality for first-year ocean-type chinook is attributed to predation, poor growth, pathogens, starvation, and toxicants.” Meador determined whether or not a particular estuary was “contaminated” or “clean” based on existing records of contaminants found in juvenile chinook tissue in that estuary, records of sediment contamination, and whether or not the estuary had been listed as a contaminated site. He noted that most of the data on contaminants he was able to access had focussed on polychlorinated biphenyls (PCBs) and polycyclic aromatic hydrocarbons (PAHs). The scientists did not make their own measurements of contaminants in the estuaries, nor did they speculate on the possible sources of such contamination. They simply compared the statistical differences in survival rates for chinook smolts between apparently contaminated estuaries and apparently uncontaminated estuaries. Meador concluded that “when all data were considered…the mean survival for juvenile chinook released from hatcheries into contaminated estuaries was 45 percent lower than for fish outmigrating through uncontaminated estuaries.” In other words, a contaminated natal estuary causes a nearly two-fold reduction in survival compared with uncontaminated estuaries. Wow. That was quite a discovery: A single factor that doubled the mortality of a threatened species of fish that was known to be the cornerstone of the diet of an endangered species of whale. Meador’s data was confined to juveniles that came from hatcheries. Does his conclusion apply to river-reared chinook? Meador’s study reported that, except for the Skagit River, Puget Sound rivers are all dominated by hatchery-bred chinook. But, for juveniles whose parents spawned in rivers, the effect of contaminants may be even greater than for hatchery-bred fish. Meador noted that “wild juvenile chinook spend approximately twice as long in the estuary as do hatchery fish, which would likely increase their exposure to harmful chemicals.” If the incidence of a contaminated natal estuary was limited to one or two of Puget Sound’s smaller rivers, this effect might not be of too great consequence. But that’s not the case. Some of the Sound’s largest river systems have contaminated estuaries. For example, the Snohomish and Puyallup rivers have the second and third largest drainage areas in the Puget Sound Basin, an indication of their potential for rearing chinook. Two forks of the Snohomish—the Skykomish and the Snoqualmie—have, according to Washington fisheries scientists, the potential for up to 84,000 spawners. But over the last four decades these rivers have been averaging only 4,500, a mere 5 percent of this river system’s potential. Meador’s research suggests this and other rivers’ collective capacity to provide nourishment for a healthy Southern Resident orca population is being cut in half, year after year, by the contamination in their estuaries. But what contamination? The Puyallup River—which once hosted one of the largest chinook salmon runs in Puget Sound—now hosts the Tacoma Central Wastewater Treatment Plant, which is permitted to discharge up to 10,000 kilograms of suspended solids per day into the river’s estuary, habitat critical to juvenile chinook. IN 2016, MEADOR PUBLISHED “Contaminants of emerging concern in a large temperate estuary” in the scientific journal Environmental Pollution. The CECs targeted in the study included a long list of pharmaceutical and personal care products, hormones, and a number of industrial compounds. Many of these substances, the authors observed, “are potent human and animal medicines.” They considered their targets to be just a “representative subset” of CECs in the environment, not a comprehensive list of what’s actually there. The scientists estimated there are over 4000 CECs leaking out into the ecosphere. Meador referenced his earlier study, noting that “juvenile chinook salmon migrating through contaminated estuaries in Puget Sound exhibited a two-fold reduction in survival compared to those migrating through uncontaminated estuaries.” His choice of targets suggests that he suspected secondary sewage treatment plants might be the source of the contamination that is causing that two-fold reduction in juvenile chinook survival. He noted that “some CECs are poorly removed by wastewater treatment plant processing or are discharged to surface waters, including streams, estuaries, or open marine waters due to secondary bypass or combined sewer overflows.” Having found no other research by other scientists along this line of investigation, Meador noted that “bioaccumulation and comparative toxicity to aquatic species constitutes the largest data gap in assessing ecological risk” posed by CECs. Meador’s team targeted 150 contaminants. They focussed on three estuaries, two considered to be contaminated and one uncontaminated. The two contaminated estuaries (Puyallup River and Sinclair Inlet) each had one or more secondary sewage treatment plants discharging treated effluent into the rivers on which they were located. The third, the Nisqually River estuary, which doesn’t have a sewage treatment plant above it, was intended as a reference—an uncontaminated estuary to establish the extent to which the other two were contaminated. The researchers took water samples from the estuaries and effluent from the treatment plants and analyzed each for the 150 target contaminants. As well, they netted juvenile chinook and Staghorn sculpin from the estuaries and whole-body tissue analyses for contaminants were performed. Eighty-one of the CEC’s were found in effluent being discharged from the treatment plants; 25 were detected in the estuaries. To the surprise of the researchers, nine (9) of the CECs were detected in the water column of the Nisqually estuary, which they had supposed was uncontaminated. Their data indicated an even more disturbing situation: “Collectively, we detected 42 compounds in whole-body fish. CECs in juvenile chinook salmon were detected at greater frequency and higher concentrations compared to Staghorn sculpin.” Finding more CECs in fish tissue than estuary water meant juvenile chinook were quickly bioaccumulating the CECs. Moreover, the chinook were absorbing a higher dose of toxins in just a few weeks than were the Staghorn sculpin, which spent their entire life in the estuary. Of the targeted contaminants, 37 were found in chinook. This included, from A to Z: Amitriptyline, Amlodipine, Amphetamine, Azithromycin, Benztropine, Bisphenol A, Caffeine, DEET, Diazepam, Diltiazem, Diltiazem desmethyl… well, you get the picture. How might multi-contaminant doses lower the survival rate of juvenile chinook? The scientists found “several compounds in water and tissue that have the potential to affect fish growth, behavior, reproduction, immune function, and antibiotic resistance,” all of which could lead to early mortality. But they also noted that even if individual contaminants weren’t at a lethal level in tissue or organs, the cumulative effect of so many different contaminants in the juvenile chinook at the same time could very well be lethal—the drug-cocktail effect that so many humans experience, sometimes with fatal results. The scientists put this finding in the context of Puget Sound as a whole: “The greater Puget Sound area contains 106 publicly-owned wastewater treatment plants that discharge at an average total flow about 1347 million litres per day (Washington Department of Ecology (2010)). Our study examined two of these with a combined total of 71 million litres per day. The output for these two wastewater treatment plants alone was on the order of kilogram quantities of detected CECs per day into estuarine waters of Puget Sound. Considering the low percentage of commercially available pharmaceutical and personal care products analyzed in this study and the amount of effluent discharged to Puget Sound waters, it is possible that a substantial load of potentially harmful chemicals are introduced into streams and nearshore marine waters daily. If the concentrations from the two studied effluents are representative of that from other wastewater treatment plants in Puget Sound, then it is reasonable to assume that inputs to streams and nearshore waters are substantial and likely on the order of 121 kilograms per day (approximately 44,000 kilograms annually) and even higher if secondary treatment bypass, permitted flows, maximum outputs, unmeasured compounds, septic system contributions, and transboundary contributions are considered.” Some of Puget Sound’s largest secondary sewage treatment plants. There are 106 publicly-owned sewage treatment plants in the Puget Sound Basin. Many are located on or near to the natal estuaries of threatened chinook salmon runs. All of Puget Sound is considered to be an estuarine ecosystem. The data the scientists collected contained another ominous finding. The concentrations of the targeted contaminants found in the effluent from the treatment plants were unexpectedly high, by American standards. Meador found that “a large percentage of the chemicals detected in Puget Sound effluents are among the highest concentrations reported in the US, which may be a function of per capita usage of these compounds or the treatment processes used at these wastewater treatment plants.” One final, noteworthy point: In the estuary that was thought to be uncontaminated—the mouth of the Nisqually—the researchers found 9 of the targeted contaminants in estuary water and 13 in chinook. Meador observed, “Based on our water and fish data, the Nisqually estuary was more contaminated than expected, which highlights the difficulties of establishing suitable non-polluted reference sites for these ubiquitously distributed CECs.” This observation has an interesting implication with respect to Meador’s earlier study, mentioned above, in which he was comparing the survival rates of juvenile chinook between contaminated estuaries and those considered uncontaminated. The Nisqually estuary was on the “uncontaminated” side of the ledger in that study, but on investigation it was, in reality, merely less contaminated. Would Meador’s finding of double the rate of mortality have risen if he actually had a number of pristine estuaries to compare with those that are contaminated? IN AN EARLIER STORY (“Washington’s phony sewage war with Victoria,” Focus, May 2016) we reported on the 32.4 million kilograms of suspended solids permitted to be discharged by 77 of Puget Sound’s largest wastewater treatment plants each year. Attached to those solids are many contaminants, including PCBs and PBDEs, not targeted by Meador’s study, but known to have a negative impact on the health of fish and their sources of food. The additional impact on chinook smolts, after they leave their natal estuaries and migrate through this near-shore chemical soup—dubbed “Poisoned Waters” by the 2005 PBS film of that name—is hinted at by the Puget Sound Basin’s 10-fold decline in chinook returns from historic numbers. As the urbanization of Puget Sound’s shores has spread, and the daily recontamination of marine and estuarine waters has grown, the chinook and the Southern Resident Killer Whales have been pushed closer and closer toward extinction. This intense urbanization—right beside the critical habitat of both whales and their prey—is not occurring for the Northern Resident Killer Whale population, and that difference may be the deciding factor in the different birth rates of the two populations. Given the seriousness of the situation and the headlines in the media about drugged fish in Puget Sound, one might have reasonably expected that Washington State’s political leaders would respond to Meador’s findings. After all, what Everett-Seattle-Tacoma residents were flushing down their toilets into Puget Sound by way of sewage treatment plants was doubling the rate of mortality of a fish already listed as threatened under the Endangered Species Act. They did respond, but apparently only to deflect attention away from Puget Sound’s contamination from sewage plants. To do that they pointed at…Victoria. Just two days after an embarrassing drugged-chinook story appeared in the Seattle Times, Washington State Representative Jeff Morris boldly announced a proposal to ban Washington State employees from claiming travel expenses for trips made to Victoria until Victoria built a sewage treatment plant just like the ones around Puget Sound. A week later, Morris sent a letter to Victoria Mayor Lisa Helps claiming that “chemical loading” from Victoria’s marine-based sewage treatment system poses a “long-term risk” to “our shared waters.” Morris’ letter was signed by 36 other Washington legislators whose districts border on Puget Sound. The legislators’ letter informed Helps: “We recognize the shared risk in short-term loss of tourism activity on both sides of the border from publicity surrounding [Victoria’s lack of secondary sewage treatment]. However, we believe the long-term damage to marine mammals, in particular, but all marine wildlife, does more long-term damage to ecotourism.” Washington State Representative Jeff Morris Morris’ idea that extinctions should be prevented because they’re bad for tourism highlights the gap between a politician’s level of understanding of this critical issue and the depth of knowledge that has been created by scientists like Wasser and Meador. If State legislators were drawing up an action plan for the recovery of Puget Sound, they could do worse than to put on their list: “Read some science about contamination.” The Washington legislators’ proposal to discourage State employees from travelling to Victoria—a move they didn’t follow through on—wasn’t the only action precipitated by Meador’s science. There was a bureaucratic response as well. The Puget Sound Partnership (PSP), which describes itself as “the State agency leading the region’s collective effort to restore and protect Puget Sound,” undertook two related “actions” after Meador’s study had been published. One of those was “Action 0156,” which directed the University of Washington to conduct an “analysis of impacts…from Victoria, BC sewage.” Nowhere to be found on PSP’s long list of actions was any analysis of the impacts from the 106 publicly-owned sewage treatment plants around the Sound that are permitted to discharge over 32.4 million kilograms of suspended solids each year. The PSP also committed to “Action 0048,” which was “Identifying sources of contaminants harmful to juvenile salmon.” PSP reports that after the expenditure of $273,000, the project is “off-schedule.” Contacted by Focus, the Washington State Department of Ecology—the agency responsible for undertaking the analysis—clarified that the study “was not actually funded.” It appears that little else on the “Action” list for the Sound’s recovery is funded, either. PSP estimated its list of “Actions” for 2016 would cost $130 million, but acknowledged that only $17 million of that had been found. Washington’s Department of Ecology confirmed that, as of 2016, the State had no plans to upgrade or relocate any of the existing large sewage treatment plants on Puget Sound. Washington State says it’s commited to the recovery of Puget Sound. That would require the State to act on its scientists’ findings about the ecological impacts of ongoing contamination from its sewage treatment facilities. Unfortunately, the State’s current course doesn’t appear likely to produce anything that the Southern Resident Killer Whales will be able to chew on. David Broadland is the publisher of Focus Magazine.
  2. September 2017 To create a realistic pathway to a low-carbon regional transportation system, science—not activism—needs to lead the way forward. IT HAD LONG BEEN MY UNDERSTANDING that cycling—all on its own—would become a significant part of the solution for reducing local transportation emissions. However, when I used the Capital Regional District’s most recent comprehensive travel survey to estimate the relative amount of work done by each form of transportation at the regional level, I was flabbergasted to find that cycling accounts for such a tiny share: 1.5 percent in 2011. The amount of work done by each transportation mode can only be compared when you consider the total distance travelled each day by CRD residents using each type of transport. Replacing the work done by fossil-fuelled automobiles is essential if we’re going to reduce emissions. But how much of that work can be replaced by humans exerting themselves by cycling or walking instead of driving? More than is currently the case in our region, no doubt. But when we consider how to shift enough of the work done by automobiles to more energy efficient modes of transportation, like walking, cycling, and transit, the magnitude of the challenge facing us becomes clear. There has to be a huge shift in how people move around, quickly. Why time is such a critical part of the equation should be obvious, and the Trudeau government’s announcement late last year of a mid-century emissions goal establishes the rate of descent for making reductions. The perplexing question is: What do we shift to? Cycling and walking are part of the solution, but there needs to be a massive shift of the work done by cars to public transit. If other places that have already made this change are any indication of what Victoria will choose to do, the role of cycling and walking will largely be for making the first short leg of a trip made by public transit. While we’re seeing local governments create isolated pockets of inordinately expensive improvements for cycling, there’s little evidence that the region is on the verge of making sensible (let alone massive) investments in public transit. I pointed this out in the last edition in “Mayor Helps’ 1.5 percent solution,” which was subtitled, “Local government’s response to reducing transportation emissions may be wishful thinking. Or foolish.” New two-way protected cycling corridor in Downtown Victoria Responding at a local level to the existential threat posed by climate change, rising sea level and ocean acidification—all caused by carbon emissions—will be a transformative, Herculean task that requires constant, difficult conversation about the path we should be on. If we Earthlings don’t do this work—including the conversations—we’re cooked. What is the task facing us? According to the CRD, 55 percent of emissions generated in the region come from fossil-fuelled vehicles. Unless there is a significant and quick decline in their use, the planet will be at increasing risk of runaway warming. We simply can’t take a long-term approach to this shift. How rapidly do we need to act? The Trudeau government’s overall emissions goal is to lower them by 80 percent (compared with 2005 levels) by 2050. As yet, no targets have been set for individual economic sectors, but it’s reasonable to assume that the transportation sector’s contribution would have to be on the order of 80 percent, give or take a few percentage points. To be on the most gradual descent that would get Canada to that goal, transportation emissions, and those from other sectors, would need to be reduced by about 34 percent over the next 12 years. Canada’s mid-century emissions target, announced in late 2016, means an overall emissions reduction of 34 percent by 2030—12 years from now. To put that time frame into perspective, consider that the City of Victoria started the process to replace the Johnson Street Bridge in 2008. It will, hopefully, open for traffic in 2018, ten years later. The amount of time left before 2030 is only a little longer than the City of Victoria needed to build a 156-metre-long bridge. What would this rapid transformation mean for drivers of fossil-fuelled cars in Victoria? Collectively, over the next 12 years, we will have to either drive 34 percent less distance each day, get new vehicles that use, on average, 34 percent less fuel, shift 34 percent of our travel to non-fossil-fuel modes of transportation, or employ a strategy that combines some or all of these. What is the CRD’s plan for responding to the goals announced by the Trudeau government in late 2016? In its already-outdated 2014 Regional Transportation Plan (RTP), the CRD noted: “Long-term transportation planning efforts and investments are therefore needed to help reduce GHG emissions and adapt to a changing climate—both requirements are fundamental principles to all of the themes elaborated in this RTP. This means focusing on integrating land use and transportation planning to support sustainable transportation choices and reduce trip distances.” The CRD’s short-term plan is to double ridership on public transit by 2030 and build more cycling and pedestrian infrastructure. Will this suffice to meet our national emission reduction target? The short answer is a definite “No.” I’ll show you the arithmetic for that conclusion later on. In “Mayor Helps’ 1.5 percent solution” I used the CRD’s most recent and most comprehensive survey of the region’s transportation system, done in 2011. It showed that autos accounted for 88 percent of the distance travelled in the CRD each day. By comparison, public transit accounted for 7.1 percent, walking 1.7 percent, and bicycles 1.5 percent. I questioned whether the CRD’s plan would be able to significantly shift the share of the work being done by the various modes of transportation enough to significantly reduce emissions. These numbers baffled cycling advocates, who were more familiar with “mode share” to describe cycling’s contribution to our transportation needs. Mode share is a way of comparing the number of individual trips made by each form of transportation in a day. Using mode share, both a 3-kilometre trip on a bicycle and a 10-kilometre drive in a car are given equal weight. Although the CRD’s 2011 information shows bicycling had a mode share of 2.8 percent in the region, in certain places and for certain trip purposes, such as commuting to work in the City of Victoria, cycling’s mode share can be considerably higher. The Victoria area isn’t much different from Vancouver, where cycling accounts for about 1 percent of total distance travelled. Notably, Metro Vancouver’s equivalent of the travel study done by the CRD includes such information, whereas the CRD does not. Share of total distance travelled by each mode of travel (Source: 2011 Metro Vancouver Regional Trip Diary Survey Analysis Report) Presenting basic information about the work done by components of transportation systems in this way might be discouraging to cyclists. However, when the primary consideration is reduction of emissions, “mode share” provides no useful information. As laid out in the CRD’s emissions reduction plan, the task will be to shift some fossil-fuelled auto use to a combination of transit, cycling and walking. Only by including the distance travelled, which reflects all the current realities about where people live, study, work and play and how far they have to travel each day to accomplish what they need to do, can we gauge how much energy needs to be shifted from autos to other modes. To put it as plainly as possible, a 34 percent reduction in emissions would require, after factoring in small increases in fuel efficiency and a small shift to electric vehicles, a shift of about 25 percent of the distance travelled in fossil-fuelled autos to non-fossil-fuelled modes over the next 12 years. I’ll elaborate on this later. AS MENTIONED ABOVE, my use of “total distance travelled” to compare the current energy contribution of different modes baffled cycling advocates. Former City of Victoria councillor John Luton, who has played a lead role in promoting cycling infrastructure projects in the region, wrote on Facebook, “Stories emerging from unreliable sources claim that CRD numbers show that only 1.5 percent of trips in the region are bicycle trips.” Luton went on to state, “Promoters of this theory are dishonest or unable to understand statistical information…The premise used to sell this fairy tale is that total mileage equals number of trips. That is false.…lying about the numbers is not a useful contribution to these discussions.” Edward Pullman, president of the board of directors of the Greater Victoria Cycling Coalition, responded to Luton: “Spot on John. By focussing exclusively on total distance travelled, folks that commute long distances become more important than those that live closer to their destinations. It’s a bizarre perversion of commuter choices.” Contacted by email, neither Luton nor Pullman could explain what their comments had meant. The story did not propose that “total mileage equals number of trips,” as Luton claimed. Former MLA and cycling advocate David Cubberley asserted: “There are no useful analytics involved in focussing on total distance travelled.” In a letter to Focus, Paul Rasmussen wrote, “Using the percentage of total miles travelled by mode… seems designed to minimize the positive impact of cycling.” The idea that our story was intentionally “designed to minimize the positive impact of cycling” occurred to other readers, as well. Transportation planning consultant Todd Litman wrote a lengthy response to our story in an online blog in which he claimed I had written that bicycle lanes were “wasteful” and “unfair to motorists.” On the basis of those claims—neither of which were made in our article, or intended—Litman continued on to assert what possessing such beliefs must indicate about the writer, including this zinger: “Critics like Broadland imply that cycling facilities only benefit a small number of serious cyclists—those who ride expensive racing bikes wearing lycra.” Nothing like that, though, was either stated or intended in our story. Luton, Pullman, Cubberley, Rasmussen and Litman are all in a position to influence the CRD’s plan for reducing emissions and the expenditure of many millions of dollars in public resources, yet none of them seemed able to understand what the CRD’s own numbers say about the magnitude of the energy shift that will be required to meet the federal target. Instead, they mounted a defense of cycling on the basis of other details we reported—or didn’t report—about the new Pandora Avenue protected bike lanes. Litman complained: “By extrapolating the Pandora bike lane cost to other Downtown arterials, Broadland estimates that Victoria’s cycling program will cost $16 million, which is almost certainly an exaggeration since the first project is always more costly than those that follow.” But the City’s record of underestimating and hiding project costs is a matter of public record. For example, when City councillors voted to replace the Johnson Street Bridge in 2009 they understood the project would cost $40 million. It’s now close to triple that. A more prudent reporter would have pushed the City’s bike lane estimate much higher. I simply extended the City’s actual cost per kilometre for the Pandora lanes—which was higher than the City’s budget estimate—to the full length of the protected corridor it plans to build. Merely reporting the likely cost of the planned Downtown protected network was, it seemed, enough to set the cycling advocates’ sense of fairness on fire. Rasmussen wrote, “Broadland criticizes the cost of the project—which he claims will be $16 million—over twice as much as the City says it will cost. In the eight years I’ve lived in Victoria, this is the first time that any entity has spent any significant amount of money on bike infrastructure. Meanwhile, just off the top of my head, I can count three significant projects for automobile traffic within the CRD in the last few years—the McTavish Interchange at $24 million, the Johnson Street Bridge project at $100 million and counting, and the McKenzie Interchange project at least $85 million. So that’s at least $210 million for car infrastructure just in major projects. Maybe even $16 million for something that promotes a clear social good isn’t so much?” Rasmussen could have included the $30-million Leigh Road Interchange (aka The Bridge to Nowhere) in Langford on that list, but let’s examine his claim a little more closely. The cost of the new McKenzie interchange, for example, includes the cost of space for cyclists, pedestrians and public transit. The new Johnson Street Bridge also includes space for those three non-car modes. In fact, 53.5 percent of the bridge’s available deck space is dedicated to pedestrians and cyclists. If the final cost of the bridge is $115 million—which it will be once hidden and as-yet undetermined costs for landscaping and additional protective fendering are included—should 53.5 percent of that cost be assigned to cycling and walking? That would be $62 million. Moreover, the public record of how this project unfolded shows that cycling advocates greatly overstated the extent to which the old bridges were being used by cyclists and their exaggerations helped to inflate the project into the public works nightmare it has become (See “Juking the stats,” Focus November 2011). Comparison of the space for autos (red) and cyclists and pedestrians (green) on the new Johnson Street Bridge (Source: PCL drawing) In Litman’s response to our story he wrote, “Cyclists just want a fair share of public resources (transportation funding and road space). What would be fair? You could argue that it should be about equal to cycling’s mode share: if 5 percent of trips are by cycling then it would be fair to invest 5 percent of public resources in cycling facilities. But this is backward looking since it reflects the travel patterns that occur under current conditions, ignoring ‘latent demand,’ the additional cycling trips that some travellers want to make but cannot due to inadequate facilities. To respond to these demands it would be fair to invest the portion of money and road space that reflects the mode share after those programs are completed; if comprehensive planning is likely to result in 10 percent cycling mode share, it would be fair to invest 10 percent of transportation funds and road space in cycling facilities.” Litman’s point isn’t particularly relevant to a discussion focussing on whether proposed bicycle and LRT infrastructure will effectively address emissions reduction, but it’s worth exploring. The record at the City of Victoria shows that transportation infrastructure decisions have been wonky, but not in the direction Litman claims. Again, consider the new Johnson Street Bridge. In the only reliable survey comparing trips across the bridge—published in a 2010 economic assessment used by the City to promote a new bridge—cycling and walking accounted for about 6 percent of mode share during periods of the year when those modes are at their peak. In the winter that share drops. Yet the new bridge will provide them with over 53 percent of the available deck space. So far there is no evidence to suggest mode share for cycling and walking will ever reach 53 percent, but they got it anyway. The City of Victoria Engineering Department's traffic counts on the Johnson Street Bridge used in a 2010 economic impact analysis to support a new bridge: Autos on left, buses centre, bicycles on right. Reading the various responses to our story, I got the strong impression that cyclists were not willing to consider the story’s core idea: Transportation infrastructure decisions need to more strongly reflect the urgent need to reduce transportation emissions, and we need better, more timely information on vehicle use in the CRD in order to gauge the effectiveness of the strategies that are being employed to reduce emissions. By “better” I mean more trustworthy information, the gathering of which is insulated from the influence of special interest groups like the Greater Victoria Cycling Coalition, engineering and project management corporations, or current and former politicians. In email exchanges with Litman and others, it emerged that, in their minds, Focus had written the wrong story. The cycling advocates were furious that our article focussed so narrowly on the issue of emissions reduction rather than fully explaining all the other benefits that more cycling infrastructure would bring, such as cleaner air, greater personal safety for cyclists and a reduction in vehicle congestion. Litman wrote, “Public investments should be evaluated based on total benefits and costs. My report, ‘Evaluating Active Transportation Benefits and Costs’ (vtpi.org/nmt-tdm.pdf ) provides a framework for doing just that: it identifies about a dozen categories of impacts (benefits and costs) that should be considered when evaluating walking and cycling policies and programs, including direct impacts on users, and indirect impacts on society. Your column only considered two benefits: increased user safety and climate change emission reductions. That is grossly incomplete and undervalues cycling improvements.” Our story, in fact, made no attempt to examine “increased user safety” beyond presenting Mayor Helps’ publicly stated position. Nor was it our purpose to present any of cycling’s other benefits. Our focus was on emissions reduction and getting better information. Litman encourages us to evaluate cycling infrastructure on the basis of total benefits and cost, but this would be an exceedingly speculative endeavour. Consider cost. The 2011 CRD Pedestrian and Cycling Master Plan—the only plan for building cycling infrastructure in the member municipalities of the CRD—estimated the cost of a region-wide bicycle network at $275 million. But that plan didn’t include any cycling improvements on Pandora Street. Yet it’s still the “Master Plan.” Indeed, the plan estimated costs of $3.3 million for 22.7 kilometres of “priority” bike lanes in the City of Victoria. But that’s a lower cost than the actual cost incurred for only 1.4 kilometres of protected bike lanes on Pandora (which wasn’t in the plan). And, optimistically, the plan estimates the cost of “all projects” (54.7 kilometres) in the City of Victoria at $12.4 million. Yet that won’t even cover the four legs of the 5.3-kilometre-long protected network in the Downtown core. The plan’s estimates for other municipalities seem even wilder, if that’s possible. For example, it put the cost of 26.5 kilometres of bikeway in View Royal at $36 million. Why would $36 million be spent way out in View Royal and only $12.4 million in Victoria? By the way, the consultant who wrote the CRD’s Master Plan lived in Oregon. Even if we did have a good grasp of the benefits an advanced cycling network might provide, the cost estimating that has been done so far is deeply flawed. So how can a useful cost-benefit analysis be conducted? Again, the CRD needs more trustworthy information gathered by a process that’s insulated from special interest groups. In any case, cyclist-centric claims about mode share, costs and fairness—and the backlash from other parts of the community those claims generate—are diversions for which we no longer have time. Shouldn’t the choice about how to transform our transportation system be simpler than that? Shouldn’t it be: Are we going to make a serious attempt to meet the federal emissions target or not? If we are, what do we need to do to accomplish that? Personally, I’m not interested in writing about all the benefits of a “sustainable” transportation system if that system won’t come anywhere close to meeting our 2030 emissions reduction target. So here’s the crux of the problem: The emissions reduction potential of an improved cycling network, if that’s all that’s executed, is limited. A paper published by Litman quoted results from “a detailed study of five US communities with active transport improvements” which found the improvements resulted in a reduction of “one to four percent of total automobile travel.” A “one to four percent” reduction would be the equivalent of rearranging the deck chairs as the ship is sinking. We need a 34 percent reduction in 12 years. Let’s shift back to what our regional transportation system would need to look like by 2030 so that we could meet that target. To get a clearer picture, let’s start in the Netherlands. The Netherlands has invested billions of dollars in public transit and infrastructure for bicycles and pedestrians. Is this a solution for Victoria? STATISTICS NETHERLANDS REPORTS that, in 2015, with 1.1 bicycle for each of its nearly 17 million inhabitants, that country had “the highest bicycle density in the world.” Featured prominently in its depiction of that country’s transportation system is a chart showing the percentage that each different mode contributed to transportation of people on land—bicycles, cars, buses, trains, walking, etcetera. Percentage of what? The percentage of the total distance travelled: Domestic distance travelled by transport mode in the Netherlands (Source: Statistics Netherlands) According to Statistics Netherlands, cars accounted for 73 percent of the total distance people travelled within their country. Public transit provides 12 percent, bicycles 7 percent and walking 3 percent. The City of Amsterdam, considered to have the greatest regional participation in cycling of any large European city, also publishes comparisons of the extent to which each transportation mode is used within that city, both by mode share and total distance travelled: Mode share (left) and share of total distance travelled (right) in the City of Amsterdam (Source: City of Amsterdam) The combined mode share for cycling and walking amounts to 54 percent (30 + 24). Yet when the total-distance-travelled lens is applied, together they account for 14 percent (12 + 2). The Dutch, rightfully proud of their extensive use of bicycles for transportation, have no problem being transparent about how much of the work of transporting people is done by each mode. Cars, at 54 percent, still account for the majority of the work done. (According to TomTom, an Amsterdam-based company that measures vehicle congestion all over the globe, Amsterdam’s traffic congestion is increasing; it’s already at a level higher than many American cities.) In the CRD, 88 percent of that work is being done by cars. The 34 percentage points of difference between Victoria’s and Amsterdam’s reliance on fossil-fuelled cars to transport people is, completely coincidentally, equal to the shift Victoria would need to make by 2030 to be on a path that would meet the federal mid-century goal. In other words, Victoria would need to become Little Amsterdam (Amsterdam has a metropolitan population of 1.6 million, Victoria’s is 368,000) within 12 years—the equivalent of a moonshot. Amsterdam’s achievements, it should be noted, include extensive bus, tram, metro and railway networks which provide the means to extend the length of a trip that a person starts and ends as a pedestrian or a cyclist. This achievement has taken many decades and many billions of dollars. For example, the city’s 73 kilometres of underground metro lines have a current value of $30-40 billion. Amsterdam’s highly developed public tramway, metro and railway system. Bus routes aren’t shown. Estimated cost? Unknown, but the 9.5-kilometre North-South Line (shown by the blue line), a new metro line currently under construction, will cost the equivalent of $4.6 billion CAD. What would Victoria need to do to knock 34 percent off its emissions tally? Let me take you through that exercise, but keep in mind that this is an arithmetical exercise performed only to provide you with a sense of the magnitude of the challenge we face. To do it we need to start with some basic assumptions. First, let’s assume 4 percent of fossil-fuelled auto travel in the CRD shifts to electric cars over the next 12 years (it’s currently less than 1 percent). That would take care of 4 percent of transportation emissions and our reduction requirement would fall to about 30 percent. If there’s a quick breakthrough in super-capacitor technology, which could replace the lithium ion batteries currently used in electric vehicles, this shift could eventually be much higher. But even such an unexpected breakthrough wouldn’t have a big impact over the next 12 years. Secondly, let’s assume there will be only minor emission reductions as a result of people using cars with higher fuel efficiency. In the USA earlier this year, Trump ordered a review of Obama’s regulations requiring much greater fuel efficiency by 2025. There’s broad expectation in the US that those standards will be rolled back, partly because car manufacturers have made the case that Obama’s regulations can’t be met without making cars unaffordable. Canada harmonizes with the US on such matters, so higher fuel efficiency seems like a long shot. Still, let’s include a conservative five percent reduction in car emissions due to fuel efficiency gains by 2030. Now we’re down to the need for a 25 percent reduction from taking other actions. Most people are aware of the need to reduce emissions and believe they already limit their travel to only what’s essential. That leaves government only one option: somehow persuading drivers to replace 25 percent of their current auto travel with a combination of public transit, bicycling or walking. How will we be persuaded? There would be no need for a carbon tax if people would voluntarily limit their auto use to the level governments told them was necessary. But we’re not like that, so implementation of a much higher carbon tax to start pushing the most cost-sensitive drivers out of their cars would have to occur soon. The Province’s account of BC’s emissions shows the current level of the carbon tax doesn’t appear to be having much bite, especially with gas prices as low as they are. So our last assumption is that much more serious fuel-cost persuasion will begin soon. With current total travel by autos in the CRD running at approximately five million kilometres each day, 25 percent of that—or 1.25 million kilometres per day—would need to be shifted from cars to buses, walking and cycling. However, in reducing the distance driven by autos by 25 percent, we would also likely displace 25 percent of the 1 million kilometres travelled in autos by passengers each day. So the shift to public transit, walking or bicycles would need to amount to about 1.5 million kilometres per day. Doubling the mode share of buses by 2030—the CRD’s stated goal—would cover about 500,000 kilometres of the required shift. The remaining 1 million kilometres of the shift would fall to walking and cycling. When added to their current levels, that would mean that cycling and walking would account for about 1.2 million kilometres each day, or about 18 percent of the total distance travelled—in just 12 years time. Now compare that with Amsterdam. Its combined total for bicycles and walking is 14 percent of the total distance travelled—a level that has taken several decades and billions of dollars invested in infrastructure for walking, bicycles, buses, subways, trams and commuter rail. Moreover, Amsterdam has packed 1.6 million people into an area about the same size as Victoria’s metropolitan area. That high population density, over four times Victoria’s, is essential for the financial viability of Amsterdam’s expansive, complex and costly public transit system. For the CRD’s vaguely-outlined plan to work, the distance travelled by cycling and walking would have to increase by about 600 percent (over levels in 2011) within 12 years. For a City with a steadily aging population and a so-so transit system, is this realistic? Has the CRD come up with the moonshot plan that will reduce the region’s transportation emissions by 34 percent within 12 years? So far, only minimal information has emerged into public view about how the region’s public transit system will evolve so its mode share doubles by 2030. What seems evident is that the rationale stated in the CRD’s Regional Transportation Plan for very expensive rapid transit is much more of a response to brief periods of traffic congestion—along the Trans Canada Highway out to Langford, and the Pat Bay Highway out to Sidney, during peak commuting periods—than it is a response to the need to cost-effectively reduce emissions throughout the day. The assumption that such congestion will continue on the Trans Canada, even after the new McKenzie Road interchange is complete, is founded on the debunked theory that most future growth in the region will occur in Langford. The 2016 census data shows that over the past 15 years—Langford’s glory years—the Core’s share of the metropolitan population has hardly changed, dropping from 68 percent to 65 percent. That strongly suggests the best place to focus future investment in public transit is where most of the people already live—in Victoria and Saanich. Instead, the CRD could be the first government in history to plan for an LRT to Nowhere. After the next 12 years, of course, the same rate of shift from autos to public transit, cycling and walking would have to continue—right through to 2050. Keep in mind, too, that transportation emissions in Canada amount to about 24 percent of total emissions, so to be on the most gradually descending path to 2050, all the other sectors would need to be reducing their emissions as well. That will impact all of our lives in ways that, at this point, we haven’t yet imagined. But unless we do it—according to the world’s best scientific minds—we’re cooked. Is Victoria’s political culture up to the task of getting us through this daunting challenge? The short answer may lie in the record of the attempt to build a new Johnson Street Bridge. An even more chilling possibility is hinted at by the misplaced effort to convert Victoria’s safe, source-controlled, low-cost, tidal-powered marine-based sewage treatment system to a land-based system that will cost Victorians billions of dollars over the life of the infrastructure that’s being built. According to DFO scientists, land-based sewage treatment will have negligible effect on environmental conditions in the Strait of Juan de Fuca. The existing marine-based system was endorsed by an overwhelming number of Victoria’s marine scientists and current and former public health officials. One of the DFO scientists I spoke with during those deliberations was Sophie Johannessen, the lead author of the peer-reviewed study that found land-based treatment would have a negligible environmental effect on environmental conditions in the Strait. I asked Johannessen if there was anything the community could do that would have a more positive effect on marine ecosystems than moving Victoria’s marine-based sewage treatment system onto land. “I think so, yes,” Johannessen said. “We could reduce our greenhouse gas emissions, enact source control for persistent contaminants, and reduce other local pressures on the marine biota.” The local political culture didn’t listen to the scientists. Instead it followed Mr Floatie to Seattle and started the never-ending process of flushing billions of dollars down our toilets. On atmospheric emissions, the scientists have spoken loudly and clearly: there’s a pressing need to act. In response, will our politicians be led by special interest groups? Or will their decisions be based on science and evidence? David Broadland is the publisher of Focus Magazine.
  3. July 2017 Local government’s response to reducing transportation emissions may be wishful thinking. Or foolish. IS THE CITY OF VICTORIA’S STRATEGY to create protected bike lanes in the Downtown core a well-thought-out strategy to make bicycling safer, relieve vehicle congestion and move Victoria in the direction of a low-carbon future? Or is it another case—like the Johnson Street Bridge Replacement Project—of the City unintentionally displaying its proven tendency toward decision-based evidence-making? The first component of the strategy—a $3.5-million, 1.2-kilometre-long corridor on Pandora between Cook and Store—became operational in May. By mid-June the City’s PR team announced “the number of cyclists using the new bike lanes is very encouraging” with “nearly 40,000 bicycle trips” made along the corridor in its first month of operation. That timeframe, and the numbers, included Victoria’s Bike to Work Week, an annual outpouring of temporary enthusiasm. A second protected bicycle corridor—1.2 kilometres of Fort from Cook to Wharf—was approved by City of Victoria councillors on June 8. Construction is scheduled to begin in September. The City plans to expand these corridors to Wharf, Humboldt and Cook. At the cost per kilometre of the Pandora corridor, the 5.3-kilometre-long Phase 1 would cost about $16 million—and depends almost entirely on the availability of grants from the Gas Tax Fund. The rationale behind the protected lanes—as opposed to cyclists sharing the existing infrastructure with automobiles—is to increase the safety of cyclists. But creating protected lanes has resulted in removal of auto parking space, already in short supply in the Downtown core much of each day. The Pandora corridor removed 43 auto parking stalls; another 30 will be removed on Fort Street. At that rate of parking space removal, Phase 1 would see about 175 spaces disappear. Before construction of the protected corridors began, the City had less than 2000 on-street parking spaces Downtown. So Phase 1, originally planned to be complete by the end of 2018, will see the loss of nearly 10 percent of on-street parking in the Downtown core. The City’s aim appears to be to quickly replace a significant fraction of motorized individual transport with unmotorized individual transport. For people who drive a car, truck or van Downtown and don’t see themselves as likely to ever switch to a bicycle, the new situation feels like an attempt to force them to make a change they can’t or don’t want to make, and carries a whiff of social engineering. Some Downtown businesses have expressed concern that making vehicle parking Downtown less available will discourage potential clients and impact their businesses. But Victoria Mayor Lisa Helps has argued that protected bicycle corridors will make auto parking more available, not less. Her theory is that by making biking around Downtown safer, people who in the past would only travel there by auto will now be encouraged to come by bicycle. Victoria’s 40-ish mayor is an avid cycle commuter and she now has a protected corridor that runs most of the 1.4 kilometres from her home in Fernwood to her place of work at City Hall. Implicit in the City’s decision to proceed along this course is the belief that the number of cyclists, especially those commuting to work, needs to be encouraged and allowed to grow far beyond current levels. Why are they doing that? Here’s the City’s official position on “why”: “Encouraging cycling, along with walking and transit use, is an important strategy to manage expected population growth and support community health, affordability, economic development, air quality and climate action objectives. As the City grows in population, we will need to shift some of our trips to transit, cycling and walking because these are much more efficient modes of transportation than single occupancy vehicles.” The City supports its position with data that it attributes to the 2011 Census that indicated 10.6 percent of people living within the City of Victoria cycle to work. That’s Canada’s highest per capita incidence of commuting by bicycle. It’s hard to argue with federal census data that counts (almost) every single person in the country and has a margin of error close to zero. The City is hoping to build on that encouraging number and calls its plan “Biketoria.” While the City’s vision sounds progressive and smart, the best available data about transportation in Victoria calls into question the City’s emphasis on cycling and walking—and perhaps transit, too. Let’s start with Victoria’s claim to fame, that 10.6 percent of Victorians who cycle to work. It turns out that number wasn’t obtained directly from the 2011 Census. Instead, the “10.6 percent” figure comes from the 2011 National Household Survey, which was voluntary and produced data with a margin of error much higher than zero. Since good transportation planning requires good transportation data, it’s important to understand why one of the fundamental numbers supporting the City’s Biketoria initiative is probably flawed. The National Household Survey asked participants only one question about transportation: How did the person filling out the survey “usually get to work.” There were 11 modes of transportation listed (auto driver, auto passenger, transit, walking, bicycle, etc...) and the respondent could choose only one. How did multi-modal commuters decide how to respond? We don’t know, but it’s well-known that commuter cycling ebbs in the darker, colder, wetter half of the year, so it’s reasonable to assume that some cyclists must be using other forms of transportation to get to work at least part of the year: walking, transit, some might even drive an auto. But the National Household Survey didn’t allow for such complexity. Nor did it attempt to gauge the distance people had to travel to work. As a guide for transportation planners, then, the National Housing Survey doesn’t really qualify as a reliable tool for making multi-million-dollar transportation decisions. Yet it is attributed as one of the primary sources upon which the City based its case for building protected bicycle corridors. The other source the City cites is the 2011 CRD Origin-Destination Household Travel Survey. But a careful read of the data in that survey, especially when compared with the data the survey produced in 2006, raises questions about the City’s direction. According to the CRD’s 2011 survey, only 3.8 percent of trips within, into and out of the City of Victoria over a 24-hour period were made by bicycle. When the average distance of trips made by different modes of transportation are factored in, bicycles accounted for less than 2.5 percent of the total distance travelled using all modes. Moreover, the Origin-Destination survey didn’t capture trips that were made to move goods or to provide services—like taxi drivers, social workers, delivery services, healthcare providers, transit drivers—it’s a long list and almost none of it is done by bicycle. If bicycles currently account for only a tiny fraction of the total distance travelled each day in the City of Victoria, how realistic is it that large numbers of Victorians will soon become cyclists? While Copenhagen’s large contingent of cyclists is held up as a model for Victoria to aspire to, the average age of a person living in Copenhagen is 35.9 years and has been falling for many years. In the City of Victoria, the average is 44.5 and is projected to rise for many years. As people get older, they generally spend a lot less time on bicycles, especially in hilly places like Victoria. Perhaps that’s one reason why the Origin-Destination surveys for 2006 and 2011 show that, for the whole CRD, the daily mode share for bicycles dropped slightly over those five years, from 3.2 percent to 2.8 percent. Yet the official goal in the CRD is to raise that to a regional level of 15 percent by 2038. Is this realistic? The answer to that becomes clearer when we consider the cumulative distance travelled each day by residents of the Capital Regional District (see table below). According to data in the 2011 Origin-Destination survey, about 6.6 million kilometres are travelled within the CRD each and every weekday (note that’s each day, not week). Of that travel, 72.7 percent was as the driver of an auto and 15.6 percent as a passenger in an auto. That means that about 88 percent of all travel within the CRD relies on autos. Share, by transportation mode, of total distance travelled within the CRD on a weekday Source: 2011 CRD Origin-Destination Household Travel Survey, conducted by Malatest and Associates Ltd. The survey did not capture commercial traffic or traffic originating outside of the CRD. The 2011 survey is the most recent data available. Only 1.5 percent of the distance travelled is by bicycle. As noted above, the survey does not capture commercial trips made to move goods or to provide services. If commercial traffic was included, bicycles would likely drop to little more than one percent. So bicycles currently account for a tiny fraction of the actual distance people cover in getting from point A to point B in the CRD. Again, is it realistic to think that bicycles—currently providing about 1 percent of the work being done by our regional transportation system—will supply 15 to 20 times as much work in 20 years? Without improving the cycling network, City and CRD transportation planners won't know whether their long-term goal is achievable. Unfortunately, though, as they experiment, the protected lanes may unintentionally increase emissions by delaying vehicles making right-hand turns off Pandora, resulting in hours of additional engine idling each day (see the short video below). Unless use of the corridor increases dramatically, it could be argued it's doing more harm than good most of the day. The Pandora Avenue protected bicycle corridor includes new traffic signals that delay right-hand turns off Pandora by 25 seconds at each of six intersections. As is shown in this video, this will increase emissions from autos even though there are few cyclists using the lane. You might be wondering why I am quoting a study done in 2011. The Origin-Destination surveys are done every five years, but the 2016 survey has been delayed. John Hicks, senior transportation planner at the CRD, told Focus the 2016 version, which would normally have been released about now, was pushed back a year so that 2016 federal census data could be used more directly in determining required sample sizes. A call for credentials was issued by the CRD in April and the survey will be conducted during the same months as the 2011 survey. It should be released in March 2018. So, for now, we are dependent on the 2011 data, and that shows bicycles only provide a tiny fraction of the travel needs of people throughout the CRD. Regardless of whether the loss of nearly 10 percent of the Downtown core’s on-street parking is or isn’t a reasonable trade-off for a greater level of safety for bicyclists, the claim that these corridors will play a significant role in reducing carbon emissions seems like a refusal to accept the obvious: Most people prefer to use four-wheeled motorized personal transport. So at least some of the CRD’s and the municipalities’ efforts in transportation planning ought to include how that strong preference can be incorporated in a transportation system that evolves toward a low-carbon future. For example, why not incorporate charging stations for electric cars into the protected bicycle corridor infrastructure? Providing electricity for free would create an incentive for electric vehicles Downtown. Even so, such ideas would be little more than civic acknowledgment of the need to reduce emissions since the vast majority of motorized vehicles depend on fossil fuels and likely will for many years to come, according to auto industry experts. To produce a significant reduction in CRD transportation emissions, a more sophisticated approach than painting bicycle lanes on roads will be needed. That will, at least to begin with, require helping auto drivers and auto passengers reduce their use of fossil-fueled vehicles, while accepting their choice for how to get around. How can that use be downsized? The data the CRD has collected, if it’s accurate, contains some interesting possibilities. Comparing the 2006 and 2011 surveys, it appears two shifts in the use of autos were underway. One was good news, the other bad. First the bad news. According to the CRD’s Origin-Destination surveys, between 2006 and 2011, about 78,000 fewer trips were taken as auto passengers each day. Where did the passengers go? It appears that many of them might have become drivers. In 2006, drivers accounted for 59 percent of non-commercial trips. But by 2011 that had climbed to 64 percent. If nothing else had changed, this would have meant more vehicles travelling each day—and higher emissions. But—and this is the good news—regional transportation emissions per person may have declined in spite of the trend of passengers becoming drivers. That’s because the average number of daily trips per person in the CRD decreased after 2006 by 4.8 percent. That translates to CRD residents driving about 43,000 fewer kilometres each weekday than in 2006. These two factors—the incidence of single-occupancy vehicles and the average number of daily trips taken by CRD auto users—suggest possibilities for emissions reduction that don’t involve converting car drivers to cyclists. (Again, this is only true to the extent that the data collected for the Origin-Destination surveys for 2006 and 2011 is accurate.) The CRD needs to gather more data that provides decision-makers, elected officials and media answers to basic questions, such as: Why did CRD residents reduce the number of their trips between 2006 and 2011? Is there some way to incentivize that shift in behaviour? If the City of Victoria can spend $16 million on safer bicycling for a few thousand bicyclists—using money collected from auto drivers through the Gas Tax Fund—why can’t the CRD refund a few million a year back to auto drivers who can prove a significant reduction in the miles they travel each year or switched to an EV? And what were the factors that turned auto passengers into auto drivers? What would it take to reverse that trend? Can local government, especially the CRD, play a role in facilitating that reversal? With the huge growth in the use of cell phones, ipads and apps, why does the CRD not have its own high-profile regional rideshare system in place that can connect car drivers who are about to make a similar trip? The absence of timely, deep, reliable data on transportation in the CRD will make it difficult for the community to make sensible decisions based on evidence. One example of how badly politicians can steer the public interest—when they make a decision and then look for evidence that supports it—is the CRD’s controversial LRT initiative. That began in 2009 as the provincial NDP’s response to the then-Campbell government’s transit initiatives in Vancouver. Local NDP MLA Maureen Karaganis stated back then: “The Campbell government’s transit plan focuses almost entirely on projects in the lower mainland while the rest of BC, including Victoria, has been ignored. The Capital Region seeks to avoid sprawl by building an innovative, high quality public transit system with LRT between Downtown and the western communities.” By 2012, that we-want-one-too logic had ballooned into a live, billion-dollar proposal to build an LRT between “Downtown and the western communities.” Note that the western terminus of such a system wouldn’t have been the “western communities,” but rather Langford. When politicians start pounding the drum for some large infrastructure project, which they hope will distinguish themselves from their political competitors, the only thing that might prevent them from making a big, expensive mistake is credible, accurate, up-to-date information. With the LRT proposal, if a billion was going to be spent anywhere, should it really be used to connect Downtown with Langford? The 2011 Origin-Destination survey included a graphic of the “Desire Lines” in the CRD (see graphic below). These represent the most heavily-travelled routes people take in moving around the CRD each day. In the illustration, you can see that, by far, the strongest flows are between Downtown and south Victoria, from Downtown to Uptown, and from all three of those areas out to the University of Victoria. Notice the feeble desire line out to Langford. Desire Lines in the CRD, from the 2011 Origin Destination survey. The illustration excludes trips that originate outside of the CRD. For example, trips from anywhere north of Langford, which contribute much of the traffic on the at-times congested Trans Canada Highway. The LRT the NDP was proposing would likely not be used by such travellers. The most prominent desire lines show where an LRT should be located if the goal was to reduce emissions and create a more compact community. A 19-kilometre-long loop that connected Downtown, Oak Bay, the Shelbourne Valley, UVic and Uptown would follow arterial roads that already pass within a kilometre or so of tens of thousands of existing homes. Over time, the presence of a transit line would encourage even greater population density in those already-developed areas. According to the desire lines, a 15-kilometre (one way) route from Downtown to Langford wouldn’t make sense. Yet the Regional Transportation Plan’s rationale for LRT sees that route “as a possible means to significantly curb pressure on auto infrastructure in high growth areas.” By “high growth area” the CRD means Langford, which has the highest relative population growth rate in the CRD. But in terms of absolute growth—which includes growth in population, commercial and institutional development and employment—the area of Victoria and Saanich already heavily criss-crossed by desire lines is experiencing more than twice the growth of Langford. With the NDP about to become government, will an LRT to Langford be resurrected? Quick! Using Google Earth, someone needs to count all the dwellings and places of employment within walking distance of the above two routes. Why isn’t that information already available? Choosing the wrong first route for LRT would have a devastating effect on the long-term prospects of reducing emissions in the CRD. What the CRD really needs, before spending countless millions on pet transportation projects that address a tiny fraction of the CRD’s emissions problem, is credible and comprehensive information about what it would take to get people who live here (as opposed to Danish twenty-year-olds) to change their travel behaviour. Obtaining that information would cost money, of course. One way to fund such data gathering would be to use the Gas Tax Fund. Unfortunately, that huge chunk of cash—which is taken from drivers of vehicles that run on fossil fuels—is used almost exclusively for cycling infrastructure or non-transportation-related projects in the CRD: an agricultural strategy here, a tennis court there, water system upgrades all over the place, even a fire hydrant in Shirley. The tax isn’t being used, however, for any initiative that might one day seriously lower carbon emissions. Perhaps that’s because actually reducing the use of fossil-fuelled vehicles would diminish the flow of money to the Gas Tax Fund, and that, in turn, would start to dry up funding for local politicians’ pet projects. It’s an interesting dynamic, one addiction feeding another. How do we get free of it? Please let me know what you would do. David Broadland is the publisher of Focus Magazine.
  4. January 5, 2020 The Climate Leadership Team massaged an engineering report to justify policy directions the City had already taken. AN ENGINEERING COMPANY’S REPORT obtained from the City of Victoria through an FOI request shows that the City cheated on its first attempt to plot a critical path to lower territorial greenhouse gas emissions. The way in which the report’s findings were changed suggests that the City was intent on manufacturing information for its Climate Leadership Plan that would provide support for policy directions it was already pursuing, or wanted to pursue. Stantec Engineering was hired by the City to assess the municipality’s emissions in 2017. The City published its Climate Leadership Plan (CLP) in 2018 (see link at end of story). Focus reviewed the CLP in mid-2019. While the 66-page report is full of high-level visions and soft goals, the only hard information about emissions, and how those might be reduced, were numbers that appeared in percentage breakdowns of the sources of emissions, and in a wedge graph titled “Pathways to 2050 GHG Emissions Reductions.” These were attributed to a “GPC Compliant Inventory, 2017.” Focus requested the inventory and the City released Stantec’s report to us in late October. There are several interesting differences between the information in the City’s Climate Leadership Plan and Stantec’s report. Stantec estimated GHG emissions that occurred within the municipality’s boundaries in 2017 were 465,482 tonnes. It classified those emissions by categories that were in accordance with the Global Protocol for Community-Scale Greenhouse Gas Emissions Basic+ (see link at end of story). But the City’s CLP used “387,694” tonnes and “370,000” tonnes on different pages, thereby reducing at least 77,788 tonnes of emissions with six taps on a keyboard. If the emissions Stantec estimated had been used, the paper pathway the City had plotted for reducing those emissions by 2050 would have missed its target by a wide margin. A more telling difference between Stantec’s and the City’s account of emissions is the way in which the categories used by Stantec were changed by the City. The GPC protocol has established categories of territorial emissions that allow comparison with other jurisdictions and provide a method for consistently measuring progress from year to year. Adhering to the GPC categories creates transparency, which in turn allows accountability. Adhering to the GPC Basic+ protocol is also a requirement for any city that wants to be listed on the Carbon Disclosure Project’s A-List, or is a signatory to the Compact of Mayors. Because of the way the City altered Stantec’s reported emissions, the CLPdoesn’t meet the requirements of either of those projects. Neither is it GPC compliant. Perhaps the City ought to take the "Leadership" claim out of its climate action plan. The City eliminated three of the seven categories for which Stantec had found significant emissions (see pie charts above). That included the category “Industrial Processes and Product Use (IPPU),” which had the highest rate of growth in Victoria—66 percent over the last 10 years. The City also eliminated the GPC categories “Transboundary Transportation” and “Off-Road Transportation,” which accounted for, combined, 35 percent of all territorial emissions. Lastly, the City moved multi-unit residential buildings out of the GPC category “Residential Buildings” and lumped it in with the GPC category “Commercial & Institutional Buildings and Facilities.” In the City’s version of Stantec’s report, single-family homes have suddenly become greater emitters than Stantec had found for single-family and multi-family buildings combined. Perhaps to stymie any efforts at holding the City accountable (like this story), it then moved multi-family residential buildings in with “industrial” and “commercial, institutional” buildings and found that this category now had emissions of 124,062 tonnes, only slightly higher than the 123,370 tonnes Stantec had attributed to just commercial and institutional buildings in its assessment. Trying to figure out the City’s rationale for doing that produces a sensation in my brain that I imagine is something like having a mini-stroke. In a similarly puzzling shift, the City made a separate category for single-family homes and held it responsible for a bigger percentage of emissions than Stantec had found for multi-family and single-family residential housing combined. It may be entirely coincidental, but there is a move afoot at City Hall, led by Mayor Lisa Helps, to eliminate single-family zoning throughout the City of Victoria. If it comes to that, the mayor and her supporters will be able to point to the Climate Leadership Planand say, “Look, our GPC Compliant Inventory shows this will address a big source of emissions.” Another of City Hall’s controversial directions might be at the heart of the difference between Stantec’s findings and the City’s spin of Stantec’s findings regarding transportation emissions. Stantec found that “On Road Transport” accounted for nine percent of total territorial emissions. Victoria’s version boosted that to 40 percent. This category is intended to measure emissions from cars, trucks and buses that don’t cross the City of Victoria’s boundaries. In other words, it’s not intended to include vehicles that make longer trips, too long for most people to make by walking or cycling. Emissions that result from longer trips are counted under “Transboundary Transportation,” a category the City eliminated. In the City’s version of reality, cars, trucks and buses making short trips on its streets are the single biggest emissions problem by far. That version supports its choice to spend money and create community division in the hope of getting people to cycle instead of driving a car. Stantec found that “Off-Road Transportation” (marine, aviation, other) accounted for 12.4 percent of emissions, even higher than on-road transportation. Yet the City’s climate-action brain trust deep-sixed these emissions altogether, perhaps influenced by the tourism lobby. This is classic decision-based evidence-making. In early 2019, City staff requested that council approve a $540,000 increase in spending related to further development of its climate initiatives. Those initiatives included expanding the size of the City’s public relations department. After publication addendum: The City did not respond to questions presented to it about its Climate Leadership Plan and the numbers it contained. David Broadland is the publisher of Focus.
  5. May 2017 The project seems to be a complete fiasco. But is that just a perception created by something in the air? IN A REPORT HE DELIVERED to Victoria City council in late March, Johnson Street Bridge Project Director Jonathan Huggett did a 180-degree flip-flop on one of the project’s costly screw-ups. Before I tell you about that, though, I have to provide the reader with a caveat-emptor kind of warning about my story. The fact is, I may be suffering from a mind-altering overdose of carbon dioxide. I don’t think I’m making this up, but I might be. I came to realize this was a real possibility after coming across a 2015 study by research scientists at Harvard, State University of New York, and Syracuse University. I was earnestly googling away for what might be in the air that could possibly explain the widespread mental confusion we’re seeing south of the border these days. Is it something in the water? No, it’s in the air. These scientists reported that human cognitive abilities are significantly and adversely affected by the concentration of carbon dioxide that we are now regularly exposed to inside many buildings. Their work confirmed two previous but smaller studies that had come to much the same conclusion. The cognitive functions most severely impacted, the research found, were the ability to use information and the ability to strategize. So I need to warn you: I wrote this story while sitting inside a building. Moreover, my subject—Mr Huggett’s flip-flopping report—was presumably also written while the author was inside a building. Even worse, because of the likelihood of elevated levels of carbon dioxide wherever you are right now, your ability to process my potentially confused reporting of a potentially confused report could be compromised. By the end of this story, you may be completely dazed and confused. Before venturing into that minefield, consider this: The only real solution to adverse levels of indoor carbon dioxide is thorough ventilation with fresh, outdoor air. But, as the level of carbon dioxide outdoors continues to increase as a result of carbon emissions from human activity, ventilation will increasingly fail to make any difference. How bad could this get? The worst-case scenario is that global concentration of atmospheric carbon dioxide will one day reach the levels that significantly affect human cognition. Confusion begins around 800 to 900 parts per million. Currently, outdoor levels are about halfway there and rising. Donald Trump seems eager to get all of us fully there, but being even halfway seems to allow for craziness enough. So reader beware, and let’s look—for the billionth time—at the Johnson Street Bridge project, whose nine-year history so far provides plenty of circumstantial evidence that carbon dioxide levels during City council meetings in Victoria need to be carefully investigated. JONATHAN HUGGETT, it turns out, is the most highly-paid official currently working for the City of Victoria. At $20,000 per month, he’s making more than even City Manager Jason Johnson, who hired him. Including expenses and taxes, Huggett is billing Victoria taxpayers approximately $295,000 per year. Not bad for a guy who lives in Surrey, only needs to report to City council four times a year, and isn’t required to answer questions from reporters. Since he’s so highly paid—by taxpayers—and since some of his claims about the project have seemed to be at odds with the public interest, Huggett’s reports beg for a detailed examination by local media. He has told Focus he’s too busy to answer our emailed questions, although he has made frequent appearances on local talk-radio programs. As Huggett’s open-ended contract with the City notes, the City also has a highly-paid “designer and project manager,” MMM Group. Since 2009 the City has paid MMM about $16 million for its services. As the “owner’s representative,” MMM, supposedly, would insure the City’s interests were given top priority by the company building the bridge, PCL Constructors Westcoast Inc. So why does the City need Huggett? Can’t MMM be trusted to do its job? According to his contract, Huggett was brought in by City Manager Johnson in April 2014, to “undertake an independent review of the Project, including assessment of the relationship between the City, MMM and PCL, to evaluate the current status of the project and potential risks to its successful completion.” But after undertaking that review and providing a report in July 2014, Huggett was appointed “Project Director.” He has spent the time since then providing quarterly reports formerly written by City employees in collaboration with MMM. The breakdown in trust between the City and its project manager became public in 2014 when both PCL and MMM began to present the City with claims for additional costs even though the City had been assured that project costs had been capped by a “fixed price” contract. Huggett’s first report to City councillor’s assured them that the City didn’t have a fixed-price contract. For some reason, councillors liked what they heard and Huggett’s monthly cost then escalated. With an extended period of legal battles likely to follow physical completion of the bridge, Huggett can expect to receive a monthly cheque from the City at least through 2018. If that’s the case, his own work on the project will add roughly $1.3 million to the cost of the new bridge. It’s unclear whether that amount has been fully included in any of the quarterly updates Huggett has delivered to City councillors. It should also be noted that Huggett does not track the bridge’s costs. That’s done by the City’s finance department. As well, the City is represented by an outside law firm—as well as its own highly-paid legal staff—on legal issues related to the project. Huggett has stated publicly a number of times that his job is to make sure the project gets completed. But Huggett has sometimes presented opinions to City council and the public that haven’t been based on facts. His use of—let’s call them alternative facts—have had the effect of protecting the reputations of professional engineers who have screwed up on this project rather than protecting the public interest. One good example of Huggett’s use of alternative facts was his response to a story Focus published about how the level of seismic protection stipulated for the bridge—the seismic design criteria—was secretly downgraded from the level that MMM had recommended. The essential facts of that story are these: MMM recommended to the City in 2010 that the new bridge be able to withstand a magnitude 8.5 earthquake and the City agreed to pay an additional $10 million for that recommended higher level of protection. However, after initial estimates from the construction companies bidding to build the bridge were received in 2012, project engineers realized that the bridge would cost much more than they had hoped. At least one of the companies also expressed concerns about the unusual design’s inherent seismic risk. For whatever reason—whether it was to reduce costs in an attempt to save a failing project or because the engineers realized the peculiar design could not withstand a magnitude 8.5 earthquake without irreparable damage—the project’s target seismic protection level was lowered. The decision to build the bridge to a lower seismic standard was made in secret—that is, without City council’s knowledge—and that broke the agreement City managers had made to seek elected officials’ consent to change the project’s scope. More importantly, the downgrading of the seismic design criteria meant the bridge could be more easily damaged by an earthquake. It also made it more likely the bridge would be unrepairable following a smaller earthquake. When Focus published a story pointing this out, Huggett’s response was to obscure what had occurred. His explanations never acknowledged the existence of the Johnson Street Seismic Design Criteria document which proved the change had been made. This document was an integral part of the construction contract the City signed with PCL. Instead, Huggett provided City councillors with a report in which a critical paragraph of the building code governing construction of bridges had been altered so that it appeared that the lower standard to which the bridge had been built was in accord with the requirements of the (altered) code. This was a truly remarkable sleight of hand, and I have wondered whether carbon dioxide might have been involved. What else could explain Victoria City council’s utter lack of ambition to look more closely at the issue? The City was in a position to demand that MMM return $10 million of its $16 million payment for its failure to provide a bridge with the level of seismic performance it had recommended. And what explains Huggett’s course of action? Instead of pursuing MMM, he misquoted the bridge code. A partially-redacted email (it was obtained by FOI) from an MMM employee to Huggett following the creative rewrite of the seismic code, expressed MMM’s relief “since the seismic issues appear to be contained for the time being.” Huggett never publicly admitted that such “issues” even existed, but it’s apparent that MMM expected the issue might resurface. So now we come to Huggett’s 180-degree flip-flop. (Also see the slideshow: Seismic rip-off on the Johnson Street Bridge) I RECENTLY REPORTED WHAT HUGGETT has said about the issue of fendering on the north side of the bridge. Fendering is the protective barrier placed around the support piers of a bridge to minimize the damage that could be done if a ship or barge accidentally hit the piers. Huggett told councillors in July 2015 that more extensive fendering was needed on the north side of the bridge than had initially been planned because, as it turned out, “The new bridge is somewhat less robust than the existing structure.” In explaining why this would add significantly more cost to the project than had been stipulated in the so-called “fixed-price” contract, Huggett told councillors that the north-side fendering had been “clouded-out” in a contract drawing. That indicated, he said, “It is not in the original contract.” But a review of the “fixed-price” contract by Focus strongly suggested that the cost of the fendering had been included, even if the final design of the north side fendering had not been fully worked out. In response to an FOI request from Focus, the City said it could not find the “clouded-out” contract drawing that Huggett had referred to, further eroding the credibility of his claim that the contract did not include the north side fendering. In spite of these facts, Huggett continued to maintain that the additional cost of the north side fendering could be substantial and would have to be borne by City taxpayers. The cost has been rumoured to be as high as $10 million. A rendering of fendering on the north side of the new Johnson Street Bridge from Jonathan Huggett’s March 2017 quarterly report to Victoria City Council, in which it was described as “one option.” One Victoria engineer estimated the installation could add $10 million to the cost of the project. Who was Huggett representing by taking this position? He is being paid $20,000 each month by Victoria taxpayers. Shouldn’t his positions reflect that? Let me boil this down to two points. First, why would Victoria be getting a bridge that was “less robust” than the existing bridge? Questions raised about the ability of the existing bridge to withstand the forces exerted on it by even a minor earthquake was the very rationale used for building a new bridge. Yet, according to Huggett, the new bridge would be less robust than the old bridge. Rather than openly accepting this apparent project failure, shouldn’t Huggett have been advocating for a better outcome? Secondly, why didn’t Huggett take the position that the cost of all fendering was in the PCL contract? In his report to City council in March, Huggett reversed his position and admitted that PCL’s fixed-price contract was “supposed to cover all fendering costs.” Huggett also provided details about the issue that have been kept secret for two years. Huggett revealed two errors that were made. One was made before the construction contract was negotiated with PCL and one afterward. Both subsequently “impacted” the design of the fendering, and hence its cost, Huggett reported. The first error was the relocation in early 2012 of an underwater duct bank containing numerous telecommunications cables, including fibre optic cables connecting CFB Esquimalt to the world. That $1.6 million project was engineered and overseen by MMM. According to Huggett, though, the duct bank “was not moved sufficiently far enough to allow for easy construction of fendering systems. Without additional protection measures, piles cannot be driven close to the duct bank as in the event of a ship collision the piles might move and damage the duct bank.” Unbelievable, but—according to Huggett—true. By the way, the duct bank was relocated even before the City had a final bridge design, let alone a signed construction contract. At the time, City managers insisted such work needed to proceed in order for the project to meet its March 2016 completion deadline so that federal funding would not be lost. (Arbitrary deadlines and high levels of carbon dioxide are a truly awesome combination of conditions under which City councillors are asked to make important decisions, don’t ’ya think?) The second error identified by Huggett involved the City’s property at 203 Harbour Road. According to Huggett, “The City sold 203 Harbour Road to Ralmax as it was assumed the land was not needed for the construction of the bridge. This impacts an economical design since access to the water side frontage of 203 Harbour Road must be preserved.” That’s not quite true, though. The City actually transferred 203 Harbour Road and other adjacent properties to the Province in 2014 in exchange for the Crystal Garden property on Douglas. The Province then sold the Harbour Road properties to Ralmax. Regardless, Huggett is implying that whoever negotiated the transfer of 203 Harbour Road to Ralmax apparently neglected to obtain an agreement that would have allowed a minor intrusion on its riparian access to 203 Harbour Road to allow economical fendering for the bridge project. Wow. I bet the negotiating room had poor ventilation. Following delivery of Huggett’s March report to councillors, he appeared on CFAX. Among other things, Huggett told listeners the City hoped to recover, through legal action, the additional cost of fendering from the bridge’s “designer.” In Huggett’s contract with the City, the bridge’s “designer” is identified as MMM. A review of what MMM committed to in writing on the design and cost of fendering suggests that the City will have little chance of recovering that cost from MMM. But still, this is a complete flip-flop from Huggett’s previous position that the cost of north-side fendering was explicitly excluded from the original contract—and so the City would have to suck it up. Could he also flip-flop on the seismic issue and assist the City in getting MMM to return $10 million for that fiasco? Not likely. To flip-flop on the seismic issue would require that Huggett explain why he rewrote the bridge seismic code for a council report. That would be awkward for him to explain. Perhaps he could invoke a carbon-dioxide defence. SPEAKING OF CARBON DIOXIDE, one of the original premises used to justify building a new bridge in 2009 was that the existing double-bascule bridge presented a daily discouragement to thousands of would-be cyclists who, promoters claimed, were just waiting for a new bridge so they could abandon their daily commute by car. That would reduce carbon emissions, they said. Bicycle access across the railway bridge was eliminated in April 2011. If the bridge was a choke point before then, it has been even worse in the six years since. The prolonged disruption of vehicle traffic—with long waits on both sides of the bridge only adding to overall vehicle emissions—was never part of the bridge promoters’ calculations. The longer the bottleneck lasts, the more ridiculous the claim of reducing carbon emissions becomes. When will it end? The project has been on hold for months, waiting for completion in China of the lifting part of the bridge, which will span the remaining 41-metre gap. So far, fabrication of that one section of the bridge has taken over three years. How is that going? Explaining the project’s schedule—and why the bridge won’t be finished anytime soon, has been a major part of Huggett’s $20,000 per month assignment. In his September 2016 report to the City, Huggett said that Chinese fabricators had been working at fitting the rings to the trusses in preparation for a “trial fit-up.” “Painting of the structure will commence shortly,” Huggett reported. Three months later Huggett’s report noted that Chinese fabricators experienced difficulty fitting the first ring to the first truss, but Huggett expressed optimism that what the fabricators had learned would speed up fitting the other ring and truss together. It didn’t. Almost four months later, Huggett presented photographs that showed most of the major components had been fitted together, although there was no photographic proof that the north-side truss and ring had been matched. Photos published by the City showed Chinese workers apparently ready to lift the north-side truss into place on March 16. The photographs suggest painting of the bridge parts might be weeks—if not months—away. Yet Huggett had reported six months earlier that painting would “commence shortly.” So when is Victoria getting its new bridge? According to PCL’s original construction schedule, it would take slightly more than six months between the date the steel components were delivered to Victoria and the date the bridge could be opened for traffic. It would take another three months after that before the Blue Bridge could be removed and the project completed. So far, PCL hasn’t completed any of the tasks on its original schedule in less time than predicted. So, with the final shipment of steel components not expected to get to Victoria until September—according to Huggett—six months after that would put the bridge opening for traffic in February 2018, and project completion in early May 2018. One has to wonder: If those Harvard scientists are right about carbon dioxide affecting human cognitive function, did Shanghai’s notoriously dirty air play a role in the Chinese fabricators’ stumbling performance on Victoria’s new bridge? That seems possible. And there’s plenty of evidence of mental confusion at play on this project right here in Victoria, too. If there’s something in the air that’s making it more difficult for people to make good decisions, it’s a global phenomenon. Which means, of course, I, too, could be dazed and confused on the Johnson Street Bridge. How about you? David Broadland is the publisher of Focus.
  6. March 2017 Project promoters are still claiming the new bridge will be “world-class” and “iconic.” Unfortunately, they may be right. IN A RECENT RADIO INTERVIEW, City of Victoria Mayor Lisa Helps described the new Johnson Street Bridge as “iconic” and “world-class.” Those words were optimistically attached to the project back in 2009 and Helps’ use of them eight years later is a bit like Donald Trump describing his popular-vote loss as “a massive landslide victory.” Both are ignoring, or don’t know, the factual history of their respective projects. So far, nearly four years of bridge construction has produced what looks like an ordinary concrete highway overpass with the middle missing. If the bridge is going to be “iconic” and “world-class” in the way that Helps meant, the missing piece will have to be so architecturally stunning and engineeringly remarkable that it’s able to lift the dull heaviness of what’s been built out of mediocrity. Unfortunately, evidence is mounting that the City has committed a world-class blunder. Fabrication of the missing part of the bridge—which will span a gap of 46 metres (151 feet)—has proved to be extraordinarily difficult for Chinese welders. They began work in early 2014 on a much-simplified version of the span originally designed by noted bridge architect Sebastien Ricard of WilkinsonEyre Architects in London, England. Not simplified enough, apparently. By mid-2014, quality control inspectors found the fabricated sections of the bridge had been made of such poor-quality steel, or so badly welded, that they had to be scrapped. Three years after starting, the Chinese welders were reported to be struggling to fit the pieces together. According to project reports, once the Chinese fabricators get the parts to fit, and assuming everything else goes smoothly from that point onward, both in China and Victoria, construction of the new bridge will be completed by early 2018. If that projection holds up, the 156-metre-long, 15-metre-wide (1) bridge will have taken five years to build. The City announced (2) the start of construction on May 17, 2013. Compare that with the indisputably “iconic” and “world-class” French bridge, the Millau Viaduct (photo below), completed in 2004. The 2460-metre-long, 27.6-metre-wide bridge (3) floats 270 metres above the Tarn River. It took three years to construct. A comparison of the project costs (4) (5) is also revealing. A standard method of comparing the cost of bridges is to divide the project cost by the area of the bridge’s highway deck (which are the dimensions given above). Doing that arithmetic (6) for the Millau and Victoria projects, we find that each square metre of bridge deck on the new Johnson Street Bridge will cost five times as much as a square metre of bridge deck on the Millau Viaduct (adjusted for inflation to 2016). So, in a way, Helps could be right. Victoria’s new bridge could very well be judged an “iconic” and “world-class” example of how not to build a bridge. The project’s problems go deeper than mere extreme cost and long construction delays, though. Many of the original objectives of the project—like architectural significance, a wider navigational channel, and seismic protection up to magnitude 8.5—had to be ditched as the project’s real costs became unhinged from consultants’ promises. But the story of why the project kept costing more, even as its promoters secretly stripped away promised benefits and features is, at its core, the story of what happens when old blunders are covered over by new blunders. The project was originally justified on the basis that the existing 1924 Joseph Strauss-designed bascule bridge had not been built to any seismic standard, and might collapse in an earthquake. Focus learned through freedom of information requests that City officials had been advised—in writing—by both of the first two engineers involved in the project, Joost Meyboom and Mark Mulvihill, that the City should seismically upgrade and rehabilitate the existing historically-significant bridge rather than replace it. Meyboom told the City that work could be done for $8.6 million (7). What followed was a long series of blunders and misrepresentations by City officials and private engineering consultants that, piled one on top of another, has led to a spectacular design failure and a series of cover-ups that have attempted to hide that failure. A full account of all the misrepresentations is beyond the scope of this article, but one particular misrepresentation, the impact of which is now working its way into the local economy, is worth exploring in depth. This particular misrepresentation was the inevitable consequence of rushing a poorly-understood design through a competitive bidding process in which all the bidders were warning the City that the project was risky in terms of cost and engineering considerations. Instead of doing the right thing—pausing the project to thoroughly assess the design—its promoters ignored the warnings and hid these concerns from elected decision-makers and the public. IN MID-JANUARY 2017, a letter (8) from Seaspan Marine to the City of Victoria was leaked to media outlets in Victoria, including Focus. Seaspan is a prominent tug and barge company operating on the West Coast. It frequently pulls barges and guides other vessels through the narrow channel spanned by the Johnson Street Bridge. In the letter, Seaspan told the City that recommendations to lower the speed at which it and other operators could make such transits, coupled with the “doubling of the transit distance”—a result of the project’s hasty decision to leave the concrete support piers of the existing bridge in place—“undermines safety rather than enhances it.” As a result, Seaspan’s Vice-President of Operations Paul Hilder wrote, “we will have to curtail barge service to businesses above the bridge and cease performing bridge assists to other operators.” Hilder requested that the City “reconsider their position to seek a reduce[d] speed limit from Transport Canada and the Victoria Harbour Master.” The current speed limit past the bridge is five knots. The City would like that reduced to 3.5 knots. Interviewed on CFAX radio, current Johnson Street Bridge Project Director Jonathan Huggett was asked if the speed change was being brought forward because a lower speed limit would allow the use of less robust fendering on the north side of the bridge. Huggett said that the issue was one of whether spending more money on fendering would be an appropriate use of public resources. More robust fendering would cost more money. The public resources at stake are not insignificant. It’s rumoured that more robust fendering, which would allow the current maximum recommended speed of five knots to be maintained, could cost in the neighbourhood of $10 million. The commercial interests of the Middle and Upper Harbour customers that Seaspan serves are also significant. Whose interest should prevail? Lost in the ensuing public discussion of whether the City should pay for more fendering protection so that barges could be pulled at the speed the mariners thought was safest, were two underlying questions. First, why was the cost of fendering on the north side of the bridge left out of the construction contract—if in fact it was—when councillors were asked to approve the contract in December 2012? Secondly, why would the new bridge not be able to withstand a five-knot collision on its north side if it was protected with the same kind of fendering that has protected the existing bridge on its north side? The existing bridge has been able to withstand hits by vessels moving at five knots over its 93-year life without it incurring damage to its lifting mechanism. It continues to provide reliable service. The City has avoided providing factual answers to these key questions. No wonder. Factual answers backed by evidence would reveal why the Johnson Street Bridge Replacement Project will likely be an engineering case study on how not to build a bridge. To understand why, we need to go back to when the fendering issue first became public. Project Director Huggett brought the bad news about the north-side fendering into the public realm at a City council meeting on July 16, 2015. Back then, he had a slightly different story. At that meeting, Huggett told councillors that fendering for the north side of the bridge needed to allow for a five-knot speed and would add an additional $3 million—more or less—to the cost of the bridge. Councillor Ben Isitt asked Huggett: “Could you remind us why the fendering isn’t included in the scope of the contract with PCL?” PCL is the company building the bridge. Isitt was at the critical December, 2012 in camera meeting at which councillors were given the details of the contract and urged to approve it by City staff. In response to Isitt, Huggett asserted the existence of a contract drawing, one that Isitt apparently hadn’t been shown, in which the north-side fendering had been “clouded out,” signifying that it was not part of the agreement between PCL and the City. Huggett stated, “At the time we went forward with the contract it was left as an issue to be resolved.” A few moments later, again referring to fendering on the north side of the bridge, Huggett was even more definite: “It was not in the original contract.” After that meeting, Focus filed an FOI request for the document Huggett referred to as proof that the north-side fendering had not been included in the 2012 construction contract. The City was unable to locate any such document (9). Indeed, the PCL contract seems as explicit about the design and cost of fendering as it is about any other detail covering construction of the bridge and related structures. In its response to our FOI request, the City informed us it had “eight fendering drawings created in 2013 for the north side of the new bridge” that they said “do contain three drawings in which portions are clouded to identify portions of the fendering system that were put on hold.” But that was well after the councillors were shown the details of the contract and asked to approve it. In other words, there was nothing in the PCL contract itself to signify that north-side fendering was not included, but, as the project advanced, changes to the proposed fendering were contemplated. The City also informed us that it would not release those 2013 documents because the fendering was part of an ongoing legal mediation with PCL and release of the documents “could compromise the City’s negotiating position at the mediation table.” In other words, north-side fendering was included in the 2012 construction contract approved by councillors, but it has since become a bargaining chip in the unfolding legal dispute between the City and all of the other parties involved in the troubled project. Focus has obtained records (10) from the City that show Huggett had been made aware that senior City staff had agreed to a deal with PCL with respect to the cost of fendering, a deal which apparently didn’t include seeking approval from Councillor Isitt and other elected officials. Huggett was informed of this before he told councillors at the July 16, 2015 meeting that north-side fendering was “not in the contract.” Focus contacted Huggett for his explanation but he did not respond to repeated emails. So let me back up a bit and address the other fundamental question about the fendering issue that the City doesn’t want to answer: Why would the new bridge not be able to withstand a five-knot collision on its north side with the kind of fendering that has been protecting the existing bridge? At the council meeting at which Huggett first made this issue public, he also explained to councillors why fendering was so vital: “The new bridge is somewhat less robust than the existing structure,” he told councillors. Bingo. He continued: “The last thing I need is a barge to hit the rest pier and knock it two inches out of alignment. For one, I don’t know how I’d get it back again having knocked it out of alignment and then I’m faced with an inoperable bridge. You’ve got $100 million invested in the water here and I’ve got to protect it.” What Huggett was saying, in effect, is that if an outgoing barge loaded with scrap metal hits the new bridge, it is more likely to be made inoperable than would be the case if the same barge hit the old bridge. By “inoperable” we mean unable to lift or lower the moving part of the bridge. For a project that was originally justified on the basis of the existing bridge not being robust enough to survive the forces imposed on it by a significant earthquake—and thus posed a threat to public safety—this is an extraordinary admission of project failure. Huggett’s admission, by the way, apparently went right over the top of councillors’ heads. What characteristic of the new bridge makes it “somewhat less robust” to marine collisions than the existing bridge? We put that question to Huggett but he didn’t respond. But it’s not difficult to understand what’s really at issue. The first thing to note is that the north side of the main pier has been left unprotected since it was completed. This structure is called the “bascule pier.” It will house all of the machinery used to lift the bridge, and it supports the weight of the “bascule leaf”—the moving part of the bridge being fabricated in China. If protecting the pier itself was so critical, wouldn’t that have been put in place as soon as the pier was finished? Many loaded barges have been towed past it already. From the absence of protection, it’s not unreasonable to infer that it’s not the pier itself that’s vulnerable, but the bascule leaf that the pier will support. But why would that be so vulnerable? Imagine a tug pulling a barge full of scrap metal headed south from the Upper Harbour toward the new bridge—which in real life is a regular occurrence. The bridge would lift to its upright position to provide clearance for the tug and barge. But imagine that a strong tailwind suddenly catches the barge and the combination of wind, an ebb tide and a narrow channel result in the barge swinging around and striking the main pier of the bridge with great force. What would happen to the bascule leaf? Try to picture it: the erect span projects 50 metres into the air above the bridge deck—as high as a 15-storey building. When the barge hits the pier, how will that heavy steel projection behave? This is a particularly crucial design issue for this bridge, which has a one-of-a-kind feature: The bascule leaf’s 15-metre-diameter rings float on steel rollers and are not attached to the bascule pier. There is no central axle that’s bolted to the pier that will hold the leaf in place if the bridge is hit by a barge or an earthquake. There’s nothing but the leaf’s own weight to keep it in place. And, bizarrely, when this bridge is in the fully erect position, it’s top-heavy. As the bridge lifts, its centre of gravity actually rises. When it’s in the fully-raised position, more than half of the weight of the moving part of the bridge is above the highway deck and there is nothing—other than the wide stance of the rings—to keep the bascule leaf from being tipped to one side in reaction to the pier being hit by a barge. If the bridge were hit in a strong northerly wind that was already pressing the top-heavy leaf sideways, what would happen? Would there be enough momentum from the loaded barge transferred to the upright bascule leaf to tip it over sideways or shift it enough to damage the lifting machinery? Could the bridge get stuck in the upright position with no way, as Huggett put it, “to get it back again having knocked it out of alignment”? You might think that all of this would have been worked out years ago. But it wasn’t. Only in 2016, seven years after the open-ring design had been chosen, did the project evaluate “the severity of forces on the bridge and its associated structures resulting from impacts during tug and barge transit through the waterway between the Upper and Lower Harbors passing through the new Johnson Street Bridge when open.” The study, undertaken by Seattle’s Pacific Maritime Institute, determined that the worst probable impact would occur to the north side of the bridge’s main pier. The force of such an impact was estimated to be 1200 tonnes. What effect would that have on the bascule leaf in the open position? The City isn’t saying, but what we do know is that the project proceeded in 2013 without such an evaluation. Now the City faces the additional cost of ensuring that a 1200-tonne impact never occurs. Let me summarize: The City can’t provide any evidence of Huggett’s assertion that the fendering for the north side of the bridge was explicitly excluded from the 2012 PCL contract. And, although the City and Huggett would not answer questions about the positional stability of the unattached bascule leaf in a barge collision, what is known suggests the project realized—after construction had begun—that the experimental design created an unforseen vulnerability. This has been the modus operandi of the project since 2009. At critical moments, when it was realized the open-ring design would produce a construction-cost risk or a seismic risk or an operational risk, the project’s promoters hid the risks. They misled councillors and the public about the flawed design to get more money to keep the project moving forward. The most iconic, world-class moment on this long downhill slide occurred in November 2012 when City managers made their recommendation to councillors on the three construction bids. At a closed-to-the-public meeting, the managers urged councillors to allow them to begin negotiations for a contract with PCL, even though the company had produced a design in which every single element of the bridge had changed significantly from the design envisioned by the City’s project manager MMM, and WilkinsonEyre. Even though it wasn’t in the interest of any of the three bidding companies to alienate the City’s influential project manager, all had produced polite but scathing criticisms of the design and supporting engineering done by MMM. Two of the companies’ bids were based on completely different mechanical lifting concepts. PCL’s quickly-produced adaptation was the only option left to City managers for proceeding with the project. The City officials failed to relate any of the information in the critical reviews to councillors. Rather than accepting the realities exposed by the companies’ critiques—that MMM had greatly under-priced and under-engineered the design—the officials instead hid these concerns, and the accompanying financial risks, from councillors. Many of the senior City managers who played a direct role in this deception later departed abruptly as the implications of a hastily-conceived design on cost and construction duration became clear. Their replacements have been kept busy ever since hiding the ways in which the project had to be scaled back, including seismic protection, fendering, and the original architectural vision. WilkinsonEyre has now removed all traces of its association with the project from its website. As for the deceived, although then-Mayor Dean Fortin was removed by voters, most of the councillors who had the wool pulled over their eyes are still sitting around the council table, asking Mr Huggett polite questions about pathways and the kind of grass being planted on the bridge approaches. At a December council meeting City Manager Jason Johnson told those councillors that a “mid-term lessons learned” exercise on the project had been completed by City staff. Focus asked Johnson whether that exercise had included public input and whether the results were available to the public. In his response, Johnson didn’t answer either question directly but said the City “will release all of the findings when the bridge is finished.” More likely, the project will be protected under a shroud of legal advice for years to come, and making the “lessons learned” public would—and I’m just taking a wild stab in the dark here at what the City will say—“compromise the City’s negotiating position.” Thus City officials, former and current, will be spared public exposure of the role they played in the building of Victoria’s iconic, world-class blunder—and will be free to move on to other projects. David Broadland is the publisher of Focus.
  7. January 2017 Justin Trudeau linked approval of Trans Mountain to Alberta’s “100-megatonne cap” on oil sands emissions. Independent analyses suggest that cap has already been exceeded. Further expansion of oil sands exports could give Alberta a stranglehold on Canada's allowable emissions by 2028. WHEN PRIME MINISTER TRUDEAU announced approval of the Trans Mountain pipeline expansion project, he linked that to Alberta’s goal of limiting emissions from oil sands production. “We could not have approved this project without the leadership of Premier Notley, and Alberta’s Climate Leadership Plan—a plan that commits to pricing carbon and capping oil sands emissions at 100 megatonnes per year,” Trudeau told Canadians. The prime-ministerial logic here is challenging. Just ten days before, his Environment Minister Catherine McKenna had announced Canada’s emissions goal for 2050 would be 150 megatonnes—for the whole country. To accomplish that would require reducing national emissions by increments of 18 megatonnes every year from now until 2050. Yet Trudeau’s first action following McKenna’s announcement was to approve a project that would allow Canada’s annual emissions to grow by 18 megatonnes. Even though they pull in opposite directions—one to higher emissions and the other to lower—Notley’s promise and McKenna’s goal amount to the same thing. They’re both paper-thin promises that can be broken at any time depending on who is governing Alberta and Canada. At the Trans Mountain announcement Trudeau said, “Climate change is real. It is here. And it cannot be wished or voted away.” On his assertion that climate change is real, a majority of British Columbians would probably agree. But both Trudeau and Notley can be voted away, and so can their legislation. An expanded pipeline from Alberta to BC’s south coast, on the other hand, will create a permanent increase in risk to both the environment and southwest BC’s economy. Many Vancouverites and Victorians won’t let it happen without a fight—a physical one if it comes down to that. But Trudeau’s linking of Trans Mountain with Notley’s pledge of “capping oil sands emissions at 100 megatonnes per year” creates a challenge for the prime minister. Where is the proof that limit hasn’t already been exceeded? If it can be shown that oil sands emissions are already over 100 megatonnes, would he rescind approval of the project? And on whom should the burden of proof fall? Trudeau also said that Trans Mountain—by allowing oil sands production bound for export to grow substantially—would be good for Canada, economically. While that assertion might have been true in the economic paradigm in which continuous growth in fossil fuel emissions was assumed to be a sign of economic health, in the new paradigm in which Trudeau and McKenna hope to lead Canada— one where national emissions must shrink by another 18 megatonnes every year—does it make any sense at all? Let’s start by examining the fundamental premise behind that “100-megatonne cap,” which is that it hasn’t already been exceeded. WHERE DID JUSTIFICATION FOR a “100-megatonne” cap come from? Was the concept dreamed up by the Alberta Petroleum Marketing Commission? Consider why the number “100” might have been chosen. Who wouldn’t celebrate reaching 100? But is there any scientific evidence that supports that cap? None has been offered. Indeed, there are strong indications Alberta’s oil sands projects have long passed that symbolic mark. Let’s begin with what Environment Canada claims. In 2014—the most recent year for which it has published figures (“Greenhouse Gas Emissions, April 2016”) describing oil sands-related emissions—they were put at 67.8 megatonnes. A “megatonne” is a million metric tonnes. Environment Canada provides only three numbers in its inventory of greenhouse gas emissions to support that figure: one for “Oil sands—upgrading,” another for “Oil sands—in situ” and a third for “Oil sands—mining and extraction.” That’s it. That’s Environment Canada’s entire breakdown of emissions for an industry regularly described as the “fastest-growing source of emissions in Canada.” Again, those three numbers added up to 67.8 megatonnes in 2014. The unavailability of information from the federal government around this highly controversial industry is startling. But because of the controversy—the oil sands have an international reputation as being a “dirty” source of energy—several independent analyses have been conducted to determine oil sands emissions intensity. By “emissions intensity” we mean the amount of greenhouse gases released for each barrel of bitumen produced. Such analyses include carbon dioxide, methane, nitrous oxide and other GHGs. The independent analyses—which have had varying levels of independence from the Alberta government and the oil sands industry—were conducted to compare the emissions intensity of fuels derived from oil sands bitumen with fuels refined from other sources of crude oil. Most of the studies divide the entire life cycle of a fuel into stages and assign an emissions intensity value to each stage. The stages include extraction, upgrading, transportation by pipeline to a refinery, refining, delivery from refineries to distribution terminals, and so on through to combustion. The emissions that Environment Canada attributes to the oil sands industry in Alberta are limited to those from extraction, upgrading and pipeline transportation. Very little of Alberta’s bitumen is refined in Canada, and refining emissions are inventoried by Environment Canada in a separate category. So when Trudeau approved Trans Mountain because Alberta promised to cap “oil sands emissions,” it’s only those first three steps—extraction, upgrading and pipeline transportation—that are included. The independent studies have arrived at different values for the overall carbon intensity of those first three steps. Using an average of those values, along with the oil sands production records of Alberta Energy Regulator and the National Energy Board, we can determine a reasonably good estimate of emissions attributable to those first three steps. What stands out in doing that arithmetic is that only by using a value for emissions intensity from the very bottom of the range produced by the independent studies could a value of “67.8 megatonnes” be obtained for oil sands emissions in 2014. In our effort to confirm Environment Canada’s oil sands emissions, we used the average values for “Canadian Oil Sands” “extraction” and “crude transportation” determined by a 2014 study conducted by the US Congressional Research Service (US CRS). That office describes itself as “providing policy and legal analysis to committees and Members of both the House and Senate, regardless of party affiliation.” Its report was a meta analysis of six previous studies that determined emissions from the oil sands. The US CRS determined an average emissions intensity of about 20 grams of carbon-dioxide-equivalent for each megajoule of bitumen produced, including extraction, upgrading and pipeline transportation. That works out to 122 kilograms of carbon-dioxide-equivalent emissions per barrel of bitumen produced. To cover the additional energy required for upgrading, we used a standard 10 kilograms of carbon-dioxide-equivalent emissions per barrel. When those numbers are applied to the oil sands’ 2015 production volumes recorded by Alberta Energy Regulator, emissions from Alberta’s oil sands operations grow to about 116 megatonnes. That suggests oil sands emissions could already be significantly higher than Notley’s 100 megatonne cap. To obtain Environment Canada’s much lower, official level of emissions for the oil sands projects, carbon intensity values about one-half of that determined by US CRS would need to be used (11 grams of carbon dioxide for each megajoule of bitumen produced). A study done by the Jacobs Consultancy in 2012 placed oil sands production emissions in that range. (This study was not included in the US CSR’s analysis.) But that study’s authors noted, “Jacobs Consultancy has not made an analysis, verified, or rendered an independent judgment of the validity of the information provided by others.” The Jacobs study was commissioned by the Alberta Petroleum Marketing Commission. That Alberta government organization’s mandate includes responsibility “for exploring new opportunities for building new markets for oil and gas products within North America and abroad, and improving access to current and new markets for oil sands products…” Do I need to point out that APMC are trying really hard to sell more bitumen? An earlier study done by Jacobs for the Alberta Energy Research Institute in 2009 was included in the US CRS study. That study determined values much closer to 20 grams of carbon dioxide equivalent for each megajoule of bitumen produced. A 2013 scientific study, “Historical trends in greenhouse gas emissions of the Alberta oil sands, (1970–2010)” by Jacob Englander et al, also provides data that challenges the Alberta/Environment Canada version of emissions. It considered data from each of the oil sands projects and put production emissions intensity at 20 to 22 grams of carbon dioxide equivalent for each megajoule of bitumen produced. It estimated emissions related to extraction, upgrading and pipeline transportation in 2010 were about “70 megatonnes.” Applying the large increase in daily production that has occurred since 2010 to Englander’s estimate, annual emission from the oil sands in 2015 would be approximately 117 megatonnes. Additional scientific research published in 2015 by Sonia Yeh et al on the net emissions associated with land-use impacts resulting from oil sands production helps to illustrate the significant undercounting of emissions that is occurring. The authors note: “We found that land use and GHG disturbance of oil sands production, especially in-situ technology that will be the dominant technology of choice for future oil sands development, are greater than previously reported.” Based on expected production levels out to 2030, the authors estimated emissions as high as 10 megatonnes per year just from land use impacts. The 2013 Englander study put land-use impact for in-situ production at zero, so even its finding of emissions intensity is likely an undercount of actual emissions (Englander contributed to the Yeh study). Yet Englander’s value for emissions intensity translates to overall oil sands emissions being nearly twice as high as Environment Canada admits. The current scientific evidence and level of uncertainty, then, conflict with information created by industry and government marketing organizations. Yet that clash is invisible in Notley’s vaunted Climate Leadership Plan. In the 97-page “Report to Minister” that launched the plan, feel-good aspirations about possible reductions in oil sands emissions intensity abound, but there isn’t a single direct account of current oil sands emissions. There is an indirect reference—in a pie chart—that, if a reader does the arithmetic, suggests emissions might have been 58 megatonnes in 2013. But the avoidance of a rigorous accounting of current oil sands emissions in Notley’s plan is a flashing yellow light: What are current emissions and what does that include? Focus requested a detailed inventory of all greenhouse gas emissions from Alberta’s Climate Change Office. The only data it could provide was collected under Alberta’s Greenhouse Gas Emissions Reporting Program. That information covered only half of Alberta’s acknowledged overall emissions and was limited to “facilities” that emitted 50,000 tonnes or more each year. The most recent report that’s available covers 2013 and doesn’t reflect significant increases in oil sands production since then. It put 2013 emissions at 58 megatonnes, just like the pie chart in Notley’s Climate Leadership Plan. Since 2013, Alberta oil sands production has increased by about 629,000 barrels per day. That increase alone, at the US Congressional Research Service’s carbon intensity average of 20 grams of carbon-dioxide-equivalent for each megajoule of bitumen produced, would have added close to 30 megatonnes. Added to Environment Canada’s dubious 2013 account of oil sands emissions, Alberta would now be at 94 megatonnes. Let me sow a little more doubt about Environment Canada’s account of emissions. In the same publication as it provides its brief three-number summary of oil sands emissions mentioned earlier, it also summarizes “fugitive emissions” for all of Canada’s oil and gas industry, including the oil sands. Fugitive emissions are the greenhouse gases that escape from tailings ponds, oil sands mine faces, oil and gas valves, pumps and pipelines, and so on. Environment Canada claims 30.5 megatonnes of oil- and gas-related fugitive emissions for 2012 (see its Table A.4. above). Yet provincial breakdowns of emissions data from Canada's National Inventory Report of emissions filed with the UN for 2012 show that fugitive emission produced by the oil and gas industry were actually 61 megatonnes. In other words, there’s 30 megatonnes of fugitive emissions from Canada’s oil and gas industry that are missing from Environment Canada’s description of the industry’s emissions. The lion’s share of oil and gas fugitive emissions, by the way, are released by Alberta—35 megatonnes each year. That missing 30 megatonnes largely makes up the difference between the public perception of where oil sands emissions are currently (68 megatonnes) and Notley’s Cap (100 megatonnes). When questioned by Focus, Environment Canada was unable to explain why fugitive emissions from the oil and gas industry were not fully counted in its depiction of national oil and gas sector emissions. It noted that the missing emissions were included in Canada’s National Inventory Report as submitted to the UN. Trudeau and McKenna know that Canada’s emissions reporting procedures need to be improved and have proposed legislation to accomplish that. Under Stephen Harper’s climate-change-skeptical government, the reporting threshold for an industrial emitter had been 100,000 tonnes per year but was lowered to 50,000 tonnes in 2010. Now Environment Canada is hoping to move that down to 10,000 tonnes. But on the day Trudeau approved Trans Mountain, he expressed certainty that the facts and figures were on his side. “This is a decision based on rigorous debate, on science and on evidence,” Trudeau said. “We have not been and will not be swayed by political arguments—be they local, regional or national.” Prime Minister Trudeau linked approval of Trans Mountain to oil sands emissions not exceeding 100 megatonnes. But the best analysis that’s been applied to measurement of those emissions suggests it could already be as high as 120 megatonnes. That’s not a political argument. It’s a serious question about the “evidence” Trudeau is using. NOTLEY'S CAP, the promise to somehow hold oil sands emissions to no more than 100 megatonnes, presumes they’re currently either 58 (Alberta) or 68 (Environment Canada) megatonnes. Through the cap, Alberta is giving itself permission to ramp up oil sands production by about 50 percent above current levels. The Canadian Association of Petroleum Producers’ 2016 projection to 2030 show oil sands production climbing to around 3.5 million barrels a day by about 2028 and then beginning to accelerate. At the same time, the Trudeau government is acting on its commitment to significantly reduce Canadian emissions by imposing an escalating price on carbon for any province that doesn’t follow its lead. The contradiction of facilitating oil sands growth while discouraging the use of fossil fuels with a carbon tax or fees is jarring enough. But the bizarre, long-term consequences for the Canadian economy of these two initiatives, if they both play out as hoped for by Trudeau and Notley, seems to have been overlooked. As national emissions decline, emissions caused by production of bitumen destined for export will come to dominate Canada’s carbon budget. If Alberta’s fossil-fuel exports have a stranglehold on allowable emissions, its oil and gas industry could choke off economic opportunity in the rest of Canada. Oil and gas extraction have high emissions per dollar of economic value that they create. Other industries in the same boat, like electricity and heat utilities, construction, manufacturing, forestry and agriculture, will be required to pay the same level of carbon fees for their activities even though their products—electricity, heat, infrastructure, housing and food—are essential to the well being of Canadians. Are exports of fossil fuels to the US necessary for Canadians to have a good quality of life? Where is the proof of that? How soon might the strangling of the Canadian economy begin? For the analysis below, we start with Environment Canada’s numbers. Environment Canada reports that, in 2014, 192 megatonnes of emissions were attributable to the oil sands and conventional oil and gas industries. As noted earlier, however, Environment Canada had removed 30 megatonnes of fugitive emissions from that account. If we put them back in, emissions related to Canada’s oil and gas industries were 222 megatonnes. How much of that was attributable to exported fossil fuels? That year, according to the National Energy Board, 77.5 percent of crude oil and 47.5 percent of natural gas was exported. If those percentages are applied to the appropriate components of Environment Canada’s 222 megatonnes, emissions related to net exports of natural gas and oil, including bitumen, total 146 megatonnes. So, two years ago, using Environment Canada’s suspect numbers, just emissions resulting from production of fossil fuels destined for export were already pushing McKenna’s mid-century goal of 150 megatonnes for Canada’s entire emissions budget. If oil sands production continues to grow at the rate projected by Alberta Energy Regulator, then emissions from producing fossil fuels for export will climb at about the same rate. You might ask: Won’t there be improvements in emissions intensity? The previously-mentioned study by Englander et al indicates the industry has flat-lined on improvements in emissions intensity since about 2005 and the increase in the extent of in-situ extraction, which is more emissions intensive than surface mining, could cancel out any efficiency gains that might be possible through improvements in technology. In-situ extraction involves injecting steam into the ground deep below the surface and, in effect, melting the bitumen out of the sands that contain it. It’s a process that involves burning a lot of natural gas to heat up water for steam. That form of extraction is expected to account for 60 percent of all bitumen production by 2025. By 2035, emissions from production of fossil fuels destined for export could eat up more than 50 percent of all allowable emissions as Canada reduces its national emissions towards McKenna’s goal of 150 megatonnes. By 2045, producing fossil fuels for export could consume all of Canada’s allowable emissions. If oil sands emissions have been underestimated to the extent suggested by the US CRS emissions intensity finding and other studies, emissions related to fossil-fuel exports could consume half of Canada’s carbon budget by 2028—and all of it by 2040. Not included in this analysis is the potential for a large increase in emissions that would result from an increase in export of natural gas from Alberta, not covered by Notley’s Cap. The province’s vast and largely untapped reservoir of shale gas—estimated by Alberta to be 110 times larger than its conventional gas reserve—could come under intense development pressure if natural gas production in the US begins to falter. The inevitable consequence of allowing oil sands production to grow—rather than starting to cut it back now—will be that Canada’s allowable emissions will be dominated by production of low-value bitumen for export, mainly to the US. Canada would never be able to turn off our powerful neighbour’s supply. Our country’s economic role in the world would then be to serve as America’s pump jockey. When Trudeau announced approval of Trans Mountain he told his audience: “I have said many times that there isn’t a country in the world that would find billions of barrels of oil and leave it in the ground while there is a market for it.” The prime minister is apparently stuck on that idea and is unable to see that it no longer fits with the more fundamental need to lower carbon emissions and prepare properly for the low-carbon economy that Canada needs to build. Meanwhile, as Alberta’s premier flails about in a sea of low-value hydrocarbons, her promises threaten to pull the rest of Canada under with her. Trudeau has thrown her a life-ring, but to what end? David Broadland is the publisher of Focus Magazine.
  8. November 2016 The choice of the controversial site over Rock Bay will lead to hundreds of millions in costs that could have been avoided. UNPUBLICIZED WARNINGS from the engineering company Stantec to the Seaterra Commission in 2013 show there’s a big difference between what the public has been told and what CRD bureaucrats and their corporate proxies know about a wastewater treatment plant at McLoughlin Point. Simply put, a plant squeezed onto the tiny McLoughlin site is going to present regional taxpayers and the environment with big problems. Soon. Within a few years of the plant’s commissioning, costly new treatment capacity will have to be built elsewhere to avoid the expense and environmental impacts resulting from the heavy use of chemicals that will be needed to keep the plant operating to federal regulation standards. Senior CRD bureaucrats aware of these circumstances failed to disclose to the public McLoughlin’s serious limitations during a 2-year-long reconsideration of the site’s suitability. As a result of these circumstances, and the CRD’s recent move to start planning for a second wastewater facility in Colwood, Victoria taxpayers will likely be facing a bill for three widely separated treatment plants at an additional cost of hundreds of millions of dollars above what it would have cost to construct a single expandable plant at the relatively spacious Rock Bay site. Above: The relative sizes of the McLoughlin Point site (left) and the Rock Bay site. Rock Bay is 2.7 times larger than McLoughlin. Stantec told the CRD in 2013 that Victoria’s sewage treatment needs could exceed the capacity of a plant at McLoughlin by 2018. CRD staff failed to inform the public of that limitation. A 35,000 kilogram/day treatment plant located at McLoughlin Point (left) and Rock Bay (right). The McLoughlin plant couldn’t be expanded to meet future population growth. The Rock Bay site has abundant room for expansion (red areas). Thus a treatment plant at McLoughlin will result in a much costlier system of decentralized plants, a situation CRD elected officials have said should be avoided. Below I will describe a number of issues that arise from the diminutive physical size of the McLoughlin Point property, which a peer review had warned the CRD in 2009 was “extremely small” for a sewage treatment plant. It would appear that issues vital to the public’s understanding of this project have been deliberately hidden from both elected officials and the public. At the end of this article I will examine the question of whether the withholding of this information may have created an avenue for a court challenge of the project. Let’s start here: In September 2013, Stantec engineers responded to what they called “pointed questions” about the capacity of a proposed wastewater treatment plant at McLoughlin Point to handle expected liquid flows and organic loads. The engineers’ written response to these questions, submitted by the Seaterra Commission then overseeing the project, included the distinct possibility that the plant’s design capacity could be exceeded by the time the plant was expected to become operational in 2018. But, if that happened, the engineers told the commission, “CEP operation would most likely be implemented to maintain adequate capacity until 2040.” Focus was given a tip that led to the Stantec memo. A search of CRD records indicates Stantec’s September 2013 warning was never shared by CRD staff with elected officials in a public meeting. By “CEP operation” the engineers meant “chemically-enhanced primary treatment,” (CEPT) a costly and increasingly contentious add-on to primary treatment that is sometimes employed to reduce the level of phosphorous and/or nitrogen being discharged to waters that are particularly sensitive to eutrophication, such as lakes. During CEPT operation, three different chemicals are injected into the influent as it flows through a wastewater plant, increasing the rate at which solids are removed. But those chemicals end up in the sludge produced by sewage treatment and create a big problem: The sludge can’t be incinerated, used as fertilizer, or recycled in any useful way. UBC engineering professor Dr Don Mavinic, an expert on sewage treatment, told Focus in 2014: “This is a huge problem in Ontario right now. It’s become very contentious. Very few landfills will accept the sludge now. Most incinerators won’t touch it. Ontario has ended up with this chemical soup that has to be stored somewhere because you can’t do anything with it.” In Victoria’s case, DFO scientists have determined that eutrophication isn’t a concern. But CEPT is also used in plants that have reached the upper limit of their design capacity. The aging Lions Gate treatment plant in North Vancouver—slated for replacement by 2020—began using CEPT in 2014 as it bumped up against its capacity limit. That a new plant at McLoughlin Point would need to implement CEPT soon after it had been constructed in order “to maintain adequate capacity,” as Stantec acknowledged in 2013, is extraordinary. In the recent 18-month-long consideration of optional sites, McLoughlin wasn’t on the table. As a result, questions about the site’s suitability lay dormant and Victorians were never informed that the excess capacity of a treatment plant there could be used up as early as 2018. Prompted by a letter to CRD directors from this reporter, the issue of McLoughlin Point’s limited capacity was raised at a CRD Board meeting on September 14. At that meeting, elected officials voted to proceed with the McLoughlin treatment plant. But before that vote, CRD directors were given an opportunity to question members of a “Project Board.” The Project Board’s Chair, Jane Bird, and Vice Chair Don Fairbairn—both Vancouver residents who have no previous experience directly related to sewage treatment—took questions about the Project Board’s choice of McLoughlin Point over other options. CRD Director Colin Plant asked whether the McLoughlin plant would have sufficient excess capacity. Fairbairn told Plant, “We have the highest level of confidence that under a low, medium, high population growth scenario, this plant will have adequate capacity for a minimum of 20 years…It can be very difficult for a non-technical person, such as myself, to understand. That’s why we do have to rely upon the expert opinions of firms like Stantec, as well as on the years of expertise with your staff.” Fairbairn’s response ignored the advice Stantec had given the Seaterra Commission in 2013. Its expert opinion then was: “At an increased growth rate of 2.1 percent, the plant capacity is reached much sooner by the year 2018…To cope with the high growth rate scenario, CEP operation would most likely be implemented to maintain adequate capacity until 2040.” Now Fairbairn was claiming Stantec’s expert opinion was that, under any population growth scenario, capacity would last “for a minimum of 20 years.” For clarity, the organic loading capacity of the plant referred to by Stantec in 2013—35,000 kilograms per day—was exactly the same as the plant Fairbairn was referring to. Various documents authored by Stantec and other consultants show the critical limiting design factor for a McLoughlin plant is organic loading—referred to by wastewater engineers as biochemical oxygen demand—not hydraulic flow. Stantec’s 2013 projection that peak organic loading in Victoria’s sewers could reach McLoughlin’s limited design capacity by 2018 was based on a population growth projection of 2.1 percent per year. At a growth rate of half that (lower than the CRD is currently using for its projections), Stantec’s arithmetic shows the McLoughlin plant could run out of capacity by 2023. That date is within a few years of the CRD’s hoped-for completion date of 2020. Bird and Fairbairn did not respond to requests from Focus for information. The CRD refused to answer questions related to McLoughlin’s capacity limitations. CRD directors have been told that CEPT would be employed during significant “wet weather events,” but they have never been told—in public—that its regular use could be needed as early as 2018 as a result of the plant’s capacity being exceeded. Yet Mavinic’s 2014 concern that CEPT chemicals create sludge that “you can’t do anything with” seems to have been incorporated in the Project Board’s two recommended options for dealing with McLoughlin’s sludge. Both options included perpetual storage of the sludge in “biocell reactors,” which would be, essentially, permanent hills of toxic poop composting beside Willis Point Road, waiting for someone to figure out what to do with them. Residents in the area worried about the impact of the piles on air and groundwater quality will have to hope that a safe way to dispose of the sludge will be found one day. The Project Board only suggested they could be “mined” for a “beneficial use” once such a use had been discovered. The evidence indicates, then, that three vitally important pieces of information about a plant at McLoughlin Point were hidden by CRD staff from both elected officials and the public while the community evaluated other site locations: Its very limited excess capacity; the consequent need for ongoing use of CEPT soon after it is completed; and how CEPT limits what can be done with the sludge produced by the plant. Obscuring of these facts continues. THERE ARE TWO OTHER CONCERNS arising from McLoughlin’s limited capacity that have also been kept out of view by the CRD: First, how McLoughlin’s small size limited what treatment technology could be used there; and second, the huge additional cost that will result from the need to provide additional capacity using a system of decentralized treatment plants. Let’s look at the former first. The same 2009 peer review that judged McLoughlin to be “extremely small” questioned the CRD’s initial choice of membrane bioreactor (MBR) technology for secondary treatment and suggested the CRD assess biological aerated filter (BAF) technology as well. That’s the secondary treatment process the CRD eventually chose and, in 2013, the Seaterra Commission, in its “pointed questions” start-up phase, asked for an explanation of that choice. Stantec engineer Dr Bob Dawson’s reply to the Seaterra Commission described the physical process involved in a BAF plant, and he made a number of observations. Dawson wrote, “BAFs are relatively recent proprietary systems developed in Europe over the last 15 to 20 years and have been gradually introduced into North America over the last 10 years—a similar development timeline as membrane processes.” But if the technology was so new—Wikipedia calls it an “emerging technology”—then why would the CRD risk using it in Victoria? That’s covered by a second observation made by Dawson: “[BAF] is particularly applicable for locations where there is limited space for construction of a plant…” In other words, McLoughlin Point’s tiny size dictated the use of a highly compact form of treatment for which there was a very short track record. So what is the experience with BAF in Europe, where it has been used for five or ten years longer than in North America? Here’s what AECOM engineers who were making a comparison of wastewater treatment technology options for Jersey, one of the Channel Islands, in 2014, said about BAF: “Biological Aerated Filters are not recommended for consideration due to the associated high capital and operational costs. Generally, BAF technology produces effluents with very low suspended solid concentrations. However, after backwash cycles, this can deteriorate resulting in poorer quality effluent, which will reduce the effectiveness of the UV disinfection plant.” AECOM, by the way, is the global wastewater engineering company that’s one of three partners in Harbour Resource Partners. That’s the consortium that won the contract to build a BAF plant at McLoughlin Point in 2014, a contract recently resurrected by the Project Board. So, because of McLoughlin Point’s tiny size, Victoria is getting an apparently problematic treatment technology that, compared to more proven technologies, has higher capital and operating costs. Stantec’s explanation of BAF technology to the Seaterra Commission included information about the filter bed media utilized by the process. Stantec’s memo contained a photograph of expanded polystyrene beads, the filter media used, for example, in one of the few other BAF plants in Canada at Kingston’s Ravensview treatment plant. Polystyrene beads are a soft, friable plastic and since the polystyrene filter bed would be eroded over time by the effluent passing through it—especially if it contains fine, gritty precipitate introduced by CEPT—one can easily imagine a BAF plant being a perpetual source of microplastics flowing into the Strait of Juan de Fuca. When asked by Focus what filter-bed medium would be used at McLoughlin, Stantec replied, “The filter media for the BAF has not been selected yet as design is not complete.” The contract, however, has been awarded and the CRD would have no real control over what filter bed media is used. A search of CRD records indicates Stantec’s August 2013 explanation of BAF technology to the Seaterra Commission was never shared by CRD staff with elected officials at an open, public meeting. NOW LET’S LOOK AT how McLoughlin’s small size will lead to a system of decentralized treatment plants and huge additional costs. Stantec’s 2013 warning to the Seaterra Commission about the site’s limited capacity to accommodate future population growth in the region offered a mitigating strategy—the ongoing use of CEPT. But there’s another solution that would avoid the use of CEPT—building a second treatment plant at a different location. That strategy is actually incorporated in the CRD’s current Liquid Waste Management Plan. CRD staff have said, in several reports, that a second plant should be built in the West Shore because that’s “where most of the growth is occurring.” I’ll show later that this prognostication is demonstrably incorrect, but first consider how the strategy of building a second plant at a different location completely contradicts what the CRD has been saying all along about economy of scale. If a second plant location could be avoided, wouldn’t taxpayers stand to save many millions—perhaps hundreds of millions—of dollars on capital, operating and borrowing costs? That had always been the position of CRD staff and pro-McLoughlin politicians when they were dismissing the idea of distributed treatment plants as being uneconomical compared to a single plant at McLoughlin. Indeed, the Project Board’s final report states that splitting McLoughlin’s capacity between two plants would increase the capital cost by $245 million. So, by the Project Board’s own reckoning, decentralization would have increased capital costs by 32 percent. That, in turn, would result in higher borrowing costs. Presumably, operating costs would be higher as well. Paradoxically, then, although the Project Board’s report confirms there is a very high cost that comes with a decentralized system, its choice of McLoughlin Point guarantees that Victoria will get a decentralized system—and the higher costs. Once McLoughlin’s capacity has been reached, what would an additional plant cost? The Urban Systems-Carollo options analyses, done earlier this year as part of the 18-month-long consideration of optional sites, estimated that by 2030 an additional $250 to $310 million would need to be spent for additional capacity. That estimate didn’t include additional conveyancing costs, which would likely add another $100 million. So with McLoughlin Point as the first step in a decentralized system, the experts are predicting additional costs of $350 to $410 million by 2030. It’s noteworthy that an outlook to 2030, as was included in the Urban Systems-Carollo analyses, doesn’t appear anywhere in the Project Board’s final report, and isn’t reflected in its estimates of cost per household. What’s readily apparent from the engineers’ estimates of the high cost of decentralization and the high cost of additional capacity is that a single expandable treatment plant could save the community hundreds of millions of dollars in capital costs compared to two widely-separated treatment plants located at McLoughlin Point and in Colwood or Langford. To save those hundreds of millions, though, a site larger than McLoughlin Point would have needed to be available. As we know, such a site is available—at Rock Bay. Yet the Project Board’s comparison of McLoughlin with Rock Bay gave not one iota of value to Rock Bay’s ability to accommodate expansion far into the future. This, too, is extraordinary. The Rock Bay site is 2.7 times larger than McLoughlin. Stantec’s rudimentary positioning of a treatment plant at Rock Bay for the Project Board’s report shows just how much of the Rock Bay site was left unused. That room for expansion would have completely eliminated the costly and environmentally-problematic reliance on CEPT “to maintain adequate capacity.” That advantage, too, was given zero value by the Project Board. It’s also possible that Rock Bay is large enough to accommodate a form of treatment that has lower capital and operating costs than BAF. The Project Board’s comparison of McLoughlin Point with Rock Bay used essentially the same BAF plant on both locations. That must have made for an easy comparison of cost (they should be close to equal), but did Stantec consider a technology with lower capital and operating costs for the much larger site at Rock Bay? It claims, without providing any evidence, that Rock Bay wasn’t large enough to accommodate conventional activated sludge technology. But the new Lions Gate plant in North Vancouver will be sited on a smaller parcel of land than Rock Bay, will be able to process a greater liquid load than the McLoughlin plant, has enough room for on-site anaerobic digesters—and uses lower-cost activated sludge treatment. It’s expected to be expandable to meet the needs of the North Shore well past 2100. While Fairbairn advised Plant to rely on “the expert opinions of firms like Stantec,” the expert opinions of Stantec have had a habit of selectively disappearing into the bowels of the CRD. Is there a memo somewhere in those depths explaining why Stantec never looked very hard at options other than a BAF plant at McLoughlin Point? LET'S BACK UP TO CONSIDER THE CRD'S PLAN to build a second treatment plant in either Colwood or Langford. If you think this is unlikely, or not particularly imminent, consider this: When CRD directors voted to go ahead with a treatment plant at McLoughlin Point, they also committed to spend $2 million on initial planning for a second treatment plant in Colwood. Why would the Project Board have made this recommendation if, as Fairbairn put it, “under a low, medium, high population growth scenario, [McLoughlin] will have adequate capacity for a minimum of 20 years…”? The Project Board, CRD staff and Stantec know that building a plant at McLoughlin Point with limited capacity for future growth means the development of a plan for a second plant needs to start immediately, and that’s what the CRD is doing. The extra business is obviously good for Stantec, but why would the CRD prefer that course instead of choosing Rock Bay, where hundreds of millions of taxpayers’ dollars could be saved by avoiding a decentralized system? The Project Board claimed a plant at Rock Bay would cost $155 million more than one at McLoughlin. Much of that difference is in the higher cost of land at Rock Bay. The Project Board said the cost difference was “material,” meaning significant, but it didn’t give any material value to the highly valuable room for expansion at Rock Bay. No, the relatively small difference in capital cost doesn’t explain the Project Board’s choice of McLoughlin over Rock Bay. If Rock Bay had won out over McLoughlin, that could have been construed as a professional and political defeat for all those CRD staff and elected officials who have insisted that the $80 million spent on planning and 10 years of talking had correctly identified McLoughlin as the best location. With careers in the balance, McLoughlin was the emotional favourite. Other than that, though, there doesn’t appear to be any real justification for the choice. In fact, when the question of why the CRD would choose to put a second plant in Colwood is examined carefully, it becomes clear that a third plant—likely located in Victoria—will be needed in about 20 years. Even though Colwood and Langford contribute little more than seven percent of the current wastewater load, the CRD plans to put a second plant there anyway. Why? The CRD’s rationale is based on an out-of-date belief that “most of the growth is occurring” there. But over the last six years this belief has proven to be a delusion. The CRD’s own figures show that the increase in the number of people living in the “Core” municipalities has been almost twice that of Langford and Colwood combined. Moreover, when all sewage-generating development is considered—residential, commercial, institutional and industrial—the wrong-headedness of the CRD’s strategy is even more evident. Over the past 6 years, based on the value of building permits issued in each municipality, the core has seen 2.5 times as much growth in long-term wastewater-producing development as Langford and Colwood. The vast majority of that growth is occurring in Victoria and Saanich. Witness the numerous construction cranes in the Downtown core right now. There is nothing like this happening in Langford and Colwood. This recent reversal in the focus and form of development, from the suburbs to urban cores, from low density to high density, is taking place elsewhere in North America, including in cities like Vancouver and Toronto. Inevitable changes in public policy around energy, housing and transportation in response to the threat of climate change and ocean acidification will accelerate this phenomenon. As a result of the CRD’s miscalculation of where most growth will occur, putting a limited-capacity plant at McLoughlin and planning for a second plant in the West Shore will put taxpayers in jeopardy of having to pay for three plants. There’s two reasons for that. First, a second plant on the West Shore won’t be able to serve future growth in Saanich and Victoria without a hugely-expensive reconstruction of the sewer trunks. That’s never going to happen. Secondly, after a plant is built in Colwood or Langford, the small portion (about seven percent) of McLoughlin’s capacity that would be freed up would soon be gobbled up by growth in Victoria and Saanich. So, 20 years from now, Victorians will be looking for a third treatment site—one that will have to be located in either Victoria or Saanich, where most of the region’s growth is occurring. Where will it go? Clover Point is a likely candidate. If the cost of decentralization—going from one to two plants—is about 30 percent of the project cost, as the Project Board’s numbers indicate, what would be the additional cost of building three plants instead of one? Forty percent? Fifty percent? A far more logical, less expensive alternative would have been to put one central plant at Rock Bay—followed by incremental expansion of capacity there as required. The remediated contaminated site at Rock Bay was identified during extensive public consultation as the location for treatment most preferred by the public. Its First Nations owners were eager to sell. The site is already surrounded by industrial operations that provide essential building materials for constructing a city—gravel, concrete, asphalt and beer—businesses that are unlikely to go away in the future. The location was also supported by the mayors of Victoria and Esquimalt. In spite of all those strong positives, the previously rejected McLoughlin site magically became the recommended option—even though it wasn’t even part of the recent 18-month-long consideration of options. But wait…by not being on the table, the CRD avoided examination of any of McLoughlin’s strong negatives (see above). THE CRD HAS HIDDEN FROM THE PUBLIC many significant aspects of this project: McLoughlin’s limited capacity, the need for the use of CEPT, the way in which CEPT would restrict what could be done with the sewage sludge, the known problems with BAF technology, the need for—and cost of—additional capacity, including the certainty of a second plant and the likelihood of a third plant. Yet the provincial Environmental Management Act allows the CRD to proceed with its flawed plan without the need for elector consent through a referendum. In ordinary circumstances, such issues as I’ve outlined here would have been hashed out in public by opposing sides in a referendum. A citizen’s right to be asked by a municipal government for permission to borrow large sums of money to provide that citizen a service is a basic right in Canada. The Environmental Management Act takes that right away in the case of implementing a Liquid Waste Management Plan. But the Province’s published guidelines promise that electors will be “adequately” consulted. Given the circumstances I’ve described above, there is grave doubt that consultation has been adequate. With the failure of Victoria’s political representatives to address these issues— they, too, have been kept largely in the dark—do Victoria electors have any avenue through the courts? I outlined these issues to Victoria lawyer John Alexander, a litigation partner with the law firm Cox Taylor. I noted the EMA’s promise of adequate consultation and asked Alexander if there was any avenue for a judicial review of the Province’s expected approval of the CRD’s McLoughlin-based Liquid Waste Management Plan (LWMP). Alexander replied, “From a legal perspective, the question would be stated: Does the Province’s published non-statutory consultation requirement create a legitimate expectation that an Order imposing a LWMP would not be made without consultation?” Alexander pointed to a 1990 Supreme Court of Canada ruling which states, in part, “[the doctrine of legitimate expectations] is simply an extension of the rules of natural justice and procedural fairness. It affords a party affected by the decision of the public official an opportunity to make representations in circumstances in which there otherwise would be no such opportunity. The court supplies the omission where, based on the conduct of the public official, a party has been led to believe that his or her rights would not be affected without consultation.” “In other words,” Alexander wrote, “the court sets aside the decision on the basis that [it was] as if some required procedural step was not properly taken.” Focus readers interested in supporting a legal challenge of the Province’s approval of the CRD’s plan for McLoughlin Point can express that support by contacting us at 250-388-7231, email at focuspublish@shaw.ca, or by using the "Contact Us" form on this website. If a legal challenge is organized by Victoria electors, Focus will connect you with the organizers of that challenge. David Broadland is the publisher of Focus.
  9. September 2016 The city was once targeted by Sierra Legal Defence Fund for the level of "harmful" chemical contaminants in its wastewater. 12 years later, advanced source control has reduced those contaminants to a level lower than is allowed in Canada's drinking water. IN MID-AUGUST, Victoria architectural firm D’Ambrosio Architecture + Urbanism released drawings of a design created by “an international team” for a wall around a sewage treatment plant on McLoughlin Point at the entrance to Victoria’s harbour. Writing in the past tense, as though the idea might have already been superceded by some better one, the firm stated: “The architectural strategy embraced the industrial nature of the facility. It consisted of a series of 184 concrete columns forming a palisade around the process and operation buildings. The buildings housing operational and administrative functions engage the columns, creating a visually calm and collected interface between the industrial facility and the sensitive harbour waterfront.” Aside from obscuring the plant, the 184 sturdy concrete columns in D’Ambrosio’s design don’t suggest any utilitarian purpose. Instead they seem to be there simply to memorialize something big: A war, perhaps, or some other sad outcome of human misjudgement. Considering the evidence challenging the wisdom of moving the region’s marine-based treatment system to land, D’Ambrosio’s vision is, intentionally or not, one of the most honest public statements made about the treatment project so far. D’Ambrosio Architecture + Urbanism's vision of how a sewage treatment plant at the entrance to Victoria Harbour could appear Later in this story I’m going to follow D’Ambrosio’s lead and be creative, but in a different direction. Rather than artful, I’m going to be practical and answer these questions: What’s wrong with the current marine-based treatment system, and how can we fix that? First though, I want to tell you about one of the great strengths of the current system, a very progressive approach to wastewater treatment that obviously has much potential but will likely be one of the first casualties of land-based secondary treatment: the CRD’s Regional Source Control Program. The best way for me to describe its potential is to show you what it has achieved since 2003. Back in 2004, the Sierra Legal Defence Fund (now called Ecojustice) petitioned the federal government “to effectively address the pollution of the marine environment by persistent organic pollutants including PCBs.” Ecojustice targeted Victoria’s outfalls in their petition and named ten different contaminants it identified as “harmful”: Polychlorinated biphenols (PCBs), oil and grease, mercury, lead, silver, cadmium, copper, zinc, polycyclic aromatic hydrocarbons (PAHs) and “halogenated compounds.” For each contaminant, Ecojustice provided an estimate of the amount being discharged over a two-year period. It based its numbers on CRD data obtained through an FOI, but some of those numbers now seem unsupportable—either too high or too low—based on data since made public by the CRD. (Contacted by Focus, Ecojustice did not explain the numbers it used in its petition.) Using data collected by CRD scientists between 2003 and 2014 for the substances named in Ecojustice’s petition, reductions can be estimated: The discharge of oil and grease has been reduced by 93 percent, mercury by 95 percent, silver by 94 percent, cadmium by 72 percent, lead by 62 percent, and PAHs by 28 percent. Although there are several halogenated compounds found in the CRD’s effluent, consider the case of pentachlorophenol, once commonly used by homeowners in Victoria as an insecticide and a wood preservative. In 2004, PCP was detected in 100 percent of the CRD’s samples. By 2014, the CRD could no longer detect PCP in the effluent. What about PCBs? Ecojustice took a single sample from each outfall in both 2001 and 2003 and included those in its 2004 petition. That 2001 figure was 2100 grams per year and was well-used in campaigns to discredit the marine-based, source-controlled system. In 2015, DFO scientist Sophie Johannessen published a peer-reviewed study that estimated the annual discharge of PCBs at 340 grams per year. Based on those numbers, there has been an 83 percent reduction in the amount of PCBs being discharged, even while the residential population using the outfalls increased. The 95 percent reduction in mercury since 2004 is the strongest affirmation of the potential of source-controlled treatment. Victoria’s effluent now has even less mercury per litre than the Annacis Island secondary treatment plant on the Fraser River. But Victoria has done this without the immense cost of land-based secondary treatment or any of the environmental risks associated with land-based disposal of the chemically-contaminated biosolids that Annacis Island and other secondary and tertiary treatment plants produce. CRD scientist Chris Lowe, who has overseen the region’s wastewater monitoring program for many years, credits the reductions to the Regional Source Control Program and to reductions in the general use of certain materials in our culture. Digital photography has largely replaced silver-based photography, for instance. To put the current level of chemical contamination from Victoria’s wastewater in some perspective, I compared the CRD’s 2014 data for the substances Ecojustice identified as “harmful” with Health Canada’s current Drinking Water Guidelines for those substances. That’s right, I compared Victoria’s sewage with Canada’s drinking water. The result is surprising (see table below). Health Canada’s Guidelines specify an upper limit for the allowable concentration of contaminants for water to be safe for humans to drink. For example, the concentration of mercury allowed in Canadian drinking water is .001 milligram per litre. That turns out to be 100 times higher than the average concentration of mercury in the sewage that passed through the Macaulay Point outfall during 2014. Health Canada allows 25 times more cadmium and twice as much lead in drinking water than can be found in Victoria’s sewage. Not all of the contaminants targeted by Ecojustice are limited by Health Canada’s Guidelines. For example, copper and zinc are considered beneficial to human health and the Guidelines set no health-related limits for these. The US EPA’s National Primary Drinking Water Regulations, however, has established a limit for copper in America’s drinking water. That level is 10 times higher than the level of copper currently found in Victoria’s wastewater. Similarly, Health Canada’s Guidelines don’t give an allowable limit for polychlorinated biphenols (PCBs), but the EPA does. It allows PCBs in drinking water at a concentration that’s 50 to 60 times higher than is currently found in the effluent passing through Victoria’s two marine outfalls. Two of the contaminants on Ecojustice’s list, copper and zinc, both of which come mainly from deteriorating domestic water supply pipes and fittings, have seen only minor reductions since 2004. Does the failure of source control to limit the amount of copper and zinc warrant the investment of billions in public resources over the life of a land-based treatment system? Ecojustice might think it does, but environmental protection policy in jurisdictions around North America doesn’t reflect that view. Let me tell you about copper, zinc and “the initial dilution zone.” While both copper and zinc are considered essential for human health, they are potentially harmful to aquatic life even at low concentrations. To protect organisms against such contaminants in wastewater discharged to bodies of water, the BC Ministry of Environment has developed “Water Quality Objectives” that are orders of magnitude more stringent than Health Canada’s Drinking Water Guidelines. These objectives must be met, but the undiluted effluent from secondary and tertiary treatment plants simply can’t meet them. In fact, sewage treatment is known to increase the amount of dissolved copper, which makes copper more immediately available to cause harm to aquatic organisms. This amplification effect is even more pronounced for zinc. None of BC’s secondary treatment plants meet the Province’s standard for either copper or zinc. For example, Vancouver’s Annacis Island secondary treatment plant—located right on the migration route of Fraser River sockeye—exceeds BC’s water quality objectives by factors of 4 and 10 for zinc and copper respectively. How does the Province get around this conundrum? It does that by incorporating into its environmental regulations what’s known in wastewater treatment policy as “the initial dilution zone.” What is that? It’s an imperfect place, a volume of water where conditions are moving from not-so-good to better. The BC Environmental Protection Branch, responsible for overseeing the appropriate implementation of BC Water Quality Objectives, states: “Objectives do not apply within an initial dilution zone, which is the initial portion of the larger effluent mixing zone. The extent of initial dilution zones is defined on a site-specific basis, with due regard to water uses, aquatic life, including migratory fish, and other waste discharges.” In other words, the effluent inside an outfall isn’t required to meet water quality objectives. It doesn’t have to meet the objectives a second or two after being discharged, either. It’s allowed a certain distance away from the outfall—usually 100 metres—to become diluted enough that it effectively meets the regulation water quality objectives. Notice that all of this is determined on a “site-specific” basis. Once the effect of the initial dilution zone is taken into count, the CRD’s discharge of copper and zinc meets the Province’s stringent water quality objectives just as well as Annacis Island’s secondary treatment plant does. Why does the Province take this approach? The Environmental Protection Branch’s explanation is blunt: “If initial dilution zones did not exist, it would mean that effluent quality would have to meet water quality objectives, which would be costly and impractical.” Costly and impractical. Say those words a few times, roll them around your mind and try to get a feel for why the Province, the US EPA and every other jurisdiction in North America—except one—has adopted the initial dilution zone (aka “mixing zone, “zone of initial dilution,” etc) as a fundamental policy tool for making practical decisions about environmental protection and wastewater. That one exception, of course, is Environment Canada’s Wastewater Systems Effluent Regulations, brought into being by Stephen Harper’s government. Those regulations judged that Victoria’s effluent, before being discharged into the initial dilution zone, had too high a concentration of “suspended solids,” which, to translate as accurately as possible, means “digested food.” Obviously, “food” is not one of the substances that can be successfully source controlled by the CRD. As contaminants go, this one is all appearance and no cause for concern. According to marine scientists, the amount of digested food Victorians discharge to the ocean does not constitute an environmental risk. DFO’s Johannessen put it in context in her 2015 study: The discharge of digested food from the two outfalls represents .03 percent—that’s three one-hundredths of one percent—of the total suspended solids discharged by all sources to the Strait of Georgia and the Strait of Juan de Fuca. Yet that’s the only factor Environment Canada’s regulation used to push Victoria in the costly and impractical direction it’s now headed. There’s the appearance that something necessary has been done, but there’s no evidence that there is a problem that needed fixing. The CRD’s highly-effective source control program will likely be one of the first victims of Environment Canada’s unhelpful regulation. The serious financial burden of a land-based secondary treatment system will inevitably result in a quest for cost-saving measures at the CRD. Other communities with secondary sewage treatment don’t have advanced source control programs, so why should Victoria? Chop. Victoria’s unique approach to wastewater treatment—reducing contaminants by keeping them out of the environment in the first place—is a challenge for many to understand and appreciate. It has remained Victoria’s big secret. Why? Because the CRD has done a terrible job of communicating its successes to the community, and media here have largely ignored the story. The CRD’s unwillingness to toot its horn has likely reduced the impact of the program over what would have been possible with a better-funded, more broadly understood and supported initiative. That dearth of communication also applies to the marine-based treatment system’s strengths and weaknesses and its ability to protect public health. IN A 2008 EDITORIAL in the scientific journal Marine Pollution Bulletin, nine Victoria and Vancouver marine scientists refuted the basis on which then-Environment Minister Barry Penner had ordered Victoria to abandon its carefully-engineered and smoothly operating marine-based treatment system. The scientists noted, “The concept of natural sewage treatment has been criticized in the media, but in fact waste treatment is well recognized as a useful ecosystem service contributing to human well-being. The focus of environmental protection is changing to preserving such ecosystem services to the benefit of both human beings and the natural environment. It makes no sense to replace a natural ecosystem service with a human creation that is energy inefficient and has other harmful environmental consequences.” Let me remind you, briefly, how Victoria’s “natural treatment system” works. Many people know there are screening and settling plants at Clover Point and Macaulay Point where a lot of solids are removed, but they don’t seem to know what happens out in the Strait of Juan de Fuca. The plants on the points remove from the wastewater anything solid that’s larger than 6 millimetres in diameter, which is roughly the size of a pea. Oil and grease are scrapped off in the settling tanks and the remaining effluent—99.9 percent water—flows by gravity down a pipe to the outfall. The outfalls consist of a pipe sitting on the seabed leading to a specially-engineered section called a diffuser. Located 55 to 60 metres below the surface, the diffuser has carefully spaced and oriented ports that direct the effluent upwards. Macaulay Point’s diffuser is 135 metres long and about 1700 metres from the nearest shoreline. Clover Point’s diffuser is 196 metres long and 1100 metres from the nearest shore. The ends of both outfalls are capped and the effluent is forced out of 28 small ports at Macaulay and 37 ports at Clover. It’s not “dumped,” as the Times Colonist relentlessly claims. At Clover Point, the effluent is dispersed from 37 small jets into a 200-metre-wide by 60-metre-high wall of cold, turbulent, highly-oxygenated salt water that’s travelling at speeds of up to one metre per second, depending on the strength of the tides. At that speed, as much as 12,000 cubic metres of water passes over the diffuser in a single second, quickly diluting the effluent and beginning the physical process of killing off bacterial contaminants. Victoria’s treatment system harnesses a tremendous source of renewable energy—almost twice the peak spring flow of the Fraser River—to do that work. DFO’s Johannessen describes the turbulent river of water off Clover Point as “like a giant washing machine.” The “plume,” as the rising effluent is known, is washed in the direction of the current. CRD monitoring of the plume shows it seldom reaches the surface of the water (four times in 2014), and then only during periods of intense winter or spring rains. But the top of the plume is generally trapped five metres or more below the surface. At that depth, while bacteria are rapidly dying as a result of the harsh physical conditions, there’s little possibility of contact with humans. The testimony from knowledgeable, local experts about the efficacy of this approach to killing bacteria is unequivocal: Six past and current public health officers—Dr Richard Stanwick, Dr John Millar, Dr Shaun Peck, Dr Brian Emerson, Dr Brian Allen and Dr Kelly Barnard—have stated: “There is no measurable public health risk from Victoria’s current method of offshore liquid waste disposal.” In spite of their assurance, though, there are some pertinent questions about what might happen in the future as the region’s population grows and there’s more sewage. Would the plume break through to the surface more frequently? And what happens when the tides change? Doesn’t the current passing over the diffuser slow down, stop, and then change direction? Then what happens to the plume at slack tide? Does the rapid rate of dilution stop? I’ve spoken with many local engineers, scientists and interested residents over the past several years about these questions, and they’ve provided all sorts of creative solutions. I’ve knit some of what they’ve told me into what follows. According to these experts, there’s a way of improving the physical characteristics of the outfalls’ plumes that would make Victoria’s treatment system even more effective at dispersing chemical and biological contaminants and extend its life far into the future. The improvements become possible if a completely different problem—contamination of near-shore waters by the release of sewage during significant rainstorms—is solved first. Let me take you through these ideas, starting with solving a real problem for $50 million. The nine local marine scientists who wrote the editorial in Marine Pollution Bulletin in support of Victoria’s “natural sewage treatment” system noted that, prior to Penner’s 2006 order to the CRD, “an independent expert scientific review had been completed under the auspices of the Society of Environmental Toxicology and Chemistry (SETAC). This independent review made an important point that appears to have been overlooked by the Minister and others in favour of secondary treatment. Specifically, stormwater, sanitary and combined overflows, and other discharges, particularly into the surface waters in Victoria’s harbours, present more pressing environmental issues than the current offshore submarine sewage discharges.” None of those “more pressing environmental issues” have been addressed, and that has created bizarrely unpredictable results. In March of this year, Washington State Representative Jeff Morris and 36 of his fellow legislators threatened an economic boycott of Victoria. The incident that triggered their outrage was an article in the Times Colonist about what’s called a “combined sewer overflow,” or CSO. Morris thoughtthe paper’s story was about Victoria’s deep-water marine outfalls, but it wasn’t. It was about a very ordinary problem, common even in Morris’ 40th District. During heavy rainfalls, sewage collection systems that don’t have enough hydraulic capacity—the ability to absorb liquid—spill the excess through short beach outfalls into near-shore waters. When such events occur in Victoria, there’s a big fuss in the TC as CRD officials warn residents of the possibility of contamination of beaches. It’s common to blame these spills on the two deep-water marine outfalls, but there’s no connection. CSOs are a problem unto themselves, and land-based treatment plans developed by the CRD so far would only address a fraction of the problem. Seattle is currently fixing a much worse CSO problem than Victoria’s, partly by increasing the hydraulic capacity of the Murray Basin collection system. That involves building a big storage tank that can absorb surges in the amount of liquid in the sewers during storms. Once a storm has passed, and liquid levels in the collection system have dropped, the contents of the tank can be slowly released, and a CSO is avoided. The CRD’s McLoughlin Point plan included building such a tank in Gordon Head—the so-called Arbutus attenuation tank. Construction of that tank would have allowed the CRD to put a screen on one of three remaining unscreened beach outfalls on the East Coast Interceptor. Unscreened beach outfalls have given Morris and Mr Floatie much material to work with—“floatables” is the usual euphemism—in their misleading campaigns against Victoria’s marine-based treatment system. But only increasing the hydraulic capacity of the collection system will eliminate floatables, Mr Floatie and Representative Morris from the region’s politics. The cost of accomplishing that can be estimated from the projected cost of the Arbutus attenuation tank project. The CRD and its consultants predicted the Arbutus tank would cost $9.5 million and its construction would eliminate all overflows in the East Coast Interceptor portion of the system for all downpours up to a “one-in-five-year storm event.” That’s the CSO standard required by the Province. To bring the whole system up to that standard would require a total of five Arbutus-like tanks scattered strategically around the core area. The total capital cost of such an increase in hydraulic capacity would be in the neighbourhood of $50 million. Imagine if the CRD decided to listen to the scientists and actually solved the CSO problem. Doing that would also allow improvement of the marine-based treatment system. Here’s how that would work. Those tanks, if used only to absorb downpours and eliminate CSOs, would be empty almost all of the time. According to the CRD, over a six-year period the region experienced 160 CSOs. This means that having the capacity to reduce CSOs down to the Province’s standard would result in storage tanks that would be empty about 90 percent of the time. During that time, they could safely be used for another purpose: controlling the outfall plumes. Let me describe how that would work. The volume of liquid flowing to the outfalls as a result of human activity varies through a 24-hour period in a pattern that’s very predictable. See the oscillating line in the graph above, which shows how the flow to the outfalls varies over eight days. The consistent pattern of our daily use of sinks, showers, bathtubs and toilets results in a peak flow around breakfast, after which it falls off until mid-afternoon and then rises to another peak just before we go to bed. The flow falls to a minimum while we sleep, and then the cycle repeats itself. Imagine if we could even-out the flow so that’s it’s more or less constant throughout the day (indicated by the red line in the graph above). That would reduce the maximum flow from the diffusers and that would be like going back several decades in time to when the maximum flow was about 65 percent of what it is today. The plume would then be even less likely to reach the surface. How could the flow be evened out? The experts say that could be accomplished by using the CSO tanks to hold back some of the flow generated between breakfast and bedtime and then letting it go in the wee hours of the night. Those tanks would also allow reducing or pausing the flow of effluent from the diffusers during the period when the tidal current slows, stops and then reverses direction. This would further reduce the likelihood that the plume could break through to the surface. It might also lessen the amount of organic material that’s deposited in the footprint of the initial dilution zone during slack tide. There could even be bells and whistles: Large tanks full of sewage would contain a lot of thermal energy. The Southeast False Creek Neighbourhood Energy Utility in Vancouver, for example, uses thermal energy captured from sewage to provide space heating and hot water to nearby buildings. Perhaps the False Creek example points to where such tanks could be located: below the car-parking level in new Downtown condominium towers. In exchange for providing that service, or as an inducement to provide it, the City could allow a couple of extra floors on a limited number of new buildings. These tanks could also act as a catchment for particles of lead, zinc and copper coming from corroding plumbing pipes and fittings upstream, thereby keeping those materials out of the marine environment. It would even be possible to fit the tanks with standard water-oil separators that would further reduce the amount of oil and grease discharged to the Strait of Juan de Fuca, allowing the oil to be recycled. Why isn’t Victoria going to do this—or something along these lines? Perhaps Franc D’Ambrosio’s grand vision of 184 decorative concrete columns hints at the answer to that question. His project isn’t meant to be a response to a compelling physical issue. Same with the treatment project. It’s an opportunity for some people to make money and for others to make a name for themselves as architects, political fixers, activists—even as journalists. Mostly it fulfills a promise made ten years ago by former BC Premier Gordon Campbell to former Washington State Governor Christine Gregoire. Whatever the original objective might have been, the actual environmental impact of what will be chosen has since become of little consequence. In fact, knowing the impact has been carefully avoided. Here’s a telling example of how pervasive the avoidance of truth has been on this issue. Recall that at the beginning of this story I referred to a 2004 petition by Ecojustice to the federal government. The information Ecojustice used to press its campaign against Victoria’s treatment system turned out to be flawed, and in the years since, scientists have confirmed the CRD’s source control program has removed a big chunk of the contaminants. During that time, many local marine scientists publicly questioned the value of land-based treatment and expressed concerns about its associated environmental risks. By late 2012, enough doubts had been raised about the CRD’s direction that motions were put forward that would have provided time for seeking further input from the scientific community on the actual risks, if any, posed by Victoria's marine-based system. One might expect that an organization like Ecojustice would have been in full support of such an exercise. Who wouldn’t want to confirm that the best direction was being taken? Apparently, Ecojustice didn’t want to know. In a letter sent to CRD directors before they voted on the motions, Ecojustice lawyers implicitly threatened the CRD with legal action under the Species at Risk Act if it approved any further consideration. The motions failed. David Broadland is the publisher of Focus.
  10. July 2016 Contamination of local politics by a false pretence and a toxic promise may require primary treatment at the ballot box. ENVIRONMENT MINISTER BARRY PENNER ordered the CRD to shift to land-based sewage treatment in 2006. His claim that Victoria’s outfalls were contaminating the seabed has since been proven untrue. As well, Washington State legislators have provided evidence that Penner’s action was prompted by an unpublicized agreement between then-Premier Gordon Campbell and then-Washington Governor Christine Gregoire. Was the legislated right of Victoria electors to control their own financial resources stripped from them under false pretences? Are there other reasons why the Province is justified in preventing Victorians from making a democratic decision through a referendum about what form of sewage treatment would be best for the community? OVER THE NEXT 30 YEARS, Victoria-area households will pay somewhere in the neighbourhood of $1.2 to $2.2 billion to fund borrowing by the Capital Regional District for land-based sewage treatment. The costs of operating those facilities over that period will add another $650-$900 million to the cost of treatment—a service that numerous local marine scientists and health officials have said will provide little or no measureable health or environmental benefit. Once initial annual costs have been settled, electors will be expected to keep paying for this service in perpetuity. The legal right of Victoria electors to choose by a referendum whether or not they are willing to incur the debt those billions in payments would finance was taken from them in 2006. That right is generally protected by provincial legislation, but in this case the need for consent was overturned by a never-before-used section of the Environmental Management Act. That protected right now appears to have been taken under false pretences. At the time, the Province claimed an area of the seabed around each of the city’s two marine outfalls was so contaminated that they could each be designated a “contaminated site” under BC’s Contaminated Sites Regulation. It was widely accepted in the community at the time that the pollution had to be stopped and recalcitrant taxpayers could not be allowed to stand in the way of environmental protection. Then-BC Environment Minister Barry Penner justified this action on the basis of what came to be known as the MacDonald Report. That report has since been exposed as fundamentally flawed and its main conclusion just plain wrong. Commissioned by the Province, environmental scientist Donald MacDonald had analysed four years of data gathered by the CRD about what was in the sediment on the seafloor in the area adjacent to each outfall. Although MacDonald admitted he had “insufficient data” to “thoroughly evaluate sediment quality conditions,” he felt he could do “a preliminary investigation.” Based on this preliminary evaluation, MacDonald reported that sediments at the outfalls “are sufficiently contaminated to warrant designation…as a contaminated site.” His report didn’t include an analysis of the source, or sources, of the contamination suggested by the CRD’s data. The outfalls were assumed to be the source. MacDonald included in his report a flow chart that showed the five steps in the process of determining whether such a site was “legally contaminated.” He noted that the second step had not been completed. To determine whether a site is “legally contaminated” would have required completion of the second step followed by three additional, onerous steps. Penner didn’t bother to complete even the second step. MacDonald’s report was dated May 2006, but by that July Penner had ordered the CRD to create a plan for treatment. His order was made under Section 24(3) of the Environmental Management Act. Its use implied that a significant environmental harm was occurring and suspension of the basic principle of elector assent was therefore justified. This allowed Penner to run around the step-by-step requirements of the Contaminated Sites Regulation, and it allowed him to order treatment without having to specify what, precisely, sewage treatment needed to stop. Penner could have used the Abatement of Municipal Pollution section of the Environmental Management Act to order the CRD to address potential contamination, but that section would have limited such work to that “reasonably necessary to control, abate or stop the pollution,” or to remediation. Under that section, the legal requirement for electoral approval would also have been suspended, but the changes that the CRD would be required to make would have been limited to what was “reasonably necessary” to meet provincial regulations. Penner’s ministry would have been obligated to detail precisely what was “reasonably necessary.” He didn’t do that. Instead, he used Section 24 and opened up Pandora’s box. In his order to the CRD Penner stated: “To ensure value for taxpayers, I encourage the CRD to consider new technologies and alternative financing and delivery options, including the potential for private sector development.” Given that vague direction, it was perhaps inevitable that, 10 years later, the cost of the CRD’s considerations would have mounted to $70 million and the community would be divided into three camps over what action needed to be taken. But during that time, two facts have emerged that challenge the right of the Province to enable the CRD to proceed any further without seeking elector approval. First, over the past ten years the CRD has continued to monitor the sediment chemistry at the outfalls. Report after report has shown that, aside from occasional exceedances of permitted levels of a few substances, neither outfall would have qualified as a “contaminated site” under the Provincial regulation. Specifically, in 2011, environmental scientists with Golder and Associates completed an extensive study that looked at the trend in contamination at the outfalls between 1991 and 2009. They concluded the data “does not provide strong evidence that toxicity or other biological responses are expected.” In 2012, a scientific study authored by Mark Yunker, Avrael Perreault and Chris Lowe presented information that has explained the presence of unexpectedly high levels of polycyclic aromatic hydrocarbons (PAHs) in sediments to the east of the Macaulay outfall. In a wonderful piece of scientific detective work, their analysis eliminated both Penner’s theory—contamination by PAHs from wastewater—and a subsequent theory that the contamination was the result of the sinking of the collier San Pedro off Brotchie Ledge in 1891. By analysing the chemical signature of the predominant PAHs in the contaminated sediments, the scientists were able to determine a more likely source: “dredged sediment containing pyrolised coal waste from a former coal gas plant in Victoria Harbour” that had been dumped there long before the outfall was even built. At Clover Point, it turns out, there is so little sediment on the rocky bottom to test that reliable samples are difficult for scientists to even obtain. Nevertheless, the data from the last sediment survey conducted there in 2012 showed only a single reading in one location for only one substance—copper—that was above the Province’s guidelines. CRD scientist Chris Lowe told Focus that the as-yet unpublished data for the 2015 sediment survey showed the latest reading for copper at that location was a little more than one-half of the 2012 reading. In other words, although there is seabed contamination near the outfalls, the contribution from the outfalls to that contamination is limited and there’s no evidence of worsening environmental conditions. This is what local marine scientists have been saying for several years. The second piece of evidence that has emerged that challenges the Province’s removal of the requirement for elector consent originated in Olympia, Washington. A letter written to Victoria Mayor Lisa Helps by Washington State Representative Jeff Morris and signed by 37 other Washington legislators confirmed that Penner’s order to the CRD was, in fact, motivated by an unpublicized agreement made between then-BC Premier Gordon Campbell and then-Washington Governor Christine Gregoire in June 2006. Campbell and Gregoire and their respective cabinets had met at that time as part of a process “to enhance trade opportunities and create stronger ties between the two jurisdictions.” According to the legislators, during discussions relating to Vancouver’s hosting of the 2010 Olympics, Gregoire told Campbell her government was unhappy about promises made about sewage treatment in Victoria that had not been kept. As a result of that, the legislators claim, Penner ordered the CRD “to make good on those promises.” According to Morris, then, Penner’s order to Victoria was part of a trade deal. The contamination claimed by the MacDonald Report provided Penner with a plausible rationale for ordering Victoria to shift to land-based treatment. Invoking Section 24 ensured that Victoria electors would not be able to stand in the way of Campbell’s promise to Gregoire. David Broadland is the publisher of Focus.
  11. May 2016 Puget Sound is a mess of sewage and toxic chemical discharges. Should Victoria taxpayers have to pay for Seattle’s sins? WASHINGTON STATE’S OPPORTUNISTIC WAR OF WORDS against Victoria’s science-endorsed form of sewage treatment reopened on a new front in February. With the cost of placating Washington’s claims of environmental damage to international waters now hovering near $1 billion, Victoria could have lobbed some scientific evidence across the border. As usual, however, Victoria taxpayers were deserted by their own elected representatives, who backed down without uttering a contrary word. Yet the timing and substance of Representative Jeff Morris’ stun-grenade attack were so suspect that anyone with a pen could have poked them full of holes. On February 23, the Seattle Times reported that researchers had found 92 chemicals of concern, some associated with drugs—from caffeine to cocaine—in the tissue of juvenile Chinook salmon netted in Puget Sound estuaries into which sewage treatment plants discharge effluent. The researchers also found the chemicals in the effluent from these plants. The Times story noted that scientist James Meador’s earlier research had shown that juvenile Chinook salmon swimming through contaminated estuaries in Puget Sound die at nearly twice the rate of fish elsewhere. Other scientific research has linked nutritional stresses experienced by endangered southern resident orca to the decline in abundance of chinook salmon Then, just two days after the embarrassing drugged-Chinook story appeared, Morris announced a legislative proposal that would ban Washington State employees from claiming travel expenses for trips made to Victoria until Victoria builds a sewage treatment plant. Meador’s research was pushed off Seattle front pages and replaced with one that linked the Sound’s sewage problems to Victoria. A week after that, Morris sent a letter to Victoria Mayor Lisa Helps claiming that “chemical loading” from Victoria’s marine-based sewage treatment system poses a “long-term risk” to “our shared waters.” Morris’ letter was signed by 36 other Washington legislators whose districts also border on Puget Sound. Was the legislators’ initiative greeted with guffaws in Victoria? Not at all. A press release from BC Environment Minister Mary Polak stated, “Washington state residents can rest assured that Greater Victoria will have sewage treatment in the near future.” Meanwhile, Helps wrote a letter to a Victoria newspaper stating: “I want the public and our colleagues in Washington to know we take their actions seriously.” Morris’ letter to Helps outlined the 23-year history of BC politicians being “forced”—that’s the word Morris used—to accept Washington’s position that Victoria’s treatment system is somehow affecting Puget Sound’s environmental health. Morris’ letter provided point-by-point proof that BC’s acceptance of Washington’s claim had been obtained either by threat of economic boycott or the offer of a deal too good to refuse. That deal-making included, according to Morris, BC Premier Gordon Campbell agreeing in 2006 to command Victoria to build land-based treatment in exchange for Washington Governor Christine Gregoire’s support for Vancouver’s 2010 Olympic bid. Morris neglected to include in his timeline the 1994 findings of a joint panel of eminent BC and Washington marine scientists. Their report, The Shared Marine Waters of British Columbia and Washington, noted that “waters off Victoria theoretically could contain about 20 times as much dissolved sewage effluent from Vancouver and Seattle as from Victoria itself.” The scientists also noted that, in Puget Sound, Victoria’s contribution to the concentration of sewage effluent would be slightly more than one percent of Seattle’s. Morris’ demand that Victoria get on with construction of a sewage treatment plant was apparently precipitated by a fuzzy February 15 Times Colonist story headlined: “Heavy rain prompts health advisories at capital’s outfalls.” The story seemed to report that Victoria’s outfalls were unscreened and, following a period of heavy rain, were discharging “floatables, including plastics” that were washing up on local beaches. Based on his understanding of that story, Morris predicted widespread damage to the economy and environment: “We recognize the shared risk in short-term loss of tourism activity on both sides of the border from publicity surrounding this issue. However, we believe the long-term damage to marine mammals, in particular, but all marine wildlife does more long-term damage to ecotourism.” Morris finished his letter by asserting: “We can no longer tolerate the long-term risk that the chemical loading caused by Victoria CRD’s inaction has brought to our shared waters.” I wrote to Morris inviting him to answer questions about his letter and the issue, especially his claim about “chemical loading” of “shared waters.” He wrote back but refused to respond to questions about chemical loading because I had used the words “marine-based treatment.” That’s the expression 10 prominent BC marine scientists have used to describe Victoria’s current treatment system. Morris wrote: “Using the term ‘marine based treatment’ to describe dumping raw sewage into our shared waters is demonstrative of a story bent that I do not want to participate in.” Even as Morris refused to participate, though, he dug himself a deeper hole: “The straight-face test was the [Times Colonist] article pointing out that several outfalls were not even screened. My impression is that you don’t have a separate surface water collection system from your dumping of raw sewage in the Straight [sic] of Juan de Fuca. In the USA it is a requirement that surface water be a separate collection system from the primary and secondary sewage treatment systems.” Morris’ impression is wrong and so is his understanding of US requirements. Victoria’s separation of storm drains and sewers is just like Seattle’s and 10 other Puget Sound communities, including Anacortes, Bellingham and Mt Vernon, all of which are wholly or partly in Morris’ 40th Legislative District. Each of these cities have storm drains and sewers that have some interconnections, usually by design but sometimes through deterioration. During periods of heavy rain, sewage can flow from sewers into storm drains and this results in what sanitary engineers call “combined sewer overflows” (CSOs). CSOs act as a relief valve to prevent sewage from backing up into homes or overflowing land during peak rain events. This condition exists throughout North America, but construction of a treatment plant in Victoria would have no direct impact on CSOs here. That’s because the flow of sewage into storm drains occurs upstream of sewage treatment plants. Not only is Morris’ impression about Victoria wrong, he doesn’t seem to be aware of the current policy about CSOs in Washington State or the extent to which they are an issue there. In 2010, after years of trying to get King County to address discharges of raw sewage from over a hundred CSOs in central Seattle, the US Environmental Protection Agency and the State of Washington referred the case to the federal Department of Justice. In 2013 the US District Court for the Western District of Washington entered a consent decree to address King County’s failure to implement a long term control plan to reduce CSOs to meet the state standard of no more than one overflow per outfall per year. At the time, Seattle and King County agreed to spend $860 million for upgrades that will address most of their CSO problem—about 5.6 billion litres of raw sewage dumped into Seattle-area waterways each year. In 2015, a final agreement was reached that will see CSOs in King County largely eliminated by 2030—14 years from now. According to Washington State’s Department of Ecology—the equivalent of BC’s Ministry of Environment—there are a total of 126 stormwater outfalls in King County and the City of Seattle that discharge raw sewage each year. Most of those discharge into Lake Washington, the Lower Duwamish Waterway, and Puget Sound’s Elliot Bay, all in the most highly urbanized areas of Seattle. The Department of Ecology’s records show there are 168 known CSOs discharging into the Puget Sound area, including 7 in communities in Morris’ own 40th District. For Morris to honestly claim that “In the USA it is a requirement that surface water be a separate collection system from the primary and secondary sewage treatment systems,” he would have to be ignorant of his state’s official policy on CSOs, know nothing about the seven CSOs in his own district and have missed the spectacular EPA lawsuit over the perennial mess in Seattle. Morris’ claim doesn’t pass his own “straight-face test.” Victoria’s decades-old treatment issue has long been an irresistible punching-bag for Washington politicians. When bad news about orca or harbour seals or salmon comes over their horizon, Victoria is within easy smacking distance. In 2014, current Washington Governor Jay Inslee and King County Executive Dow Constantine wrote BC Premier Christy Clark after Clark’s cabinet refused to override Esquimalt’s decision to limit the size of a treatment plant that could be built at McLoughlin Point. Inslee and Constantine told Clark, “We are very concerned by the lack of progress in treating wastewater and protecting the health and habitat of Puget Sound.” After listing the steps Washington was taking to “improve the health of our waters and restore habitat,” the two contrasted their efforts with Victoria’s: “However, the continued lack of wastewater treatment in Victoria—at the entrance of Puget Sound—means Greater Victoria is not doing its fair share. This is of significant health concern for the health of the rest of the region’s waterways.” Like Morris’ claims, the position of Inslee and Constantine is sorely challenged by the evidence. First, Puget Sound’s many sewage treatment plants may be providing secondary treatment, but they’re still dumping large volumes of partially-treated sewage—and the chemicals used to kill pathogens—into constricted Puget Sound. Those plants consistently fail to comply with the requirements of the US Clean Water Act and are of a type that is inadequate to address the Sound’s growing risk of eutrophication and hypoxia. Secondly, the American claim that Victoria’s discharge is adversely influencing the health of Puget Sound—and so American interference in internal Canadian politics is justified—is at odds with the state’s own science on where the vast majority of toxic anthropogenic chemicals going into Puget Sound comes from: stormwater runoff. Washington’s most recently published (2014) “Marine Waters Condition Index” showed a downturn in the environmental health of many Puget Sound basins. The most dramatic decline was in Bellingham Bay, part of which is in Morris’ 40th District. Morris, Inslee and past Washington politicians have pointed their fingers at Victoria’s toilets as the Sound’s health has declined, but the record shows they have been largely ineffective at addressing the fundamental issues driving the deterioration: too many people living on the shores of Puget Sound and inadequate regulations and infrastructure to support that population. Washington’s dismal sewage treatment According to the Department of Ecology there are “about 100” publicly-owned secondary treatment plants discharging sewage effluent into Puget Sound. The two largest plants, which serve approximately 1.5 million Seattleites, discharge nearly half of the 1.2 billion litres of sewage effluent that flows into Puget Sound every day. For perspecetive, that 1.2 billion litres is 13 times more than Victoria’s 91 million litres per day. According to the EPA’s enforcement and compliance history database, the West Point and South plants didn’t have a single 3-month period during the last 3 years in which they fully complied with the requirements of the Clean Water Act. The largest, at West Point, was in “significant violation” of the Clean Water Act 75 percent of the time during the last three years. Even Brightwater, the new tertiary-level treatment plant serving northeast Seattle, has a solid record of either non-compliance or significant violation of the Clean Water Act over the past three years. The Anacortes plant, in Morris’ 40th District, hasn’t had a single quarter over the past three years in which it fully complied with the Clean Water Act. Likewise with Bellingham’s Post Point plant, also in Morris’ district. Even having a relatively good record with the EPA does not guarantee that a Puget Sound treatment plant will not have a serious negative impact on wildlife habitat. The Central Tacoma plant, the third largest on the Sound, achieved “no-violation” 60 percent of the time during the past three years—the best record of the Sound’s largest plants. Yet it was one of the plants where scientist James Meador found higher than expected levels of chemicals of concern in both the plant’s effluent and in the tissue of juvenile Chinook found immediately downstream from the plant. In previous research Meador had found that juvenile Chinook passing through such contaminated estuaries had an overall survival rate 45 percent lower than that for Chinook moving through uncontaminated estuaries. For this story Focus reviewed the operating permits of 77 Puget Sound treatment plants to determine their cumulative permitted discharge of suspended solids, a number that the Department of Ecology admitted it had not determined. An Ecology spokesperson said “it would take many hours by several staff to pull all of the data for each wastewater treatment plant.” Our examination revealed that the 77 plants are permitted to discharge over 32.4 million kilograms of suspended solids each year—more than five times Victoria’s current (2015) annual discharge of 6.35 million kilograms. Most of that 32.4 million kilograms is for Seattle plants and those at the south end of the Sound. If that large mass of suspended solids isn’t expected to create a problem for the constricted waters of Puget Sound, why would one-fifth of that be a problem in the rapidly changing, highly-oxygenated waters off Victoria? Indeed, Canadian scientists have determined that the biological oxygen demand of all sewage solids being discharged into the Salish Sea is inconsequential compared with natural sources of oxygen demand. But those treatment-related solids do create a pathway for PBDEs to get into marine waters. PBDEs—polybrominated diphenyl ethers—are a family of chemicals commonly used as flame retardants in many objects in our homes and workplaces. PBDE molecules attach themselves to particles and any suspended solids passing through a sewage treatment plant allow PBDEs to enter an aquatic environment. Canadian scientist Peter Ross’ 2005 study “Fireproof killer whales” revealed the impacts of PBDEs and PCBs on the health and reproductive capacity of orca and other marine mammals in the Salish Sea. Washington’s Department of Ecology notes that PBDEs “can affect the development, reproduction, and survival of many species. They build up in the food chain and are found in people as well as other organisms including fish and orcas in Puget Sound.” Since Puget Sound secondary treatment plants are permitted to discharge five times as much suspended solids—and consequently five times as much PBDEs—they represent five times the risk of Victoria’s outfalls. But solids aren’t the only part of human sewage that’s impacting the health of the Sound. One of the most troubling sewage-related problems there is the trend towards diminished dissolved oxygen. If levels fall past a certain point, fish can’t breathe. Several of south Puget Sound’s basins already experience at least seasonal oxygen impairment as a result of the high level of dissolved inorganic nitrogen introduced by sewage treatment plants. That nitrogen provides a ready source of nutrients for phytoplankton and could lead to eutrophication, harmful algal blooms and hypoxia. According to data from the Department of Ecology, secondary treatment does little to lower the amount of dissolved inorganic nitrogen discharged from Puget Sound plants. The West Point and South plants that serve central Seattle have concentrations of nitrogen in their discharges just as high as the Clover Point and Macaulay Point outfalls in Victoria. But there are two big differences in the situations of the two cities that put the waters south of Seattle at greater risk. First, Puget Sound treatment plants are discharging over twelve times as much nitrogen as Victoria does. Secondly, Victoria’s outfalls discharge to rapidly replenished, highly-oxygenated waters in the Strait of Juan de Fuca, while Seattle’s plants discharge to highly-constricted waters that take two to three months to replenish and benefit little from the energetic tidal mixing that occurs in the Strait of Juan de Fuca. Unless Puget Sound communities spend heavily to add a higher level of treatment—such as biological nutrient removal—to existing plants, the already problematic level of anthropogenic nitrogen in the Sound will increase as fast as the population increases. That increase could be dramatic. In the last 10 years alone the population of Puget Sound communities grew by 420,000 to nearly 4 million. That increase is considerably more than the Capital Regional District’s entire population of 370,000. At 2015’s rate of growth, a new population as large as the CRD’s would be added to the shores of Puget Sound in just 6 years. Some Washington State projections put the population of the Puget Sound region at 8 to 9 million by 2070. By comparison, the CRD’s residential population in the area currently served by the Macaulay Point and Clover Point plants is expected to grow to a total of 457,000 by 2070, according to CRD estimates. While the rapid growth around Puget Sound is enriching that region’s economy, the growing impact of those people is threatening the long-term prospects of the southern resident orca. That population was protected by the US Endangered Species Act in 2005, which designated all of the waters of Puget Sound as critical habitat. With the orca’s nutritional health dependent on the abundance of Chinook salmon, and the abundance of Chinook salmon being challenged by the sewage treatment practices of Puget Sound communities, it’s hard to keep a straight face when Governor Inslee and Representative Morris wag their fingers at Victoria’s sewage treatment. Even more of a challenge to keeping normal Canadian politeness in play is Morris’ assertion that “chemical loading” caused by Victoria’s sewage treatment practices presents a serious “long-term risk” to the health of Puget Sound. Poisoned Waters The history and sources of chemical contamination in Puget Sound were laid bare in PBS’s 2009 Frontline documentary Poisoned Waters. That film took a hard look at Washington’s failure to clean up Puget Sound 44 years after enactment of the Clean Water Act and 36 years after creation of Superfund designations for the cleanup of sites where major chemical contamination has taken place. Poisoned Waters included only one Superfund site on the Lower Duwamish Waterway, but there are 22 Superfund sites around Puget Sound and some of the largest remain unremediated. The documentary also provided chilling insights about the source of the greatest ongoing chemical contamination of the Sound. Chilling because what Frontline found in Seattle also applies to Victoria. Journalist Hedrick Smith summed it up: “What’s making this water so sick is what scientists have now labelled the number one menace to our waterways—stormwater runoff.” At one point in the film Smith interviewed a diver incensed about a stormwater outfall across Elliot Bay from downtown Seattle. The diver told Smith: “The end of the pipe creates a brown noxious soup of nastiness that is unbelievable…” Smith responded, “Unbelievable because the water looks so good from up here. So we’re looking at something we think is clean and underneath, you can see, diving there…” The diver interrupted Smith and said, “It’s not clean. When we see that [outfall] running in full flow, we turn around and we swim the other way. Quickly. There is just this unbelievable gunk coming out the end of this pipe. This is our front yard. Would you allow your front yard to be sick?” Christine Gregoire, who was then Washington’s governor, quantified the “unbelievable gunk” for Smith: “We put in about 150,000 pounds a day of untreated toxics into Puget Sound. We thought all the way along that it was like a toilet, to be honest with you. What you put in you flush out, and it goes out to the ocean and gets diluted. We know that’s not true; it’s like a bathtub. So what you put in stays there.” Much of the “unbelievable gunk”—about 98 percent by weight—is oil, grease and petroleum, most of which is connected to the use of cars and trucks. In the film, Jay Manning, director of the Department of Ecology, tells Smith that as much oil as was spilled by the Exxon Valdez goes into Puget Sound via stormwater every two years. Gregoire’s estimate for the total weight of “untreated toxics” being flushed each year into Puget Sound from runoff works out to 23,277 metric tonnes. That’s a lot. Was she right? “Poisoned Waters” aired roughly in the middle of the five-year-long Puget Sound Toxics Loading Analysis (PSTLA) led by Washington’s Department of Ecology and the EPA. By the final report in 2011, the total estimated weight of “toxic chemical loading” of Puget Sound each year—from runoff, atmospheric deposition, sewage effluent and groundwater—was estimated to be in the range of 9,024 to 11,823 metric tonnes. That’s a lot less than Gregoire had told Hedrick Smith. What happened to the other half? For one thing, an accounting of industrial discharges—included in the first two phases of the process that began in 2007—had magically disappeared from the final analysis in 2011. With five petrochemical refineries, three pulp and paper mills, a metal smelter and hundreds of other industrial operations discharging either directly or indirectly to Puget Sound, this was an obvious shifting of responsibility away from industrial polluters. That raises questions about the integrity of the process. Were the Americans serious about understanding what’s happening to Puget Sound? Or was political influence exerted on the final numbers to protect the economic position of such operations as refineries? How extensively the figures were manipulated is unknown, but there are indicators that massaging went beyond removing numbers that could be used to focus blame on specific industrial sectors. The acknowledged loading for PBDEs and PCBs seems low considering the assessment’s findings for the total annual release of these two chemicals into the Puget Sound Basin (the “Basin” includes all watersheds draining into Puget Sound). The Department of Ecology estimates 2200 kilograms of PCBs are released into the Basin each year but decided that only 3 to 20 kilograms actually find their way into the Sound. That accounts for only a tiny percentage—0.1 to 1 percent—of the acknowledged release. What is the fate of the remaining 99 to 99.9 percent? Likewise, Ecology estimates 700 kilograms of PBDEs are released into the Basin each year but only 28 to 54 kilograms go into Puget Sound. Comparison with the release of PBDEs from Vancouver’s Annacis Island secondary treatment plant suggests Ecology’s estimate is less than half what it should be. In spite of those concerns, the PSTLA numbers do allow us to compare toxic chemical loading from Puget Sound communities with toxic chemical loading from Victoria’s outfalls. The relative amounts for every chemical of concern undermines the claim by Morris and other Washington politicians that Victoria’s treatment system is posing a significant risk to shared marine waters. Even if Victoria’s outfalls emptied their contents directly into the centre of Puget Sound, their contribution to chemical loading in the Sound would be relatively small (see table below). The PSTLA study also considered “ocean exchange” of toxic chemicals and estimated the net outflow or inflow into Puget Sound of those chemicals. For copper, zinc and PBDEs, Puget Sound was a net exporter. On the other hand, the amounts of arsenic and lead coming into Puget Sound from ocean waters overwhelms the amounts deposited through all other pathways. According to the analyses, there is a small—approximately 1 kilogram per year—net inflow of PCBs into Puget Sound from ocean exchange, 0.3 kilograms of which is brought to Puget Sound by returning salmon. Coincidentally, Victoria’s outfalls produced a total of about 0.3 kilograms of PCBs in 2014. (By the way, the level of PBDEs and PCBs reported by the CRD in its Marine Environment Program 2014 Annual Report were both down from levels reported in previous years.) With Washington planning for a large increase in Puget Sound’s human population over the next several decades, the main threat to the Sound’s environmental health—the amount of toxic chemicals entering it through stormwater runoff—seems likely to accelerate. If the state’s own numbers show Victoria plays no significant role in the Sound’s decline, why do Washington politicians continue to use the Sound’s deteriorating health as an excuse to involve themselves in Canadian politics? Is it to deflect attention away from their own apparently intractable problems? The Governor’s concern I asked Department of Ecology spokeswoman Sandy Howard what Governor Inslee’s exact concern was about Victoria’s impact on “the health and habitat of Puget Sound.” In marked contrast to Morris’ concern about “chemical loading,” Howard responded, “The letter from Governor Inslee cites a general concern for the health and habitat of our shared waters. Our studies indicate that water masses are highly connected and cross our shared border. Our position is that we all need to do our part, and we should not be sending the wrong message regarding environmental stewardship, especially in light of population growth.” The first part of that concern is that stuff from Victoria is crossing into Puget Sound waters. But the Department of Ecology’s own study of toxics shows the Sound is a net exporter of some chemicals of concern and a net importer of others, and those balances have little to do with Victoria’s outfalls. They are overwhelmingly determined by chemicals of concern—arsenic, lead and cadmium for example—already in the ocean or originating in Washington. As well, that 1994 report by a joint panel of three BC and three Washington marine scientists, mentioned earlier, had agreed that the discharge from Victoria’s outfalls would have far less impact on waters off Seattle than Seattle’s outfalls would have on waters off Victoria. I asked Howard if Washington now disagreed with that finding or had done any new analysis of the issue. “We have not revisited the relative impacts from Victoria, as was reported in this 1994 effort,” Howard replied. “We are in the process of refining a more detailed computer model to address questions that focus on US impacts on our shared waters.” Then Howard repeated her previous point about environmental stewardship and population growth. “Our position is that we all need to do our part, and we should not be sending the wrong message regarding environmental stewardship, especially in light of population growth.” Absolute population growth around Puget Sound, as mentioned earlier, has been, and is expected to continue to be, far greater than Victoria’s. Washington may be projecting its fear about what is happening there on Victoria, but the statistics don’t support that. Howard’s emphasis on “environmental stewardship,” though, will resonate with some Victorians who think that it’s a “no-brainer” that Victoria’s current system would be causing environmental harm compared with secondary sewage treatment. But that’s not the view of marine scientists in Victoria, who have endorsed the existing system—with the caveat that further studies on chemicals of emerging concern should be conducted. I told Howard about the Victoria scientists’ endorsement and asked her if the Department of Ecology agreed with the principle of making decisions about environmental stewardship using science-based information and knowledge. She responded, “Our treatment standards are based on science and require that all dischargers apply a basic level of treatment. That basic level of treatment has been defined as secondary treatment.” A comparison of how science is used in the operation of Victoria’s treatment system with how it is used in Washington is revealing. Victoria’s system appears to have two advanced features of environmental stewardship that are missing in Washington. Environmental stewardship—as practiced by governments—requires frequent measurement of chemicals of concern being discharged into the environment and transparent public reporting of the results of that monitoring. With sewage treatment, Victoria is doing this and Washington isn’t. One of the chemical groups of greatest concern in Puget Sound and the Strait of Juan de Fuca are PBDEs. I had noticed that the Department of Ecology had only published estimates for the total amount of PBDEs released to the Sound through wastewater treatment plants. Yet the CRD does very precise measuring, monitoring and publication of the amount of PBDEs—and all other chemicals of concern, found in its discharges. For example, the CRD reports right down to a hundredth of a gram the total weight of each of the 40 different PBDEs it discharges from each of its two outfalls, every year. Accurately making those measurements, monitoring the results, and releasing that information to the public represents a high level of environmental stewardship. I asked Howard about Washington’s estimates for the release of PBDEs. Her answer confirmed the difference: “Washington treatment plants are not required to monitor and report PBDEs on an annual basis,” she said. An examination of the operating permits for the Sound’s wastewater treatment plants shows a low level of monitoring, in both frequency and the number of chemicals measured, compared with that done for Victoria’s two outfalls. The monitoring requirements for Seattle’s West Point plant—the largest secondary treatment plant on the Sound—don’t include PCBs, for example. There’s another vital difference between Victoria’s treatment system and those of Puget Sound: source control. Source control refers to the practice of keeping chemicals of concern out of the sewers in the first place, through a program of regulation, registration, installation of collection equipment to isolate and store chemicals of concern, proper disposal, inspection and monitoring. In Puget Sound, institutions, businesses and industries that discharge toxic chemicals are required to self-report those releases to the EPA’s Toxic Releases Inventory only under certain conditions. For example, a business must have 10 or more full-time employees to be required to report. Many operations in Puget Sound come in under that threshold, are not required to register, and discharge toxic chemicals directly into sanitary sewers. Since treatment plants aren’t required to measure or report their release of many toxic chemicals, such as PBDEs, Washington has no hard evidence of the plants’ cumulative contribution to the chemical loading of Puget Sound. In Victoria, the CRD instituted a region-wide source control program in 1994 and since then has become a nationally-recognized leader in that practice. The CRD reports that 97 percent of region businesses whose activities fall within the program’s regulations have proper waste treatment systems installed that keep chemicals of concern out of Victoria’s sanitary sewers. Seattle does have a source-control program for the Lower Duwamish Waterway—a highly contaminated federal Superfund site—but otherwise has no city-wide source control program. While building a hundred sewage treatment plants on Puget Sound has allowed a reduction of suspended solids and reduced biological oxygen demand, the plants’ effectiveness at removing chemicals of concern is largely unmeasured and unknown. Given Washington’s failure to monitor chemicals of concern and employ source control, the argument that Governor Inslee is entitled to pull Victoria’s chain to avoid sending “the wrong message regarding environmental stewardship” doesn’t seem credible. But even a cursory examination of Washington’s internal politics shows there’s plenty of circumstantial evidence that Washington legislators simply use Victoria to cover their own asses during times of stress. The phony toilet war: politically-motivated scapegoating Following the airing of Poisoned Waters in 2009, Washington state legislators moved to increase the state’s hazardous substances tax to fund measures that would reduce toxic chemical loading from stormwater runoff. But the bill, the Washington Clean Water Act of 2010, was withdrawn in April 2010. Ironically, Jeff Morris was seen by some in Washington as influential in the bill’s demise. John Burbank, executive director of the Economic Opportunity Institute, a non-profit public policy research organization in Washington, linked the measure’s withdrawal to personal lobbying—wining and dining—of Morris on six different occasions by BP lobbyist William Kidd. According to Burbank the proposed legislation would have added at least $200,000 a day to BP’s cost of doing business in Washington. Not only were Morris and his fellow Puget Sound legislators unwilling or unable to deal with the stormwater issue, it’s easy to find specific cases where they continue to tolerate direct contamination of Puget Sound by their constituents. Consider Morris’ record, for example. A 2014 report by the Environment America Research & Policy Center described Puget Sound as having the third highest level of “toxicity-weighted” materials released into large watersheds in the USA on an ongoing basis. The study used data from the EPA’s Toxic Release Inventory. That report highlights a case of ongoing release of known carcinogenic substances, which just happens to be in Morris’ 40th District. A wood preservative company in Bellingham, registered with the EPA’s Toxic Release Inventory, reported in its last three filings an annual discharge of .5 kilograms of pentachlorophenol (PCP) and dioxins directly into Whatcom Creek, which flows into Bellingham Bay. By comparison, Victoria’s source-controlled outfalls discharged zero PCP in 2014, according to the CRD’s detailed report on chemicals released. Other corporate constituents of Morris’ district have records that raise questions about the legislator’s actual level of concern over chemical contamination. Tesoro, one of several fossil-fuel-related donors to Morris’ last election campaign, operates a refinery in Anacortes that, according to the EPA’s enforcement and compliance history database, has been in “significant violation” of the Clean Air Act every quarter for the last three years. It fully complied with the Clean Water Act only 17 percent of that time. Tesoro representatives openly testified against the Washington Clean Water Act of 2010 before it was withdrawn. Now Tesoro is planning a significant increase in the output of its Anacortes operation. Yet there’s been complete silence from Morris and other legislators about “chemical loading” from the refinery. Recall that about 98 percent of the toxic chemical loading of Puget Sound comes from petroleum, oil and grease, according to the Department of Ecology. Given the failure by Morris and his fellow legislators to protect the Sound from such impacts, their claim that “We can no longer tolerate the long-term risk that the chemical loading caused by Victoria CRD’s inaction has brought to our shared waters,” seems more like a line from Wonderland than Washington. Their threatened boycott of Victoria included other claims worthy of the rabbit hole. In his letter to Mayor Helps, Morris said “…we believe the long-term damage to marine mammals, in particular, but all marine wildlife does more long-term damage to ecotourism.” With about one whale-watching business based in Puget Sound for every three surviving orca—most operating out of Morris’ 40th District—the pressure these operations put on their prey was found by DFO scientist Christine Erbe in 2001 to be damaging the orcas’ prospect for survival. Yet for all its absurdity, Morris’ campaign has been effective. Just after BC Environment Minister Barry Penner’s approval in August 2010 of the CRD’s plan for a secondary treatment plant at McLoughlin Point, Penner told the Journal of San Juan Islands: “I know there’s been a concern in Washington state about the lack of sewage treatment in the Victoria area. I certainly hear about it from time to time, particularly from Representative Jeff Morris, who has not been shy about letting us know that his constituents are concerned about that.” Morris hasn’t been the only Washingtonian deflecting attention away from the state’s dismal performance on reducing chemical contamination of Puget Sound and international waters. In a 2014 column, Vancouver Sun columnist Vaughn Palmer wrote: “While delivering a speech in Bellingham last fall, I fielded a question that comes up pretty much every time I address an audience south of the border. ‘When are you folks in Victoria going to start treating your sewage?’ The shame of my hometown—dumping millions of litres of untreated sewage into the Strait of Juan de Fuca every day. Or, as columnist Joel Connelly wrote in the Seattle Post-Intelligencer 25 years ago, ‘the BC capital believes in using an international waterway as its toilet.’” Palmer’s “shame” button has been pushed repeatedly by Connelly over all those years. In a March 30, 2016 column on Seattle PI, Connelly repeated, for the umpteenth time: “Victoria is still using an international waterway as its toilet.” Over the years Connelly had plenty of opportunities to write about the international toilet in his own front yard, but he never did. When contacted by Focus, Connelly said, “I keep returning to Victoria sewage because promises were made, promises have not been kept, and our political leadership is perplexed. Site your treatment plants and I will very gladly go on to something else.” That BC political figures like Mike Harcourt and Gordon Campbell made “promises,” however, seems a flimsy rationale for spending $1 billion on treatment if BC scientists are saying it will have a “negligible effect” on environmental conditions in the Salish Sea. Worse, Connelly’s focus on Victoria has stunted the growth of knowledge in both Victoria and in Puget Sound. By holding up Victoria’s system as shameful, Victorians have been discouraged from learning about how it works, why it works, and how it could be made to work better. Instead, Victoria’s political discourse has been held hostage for 10 long years by what scientists say is a “non-problem.” Meanwhile, Connelly has provided cover for a poorly-functioning system of treatment plants in Puget Sound that are producing cocaine-spiked salmon smolts and fireproof orca. The lesson for Victoria? The opinions of Washington legislators about the Capital Region’s sewage treatment system are highly suspect. When challenged for details, they can’t provide them. The legislators’ uninformed portrayal of Victoria’s treatment system as “backward” is little more than an attempt to deflect attention away from their own inaction as Puget Sound deteriorates. Victoria’s political leaders shouldn’t take Washington politicians seriously on this issue. Instead, those tasked with deciding how to spend that “billion dollars” need to take their responsibility more seriously. They need to get outside the Where-to-put-it? box they’ve been stuck in since 2009 and allow themselves to be guided by local marine and human health scientists who have precise knowledge of the environmental and health impacts of the current system. In Washington, scientists say stormwater runoff is the most pressing threat to marine waters. Unless that’s solved, conditions in the Salish Sea will continue to deteriorate. In Victoria, scientists are saying additional sewage treatment here—and in Vancouver—will provide little or no environmental benefit. One initiative that would provide a benefit has been identified. Victoria’s stormwater runoff is likely as toxic as Seattle’s, albeit on a smaller scale. The deterioration of near-shore Victoria-area waters that local citizens have blamed on the deep-water outfalls is more likely due to deposition of the “incredible gunk” from storm drains that disgusted the diver interviewed in Poisonous Waters. That’s a problem that everyone agrees needs to be fixed. David Broadland is the publisher of Focus.
  12. May 2016 On the sewage treatment issue, Mayor Helps and the CRD seem to have lost sight of whom they are serving. I WROTE HERE LAST EDITION about my two-year battle with the CRD to get two sentences of a 2009 staff report released to the public. I believed the sentences would show that CRD staff greatly underestimated, either intentionally or by honest mistake, a significant cost related to the development and construction of a secondary sewage treatment system for Victoria. As ordered by the Office of the Information and Privacy Commissioner, the CRD released those two sentences in March. Since the cost to taxpayers of the CRD’s refusal to release the two sentences was significant—I estimate $20,000—I am going to provide you with every single word. The two sentences that the CRD—and the engineering consulting firm Stantec—were determined to keep secret were these: “The program management consultant service fees are estimated at 3 percent of the total project construction value and will be shareable under the federal and provincial planning agreement. Stantec’s hourly rates are up to 40 percent lower than the next ranked firm.” On the basis of that estimate, CRD directors decided to award a major 6-year contract to Stantec. The cost to the CRD was to be based on those very competitive rates. Here’s why the CRD and Stantec didn’t want to give up those two sentences. Stantec’s estimate for the cost to build a secondary treatment plant at McLoughlin Point was $783 million. Of that, about $410 million was for construction costs. Applying the “3 percent” from the two secret sentences to that cost, Stantec’s expected fee over the life of the project would have been just over $12 million. But by 2013 the CRD and the Seaterra Commission were estimating Stantec’s eventual cost would be $39.6–$43 million. That’s a 3-fold increase over Stantec’s 2009 estimate. This is significant for two reasons. First, those two sentences were written by two CRD employees, Dwayne Kalynchuk and Tony Brcic. Both just happened to be former Stantec employees. It’s evident in the second sentence that they promoted Stantec’s hourly rates as being considerably lower than those of Stantec’s two competitors for the contract. Yet once Stantec was awarded the contract, the expected cost to taxpayers ballooned. Naturally, one wonders if Kalynchuk’s and Brcic’s former employment with Stantec played any role in their recommendation of Stantec and the subsequent escalation of Stantec’s fee. I can make no comment on that. I would wager, though, that most CRD taxpayers would want to see more stringent care taken by the CRD to protect the interests of taxpayers in the awarding of contracts than was practiced in this case. Secondly, the underestimation was apparently based on information provided to the CRD by Stantec. Did it deliberately provide the CRD with a low estimate to get the contract? Or was it just bad at estimating costs? Either way, the three-fold underestimation raises questions about Stantec’s $783 million price tag for the project. Has Stantec been underestimating the overall project cost, too? Or, if Stantec couldn’t accurately estimate its own portion of the overall project cost, should anyone have confidence in its ability to estimate the overall cost? This is critical. The support for McLoughlin Point by some CRD representatives, including Nils Jensen, Ben Isitt and Geoff Young, has been based on the validity of Stantec’s estimate. Stantec’s 6-year contract with the CRD will expire in December, and the CRD seems determined to keep working with the firm. This is the most perplexing part of this story. Even though Stantec’s expected fee had inflated far beyond its initial estimate, the CRD argued against releasing those two sentences, partly on the basis that revealing them might result in Stantec’s competitors under-bidding it if the contract were reopened. That is, taxpayers might get a better deal. The CRD, funded by taxpayers, seems to be opposed to that. Instead of being appalled by the huge increase in Stantec’s expected bill, they chose to protect its interests. Shouldn’t the CRD be focussed on the interests of the people in the community who fund it? THAT RELEGATION OF THE PUBLIC INTEREST to second-class status is also playing out on a much larger scale in the sewage treatment issue. A peer-reviewed scientific study by DFO scientists showed that higher levels of treatment would have a “negligible effect” on environmental conditions in the Salish Sea. Detailed analyses have been made by marine scientists and public health officials showing that the current marine-based treatment facilities at Clover Point and Macaulay Point are safe and effective and are doing no harm to the environment. Those scientists have argued that new regulations being imposed by Ottawa do not take into account the physical circumstances that have allowed this system to function safely for many years. At the same time, crucial aspects of the system proposed to replace it have not been worked out—how biosolids would be safely disposed of, for example—so the environmental risks of those unknown details can’t be weighed. Since a replacement system will likely cost local taxpayers $1 billion or more, you would hope that the default position of Victoria’s elected officials would be to insist that Ottawa and BC provide exacting proof that the local marine scientists and health officers are wrong before the community is forced to go to such great expense. Sadly, the opposite is true. When the federal government recently sent a letter to the CRD reiterating its crude, formula-based determination that Victoria’s tidal-powered, organic and self-disinfecting approach was “high risk,” Victoria Mayor Lisa Helps said, “Very clearly, they’ve taken 2014 data that shows all the things which are not good for the marine environment are way over the threshold. So I am very happy we have received this letter. I hope this will completely quash the debate.” Helps, who led an expensive, year-long debate to a complete failure at Rock Bay, doesn’t know what she’s talking about. Under Ottawa’s point system, Victoria fails because the test puts so much emphasis on total suspended solids and oxygen demand, characteristics of sewage effluent that local marine scientists have said has negligible impact on the environmental health of the Strait of Juan de Fuca. Instead of accepting the wisdom and professional experience that scientists and health officers have obtained after years of observing Victoria’s marine-based system, Helps has accepted the authority of Ottawa to make an arbitrary and unreasonable decision. In doing that, she has put the community at greater risk of having to spend an unbelievably large sum of money unnecessarily. She’s failing to serve the public interest, just like CRD staff did when they chose to protect Stantec’s interest. The part of the community that Helps supports, that wants to build something without even knowing whether it will provide a net environmental benefit, can’t find a place to put their project. No wonder. Without a proven need for replacing the existing treatment system—a system the community has already bought and paid for—any location that’s considered will always appear to have a higher value as something else. Even a parking lot has a higher value than an unneeded treatment plant. That’s been true for Haro Woods, McLoughlin Point, Viewfield Road, Rock Bay and now Clover Point. A bureaucratic formula that has nothing to do with the public interest is going to remain unconvincing to the other part of the community that wants to see hard evidence that the scientists and health officers are wrong. David Broadland is the publisher of Focus.
  13. March 2016 Scientists recently confirmed an active seismic fault that could generate a large earthquake lies within 5 kilometres of downtown Victoria. LAST JUNE THE Geological Survey of Canada quietly released a report on a previously unexplored deformation in the bedrock below the Strait of Juan de Fuca—the Devil’s Mountain Fault. When I first read the report a few weeks ago, Sir James Douglas’ well-mythologized first impression of this place leaped to mind. On his arrival in 1842 Douglas had pronounced it “a perfect Eden.” It now appears he was profoundly mistaken. Lurking in the strait just 5 kilometres from downtown Victoria, according to scientists, is a physical imperfection so great that one day this “perfect Eden” could become—over the span of 10 or 20 seconds—hell on Earth. Before the scientists’ report was put on Natural Resources Canada’s website, John Cassidy thought it would be best to alert Emergency Management BC. Cassidy is the head of the Earthquake Seismology Section of the Geological Survey of Canada (GSC) at the Pacific Geoscience Centre in Sidney. What was in the report that Cassidy thought EMBC should know about? The alert to EMBC was titled “Discovery of potentially active submarine faults near Victoria, BC.” The senior scientist on the study, the GSC’s Dr Vaughn Barrie, assisted by marine scientist Dr Gary Greene, had analyzed sediment cores and multibeam bathymetry scans of bedrock below the strait in an area just southeast of Victoria. From this data Barrie was able to create a 3D map of a short section of the Devil’s Mountain Fault Zone. The Devil’s Mountain Fault is a deep crack in the Earth’s crust that runs for about 125 kilometres from near Darrington in the foothills of the Cascade Mountains in Washington to just south of Victoria. Before Barrie’s analysis, any estimate of the danger the fault might pose to Victoria would have been speculative. Previous calculations based on the presumed length of the fault had suggested that if it ruptured along its full length, an earthquake of magnitude 7.5 could be generated. But did the fault actually come close to Victoria? The work of Barrie and Greene confirmed a potentially grave risk for Victoria. In the summary of their report they stated, “Based on recently collected geophysical and sediment core data, the western extent of the active Devil’s Mountain Fault Zone has been mapped for the first time, offshore the city of Victoria. The occurrence of this active fault poses the real possibility of an earthquake, similar to the devastating 2011 Christchurch, New Zealand earthquake, occurring near the city of Victoria.” Barrie noted that the 2011 Christchurch earthquake killed 185 people and caused damage assessed at $40 billion (NZ). He observed that earthquake “had an effective magnitude of 6.7 and was approximately 5 kilometres from central Christchurch at its closest approach…” The scientist noted that “the Devil’s Mountain Fault Zone is less than 5 kilometres from central Victoria” and “appears to have the potential of producing a strong earthquake adjacent to Victoria, perhaps as large as magnitude 7.0 or greater.” Barrie's confirmation of the close proximity of an active fault to central Victoria was alarming news. Previous loss estimates for a hypothetical magnitude 7.0 earthquake on the Leech River Fault—located further away from Victoria than the section of the Devil’s Mountain Fault that Barrie had mapped—predicted 1500 fatalities in the CRD, with close to 20,000 injuries, many of which would require hospitalization. Over six million tons of debris would be generated by collapsed and damaged buildings and other structures. But the Leech River Fault has shown no signs of activity during the last 10,000 years. Living beside it is like having a bomb in our basement, but a bomb which we believe has had all the explosive material removed. Barrie, in effect, confirmed the Devil’s Mountain bomb is in our living room and could go off at any moment. To put Barrie’s finding in perspective, consider the so-called “Big One.” That’s the commonly-used expression to describe what geoscientists call a great-plate boundary earthquake; it’s also known as a “Cascadia Subduction Zone event.” Scientists estimate that a full-length rupture of the 1100-kilometre-long subduction zone along North America’s west coast could produce a magnitude 9.0 earthquake. Although a magnitude 9.0 earthquake off the west coast of Vancouver Island would release over 1000 times more energy than a magnitude 7.0 earthquake, because of its much greater distance away from Victoria, such an event would cause only about half the casualties and property damage that a magnitude 7.0 shallow crustal earthquake immediately adjacent to Victoria would cause. A rupture of the Devil’s Mountain Fault could have twice the impact on Victoria as the Big One. Although Focus has been unable to obtain a loss estimate (casualties and property damage) for Victoria from Natural Resources Canada for “the Big One,” a study done by BC geoscientist Martin Zaleski reported that Victoria would sustain much greater damage from a nearby magnitude 7 earthquake than from a magnitude 9 Cascadia subduction event. When asked about Zaleski’s finding, Cassidy said, “That’s about right.” So Cassidy and his colleagues at GSC knew that when Victorians heard about this nearby existential threat, there would be questions. Before Dr Barrie’s report was placed online, Cassidy contacted EMBC and outlined the potential threat—what was known and what wasn’t. EMBC then arranged a conference call with about 100 participants—primarily emergency management organizations and local governments from across the South Island and Lower Mainland regions. Cassidy explained to Focus that this step was taken because of “potentially significant interest.” The information Cassidy shared with those emergency management stakeholders included caveats about the scientists’ report. For example, the study has not been internationally peer-reviewed and more research would need to be done to confirm Barrie’s conclusions. The “potentially significant interest”—beyond that expressed by the 100 or so participants in the conference call—never materialized. That’s probably because no one outside of the conference call ever found out about the report—until now. Living with the bomb Cassidy told Focus that the results of additional research on the Devil’s Mountain Fault Zone will be ready “in about a year.” One of the unresolved issues Dr Barrie noted in his study was the possibility of a connection between the Devil’s Mountain Fault and the Leech River Fault. Barrie’s report stated, “The data here does not suggest any connection between these faults, though they are separated by only five kilometres. Further data are required to the west of our survey data set examined here to determine any relationship between these fault zones.” On a large-scale map the two faults appear to run directly into each other and the concern is that a longer fault could generate an even more powerful earthquake. But Cassidy said there is no on-the-ground evidence that the Leech River fault has been active since the last period of glaciation ended, about 9,000 years ago. That, he said, makes it less likely that the active fault Barrie confirmed would be connected to the Leech River Fault. Dr Barrie’s report includes excerpts from previous studies and one of the most interesting of his references is this: “Hyndman et al (2003) estimated a recurrence interval for large upper-plate fault earthquakes of magnitude 7.0 and greater in the Puget Lowland-Georgia Strait region to be about 200 years…They suggested that additional large earthquakes in the upper plate may occur in this region shortly after great-plate boundary earthquakes.” Translation: Magnitude 7.0 and greater crustal earthquakes in the region where we live are not that uncommon. And, a rupture of the Cascadia Subduction Zone could be followed “shortly” afterward by a rupture of the Devil’s Mountain Fault. Talk about a nightmare scenario. The Big One would cause widespread damage throughout the Pacific Northwest, including Vancouver and Seattle, so Victoria couldn’t expect help from nearby communities. Then comes the Really, Really Big One... The BC Earthquake Immediate Response Plan doesn’t cover that scenario. The plan is the responsibility of the previously-mentioned Emergency Management BC, a provincial agency that falls under the control of Attorney General Suzanne Anton. EMBC’s plan is contained in a 127-page document that lists actions to take when a damaging earthquake occurs, such as activating the “Mass Fatality Plan.” The plan’s scenario for a magnitude 7 shake in Victoria considers the “worst case” to be a mid-afternoon shallow crustal earthquake in January following a 3-day Pineapple Express. The ground would be saturated and prone to liquefaction. Here’s what the planners imagine would happen: “For many, the earthquake is heard before it is felt. The low, rumbling sound is similar to that of a freight train, immediately followed by 10 to 20 seconds of violent shaking that knocks people located closest to the epicentre from their feet—except for those who remember to ‘drop, cover, and hold on.’ Taller buildings sway with the high intensity shaking. Unsecured objects fall or fly through the air. Roads crack and the ground ruptures in some areas. Buildings on softer soils lose support through liquefaction. Landslides and rock falls are generated in many areas, cutting off transportation routes. Flooding is increased by the recent wet weather event with some dikes failing. Several fires start throughout the impact area from damaged electrical power and gas lines. Some buildings collapse, many shift and crack, and others are destroyed by fire. “Windows break and glass scatters across the pavement. Debris is strewn throughout roadways, cutting off access to areas. Entire walls from unreinforced masonry buildings fall into the streets. Many of those who try to run outside suffer extreme injury or death from falling and flying objects and thousands are trapped or injured. Dust, smoke and sirens fill the air.” Hopefully, The Plan itself won’t be buried under broken blocks of granite and marble at the seismically vulnerable legislative buildings on Belleville Street before the Mass Fatality Plan can be activated. But don’t count on it. The Province committed $1.5 billion in 2005 to an upgrade of vulnerable schools in Victoria and Vancouver but has taken no steps on the estimated $300 million seismic rehabilitation of the 118-year-old Parliament Buildings. Local governments may have plans, but they have little or no money for making this region more seismically safe and resilient. Although one might think that making the region more seismically safe and resilient would be a simple matter of applying science and common sense to the problem, recent events illustrate that there's not much of either at work. Confused public-safety priorities A genuine and concerted effort to reduce Victoria’s exposure to seismic risk—significantly higher than any other city in Canada (see table below)—will require public investment in education, prioritization, emergency response and infrastructure renewal. It’s astonishing, then, that Barrie’s dramatic and revelatory report on the Devil’s Mountain Fault was relegated to an obscure corner of Natural Resources Canada’s website even while Victorians struggled to respond to a controversial “high risk” classification of a different sort. Under federal regulations imposed on Victoria’s marine disposal of liquid waste, the region is contemplating spending upwards of $1 billion to convert to a land-based system. Yet six past and current public health officers have stated: “There is no measurable public health risk from Victoria’s current method of offshore liquid waste disposal.” Those health officers are Dr Richard Stanwick, Dr John Millar, Dr Shaun Peck, Dr Brian Emerson, Dr Brian Allen and Dr Kelly Barnard. A recent peer-reviewed study by DFO scientists found that spending billions of dollars upgrading sewage treatment in Vancouver and Victoria would have “negligible effect” on environmental conditions in the Salish Sea. If there’s no risk to public health or the environment from a marine-based treatment system—but the Devil’s Mountain earthquake could hit at any moment killing or injuring thousands of Victorians—what are we to make of the federal government’s priorities? The challenge to making the region more seismically safe and resilient isn't just a question of correctly prioritizing seismic vulnerability in relation to other needs, however. The evidence accumulated during the Johnson Street Bridge Replacement Project is illustrative of how difficult it can be for local government to properly assess seismic risks for different structures, and then to ensure that the risk for any one structure is adequately addressed. In the case of the bridge project, where the City of Victoria was responding to the existing bridge's apparent seismic vulnerability, City Council chose to build a new bridge before it had addressed the known seismic vulnerability of its main fire hall. The City had been told the fire hall would collapse in an earthquake and trap emergency response vehicles. At the same time, councillors justified replacing the bridge partly on the basis of ensuring that emergency vehicles would be able to circulate immediately following an earthquake. Common sense eluded councillors:If rescue vehicles were trapped in a collapsed fire hall they wouldn't be able to use the bridge. The political attraction of building a glamorous signature bridge designed by an internationally-renowned architect won the day. At the same time, a seismic assessment of 16 City-owned buildings showed many were potential death traps. The engineering assessment by Read Jones Christoffersen was kept secret during the bridge decision-making process and wasn’t made public until Focus obtained it in an FOI and published the details here. The assessment didn’t even look at some of the most vulnerable structures, including Downtown parkades and City Hall. The potential for loss of life from the collapsed City-owned buildings was much greater than from a collapsed Johnson Street Bridge. Unbelievably, the bridge the council choose to build—in response to the existing bridge’s seismic vulnerability—will, it turns out, likely suffer “permanent loss of service” in an earthquake generated by the Devil’s Mountain Fault. The story of just how a project aimed at increasing seismic safety could, instead, produce a seismically vulnerable design, is a warning that local governments are badly in need of an independent, trustworthy, publicly-funded agency to help guide the region toward seismic safety and resilience. In light of Barrie's warning about the Devil's Mountain fault, the failure to create a seismically resilient bridge is a cautionary tale that needs to be thoroughly understood. In this case, the devil really is in the details. How Victoria got a “less robust” bridge In 2009, following receipt of an engineering report that recommended the Johnson Street Bridge be seismically upgraded, Victoria City council voted to replace the bridge instead. In seeking federal funding for the project, then-Mayor Dean Fortin wrote then-Federal Minister of Transportation and Infrastructure John Baird, telling him: “Any seismic event will bring it down.” In 2010, after being forced to seek approval for its proposed bridge in a referendum, the City was advised that, whether it chose to repair or replace the bridge, it should spend additional money to insure both would withstand a “magnitude 8.5” earthquake. A presentation to City of Victoria councillors by MMM engineer Joost Meyboom on June 14, 2010 stated there was a “35 percent probability of a major quake (magnitude 7.0 to 7.9) in the next 50 years.” Meyboom recommended that a new bridge “be designed for a magnitude 8.5 earthquake.” He told councillors, “If you’re going to spend $100 million on a facility, the premium to pay for a very high seismic performance is a relatively low price for insurance.” The City agreed to buy this level of seismic protection. A few months later, just before a referendum on a new bridge, a study by scientist Chris Goldfinger noted that Cascadia subduction zone earthquakes could be as strong as magnitude 9.0. Meyboom wrote the City and suggested it should consider protecting the bridge to magnitude 9.0; he told the City that a higher level of protection would cost more money. This was a highly revealing moment. Meyboom equated the maximum seismic vulnerability of the bridge to the Cascadia subduction hazard, an assertion that was untrue. I’ll come back to this later. Shortly after Meyboom's attempt to upsell the City on higher seismic protection, the referendum on the new bridge passed. Fast foward to mid-2012. The City had needed to raise the bridge project's budget from $77 million to $93 million and was in the procurement phase. Councillors were adamant the budget would go no higher. There were three potential bidders. As part of the procurement process, the City asked each company to provide an initial opinion on whether it could build the bridge to the specified criteria on the City’s budget. All three said “No.” A short time later, MMM Group provided the companies with a document that established a lower level of seismic performance for the new bridge, thereby reducing expected costs. This document eventually became part of the contract the City signed with PCL, the winning bidder, but its addition to the bid process was kept secret from all but senior managers. Councillors were not informed. The document described how the bridge was expected to perform in three different earthquake scenarios. The strongest earthquake covered by the design criteria was approximately equivalent to a magnitude 7.5 earthquake. For that event, the seismic performance specified was “possible permanent loss of service”—which implies damage so great that the bridge would need to be replaced. The time between when an earthquake occurs and when emergency response vehicles can circulate is critical to rescue and recovery operations. The revised seismic design criteria specified this time for earthquakes smaller than magnitude 7.5, but no level of access was described for a magnitude 7.5 shake. The document provided no information whatsoever about the allowable outcomes—damage or emergency vehicle access—for a magnitude 8.5 earthquake, the level of seismic protection the City had agreed to buy. (The Seismic Design Criteria document does not express earthquake strength as “magnitude.” Rather, it uses “return period.” But during the decision-making process on the bridge, engineering firm Stantec linked specific magnitudes with specific return periods. I am using Stantec’s conversions in this article.) The revised seismic design criteria were kept secret until Focus obtained the document in an FOI. About a year ago, I wrote a story about how the new bridge had been designed and constructed to a much lower level of seismic protection than had been recommended to the City by Meyboom in 2010. I related the fact that the bridge could suffer permanent loss of service in a magnitude 7.5 earthquake. Two months later, I filed an FOI for the communications between the City and MMM that resulted from my article. The documents obtained showed that the company that designed the lifting portion of the bridge, Hardesty & Hanover, did not deny the bridge could suffer permanent loss of service from a magnitude 7.5 earthquake. The record of communications showed no one seemed to know how the seismic design criteria had been lowered, or who had promised what. Confusion reigned over the project. One MMM official called the matter a “debacle.” (This is perhaps the most accurate statement made by an MMM employee about the seismic issue in the 7 years the project has run.) The City’s former director of engineering, Dwayne Kalynchuk, confirmed for Focus that Hardesty & Hanover had used the reduced seismic design criteria to design the bridge. Several months later, the bridge project director Jonathan Huggett informed councillors that the new bridge would be “somewhat less robust” than the existing, 93-year-old bridge. Recall that at the beginning of this account, then-mayor Fortin had described the existing bridge to John Baird, writing “Any seismic event will bring it down.” Taking Fortin and Huggett at their words, we would have to conclude that “Any seismic event” would bring down the “less robust” new bridge, too. So there’s strong evidence that, after spending $140 million (this is the most likely current price based on information provided by the City about cost increases) on a new bridge because the old one was seismically vulnerable, there will be little or no increase in the seismic safety of the bridge. Issue too complex for local politicians? Dr Barrie’s discovery of a large, active fault just a few kilometres from the Johnson Street Bridge highlights the need for more effective management of Victoria's seismic vulnerability by elected officials and civil servants. A better understanding of seismic issues by everyone involved in making decisions about critical public infrastructure seems key. The primary misstep in the case of the bridge was the choice of a seismically-risky design. The section of the new bridge that lifts—the bascule leaf—is not permanently attached to the bridge’s foundation. It floats on steel rollers and depends on intermittently-engaged span locks to hold it in place during an earthquake. Who would build a house in Victoria without ensuring it was permanently anchored to its foundation? Although at least one of the companies competing to build the bridge red-lined the seismically risky design to City staff in its bid, that company’s proposal was heavily penalized for not sticking with the flawed design concept. Those staff also hid that criticism from elected decision-makers. So bad design and failure to listen were factors in the debacle. What else? Several paragraphs back I mentioned that the City's consultant, Joost Meyboom, had suggested to the City that since a Cascadia Subduction Zone event could produce a magnitude 9.0 earthquake, the City should consider spending more money to protect against that. That incident seems to suggest Meyboom thought a subduction event—the Big One—was the defining seismic design consideration. But a couple of years after Meyboom’s magnitude 9.0 pitch, an MMM document noted: “Given the location of the bridge, the Cascadia Subduction Earthquake was also considered as an important event. A comparison of site specific response spectra, however, showed that the spectral acceleration for the Cascadia event are lower than the 1 in 475 year earthquake and this is therefore not a critical design consideration.” Meyboom misunderstood, apparently, what kind of seismic event the bridge needed to be designed to withstand. The big problem with that was that when Meyboom first made recommendations to City councillors about seismic protection, councillors relied on his advice to make critical decisions. Unfortunately, councillors made no attempt to insure they were getting solid advice. They should have asked for at least one other opinion from a source unaffected by whether the project went ahead or not. Once the lowered seismic design criteria were made public by Focus, the City ought to have pursued the matter with an independent investigation. Instead, it simply sought reassurances from Meyboom’s company. The City’s unwillingness to properly investigate whether the bridge had been built to a lower level of seismic performance than it had agreed to pay for could, one day, have serious consequences for public safety and economic recovery. In that respect, City managers and councillors abrogated their fiduciary responsibility to the public. Barrie’s report brings into public view the need for much more attention being paid to seismic safety and resilience in the region. One possible solution to the kind of problems experienced with the bridge project would be for the region to develop its own seismic safety planning and prioritization agency. The work of such a body would need to be informed by science, not engineering companies working in the construction industry. Until the region develops such capacity, it’s unlikely to make progress toward reducing the number of casualties and property loss that will come one day when Devil’s Mountain Fault ruptures. David Broadland’s father Bob was on his parents’ farm 30 miles from the epicentre of the magnitude 7.3 Vancouver Island Earthquake in 1946. Bob’s mom thought an atom bomb had been dropped. Bob’s father Tom experienced the magnitude 7.2 Vancouver Island Earthquake in 1918. So far, David’s biggest earthquake was the magnitude 5.3 shake Victorians felt in 1976. But the party ain’t over yet.
  14. January 2016 A study by DFO scientists found that secondary sewage treatment will have a negligible effect on environmental conditions in our waters. THE CRD IS POISED to spend upwards of $1 billion on sewage treatment for Victoria in response to new Fisheries Act regulations aimed at protecting fish, yet a recent study led by DFO research scientist Sophie Johannessen says upgrading the level of treatment at two plants in Vancouver and two in Victoria will have a “negligible effect” on environmental conditions in the Strait of Georgia and Juan de Fuca Strait. Is a mistake of grand proportions about to be made? Reading between the lines, Johannessen’s peer-reviewed study challenges the narrow basis on which Victoria’s two outfalls were rated “high risk.” Environment Canada’s new regulations provide a laboratory-based formula by which the effluent from sanitary sewers can be assessed using four specific measurements. Municipal treatment plants that don’t meet the formulaic requirements are being forced to upgrade. The regulations do not provide any avenue for evaluating water conditions immediately after the effluent has been discharged from an outfall. The Johannessen study also raises the profile of one of the contaminants of concern in effluent from all the outfalls considered: polybrominated diphenyl ethers (PBDEs), otherwise known as flame retardants. The study’s authors predict that secondary treatment could significantly reduce the amount of PBDEs being discharged into the Straits “depending on how the sludge is sequestered.” As it turns out, though, this is a big if. So far the CRD hasn’t identified how it would deal with sewage sludge and if current practices for disposing of the sludge elsewhere were used in Victoria, the PBDEs could be recycled through the environment. Johannessen told Focus that unless the CRD found a way to remove the PBDEs from biosolids after treatment, they could eventually make their way to the Strait. THE NEW FEDERAL Wastewater Systems Effluent Regulations in the Fisheries Act that triggered a high risk rating for Victoria’s two outfalls are unrelated to the PBDE problem. The regulations are only intended to protect fish in the water immediately adjacent to the outfalls from effluent that is “acutely lethal.” Those regulations require measurement of chlorine, ammonia, total suspended solids, and biochemical oxygen demand at the point at which effluent is discharged from the diffuser ports on the outfalls. In Victoria’s case, it’s the combination of total suspended solids and biochemical oxygen demand that resulted in the outfalls at Macaulay Point and Clover Point being red-lined. The regulations require the effluent to be measured in its most highly concentrated form before it is discharged, and that measurement obviously doesn’t reflect actual conditions a short distance from the outfalls. Nor do the regulations have anything to say about other contaminants in the effluent, such as PBDEs, metals, plastic microbeads, or the thousands of chemicals in all sewage that derive from pharmaceuticals, detergents and other substances. The effect of the Regulations, then, is to protect a fish able to hold its position in the strong tidal currents, with its nose and gills stuck inside one of the outfall’s ports. Many Victorians will have seen images provided by the CRD that show fish swimming beside—and crabs clambering over—the Macaulay Point outfall, apparently happy to be there. These don’t appear to be conditions lethal to fish. Macaulay Point outfall Macaulay Point outfall That is, the good health of ecosystems near the outfall doesn’t seem to be predicted by the allowable range of total suspended solids and biochemical oxygen demand set out in the regulations. How could Environment Canada have got this so wrong? The fundamental inadequacy of Environment Canada’s regulations as a tool for making sound decisions about sewage treatment is captured in a quote from the study: “To predict the likely effects of management action on any point source discharge into the coastal ocean, it is essential to understand both the composition of the effluent and the environmental conditions in the receiving waters.” While Environment Canada’s regulations consider the former, they completely ignore the latter. In stark contrast, the Johannessen study accounts for the actual differences between physical conditions in the Strait of Georgia or Juan de Fuca Strait and, for example, physical conditions in Lake Ontario or the North Saskatchewan River. The authors state: “In some parts of the world wastewater discharge has led to eutrophication [an excess of nutrients], harmful algal blooms, hypoxia, extinctions of bottom fauna and fish mortality. However, the effects of wastewater discharge are not the same everywhere. For example, phosphates in household wastewater can have dramatic effects on lakes, causing eutrophication and harmful algal blooms, while anthropogenic phosphate has little effect on marine ecosystems, where productivity is more often limited by nitrate. Similarly, wastewater can affect one coastal sea differently from another, depending on processes occurring in the receiving environment. Consequently, management actions that are developed for one area, such as introducing a particular level of wastewater treatment, might not have the anticipated effect when applied to another.” Johannessen, and one of her co-authors, Rob Macdonald, are both research scientists with DFO’s Institute of Ocean Sciences in Sidney. Both are also adjunct professors at UVic. Regarding the danger of eutrophication and harmful algal blooms, the authors note, “The nitrogen discharged through all the municipal wastewater outfalls combined represents only approximately one percent of the total influx [of nitrogen from other sources]. In addition, for most of the year in most of the Strait, phytoplankton are limited by light, not by nutrients.” The authors discuss physical conditions particular to the Straits that limit growth of phytoplankton and conclude,“wastewater is unlikely to cause eutrophication or harmful algal blooms in the Strait of Georgia or Juan de Fuca Strait.” Impacts on the Straits from the discharge of organic carbon from wastewater are also quantified by the authors, who note “the municipal outfalls represent approximately 0.2 percent of the total of the sources. This is negligible in the context of the whole Strait.” The scientists acknowledge that the discharge of organic carbon does have “local effects in the area immediately surrounding each outfall.” Their description for the Macaulay Point outfall states: “[O]rganic deposition results in a high sediment concentration of organic carbon and greatly elevated sulphides, but no evident oxygen stress within sediments (due to high bottom currents and sandy substrates). Organic biomass appears to be normal relative to background in sediment around the Macaulay outfall.” The study’s authors note that there is some metal contamination in the Straits but attribute this to past mining activity, noting that core samples from the footprint of the Iona plant outfall show “little indication” of contamination by lead, zinc or copper. Elevated levels of cadmium at some outfall sites are attributed to sulphides in the footprint of the outfalls sequestering dissolved cadmium already in the water. In terms of biochemical oxygen demand, the study notes that wastewater represents only “one percent” of the demand. “On a basin scale, therefore, municipal wastewater does not add significantly to the pressure on oxygen in the Strait. In the sediment near the outfall, however, the biochemical oxygen demand of the effluent has measurable chemical and biological effects...” Even so, the authors single out the energetic ocean conditions and rapid mixing of the effluent with seawater that exist at the Macaulay and Clover outfalls as mitigating the effect of the effluent’s oxygen demand. Although the study did not consider various contaminants in wastewater that are present at very low concentrations in the Straits (detergents, pharmaceuticals, fragrances, pathogens, caffeine, etc.), the authors note that most of these substances would be expected to either break down or be consumed within one or two weeks of entering the ocean, after which time anything that remained of them would be exported out of the Straits by the net outflow of water produced by rivers flowing into the Straits. The authors note that secondary treatment could reduce those contaminants but offer no judgment on whether that would benefit the health of the Straits. The concentrations of these materials in local waters are known to be very low—below our ability to detect them. As mentioned above, the one category of contaminants in sewage that the Johannessen study predicts could be significantly reduced by secondary treatment are PBDEs. PBDEs are thought to be endocrine disruptors and may produce adverse reproductive, developmental, neurological, and immune effects in both humans and wildlife. There is broad concern that PBDEs, like PCBs, may be bioaccumulative. (See the 2014 US EPA fact sheet for more information on the language scientists are using regarding these effects.) Environment Canada and Health Canada have stated it’s their objective to reduce the concentration of PBDEs in the Canadian environment “to the lowest level possible.” Consequently, the manufacture and use of PBDEs have been banned in Canada. The Johannessen study notes, “Secondary treatment will decrease the direct input of PBDEs considerably, but it is not designed to break down persistent organic pollutants. Consequently, the effect of increasing the level of treatment will largely be to move PBDEs from marine effluent into sludge that will have to be further managed to prevent its potential re-entry into the aquatic environment.” This is a critical point. If the only significant environmental benefit of treatment is to divert PBDEs away from the ocean, but our management of sludge and biosolids then allows them back into the environment, what would be the point of spending any public money for that initial diversion? We can predict the likely fate of CRD biosolids by looking to the Annacis Island secondary treatment plant on the Fraser River that serves metro Vancouver. The data in the study shows the Annacis Island plant removes approximately 80 percent of PDBEs, which, after the sludge is processed in anaerobic biodigesters, end up in biosolids. Those biosolids are, according to Metro Vancouver, used as “cover material at landfills, fertilizer on grasslands and hay fields, for land reclamation at copper and molybdenum mines, as soil for city parks and recreation areas.” Metro Vancouver also notes that “innovative” methods for disposing of biosolids include “deep ocean dumping” and “incineration.” Consider the possibility of biosolids being used for “landfill cover” at Hartland Landfill, for example. Rainfall would eventually wash the PBDEs into the landfill’s leachate collection pipeline, which feeds into Victoria’s sewer system. The PBDEs would then return to a sewage treatment plant where about 20 percent of them would escape to the ocean. The other 80 percent would then return, via sewage treatment, to become landfill cover at Hartland. Over time the amount of PBDEs circulating through the landfill would increase, as would the flow of PBDEs that escape to the ocean. An investigation of PBDEs in landfill leachate headed by UBC scientist Monica Danon-Schaffer shows some cities in Canada with greatly elevated levels of PBDEs emerging from their landfills. Similarly, if the biosolids were incinerated, the low temperatures at which municipal incinerators operate would result in the PBDEs passing through the incinerator into the atmosphere, only to be washed out later by rain onto land or water. For those who hope that gasification of the biosolids would destroy PBDEs, think again. There is only one facility in Canada—the controversial Swan Hills Treatment Centre in Alberta—that is licensed to process such hazardous materials as PBDEs, and it employs very high temperatures relative to municipal incinerators or gasifiers. What is the CRD’s plan for the biosolids produced by secondary or higher levels of treatment? In 2010 the CRD avoided providing the Province with its plan for biosolids when they submitted, and received approval for, their current wastewater treatment plan for McLoughlin Point. That plan stated that an environmental assessment of the CRD’s biosolids plan had been completed, but an FOI filed by Focus shows the environmental assessment wasn’t actually carried out until 2015. That document simply notes that biosolids “will be used in a beneficial manner consistent with CRD Policy.” So far, no one knows what that is. Let’s sum up what we know so far. The only environmental benefit of secondary sewage treatment—according to scientists who have considered the actual in-the-water situation for Victoria—is the opportunity to permanently divert PBDEs away from the ocean. Yet the CRD has not established a plan for how it will dispose of biosolids let alone sequester the PBDEs they would contain. To explore this further I contacted the study’s lead author, Sophie Johannessen. Johannessen highlighted the complexity of the PBDEs issue by describing to me a hypothesis she has developed. It’s surprising. Her research suggests that the benthic community—the creatures that live in and on the surface sediments of the ocean bottom— have a lower level of PBDEs at the Macaulay Point outfall than those at the Iona Island outfall, which has a higher level of treatment. Johannessen thinks that’s because there’s more carbon available to eat at Macaulay compared to the amount of PBDEs. In terms of reducing the amount of PBDEs entering the marine food web, less treatment may be better. How could we expect regulators in far away Ottawa to know this? Johannessen agreed that unless the CRD found a way to remove the PBDEs from biosolids after treatment, they could eventually make their way to the Strait. “Moving persistent contaminants into sludge, which is then spread on land, might actually increase the length of time over which the environment—groundwater, streams and eventually the ocean—is exposed to the contaminants,” she explained. “Source control would be far more effective. For PBDEs, we have already undertaken source control by banning their import or manufacture, although it is going to take a long time for the existing stock of PBDEs in our furniture, toys and electronics to stop draining into the ocean. Source control is likely to be the most effective solution for any trace contaminants, because they do not make up the bulk of the effluent and because they tend not to be destroyed by sewage treatment.” I asked Johannessen if the CRD’s plan for sewage treatment—ill-defined as it is at the moment—seemed to her to be a good way to spend a billion dollars, especially in light of her finding that the opportunity to remove PBDEs was the only significant environmental effect that might be obtained from sewage treatment. “As a scientist,” she said, “it isn’t for me to say. That’s a political decision.” Fair enough, but is there anything we could do that would have a more positive effect on the marine ecosystem than upgrading Victoria’s sewage treatment system? “I think so, yes,” Johannessen said. “We could reduce our greenhouse gas emissions, enact source control for persistent contaminants, and reduce other local pressures on the marine biota.” How would reducing our emissions help? “Anything that would reduce our greenhouse gas emissions would help to slow the rate of change in the ocean, which would give marine biota more time to adapt to the changes,” Johannessen said. “Reducing carbon dioxide emissions, specifically, would also reduce the rate of ocean acidification, which is considered a major threat to a wide range of marine life, including shellfish.” What other ocean effects are scientists seeing that are related to climate change? “The local ocean is already changing fast, as a result of global-scale climate change,” Johannessen said. “The temperature of seawater and river water is increasing; the concentration of oxygen in the deep water of Juan de Fuca Strait and the Strait of Georgia is decreasing; the abundance and nutritional content of zooplankton—food for juvenile fish and seabirds—is decreasing; the timing of Fraser River flow—which drives the physical circulation in the Strait—is changing; and the frequency of short-term events such as windstorms and short intense rainstorms is increasing.” How ironic is it, then, that just as countries around the globe are marshalling the skills and knowledge of their best scientists to find a path to decarbonization, Victorians are poised to trade in their tidal-powered sewage treatment system (Johannessen likens it to “a giant washing machine”) for one that has an immense emissions burden attached? In 2013, for every million dollars of economic activity in Canada, 416 tonnes of CO2 were emitted. So the emissions burden associated with the $1 billion capital cost of this project is on the order of 416,000 tonnes. Scientists have already made it clear that the current treatment system is not causing harm to the environment. This latest study can also be seen as a warning to all of us that a carelessly-conceived treatment system could end up doing considerably more harm than good. David Broadland is the publisher of Focus Magazine.
  15. December 2015 As the cost for a new bridge marches ever upwards, explanations from City Hall seem designed to distract rather than inform. IN A RECENT Times Colonist op-ed about the new Johnson Street Bridge project, ironworkers union spokesperson Eric Bohne stated, “Deficient steel fabricated in China helped lead to a $63-million project estimate in 2009 ballooning to $100 million today and counting.” Bohne’s message is compelling: Building the steel part of the new bridge in China has taken jobs away from Canadians. Defective steel has caused the cost to swell. He’s partly right. A union-friendly NDP-led council didn’t prevent a few bridge jobs from being shipped to China. On the second count, though—that deficient steel has caused project costs to balloon—Bohne is slicing pure baloney. At a November 19 meeting of Victoria City council, Project Director Jonathan Huggett said that City costs related to problems at the Jiangsu Zhongtai Steel Structure Co Ltd factory amounted to “at least $1 million.” For a project that appears headed for an eventual cost of at least $135 million, $1 million is the proverbial drop in the bucket. That fact hasn’t prevented Huggett from constantly highlighting the faraway Chinese problem ever since he was parachuted in to save the project in early 2014. Since then, Huggett’s choice to focus on Chinese fabrication has made it appear to be the central demon plaguing the project. Bohne’s op-ed indicates that strategy is working. On November 19, Huggett continued with that message when he told councillors that, because of new fabrication problems at the Chinese factory, PCL, the company contracted to build the bridge, had changed the project completion date to early 2018. But in a contradictory statement Huggett reassured councillors the steelwork in China would be completed in “three to four months.” According to other schedules previously provided by PCL, that would allow time for delivery of the six major steel parts of the bascule leaf to Victoria in September, 2016. Based on PCL’s schedule, that would put project completion in late 2017, which is the same general ballpark Huggett was batting into last July. Huggett’s amping-up of problems in China with crack-by-crack accounts of the welding—and the impact those cracks might have on delivery dates—has had the effect of distracting attention away from deeper, more troubling issues with the project. For example, one of the fabrication challenges Huggett related to councillors on November 19—an unsuccessful attempt to fit together large steel plates that form the 50-foot-diameter rings on which the lifting part of the bridge will rotate—actually stems from the strange design of the bridge, not the skill of Chinese welders. A conventional lifting bridge rotates on an easily machinable shaft. The existing Johnson Street Bridge, for example, which has operated reliably for 93 years, rotates on simple trunnion bearings that support a shaft about 10 inches in diameter. Nothing so elegantly simple and easily manufactured can be found in the new bridge. Its original designer, Sébastien Ricard, told Victorians in 2010 that he chose to mechanically rotate the bridge using 50-foot-diameter rings rolling on steel bearings placed beneath them because he wanted an observer of the bridge to be able to readily discern how the bridge mechanism worked. So, because of Ricard’s whimsical choice, the bridge that’s being built doesn’t have a fixed shaft through its axis of rotation. Instead, it has, in effect, a 50-foot-diameter “shaft” that rolls on 24 massive 4-foot-diameter steel rollers placed beneath it. Because of the extremely tight tolerances needed for this heavy machinery to function reliably over many years, the 50-foot-diameter “shaft” of the new bridge needs to be almost as perfectly circular as the 10-inch-diameter shaft of the current bridge. The result is that Ricard’s design doesn’t make much practical sense. It’s much more difficult and expensive to make a perfectly circular steel ring that’s 50 feet in diameter compared to one that’s only 10 inches. Hence the latest difficulty in China. If the cost to the City of the problems in China can be summed up as “at least $1 million,” as Huggett put it, then what actually accounts for the ballooning of the City’s “fixed-price” with PCL? The answer to this question is complex, but once again it works back to Ricard’s design. Ricard’s design was considered too risky to build—in terms of cost— by PCL’s two competitors for the construction contract, Kiewit and Walsh. Both rejected it outright in their bids and suggested more conventional designs. PCL, however, based its bid on Ricard’s wacky 50-foot-diameter shaft and got the job. Now here’s an all-important fact to remember about this project: The dissenting opinions of Kiewit and Walsh were never shared by senior City staff with City councillors. Why was this vital information withheld from them? When Councillor Ben Isitt asked, at an open council meeting in September, 2012, why councillors couldn’t be shown the contents of the three bids, including the critical design reviews that were a requirement of the bids, City Solicitor Tom Zworski wouldn’t even allow his explanation to Isitt to be heard in public. So we don’t know what Zworski’s reasoning was and Isitt isn’t allowed to tell us. All we know is that Zworski successfully thwarted councillors—and the public—from learning about Kiewit’s and Walsh’s concerns until Focus obtained the bids two years later through an FOI request. By then it was too late and the project was already stumbling over Ricard’s impractical concept. Not only were councillors kept in the dark about the engineering concerns with Ricard’s design, they weren’t told that Kiewit and Walsh had submitted significantly higher bids (for simpler bridges) than PCL did, which should have been a warning sign that PCL’s low bid price might not hold up as the project proceeded. But Zworski’s move to isolate councillors from vital information about the project kept them from knowing this and that kept them from making an informed decision. This sealed the project’s fate. The key moment for the project, at which the council’s lack of information fully asserted itself, occurred on December 31, 2012, at a closed council meeting at which councillors were shown the PCL contract and asked to approve it before they left. It was New Year’s eve. PCL’s low starting price in the contract presented to councillors was, in fact, the highest price the councillors had been willing to approve. Their approval of an increase in the project budget from $77 million to $93 million 10 months earlier had been made on the condition that the cost would not go up “a single penny more,” as Councillor Marianne Alto put it back then. This condition was to be implemented through a “fixed-price” contract. On that New Year’s eve, with the PCL contract in front of them and needing only their approval, councillors were given the impression that the contract with PCL was essentially “fixed-price” in nature, even though the term “fixed-price” doesn’t appear in the contract and the contract provides for change orders and increased costs. Again, councillors weren’t told about Kiewit’s and Walsh’s rejection of the design and they were told nothing about the companies’ significantly higher cost estimates for more conventional bridge designs. On very incomplete information, and under pressure to say “yes,” all of the councillors except Lisa Helps and Ben Isitt voted to approve the PCL contract. An eventual ballooning of costs was a certainty. Here’s why: PCL based its bid on a modified version of a barely-developed version of Ricard’s design that had been provided by the City’s project manager, MMM Group. Unfortunately for taxpayers, MMM’s design turned out to be little more than a sketch on a napkin. As PCL’s altered version of MMM’s preliminary design was re-engineered in preparation for construction in the real world, it changed. PCL knew this could happen. They had prepared for that possibility by negotiating a contract that put all of the financial risk for both material changes to the design and delays in delivering the design squarely on the City’s shoulders. One of the first significant changes to the design involved the need for perfectly circular 50-foot-diameter steel rings. Rather than accomplishing that by using precision-milled bearing surfaces on the rings—too champagne-y for the City’s beer budget—engineers had to rely on the untried concept of pumping 4000 gallons of epoxy grout between the rings and a series of small “support segments” to create a more circular bearing surface for the 50-foot-diameter rings. To this day those engineers have been unable, or unwilling, to identify a single moveable bridge that uses epoxy grout in such prodigious quantities, and so the long-term viability of the design is in question. The grout could fail long before the Chinese welding does. In any case, that and other changes to PCL’s bid design, all changes dictated by MMM which was responsible for engineering the design through to construction, have allowed PCL to change its price for building the bridge—considerably. The City made public the first big change order request from PCL in April 2014. This was $9.5 million for “design delay” and “increases to the scope of the project.” (The net amount of the change order was $7.9 million because PCL offered to reduce its request by $1.6 million if the City agreed to thinner highway deck steel.) Since then the City has refused to provide any details about subsequent change orders. Focus has learned that there have been at least two additional change orders, but attempts to obtain details of these through FOI requests have been rebuffed by the City. In early September, 2015, Focus filed an FOI with the City for the “Issued For Construction” (IFC) drawings that would show the final design of the bascule leaf (the section of the bridge that lifts), the main support pier, and the machinery that will be used to lift the bascule leaf. Photos of the anchor bolts in the bascule pier suggest that significant additions to the design of the bridge’s lifting machinery have been made. That would support PCL’s claim of “increases to the scope” in its first $9.5 million change order. It would also add to the evidence accumulated since mid-2012 that the bridge’s experimental design had only been minimally engineered by MMM during its first four years as the City’s project manager. Those IFC drawings will be critical in sorting out conflicting claims for more money, whether through mediation or in court. The City’s response to our FOI request reflects the chaos in which the project now finds itself. Although the IFC drawings are a strict requirement of the City’s contract with PCL, and the drawings should be in the custody and control of the City, it told us the drawings couldn’t be found. The City was unsure if they even existed. Huggett’s willingness to shift blame away from City officials to faceless Chinese welders is, no doubt, a relief to the City officials who hired him to find a way through what Isitt calls “this disaster.” But the project appears to have enough serious problems right here in Victoria without having to go to China. If even the most basic project documents can’t be found, then the City doesn’t have much hope of prevailing in any legal process. It is, of course, possible that the City does have the documents we requested but is willing to break BC information access law in order to hide the true nature of the mess it has got City taxpayers into and thus avoid accountability. We have an example of that, too. Back in July, Huggett told councillors that fendering on the north side of the new bridge would cost millions of dollars that weren’t included in the City’s agreement with PCL. At the time, Isitt asked Huggett, “Could you remind us why the fendering isn’t included in the scope of the contract with PCL?” Huggett told the councillor that the north side fendering had been “clouded out” in a contract drawing, indicating that north side fendering was not included in the agreement. Following that meeting, Focus filed an FOI for the “clouded out” contract drawing Huggett had referenced. Several weeks beyond the legal deadline for the City to respond, and only after serial prompting by Focus, the City’s FOI office told us that it couldn’t find Huggett’s drawing. When we asked that office, repeatedly, if it had asked Huggett for the drawing, it didn’t respond. After we sent a written complaint to Mayor Helps and councillors, the FOI office sent us a letter making it clear that there was no “clouded out” drawing of the fendering that was part of the contract with PCL, raising serious questions about Huggett’s version of the issue. Recently, Huggett told media the north side fendering would cost “upwards of $4 million.” That’s about four times the cost that he’s attributed to problems with Chinese steel and welding. With the bridge project now looking like it has a realistic chance of topping $135 million, councillors might want to consider whether Huggett should be spending so much time on China. David Broadland is the publisher of Focus Magazine.
  16. November 2015 Was the surveillance software installed on the newly-elected mayor’s computer by Saanich staff a case of tit for tat? LATE LAST MAY I received an interesting phone call from Dr Gerald Graham. Graham had made a presentation to an August 14, 2013 CRD Board meeting at which an extraordinary incident had occurred minutes before he spoke. When Graham phoned, he told me he had filed an FOI for whatever investigation of the incident had been undertaken by the CRD. He told me there was no doubt at the CRD about who was responsible for the incident and that the FOI records he obtained showed this. When I asked if he would share those records he was non-committal. In the end he didn’t share them. I’ll come back to Graham and draw a connection to the infamous installation of surveillance software on Saanich Mayor Richard Atwell’s computer, but first let me tell you about what happened at that 2013 CRD Board meeting. The matter being discussed was the CRD’s proposed $783-million sewage treatment plan. Twenty-one individuals had pre-registered to address the board on the merits of a motion by Saanich Councillor Vic Derman. Derman was proposing that the CRD “Initiate an extensive, independent review of the current [McLoughlin Point] project.” His motion set out specific objectives for that review. The very first presentation on the schedule of speakers was a video by East Sooke fisherman and diver Allan Crow, a proponent of sewage treatment. Crow’s video had been previously uploaded by CRD staff to a laptop used to include visual presentations from the public at such meetings. At that point in the meeting the presentations of all the participants who were going to use the overhead projection system had been loaded onto the laptop. Crow started his video. The CRD’s minutes for the meeting provide a brief outline of what happened: “During the presentation it became apparent that the video had been tampered with. The Chair asked that Mr Crow return at the end of the delegation list to play an original version of the video.” Times Colonist reporter Rob Shaw’s account of the incident was more fulsome: “A local diver tried to play a video for the board of underwater conditions near a sewage outfall. But unknown opponents secretly altered his file on the CRD computer, so the words ‘misleading’ were superimposed on the video.” Shaw went on to observe: “CRD chairman Alastair Bryson stopped the presentation and asked the perpetrator to step forward. But no one did.” Gerald Graham was scheduled to speak immediately after Crow’s aborted video and then his presentation was followed by 20 others, including Crow’s unaltered video at the end. In between, among the speakers who used the CRD’s laptop to provide a visual component to their presentation, was Richard Atwell. At the time, Atwell was a community activist well known for doggedly critiquing the CRD’s every move on its sewage treatment plan. Let me go back to Graham’s phone call. As I mentioned above, Graham told me he had filed an FOI for any investigation conducted by the CRD into the incident. Graham said the CRD knew who had tampered with the video. He volunteered this information after obliquely referring to the Saanich spyware stories I had written. I was intrigued. Was the incident involving Crow’s video somehow linked to the installation of spyware on the newly-elected mayor’s computer? I eventually filed my own FOI with the CRD for the records Graham had received. What the CRD released included an email sent on August 15, 2013 to several CRD staff that described what they believed had happened: “We suspect the person downloaded the video from Youtube ahead of time (they knew the video was there), made the edits, and then deleted the version we had on our laptop and replaced it with their version. Not nice.” The records also show that, two weeks later, a second CRD employee stated, “…the delegation simply came up to the laptop and did what he wanted to under the guise of getting ready, even though his presentation had already been placed on the laptop.” It should be mentioned that the CRD emails do not name who “he” was, but with careful consideration of the short list of people who made presentations involving the CRD’s laptop at that meeting, and knowing a little about each of those people, it would be challenging to not come to the conclusion that “he” was Richard Atwell. Given the context of Graham’s phone call to me, it was evident that Graham himself had come to that conclusion. How many other people believed that Atwell had tampered with Crow’s presentation? The email records show that upwards of 13 CRD employees were made aware of the details of the CRD’s investigation into the incident. I recently spoke with Mayor Atwell and told him about the CRD’s investigation and Graham’s FOI. I asked him if he’d switched Crow’s video files at that 2013 CRD board meeting. Atwell was unwilling to either confirm or deny that he was the person who made the switch. The records provided to me by the CRD also show that on the day after the video incident, Saanich Councillor Judy Brownoff asked CRD staff about what steps they would be taking to secure the presentation laptop. It’s not hard to imagine that other CRD directors made similar inquiries and that with so many CRD staff aware of the details of the investigation, Richard Atwell, citizen activist, had quickly gained a level of notoriety amongst local government civil servants and politicians as—to use Rob Shaw’s words—“the perpetrator.” Just 15 months after the video incident, Atwell was—to the astonishment of many—elected mayor of Saanich. Within six days of that election, employee monitoring software had been installed on the mayor-elect’s designated computer, ready to record every single keystroke he made. As well, his computer was configured to prevent him from accessing the District’s corporate intranet. On top of that, access to the departmental drives that were formerly available to Mayor Frank Leonard were denied to Mayor Atwell. Why did Saanich staff feel such an urgent need to isolate, confine and monitor the new mayor’s computer activity even before he’d spent a minute in office? Had they been warned about Atwell’s suspected involvement in the video tampering incident? The official answer to that question came after the spyware had been outed. The District’s Director of Corporate Services Laura Ciarniello was asked by BC’s Information and Privacy Commissioner Elizabeth Denham why the software had been installed. Denham reported: “According to Ciarniello, the motivation for this renewed focus on IT security was the perception by District Directors that the new mayor was experienced in the area of IT and would be able to identify and criticize current weaknesses in the District’s IT security.” But that rationale has been limp from the beginning. It requires a suspension of common sense to believe that such a hostile initiative—secretly installing spyware on the newly-elected mayor’s computer—was put in place to avoid criticism. On the other hand, with rumours about Atwell’s involvement in the 2013 video incident circulating from CRD staff to Saanich politicians and then to Saanich bureaucrats—well it’s not so difficult to understand that the real motivation could have been the fear that Atwell might exploit those “current weaknesses.” This theory is lent credence by the report of “Whistleblower,” a Saanich IT division employee involved in installing the surveillance software. Concerned about the unethical nature of such covert surveillance, he wrote down his recollection of what he’d been told by Saanich’s Assistant Manager of IT John Proc: “John Proc came to us…with a directive that had just come down to IT in regards to installing monitoring software on the mayor’s computer. He said, ‘They are nervous about the new mayor. We’re installing it on the directors’ computers as well to make it [look like] it is not targeted’…” After repeatedly asking his managers if the mayor had been informed about the spyware, and not receiving an affirmative response, Whistleblower took his concerns—and that recollection—to a former colleague, who then contacted Atwell. On January 12, 2015, the mayor announced at a press conference that, among other things, he was being spied on by his own staff. Atwell’s claim was immediately countered by a press release issued by Saanich councillors on January 13, which stated: “This installation was in response to the conclusions of a May 2014 independent, external audit of the District of Saanich computer system. Recommendations from the May 2014 audit included the installation of security software.” It was later revealed by Denham, however, that the security audit’s author “did not make any such recommendation nor did he intend to make any recommendation that could be interpreted to recommend the installation of monitoring software such as Spector 360.” Indeed, Denham’s investigation concluded that the installation of the monitoring software had likely lowered the security of the District’s computer network. All this raises questions about transparency on everyone’s part, including Atwell, but it also raises the question of whether any elected Saanich official played a direct, supportive role in Ciarniello’s decision to install spyware on the incoming mayor’s computer. If so, that would have represented the kind of politicization of a civil servant’s role that was, by all accounts, common in East Germany in 1984 but which most people, one would hope, would agree has no place at all in Saanich. David Broadland is the publisher of Focus Magazine.
  17. September 2015 The Johnson Street Bridge project director says the new bridge will be “somewhat less robust” than the existing bridge. Why? CITY OF VICTORIA TAXPAYERS are now facing a price tag of $130 million for the new Johnson Street bridge project. That’s a tripling of the $35-40 million cost put on the project in 2009 when councillors first voted to build a new bridge instead of repairing the one city residents already owned. It’s more than double the $63 million that citizens were told a new bridge would cost when the City forced them, in the middle of winter, to counter-petition for a referendum on the project. It’s also $53 million above the price former City Manager Gail Stephens had in mind when she claimed the project was “on time and within budget” shortly before the 2011 civic election. And it’s almost $40 million above what “Fixed-Price” Fortin campaigned on just last November. That cost escalation is difficult for most people to understand, but the price tag is only one indicator of the whirlpool of confusion gripping the project. Consider this: Building a new bridge was initially justified on the basis of the existing bridge’s seismic vulnerability. Sure, there were other advantages touted for a new bridge, but the first and most compelling public interest rationale offered was that the existing bridge would collapse in a significant earthquake. Building a new bridge with a high level of seismic protection became the primary objective. The project has now generated enough data about itself—what the chosen objectives were and whether those will be included in the structure that’s been built—that we should have a clear picture of whether that seismic protection objective has been met. But do we? Project director fails to explain document Last March we published the story “Engineers ignored their own recommendation, a Council vote and a referendum” that considered the implications of a document created by the City’s project manager MMM Group in August 2012: Johnson Street Bridge Seismic Design Criteria. That document established the allowable physical outcomes for the new structure following earthquakes of different strengths. But those outcomes were very different than what was recommended to the City by MMM engineer Joost Meyboom in June 2010. At that time, Meyboom recommended that a new bridge “be designed for an M8.5 earthquake.” He told councillors, “If you’re going to spend $100 million on a facility, the premium to pay for a very high seismic performance is a relatively low price for insurance.” Meyboom put that premium at $10 million and characterized this level of performance as “Lifeline.” Yet MMM’s Seismic Design Criteria didn’t contain any provision at all for the outcome expected following an M8.5 earthquake, or, in the language used in the document, a 2500-year event. Moreover, the document stated that following a 1000-year event—a significantly less energetic earthquake—the allowable outcome was “possible permanent loss of service.” That wording suggests the bridge could be unrepairable. The document said nothing on the question of whether the bridge would be available for emergency services following that 1000-year event. In other words, MMM’s Seismic Design Criteria allowed a significantly lower level of seismic protection than Meyboom had recommended—and had apparently built into MMM’s cost estimates for the project. City council wasn’t consulted about this change in the project’s scope. Following our story’s publication, City councillors asked Project Director Jonathan Huggett to look into the implications of MMM’s Seismic Design Criteria. He returned to council on May 7 with a written report summarizing the seismic design of the bridge. His short report stated: “It was confirmed in writing by MMM and its subcontractor Hardesty & Hanover that the final design is based on the most comprehensive, onerous and relevant design requirements for bascule bridges in North America.” Furthermore, Huggett reported to councillors: “The new Johnson Street Bridge has been designed as a ‘Critical Bridge’—the equivalent definition of ‘Lifeline Bridge,’ which is the performance required by the City. The design of the new bridge will allow the bridge to be available to all traffic after a design earthquake of a [1000-year] return period. The bridge is expected to ‘be usable by emergency vehicles and for security/defense purposes immediately after a large subduction earthquake, e.g. a 2500-year return period event.’” Following his report, Focus filed an FOI for the confirmation “in writing” Huggett had obtained from Hardesty & Hanover (H&H), and we requested the source of the statement he had quoted in his report, specifically, that the bridge would “be usable by emergency vehicles and for security/defense purposes immediately after a large subduction earthquake, e.g. a 2500-year return period event.” We also asked for all the communications between Huggett, MMM and H&H on this issue. The email record shows Huggett scrambled for an explanation and couldn’t find one. At one point, in response to Huggett’s appeal for information on the seismic capacity of the bridge’s lifting mechanism, a senior MMM employee referred Huggett to a “briefing provided by H&H.” Huggett immediately wrote back to MMM stating: “Just to be clear—that briefing note was prepared by me last August…” [emphasis added] What the records released to us show is that H&H didn’t provided Huggett with the written confirmation he claimed, and they didn’t deny they had used MMM’s lower Seismic Design Criteria to design the bridge. H&H’s Keith Griesing wrote to Huggett and stated, “I think MMM would have to address the history and the decisions that were made to set the direction of the project. I don’t want to offer an opinion on matters that we were not involved with since it may lead to further confusion.” Worse, the City could find no record to support Huggett’s claim—which he had put in quotes as if to signify that he was quoting seismic experts at either H&H or MMM— that the bridge would “be usable by emergency vehicles and for security/defense purposes immediately after a large subduction earthquake, e.g. a 2500-year return.” It appears Huggett simply lifted a paragraph from a bridge design code and then added the word “subduction.” Records of Huggett’s billings to the City for his first year show he was paid $177,605. The report Huggett provided to councillors, then, was very expensive misinformation. Huggett hadn’t received written confirmation from H&H, and his conflation of a subduction event with a 2500-year event was pure fiction. In Victoria, the impact of a Cascadia subduction zone event would be minor compared to that of the 2500-year event for which Meyboom recommended the bridge be designed. Don’t take my word on this. Here’s what MMM said in its Project Definition Report: “Given the location of the bridge, the Cascadia Subduction Earthquake was also considered as an important event. A comparison of site specific response spectra, however, showed that the spectral acceleration for the Cascadia event are lower than the 1 in 475 earthquake and this is therefore not a critical design consideration.” In other words, in Victoria, the seismic threat posed by the Cascadia subduction zone is not a critical design consideration, at least when it comes to constructing new bridges. It’s impact here, according to MMM’s analysis, would be less than an M6.5 earthquake. So what kind of earthquake is “a critical design consideration”? Sharlie Huffman, when I spoke with her a couple of years ago, was the Province’s Bridge Seismic Engineer. She identified shallow crustal earthquakes, like the M7.3 earthquake that occurred near Campbell River in 1946, as being particularly concerning. Could such an earthquake occur near Victoria? Yes. According to the scientists of Natural Resources Canada, the peak ground acceleration predicted for the Johnson Street Bridge site in a 2500-year return period event is .607g. That value of peak ground acceleration is similar to that measured in the vicinity of earthquakes having magnitudes in the range of M8.5. As Meyboom told City councillors back in 2010, Victoria has the highest level of seismic risk of any city in Canada. MMM’s seismic engineer, Jianping Jiang, provided Huggett, in writing, with his understanding of the bridge’s expected seismic performance at M8.5. Jiang told Huggett: “With respect to the bridge performance after a 2500-year return period seismic event, we wish to clarify that the 1:2500 year event is not part of the seismic design criteria specified in the JSB 2012 Project Definition Report and was not analyzed in the design.” Why, then, did Huggett report to councillors that the bridge would be available to emergency vehicles following a 2500-year event? We asked Dwayne Kalynchuk, the previous project director, whether MMM’s Seismic Design Criteria had been used in the design and construction of the bridge. Here’s the statement Kalynchuk gave Focus, in writing: “H&H Consultants are the Engineers of Record for the bascule design. They confirmed that the standards that are reflected in the August [17th, 2012] Seismic Design Criteria are still current and are incorporated in the final bascule design which is now in the process of construction.” Focus also asked the City for records that showed when and why MMM’s Seismic Design Criteria had been developed. The records provided show the new criteria were developed in the midst of the RFP process, after the City had learned that “indicative price” submissions from all three companies bidding to build the bridge were higher than the City’s affordability limit. MMM’s Seismic Design Criteria, which provided a lower level of seismic protection than Meyboom had recommended, would have allowed for a reduction in construction costs. The document was officially added to the RFP process on August 24, 2012 after an overnight consideration of its impact by Kalynchuk. Records obtained show that on December 12, 2012, as the City was trying to finalize a contract with PCL, MMM’s Meyboom prepared a list of urgent actions that needed to be undertaken “to finalize contract discussions.” At the top of Meyboom’s to-do list was “a letter from H&H stating that the design as developed during the bid with PCL is feasible from a seismic performance point of view (bridge needs to be Lifeline).” Focus filed an FOI request for that letter; the City determined that the letter was never written. The Seismic Design Criteria document was listed in the PCL contract as a “regulatory document,” which, the contract states, “forms part of the contract.” All of the records we have obtained are consistent with our original story’s contention that MMM’s Seismic Design Criteria were used by H&H to engineer the bridge, and that the bridge’s ability to withstand an earthquake is much reduced compared with what was originally recommended by MMM. If the City can provide hard evidence that’s not true, they should produce it. Hard evidence would include a complete explanation of why MMM’s Seismic Design Criteria are part of PCL’s contract. New bridge “somewhat less robust” than existing bridge If the new Johnson Street Bridge isn’t getting the full measure of seismic protection the experts said was needed, how has the project done on other objectives? One of those goals was a wider navigational channel. Way back, the project intended to expand the distance between the new bridge’s piers by 8 metres compared with the existing bridge, thus reducing the risk that passing barges and other vessels would collide with the bridge. That improvement was effectively eliminated—to reduce project costs—in 2011. But that saving is now being offset by the cost of more substantial fendering—the bridge bumpers that would cushion a blow from a passing vessel. At a meeting on July 16, Huggett told councillors that fendering for the north side of the bridge would add an additional $3 million—more or less—to the cost of the bridge. Councillor Ben Isitt, who was in the room when the details of the PCL contract were supposedly laid out for councillors before they approved it back in December 2012, asked Huggett, “Could you remind us why the fendering isn’t included in the scope of the contract with PCL?” Huggett offered a complicated explanation involving a contract drawing that Isitt apparently hadn’t seen. My review of the contract’s details around fendering doesn’t support Huggett’s claim; the risk of additional cost of fendering seems to have been covered in the contract’s list of allocated contingencies, and limited to $462,500. Design, although incomplete, was to be covered by PCL. It’s in the contract. MMM’s own estimate of the total cost for fendering in the Project Definition Report was $1.3 million. Huggett’s prediction of an additional $3 million would mean MMM’s estimate was off by a factor of three. Hopefully Isitt will recall whether or not he was shown Huggett’s mysterious drawing and, if he wasn’t, try to save taxpayers $3 million. In any case, Huggett explained why fendering was so vital: “The new bridge is somewhat less robust than the existing structure,” he told councillors (emphasis added). “The last thing I need is a barge to hit the rest pier and knock it two inches out of alignment. For one, I don’t know how I’d get it back again having knocked it out of alignment and then I’m faced with an inoperable bridge. You’ve got a $100 million invested in the water here and I’ve got to protect it.” The news that Huggett’s bridge will be “somewhat less robust” than the existing bridge ought to have come as a shock to councillors. After all, wasn’t the robustness of the bridge—its capacity to absorb the energy of a suddenly applied force without permanent damage—the very reason why the project had been undertaken in the first place? That capacity to absorb energy is the very same characteristic required to withstand an earthquake. Huggett was now telling councillors that his bridge had less capacity to absorb a blow than the existing bridge. If any councillors comprehended the disconnect between what Huggett told them back in May about the bridge’s seismic capacity and what he was telling them now, they kept it well hidden. What some of the councillors did seem to comprehend, though, is the way in which the escalating price is a measure of the project’s fundamental lack of integrity. First-term Councillor Jeremy Loveday complained, “I feel handcuffed by past decisions and bad contracts and contingencies that are too small. As a member of the public I feel that I was misled by politicians at the civic level.” Those politicians, the records show, were misled by City staff, who, in turn, were misled by MMM Group. Take Loveday’s concern about the small contingency, for example. At the time councillors were being asked to approve a contract with PCL they were told by City staff that the four percent contingency included in the contract had been recommended by MMM. Once the project started to go off the rails, though, MMM argued that PCL should have included a 40 percent contingency. Both must have known from the start the cost would escalate dramatically. For whatever reason, neither warned the City. MMM has been the City’s project manager since the summer of 2009 and in 2010 it told councillors that “project management and engineering” should cost 12 percent of the bridge’s construction cost. On a construction cost then estimated at $65 million, MMM said its fee would be $7.7 million. But according to documents obtained by Focus through FOI, the City has already paid MMM close to $15 million. The $1.842 million Huggett obtained for MMM from councillors on July 16, along with $2.4 million the company has previously claimed, will bring their take to $19 million—or 90 percent of the $21 million federal grant. Shortly before Councillor Geoff Young voted to give MMM more money, he spoke at length about what could be learned from this project: “I don’t think it’s helpful to reflect overmuch at this stage on the project or where it will come out or its degree of success. I think it may be worthwhile to draw lessons from the project because, indeed, we are now embarked on another major project at the regional level. I think it’s worthwhile thinking about where we’re going with that one.” Young’s unwillingness to “reflect overmuch” on the project is understandable. Politicians who make bad decisions, or even those who fail to persuade their colleagues from making a bad decision, have a natural preference for forgetting. He’s right, in one way, though. It’s not the politicians who have been sitting around the table while these bad decisions were made that should now be doing the reflecting on what lessons should be learned. Judging by what councillors Pam Madoff, Chris Coleman and Charlayne Thornton Joe said at the July 16 meeting, they are oblivious to the dimensions of the disaster they have facilitated, starting with the moment back in 2009 when they all voted to replace the existing bridge. When they made that decision, there wasn’t a single aspect of the project that had been sensibly or accurately evaluated. Each bad decision they made afterward was just piling dead weight onto a poorly constructed foundation. In my life as a designer and builder—of physical structures and machinery—I learned early on that as soon as I realized I had made an error in measuring something in a job I was working on, I had to go back and fix the error or it could multiply into an even bigger problem. In politics, unfortunately, admitting to having made an error of judgment is rare. In the case of the bridge project, there were a number of points along the way where City staff’s failure to accurately assess, measure or understand some fundamental parameter—seismic risk, estimated cost, the risk associated with using an experimental design, the amount set aside for contingencies that might arise, the integrity of the City’s partners—should have been obvious to City councillors and set off alarm bells. But the majority of councillors, ill advised by highly-paid staff, continued to make bad judgment after bad judgment and the foundation of the project ended up being based on ignorance and risk instead of knowledge and certainty. As a consequence, the taxpayer is getting a bridge that’s over three times as expensive as originally estimated and “somewhat less robust” than the bridge it will replace. David Broadland is the publisher of Focus Magazine.
  18. July 2015 By David Broadland and Daniel Palmer News of a secret investigation involving Saanich interim CAO Andy Laidlaw may throw the District into more turmoil. IN THE SAANICH SPYWARE DEBATE, either you believe that the senior manager who approved the installation of employee monitoring software on newly-elected Mayor Richard Atwell’s computer understood what she was approving, or you believe that a systemic disconnect from BC’s privacy law occurred and no one in particular was to blame. That latter position was all that could be found in a report to Saanich Council on the issue delivered by the District’s interim CAO Andy Laidlaw and made public on June 24. In his introduction to the report Laidlaw noted, “I am acutely aware that my report will be subject to criticism by those who believe it does not confirm their perceptions.” The “perceptions” Laidlaw acknowledged are that there was wrong- doing and that it’s being covered up. The installation of the spyware was approved by Director of Corporate Services Laura Ciarniello. Laidlaw and Ciarniello worked together at the City of Campbell River. Their working relationship there was described to Focus by former Campbell River Mayor Walter Jakeway as “chummy.” The perception of a cover-up stems from the concern that Laidlaw’s investigation of Ciarniello’s actions would be a whitewash. Laidlaw’s introduction went on to say, “This situation has been infused with ‘politics’ from its origins and I am cognizant that my findings will be subject to that lens.” Laidlaw may have misjudged the “lens” through which his report would be viewed. The widely-held perception of a cover-up in Saanich apparently motivated some citizens to go looking for evidence to support that perception. As a result of one of those fishing expeditions, documents landed in Focus Magazine’s mailbox the day before Laidlaw made his report public. The information they daylight calls into question Saanich Council’s decision to hire Laidlaw in January this year and then appoint him to investigate the spyware question. Those documents indicate that between November 12 and November 24, 2014, Laidlaw was the subject of an internal investigation by the City of Campbell River. The investigation considered whether Laidlaw’s business association with Jerry Berry Consultants Inc, and another unnamed entity, had constituted a conflict of interest. Jerry Berry was Nanaimo’s City Manager between 1987 and 2009. Berry now describes himself on his website as “a management consultant and educator specializing in local government issues.” Laidlaw worked at the City of Nanaimo between 1980 and 2011. Campbell River retained private lawyer Richard Grounds to conduct the investigation. Grounds has done work for such organizations as the Civilian Review and Complaints Commission for the RCMP. Grounds’ investigation looked at whether Laidlaw performed paid work for Jerry Berry Consultants Inc, or another entity, while Jerry Berry Consultants or another entity had been engaged to provide services to the City of Campbell River. Additionally, the investigation considered whether Laidlaw had a business relationship with Jerry Berry Consultants Inc or other entities, and may have been paid for services provided by him to Jerry Berry Consultants Inc, or other entities, in circumstances where Laidlaw had a role in awarding or recommending the award of contracts by the City to Jerry Berry Consultants Inc or other entities. As well, the investigation considered whether Laidlaw, through his business activities with Jerry Berry Consultants, had disclosed the City’s confidential information to Jerry Berry Consultants Inc or another entity. For confirmation that an investigation had taken place, Focus contacted Jakeway, who agreed to speak on the record. According to Jakeway, who was narrowly defeated in last November’s election, the investigation found that Laidlaw’s activities did constitute a conflict of interest and that he had disclosed the confidential information of the City of Campbell River. Jakeway told Focus this finding was made following the election and was considered by the outgoing council. It decided to leave a decision on whether any action should be taken against Laidlaw to the incoming council. That council, sworn in on December 2, has sat on Grounds’ findings ever since. According to Campbell River’s 2014 statement of financial information, Grounds’ investigation cost taxpayers $25,523. This does not include the cost of a legal opinion by Campbell River's legal counsel Dean Crawford. Laidlaw was paid $182,455 in remuneration and $6,392 for expenses by the City of Campbell River in 2014. Laidlaw declined to answer questions on this subject. Campbell River Mayor Andy Adams would only confirm that Laidlaw had retired from his position on January 16. This news—and Laidlaw’s unwillingness to answer questions about it—may feed the perception of wrong doing and cover-up at Saanich. Ciarniello was, until August, 2013, Campbell River’s Director of Corporate Services. She was tasked by Saanich Council on December 8 last year with creating a shortlist of candidates for the CAO position to replace Paul Murray. Was Ciarniello aware of the investigation into Laidlaw and its outcome at the time she put him on that shortlist? In response to that question Ciarniello told Focus, “I am unable to answer your…question as it deals with personal information and if discussed would be subject to in-camera confidentiality.” The possibilities here aren’t endless. If Ciarniello wasn’t aware of the investigation, then Laidlaw didn’t inform her. What would that possibility suggest about Laidlaw’s suitability to conduct an investigation into the perceived malfeasance in the spyware case? If she was aware of the investigation, and informed councillors about it, why did they choose Laidlaw? Did councillors request a reference? Focus contacted Saanich Councillor Colin Plant and described in broad brush strokes what we knew about the investigation in Campbell River. Would Plant have voted to approve Laidlaw if he had known that an investigation in Campbell River had found Laidlaw in a conflict of interest position? “Unlikely,” Plant said. “However, to be fair, I would have needed to know more about the situation before answering that definitively.” LAIDLAW'S CONTRACT WITH SAANICH is set to run out in August unless a new CAO has not yet been found. Whether he can survive a storm blowing down from Campbell River until then remains to be seen. In the meantime, will his report bring closure to the spyware issue? Laidlaw’s report referenced Privacy Commissioner Elizabeth Denham’s late-March Investigation Report, which found that Saanich had broken BC privacy law when it installed the spyware. In that report Denham stated, “One of the most disappointing findings in my investigation of the District of Saanich’s use of employee monitoring software is the near-complete lack of awareness and understanding of the privacy provisions of BC’s Freedom of Information and Protection of Privacy Act.” Parts of Laidlaw’s report read like an investigation of Denham’s investigation. His statement that “The Commissioner has provided new interpretations applying to security and the collection of personal information,” could be read: “She’s just making this stuff up.” With evidence recently brought forward by current and former employees of Saanich’s IT division that indicates Atwell was the target of three IT initiatives that either intercepted communications made on his office computer, or prevented him from accessing the District’s computer network, the thorny question of whether such actions by public servants are excusable or should result in some disciplinary action hasn’t been fully answered. The portion of Laidlaw’s report that addressed that question was attributed to “Brian Simmons, Labour Relations Consultant.” Simmons’ report stated: “Finding no evidence of malfeasance, I find no cause to terminate or discipline any employee based upon those considerations.” How did Simmons come to that conclusion? Focus was unable to get an answer from Simmons because he couldn’t be contacted. We asked Laidlaw for information about Simmons but he refused to provide any biographical information or any way of contacting Simmons. Invoking a concern for Simmons’ privacy, Laidlaw promised, “I will forward this request to him.” An extensive web search for “Brian Simmons, Labour Relations Consultant” provided no information. Atwell, interviewed for the report by Simmons, told Focus that he was only able to obtain an email address from Simmons—no business card, no telephone number, and no website. Repeated emails by Focus to the email address given to Atwell produced no response. Similarly, Focus was unable to question Simmons about his statement that “I find that on the balance of probabilities, and considering all of the circumstances, the evidence does not support a claim that the Mayor’s computer was targeted.” How did Simmons arrive at such a conclusion? Did he interview “Whistle Blower,” whose story was published in the last two editions of Focus? It’s our understanding that Saanich threatened Whistle Blower with legal action unless he signed an agreement that limited his right to self-expression about the issue. Whistle Blower has said that the District's Assistant IT Manager John Proc told him on November 20: “They are nervous about the new mayor. We’re installing it on the directors’ computers as well to make it [look like] it is not targeted.” With Simmons secreted away, the question of whether the actions of the employees involved in the spyware issue are excusable remains unanswered since the basis for Simmons’ findings appears to be limited and can’t be examined or questioned. Saanich resident Karen Harper, a vocal critic of the way the issue has been handled by the District, is a retired Chief Information Officer of BC Pension Corporation. Following the release of Simmons’ and Laidlaw’s reports she told Focus, “I can state with utter certainty that had I authorized the installation of spyware on our system, I would have been fired for cause. It would not have mattered whether I did so out of incompetence or if I had some other motivation. I would have been gone. Why? I would have brought disrepute onto the organization to a degree that cannot be ignored.” Harper called the reports a “whitewash,” and said, “As a concerned Saanich resident and taxpayer, I believe that a public inquiry is still needed—by truly independent persons—in order to restore any faith in the bureaucracy and council.” CIARNIELLO HAS BEEN AT THE CENTRE of the spyware story from day one and the details of that involvement are now well-documented. On November 17, 2014, just two days after Atwell was elected mayor, Ciarniello and the District's Manager of IT Forrest Kvemshagen met, they have said, to discuss the recommendations of a May 2014 audit of the District’s IT security system by Wordsworth and Associates. But Ciarniello and Kvemshagen have not provided any written evidence that the plan they developed was motivated by the security audit. Two days later, a meeting took place between high-level staff that included Ciarniello, then-CAO Paul Murray, Fire Chief Mike Burgess, Legislative Services Director Carrie MacPhee (also responsible for privacy compliance), Planning Director Sharon Hvozdanski, Parks and Recreation Director Doug Henderson and Finance Director Valla Tinney. According to the District, no minutes were recorded at this meeting. Denham reported, however, that a decision was made to install “protection and monitoring software” on the workstations of everyone at the meeting (CAO, directors, fire chief), on workstations used by two executive assistants, on computers used by councillors and on Mayor Atwell’s computer. Ciarniello and Kvemshagen told Denham this decision was made in response to the security audit. By December 2, Spector 360 software had been purchased, downloaded and silently deployed on 13 computers, including that of the incoming mayor. The IT technicians doing the work were given explicit instructions to enable the most privacy-intrusive features of the software, which included logging every keystroke made by Atwell and frequent automated screen shots. At the same time, a log that would have recorded when managers viewed the information collected by the software was left disabled. If someone did sift through Atwell’s private information, it can’t be proved. Before activating the software, Kvemshagen emailed Ciarniello: “In order to ensure there is appropriate authorization in place for this work, please reply to this email stating your approval.” Ciarniello replied, “I approve of this program and the machines it has been installed on.” There is no question, then, that Ciarniello approved the installation. The question that remains unanswered, by anyone at Saanich, is this: Did Ciarniello know she had approved the installation of spyware? In the investigation undertaken by Denham, Assistant IT Manager John Proc told investigators that he understood the software agreed on at that November 19 meeting was meant to have “forensic auditing capability… and ability to determine whether user accounts were accessing areas which they were not supposed to be accessing.” A January 14 press release from Saanich, however, used very different language to characterize Spector 360, describing the software as a means of monitoring “internal activity that may result from external threats,” and assisting Saanich by deterring theft, potential leaks of data and by “protecting high profile users.” This claim was dismantled by Denham’s report as well as several IT experts. Denham also noted that the author of the May 2014 security audit “confirmed that he did not make any such recommendation nor did he intend to make any recommendation that could be interpreted to recommend the installation of monitoring software such as Spector 360.” The claim that the software was a security fix was also challenged by former Saanich IT manager Jon Woodland. “You don’t buy a system like that to protect a network; you buy it to investigate someone or their activities,” said Woodland, who is now IT manager at the Township of Esquimalt. Woodland alerted Atwell to the spyware in December after speaking to former colleagues who were worried the process “was being rushed in.” “Any of the colleagues I’ve talked to have been shocked, as I was, that the municipality would install this type of software,” Woodland said. That the intention of installing the software wasn’t security, but spying, is also supported more directly by the testimony of the former Saanich IT department analyst, “Whistle Blower,” whom we mentioned above. His statement is powerful and bears repeating: He stated that Assistant IT manager John Proc told him on November 20: “They are nervous about the new mayor. We’re installing it on the directors’ computers as well to make it [look like] it is not targeted.” There is reason to believe, then, that Ciarniello and Kvemshagen both knew—or should have known— they were setting up a system for spying on Atwell, not securing the network from external threats. Contacted for her side of the story, Ciarniello told Focus that she refutes the allegation made by Whistle Blower. From December 3 until January 21, Spector 360 was actively collecting all data from Atwell’s computer and from 12 other workstations. By January 12, of course, the affair had spilled out into the open when Atwell announced at a press conference that he was being spied on by Saanich staff. Focus has previously reported on additional records that show the installation of Spector 360 on Atwell’s computer was only one of three actions Saanich’s IT department was ordered to undertake that were aimed directly at the new mayor. Those actions included setting up Atwell’s computer without shared drive access, a privilege enjoyed by his predecessor Frank Leonard; and configuring Atwell’s computer to redirect to the public internet if he attempted to access the central corporate intranet—the heart of information exchange between Saanich employees. Focus asked Ciarniello if she had authorized the configuration of the new mayor’s computer so that he was unable to access the same departmental shared drives to which Mayor Frank Leonard had access. She said, “Mayor Atwell’s computer has been configured in the same manner as Mayor Leonard’s.” On this question Laidlaw told Focus, “Mayor Atwell has the same access as the previous mayor.” Whatever the case, what is the implication of Ciarniello’s decision that the mayor could be monitored without his consent? For one thing, it suggests she believed an elected mayor is subservient to bureaucratic managers, that a mayor is an “employee” subject to the same conditions under which rank and file District employees are required to operate. This is evident in Ciarniello’s subsequent justifications about the program of employee monitoring she approved. For example, in a January 12 press release, in an effort to prove that Atwell had been informed that his computer was under surveillance, Ciarniello stated, “Prior to being permitted access to the Saanich corporate computer network, employees are required to sign a Network Access Terms and Conditions form.” Atwell has consistently said he was never given the form. Councillor Colin Plant has confirmed that he, too, wasn’t given the form. So Ciarniello’s claim that Atwell was warned his computer would be under surveillance is unsettled. But since everyone agrees that Atwell didn’t sign the form, by logical deduction there is agreement by everyone, including Ciarniello, that Atwell didn’t consent to having his communications intercepted. Yet intercepted they were. PROFESSOR DAVID SIEGAL, an expert in Local Government and Public Policy and Administration at Brock University, said the relationship between staff and council is clear: staff works for council and not the other way around. “The actions of installing the spyware without telling people it had been installed is the sort of thing where mayor and council could discipline somebody. It’s almost the kind of thing where you could fire somebody with cause,” Siegal said. “This seems to be a pretty serious matter. I guess what it indicates is a complete breakdown in trust between staff and council … but it’s staff who leave, not councillors or the mayor.” Siegal said the designation of the mayor as “Chief Executive Officer” in the Community Charter is little more than a title, but he acknowledged that the mayor does have some exclusive powers, like the ability to suspend municipal staff. If Atwell did suspend an employee, it would then trigger a review by council to reinstate the employee, confirm or extend the suspension or fire the person in question. “The CAO or senior staff being disciplined is the kind of thing a mayor and council collectively, not the mayor by himself, can ultimately do,” said Siegal. If Ciarniello’s actions regarding the Spector 360 installation represented a “complete breakdown in trust,” as Siegal speculates, there’s been no explanation of how such a deterioration could have occurred. Atwell hadn’t even been sworn in by the time all the decisions about the spyware had been made. As well, subsequent events, as told by Atwell, suggest Ciarniello’s actions may have been guided somewhat by Atwell’s fellow councillors. This is illustrated by the fact that it wasn’t Atwell’s signature that appeared on Laidlaw’s contract. The authorizing signature came from Councillor Judy Brownoff, who was designated by staff as acting mayor in Atwell’s alleged absence the day the contract was signed. Yet Atwell hadn’t taken leave or left the District, and the only attempt Ciarniello made to contact him about the contract was through his executive assistant, Brandy Rowan. “I was asked in person by Laura Ciarniello on Friday [January 16] to sign the contract, but as it was the end of the day, I told her I would have to take the contract home over the weekend to read it before I signed it,” Atwell said. “She told me, ‘No, the contract has to stay within the walls of municipal hall.’ I was dumbfounded by this statement as I have a right as mayor to take confidential documents home, given that I’ve sworn an oath of office…I do this all the time.” On Thursday, January 22, Atwell emailed Ciarniello to ask again about accessing Laidlaw’s contract; on January 23, after inquiring again, Atwell was told by Ciarniello via email that Brownoff had already signed Laidlaw’s contract five days earlier. “There has to be a reasonable justification for staff to defer to the acting mayor,” Atwell said. “Simply stating that they were ‘eager to put out a press release’ isn’t one that stands up. I was shocked when I found out Brownoff signed the contact, as was every other elected official I have told.” Focus asked Ciarniello if she had gone around Atwell to obtain authorization on Laidlaw’s contract. She said, “No, the Mayor did not make himself available.” CIARNIELLO'S POSSIBLE SIDE-STEPPING of Atwell to get Laidlaw’s contract signed wasn’t the only time she overstepped her authority, Atwell says. On January 27, Atwell emailed a New Year’s address to his assistant, Jennifer Downie, and asked that it be distributed to all Saanich staff through the E-link internal website, to which the Mayor was not given access. In that letter, Atwell wrote that he planned to schedule drop-in coffee sessions with staff in the coming weeks “at our various facilities throughout Saanich. I would love to share a coffee with you, hear about your role in this organization and your ideas for making our community a better place for all citizens.” According to Atwell, Downie blocked the letter. She told Atwell it touched on governance and operational lines and had forwarded it to Ciarniello for direction. Atwell said Ciarniello explained to him that the email to staff was blocked “as it contravenes council’s direction.” Focus asked Ciarniello if she refuted Atwell’s claim. She said, “Yes, I refute the allegation.” Atwell told Focus that many municipal hall staff have told him that they have been directed not to talk to him and that staff are afraid of retribution from their managers for disobeying this unwritten order. Focus has seen documentary evidence that Saanich IT division staff have been told not to talk with Atwell. Siegal, who recently published Leaders in the Shadows: The Leadership Qualities of Chief Administrative Officers in Canada, observed, “If the mayor was trying to give staff direction about fixing a pothole or something that council hasn’t authorized, then the mayor is clearly overstepping his bounds. But if the mayor is wanting to talk to people, to staff, I don’t know very many CAOs who would intervene in something like that.” Contacted for comment, Laidlaw told Focus, “District staff have not been told not to speak with the Mayor.” Atwell, elected on a platform promising change at the District, appears to be sharing a similar experience to that of former Campbell River Mayor Walter Jakeway. Laidlaw was the city manager and Ciarniello was director of corporate services during Jakeway’s mayoral term. Jakeway—a mechanical engineer with an MBA who had cut his teeth in the pulp and paper industry—had been elected in 2011 on a platform of change and keeping tax increases at zero percent. He wanted to reform the budgeting process and make good on other election promises early in his term. But a strained relationship soon developed between Jakeway and senior staff. He remembers strong opposition from Laidlaw, his bureaucracy and incumbent councillors. “In a lot of ways, Andy was like the eighth member of council,” Jakeway said. “I did not have a positive relationship with either of them [Laidlaw and Ciarniello]… My idea was to make change happen… They did everything they could to try to block those ideas and gave me no options.” Both Laidlaw and Ciarniello told Focus that they disagreed with Jakeway’s characterization. Ciarniello said, “I took direction from the CAO and council.” Laidlaw said, “As city manager, I work for ‘council.’ The direction followed is subject to the collective decision making process of council. The mayor often had different viewpoints than council.” Jakeway said he believes that entrenched bureaucratic opposition exists across local governments to elected officials who try to upset the status quo. “There’s a bureaucratic code that at all costs you protect the bureaucracy, and when something goes wrong, hang onto that bureaucratic code…At all costs, protect the bureaucracy.” FOLLOWING THE JANUARY 12 announcement by Saanich Police that the software had been installed as a security fix, Atwell complained to the Office of the Police Complaints Commissioner. On June 23, a brief statement from OPCC noted, “Based on the information we have received to date, we have determined Mayor Atwell’s complaints against members of the Saanich Police Department are inadmissible as they do not constitute misconduct as defined pursuant to the Police Act of British Columbia.” That, of course, did not settle the matter of whether Atwell’s communications had been intentionally intercepted. Saanich Police Chief Bob Downie rejected a criminal investigation on January 12 based on a legal opinion that Section 184 of the Criminal Code did not apply since “[The] software was put in place to protect from a computer breach from the outside or unauthorized access from within. It was not and is not being monitored and the information stored on the computer is only accessed by two persons, both of whom are managers in IT.” But Denham and IT experts have made clear Spector 360 is an employee monitoring tool, not a firewall, anti-virus program or external threat monitor. In fact, Saanich had to turn off some IT security measures for Spector 360 to work on the affected computers, and Denham said it may have made security less effective by creating a “honeypot” of passwords and other high-level information on the Spector 360 server. Downie also ruled out a criminal investigation based on the premise that employees have no expectation of privacy at the workplace. Denham said the opposite is true. Saanich Police Sgt. Steve Eassie confirmed the department is not reconsidering its previous conclusion, as there is no new information to evaluate. “Nothing has changed,” Eassie said. “The [Privacy] Commissioner did not assert any criminality.” A determination of criminality, however, has never been within the scope of the Privacy Commissioner’s authority. The Province won’t be launching a public inquiry, either. Justice Minister Suzanne Anton’s deputy, Kurt J. W. Sandstrom, has told concerned Saanich residents that the issue is a matter of civic governance, so it won’t get involved. Neil Turley, one of several Saanich residents who wrote to Anton, said he believes the Saanich bureaucracy—and several councillors—just want the spyware scandal to go away. “Democracy at our municipal level seems to be broken. Laws were broken,” Turley said, “and the Privacy Commissioner’s report should have been enough red flags for consensus at the top and for council to come out and admit this is wrong. But instead, it just starts to rot. This issue is not going to go away until someone decides justice needs to be served. Once we lose a bit of democracy, you just don’t get it back.” David Broadland is the publisher of Focus Magazine. Daniel Palmer is the former editor of The Saanich News and is now a freelance writer.
  19. June 2015 The CRD is fighting to prevent release of a record that could show how badly it estimated one of the costs of sewage treatment. SINCE AN INQUIRY CONDUCTED BY the Office of the Information and Privacy Commissioner is a quasi-judicial process, I suppose I’m breaking some quasi-law by disclosing the contents of the CRD’s and Stantec’s submissions before an adjudication is made. Maybe I’m headed for quasi-jail, but the information that the CRD and Stantec are trying to keep out of the public eye is central to a rational, community-based decision on the sewage treatment question. In 2009 the CRD contracted Stantec to provide engineering consulting services for the core area’s sewage treatment program. The OIPC inquiry was called to determine whether the CRD was entitled to withhold from Focus (and the public) a crucial, 80-word paragraph from a report presented to an in camera meeting of the CRD’s sewage committee in June 2009. The information in the report convinced the committee to approve that contract. The report was authored by two CRD employees, Dwayne Kalynchuk and Tony Brcic, both of whom are former Stantec employees. The missing paragraph in the report is believed to be the only written record of what Kalynchuk estimated hiring Stantec as the Program Management Consultant would cost. In 2009, however, Kalynchuk was reported by the Times Colonist to have stated publicly that Stantec’s fee would be “one percent of the project budget.” The project budget has remained pegged at $783 million since 2010. One percent of that would be $7.83 million. The most recent budget estimate of the cost of Stantec’s services over the life of the project, however, has been far higher. An estimate based on CRD payments to Stantec from 2009 through to the end of 2012, added to figures from a late 2012 CRD projection through to 2018, puts the total cost of Stantec’s services at $55 million. That’s a seven-fold—700 percent—increase in the projected cost of Stantec’s services over what Kalynchuk is believed to have estimated. My working theory is that neither the CRD nor Stantec want Kalynchuk’s secret paragraph revealed for fear that public awareness of that seven-fold increase would get taxpayers wondering what other multipliers they can expect for the other parts of that “$783 million” budget. The CRD’s initial submission to the inquiry makes two claims about releasing Kalynchuk’s paragraph: First, that releasing the information would damage the financial interests of Stantec. Secondly, that releasing Kalynchuk’s paragraph will result in a possible loss of $200,000 to the CRD because that’s what it would cost to replace Stantec. The CRD, Seaterra and Stantec all claim that Kalynchuk’s paragraph has information in it that would allow Stantec’s competitors to deduce its hourly rates (a no-no in freedom of information law), and that knowledge would allow those competitors to outbid Stantec if the contract is re-bid (it expires in 2016). If you’re a taxpayer, you might be thinking, “Of course the CRD should get the most competitive bid. That’s the only way to get the most efficient use of my tax contribution.” You might think the public interest is best served if competition between businesses is encouraged. The CRD, apparently, doesn’t think that way. They’re strongly in favour of sticking with Stantec because that will save the bother of issuing another RFP, which they estimate would cost the above-mentioned $200,000. Posed against that, though, is my argument for why Kalynchuk’s paragraph should be released: If the estimate for Stantec’s fee has escalated from $7.8 million in 2009 to as high as $55 million in 2013, there’s either a tremendous amount of over-charging by Stantec going on, or Kalynchuk’s original estimate, which was apparently based on Stantec’s secret rates, was baloney. Since Kalynchuk and Brcic are both former Stantec employees, it’s a matter of public interest to know whether they grossly underestimated the probable cost of Stantec’s services or not. Similarly, if it turns out that Kalynchuk’s missing paragraph accurately estimated that Stantec’s total fee would be in the neighborhood of $55 million, no damage will have been done. The public will be reassured that Kalynchuk and Brcic did not let their former employer’s interest get in front of the public interest, and Capital Region taxpayers can sleep more easily knowing they are in good hands with Stantec guiding the CRD forward. By the way, if you search the CRD’s website for “Stantec fees” you’ll find a least a couple of relevant reports that are more current than Kalynchuk’s 2009 paragraph. Those reports provide far more explicit information from which Stantec’s hourly rates could be determined by their competitors. Either the CRD and Stantec have forgotten that they’ve already made Stantec’s rates public (doubtful) or they know they are there and simply want to delay, as long as possible, release of Kalynchuk’s crucial paragraph. Here’s why the community needs this information now: There’s an intense community effort being made right now to find an alternative to the McLoughlin Point project, and that process needs to be informed about the real costs lurking just over the horizon for the old plan. As it turns out, though, the position of the CRD’s brain trust seems to be that the project is going back to McLoughlin Point and they don’t want any surprises that might upset that plan. CRD staff’s hope for a return to McLoughlin is embedded in the documents submitted to the OIPC inquiry. Both the CRD’s submission and Seaterra’s affidavits read like a long expression of loyalty to Stantec. They go to great lengths, and, one supposes, legal expense, to explain how Stantec’s competitors might use its rates to outbid them on renewal of the project contract in 2016 or, in fact, any project. While their concern for Stantec’s well-being is touching, the irony is that it was Stantec who guided the project to the rocky shores of Esquimalt, along with an 18-kilometre twin pipeline to Hartland. The community rebelled. Why would the CRD be so loyal to a company that came up with a failed plan? Well, that’s because the CRD are intent on proceeding with that plan and McLoughlin equals Stantec. If CRD and Seaterra bureaucrats were truly committed to a reconsideration of the plan, they would be accepting of the need for a new RFP and they would release Kalynchuk’s paragraph. THE CRD’S stand-by-your-man relationship with Stantec is like the bond that developed between the City of Victoria and MMM Group during the optimistic phase of the Johnson Street Bridge Replacement Project. MMM has been the City’s project manager since 2009, a role similar to Stantec’s gig as project management consultant for the CRD. In MMM’s case though, it also provided some engineering services during the construction phase. If the sewage treatment project turns out to be anything like the bridge project, local taxpayers are in for a wild ride. The bridge experience provides insight into the level of optimism bias about cost that’s built into local political and governance cultures, and how that’s exploited by the commercial practices of big engineering and construction companies who operate in this market. The estimate for the cost of a new bridge started at $40 million, then rose to $63 million, $77 million, and $93 million. Each time it reached a new high, assurances were given—and believed by politicians—that it would rise no higher. Now everyone is optimistically hoping the cost of the bridge won’t reach $120 million. The same local optimism bias about cost is at play in the sewage treatment program. Looking at the key indicators—particularly the relative level in escalation of the project management fees—a realistic pessimist would guess the cost of a sewage treatment system here will rise to $1.7 billion, once optimism bias has burned itself out. One sure way to make the cost outcome worse would be to let Stantec and the CRD keep Kalynchuk’s missing paragraph in the dark. That would send a strong signal to Stantec that the political and governance cultures they’re working in can be pushed upward in cost without consequence. By unjustifiably delaying the release of Kalynchuk’s missing estimate, that signal has already been sent. Readers might be interested to know that the key player for the City of Victoria during the optimism burnout phase of the bridge project (June 2010 to April 2015) has been Dwayne Kalynchuk, formerly of the CRD, whose missing paragraph I’ve been seeking. He recently retired as director of the City of Victoria’s engineering department—a significant change in personnel at City Hall that, curiously, was acknowledged only indirectly through the issuance of a request for proposal that sought a head-hunter who would then find Kalynchuk’s replacement. The intial estimate for that project is unknown. David Broadland is the publisher of Focus Magazine.
  20. June 2015 The spyware installed on Mayor-elect Richard Atwell’s computer was only one of three IT strategies that targeted him. NEW EVIDENCE BROUGHT FORWARD by current and former employees of the District of Saanich’s IT department may create additional pressure on BC’s Attorney General Suzanne Anton to investigate whether, on the direction of senior Saanich officials, the communications of Mayor Richard Atwell were wilfully intercepted. Section 184 of the Canadian Criminal Code provides for punishment of up to five years in prison for the “wilful” interception of private communications between parties unless at least one of the parties agrees to the interception. Atwell has said he was never informed by the District of the interception. Saanich has provided no proof he was. Before getting to that new information, let me remind you of what we already know. The public position of the District so far has been that Spector 360 employee monitoring software was installed on 13 District computers as a temporary network security upgrade in order to impress upon the newly-elected mayor that recommendations made in a May 2014 computer network security audit by Wordsworth & Associates had been acted upon. The District decided to install the monitoring software on November 19, only four days after Atwell defeated long-time incumbent Mayor Frank Leonard in a bitterly-contested election. Atwell became aware of the monitoring software on December 11 after a Saanich IT department employee (whom Focus has named “Whistle Blower”) expressed concerns to his former manager at Saanich, Jon Woodland, who is now the manager of IT at the Township of Esquimalt. Woodland contacted Atwell who then interviewed Saanich IT department employees until he found Whistle Blower. Atwell then requested that Saanich Police investigate. On January 12 Saanich Police provided an opinion to Saanich Council that no criminal code violation had occurred. That same day, Atwell went public with his concerns. Shortly afterwards, BC’s Information and Privacy Commissioner Elizabeth Denham announced she would conduct a formal investigation to determine whether BC privacy law had been broken. In late March Denham delivered a scathing report that found Saanich broke BC privacy law when it installed the software and then collected the personal information of Atwell and others. Her report challenged the District’s claim that the initiative was a response to the Wordsworth & Associates’ security audit. Denham reported that installation of the software likely weakened the District network’s security against external attacks. She also observed that access logs that would have recorded whether anyone had accessed the information collected from the mayor’s computer hadn’t been enabled. Denham’s report revealed that five District directors, the Fire Chief and CAO Paul Murray met on November 19, 2014 and discussed the use of “a security strategy focussed on high-profile users.” No written record of that meeting was kept, so it’s uncertain what action was actually agreed to by assembled directors. Records obtained by FOI show Spector 360 employee monitoring software was purchased on November 20. On December 2, Director of Corporate Services Laura Ciarniello gave approval to District IT Manager Forrest Kvemshagen to enable the software. An email between Ciarniello and Kvemshagen shows that Murray was aware that employee monitoring software had been installed. Last month I wrote here about an affidavit prepared by Whistle Blower on January 17. The document was created to protect Whistle Blower in case Saanich took disciplinary action against him—which it subsequently did. More on that later. In the affidavit, Whistle Blower stated that on November 20 he was told by the District’s Assistant IT Manager John Proc that monitoring software was to be installed on 13 District computers. Whistle Blower quoted Proc as saying, “They are nervous about the new mayor…we’re installing it on the directors computers as well to make it look like it is not targeted.” A spokesperson for the District of Saanich refused to comment on the allegation, saying only that such comment would be “premature” given that an internal report was being prepared by Saanich’s interim CAO Andy Laidlaw. Since then, Focus has obtained additional records that show the installation of Spector 360 monitoring software on Atwell’s computer was only one of three actions Saanich’s IT department was ordered to undertake that were aimed directly at Atwell. The records were provided by former and current Saanich IT department employees on condition of anonymity; they fear Saanich will retaliate if their names are known. No details that could harm the security of Saanich’s network were sought or shared. ON NOVEMBER 17, 2014, two days after Atwell’s victory over Leonard, the District’s IT department began preparing for the transfer of political power from Leonard to Atwell. This included emailed instructions from Assistant IT Manager John Proc to his staff for physical removal of Leonard’s computer to IT department offices for “secure archival storage” in a “locked environment.” The email also shows that Leonard’s access to departmental shared drives was to be set up on a home-based computer (this access ended on December 1). In the same email, IT staff were directed to configure a new computer for Atwell’s use in the mayor’s office with “no other shared drive access at this time.” This first initiative, then, served to prevent Atwell from accessing files to which Leonard had access. Atwell recently confirmed that, six months later, he still has no access to any departmental shared drives. The second initiative was undertaken a few days later. At 8:55 am on November 21—the same day that Proc purchased Spector 360 employee monitoring software for installation on Mayor-elect Atwell’s office computer—a Saanich IT division employee used an iPhone to photograph a series of actions mapped out on a whiteboard. The directions in the plan showed that Atwell’s office computer would be configured so that any attempt by him to access the District’s corporate intranet—the heart of information exchange between Saanich employees—would be redirected to the network that’s available in places like recreation centres. That is, unlike Leonard, who could access all the information and ideas on that intranet, Atwell was locked out. Atwell recently confirmed he still has no access to Saanich’s corporate intranet. This new information changes the story in two ways. First, the technical details of the two initiatives make it even more difficult to accept the security rationale for the Spector 360 software on which Saanich’s defence of its actions depends. Second, the basic nature of the two initiatives outlined above, which have isolated Atwell from information needed to function properly as mayor, and shut off his access to the conduit through which Saanich employees communicate amongst themselves, raises serious questions about the motivation behind all three schemes. Let’s examine the security rationale again, this time in light of the new information, and then circle back to the underlying nature of the managers’ actions. I recently asked Laidlaw by email if he was aware that Atwell’s computer had been isolated from the District’s network through initiatives undertaken by Saanich’s IT division. Laidlaw replied, “The set up on Mayor Atwell’s [computer] was identical to the set up on Mayor Leonard’s computer.” That assertion, obviously, is deeply at odds with the records Focus has obtained. In her report, Denham outlined the rationale the District had given her investigators for why they had chosen employee monitoring software to strengthen network security. She stated, “Proc understood that the goal was to have a forensic auditing capability. The software was also to have the ability to determine whether user accounts were accessing areas which they were not supposed to be accessing.” We now know that Atwell’s computer had been blocked from accessing anything but his Saanich email account, his own personal files, and the internet. The councillors’ computers could only access the internet. So I asked Laidlaw why the Spector software had been put on Atwell’s and the councillors’ computers, which had no access to the sensitive files the employee monitoring software was apparently intended to guard. Laidlaw wrote, “When council use these machines, they access their personal accounts; malware could be transfered to the network through emails or sharing of files on flash drives.” I asked Jon Woodland for his opinion of Laidlaw’s claim that putting employee monitoring software on computers would protect them from malware. Woodland said, “Let me reiterate that Spector 360 provides exactly zero protection against malware and virus attacks, and, that Saanich IT staff had to disable portions of Saanich’s existing anti-virus features to allow Spector 360 to be installed on the PCs. Otherwise, their existing security measures would have prevented the installation of Spector 360.” Even if Atwell and the councillors’ computers did have access to the District’s network, why wouldn’t Saanich simply use anti-malware and anti-virus software to protect them? I posed that question to Laidlaw, who then responded, “It was never the intent to have Spector act as an anti-virus or anti-malware software—Saanich has other programs that complete these tasks. Spector was implemented as a forensic analysis tool.” That response, a feat of circular logic, takes us back to my first question to Laidlaw: Why would a forensic analysis tool be put on computers that had no access to sensitive files? The District is unable to explain this. The absence of a reasoned explanation coincides with an absence of records: Saanich’s FOI office has been unable to provide a single written communication showing that Murray, Ciarniello, Kvemshagen or Proc considered how employee monitoring software related to any of the recommendations made in the Wordsworth & Associates audit. If there was no credible security rationale for the three initiatives managers ordered, then why were they ordered? Two of the initiatives limit Atwell’s access to information. Was there sensitive information someone was fearful Atwell would find if he were able to access departmental shared drives and the corporate intranet? Given the reputation of the mayor-elect, such a fear might be understandable. Atwell first came to public attention for his critique of CRD bureaucrats’ handling of the sewage treatment plan. He filed FOIs, analysed that information, and compared what he found to CRD claims—and soundly embarrassed CRD staff when they provided misinformation. Atwell was a highly effective community activist. But he then took his activism to the next level and campaigned for mayor on a platform of change and more open and transparent government. Against all expectations, he won. Perhaps those at the top in Saanich were caught by surprise and felt compelled to quickly circle whatever wagons they could rustle up. Take, for example, the position in which CAO Paul Murray found himself following Atwell’s surprise victory. Murray had made statements during the election that indicated he favoured Leonard. He told a gathering of Saanich managers that he couldn’t work with Atwell. Following Atwell’s unexpected win, Murray was suddenly in an awkward position, and, as it turned out, he had a lot to lose. Through an FOI we know that, earlier in 2014, Leonard and the previous council had agreed to an unusually generous compensation package for Murray in which he was given a retroactive salary increase from $16,151.50 per month to $18,432. A personal letter from Mayor Leonard informed Murray his annual salary would rise to $250,000. A previous contract had stipulated payment of 18 months severance if Murray were terminated without cause—a much more generous provision than other local municipal executive contracts allowed. Murray's settlement agreement with Saanich strictly reflected the terms of that contract. Atwell, after being advised by an expert on municipal law that Murray’s publicly-stated unwillingness to work with him presented a problem, met with Murray to determine whether he would be willing to consider leaving Saanich under the terms of his employment contract. Atwell has since said Murray agreed to leave. Unfortunately, Atwell hadn’t seen Murray’s contract. Saanich staff refused to provide him with a copy, and because of the IT initiative described above that prevented Atwell from accessing shared departmental drives, he was unable to access Murray’s contract directly. So Atwell didn’t understand the significant financial impact Murray’s leaving would have, and this mushroomed into a political fiasco for Atwell. The new mayor was roundly blamed for the outcome, the conditions for which had been set in place by the previous council’s generous compensation package and Murray’s stated unwillingness to work with Atwell. If the IT initiatives launched after Atwell’s election were a circling of the wagons, Saanich managers’ specific choice of wagons has since turned out to be a colossal embarrassment to the District: An arguably political decision was made to block Atwell’s access to departmental shared drives and the corporate intranet, access that Leonard had enjoyed. At the same time, a decision was made to install employee monitoring software on Atwell’s computer—which, according to Whistle Blower’s account of his conversation with Proc, was aimed only at Atwell. IT experts have ridiculed Saanich’s claim that Spector 360 was a legitimate response to the Wordsworth & Associates security audit. In any case, Atwell couldn’t access the files Saanich says it was trying to protect, so placing Spector 360 on his computer didn’t make sense on that basis alone. Moreover, there is no written record that even a cursory consideration of the merits of Spector 360 as a response to the security audit ever took place amongst senior managers. The big remaining question, one that Laidlaw is widely expected to avoid, is this: Who, ultimately, made these decisions and ordered the three initiatives? Saanich’s Information Technology division is part of the Department of Corporate Services, which is headed by Ciarniello. Ciarniello’s boss at the time was CAO Paul Murray. Although FOIs have shown that Ciarniello gave approval to Kvemshagen to purchase, install and enable the Spector 360 software, Focus has found no explicit approval from Murray. The records obtained, however, make it clear that Murray was aware that employee monitoring software had been installed on Atwell’s computer. Who approved the IT measures that were taken to isolate Atwell’s computer from departmental shared drives and the corporate intranet on November 17 and November 21? I put this question to Woodland, who worked in Saanich’s IT division for 16 years before moving in 2012 to manage the same division in Esquimalt. “I would think that Kvemshagen would only need Ciarniello’s approval,” Woodland said. “However, this would typically be a directive from the CAO. That doesn’t mean it has to happen that way.” COMMISSIONER DENHAM DETERMINED that Saanich broke BC privacy law when it collected the personal information of Atwell and others whose communications were intercepted by the program Ciarniello approved. Had it not been for Whistle Blower’s conviction that what Ciarniello approved was morally wrong, Denham would probably never have learned that Saanich managers were spying on an elected official. So it may come as a surprise to readers that the only person who has been punished as a result is Whistle Blower. Before I tell you about that, here’s what Section 30.3 (c), the “Whistle-blower protection” provision of BC’s Freedom of Information and Protection of Privacy Act, says about an employer disciplining an employee who acts to stop the employer from violating the Act: “An employer, whether or not a public body, must not dismiss, suspend, demote, discipline, harass or otherwise disadvantage an employee of the employer, or deny that employee a benefit, because the employee, acting in good faith and on the basis of reasonable belief, has done or stated an intention of doing anything that is required to be done in order to avoid having any person contravene this Act.” In order to avoid having his employer continue to contravene the Act by intercepting the mayor’s and others’ private communications, Whistle Blower did what he felt was required to be done. After repeatedly expressing concern to his managers and asking if they had informed Atwell that he was being monitored, and repeatedly receiving “wishy-washy” responses, he sought the advice of his former manager, Jon Woodland. As a result, Whistle Blower was required to appear before a disciplinary hearing chaired by Saanich’s Manager of Human Resources Jo MacDonald. Also attending the hearing were Proc and Kvemshagen. At the hearing, Whistle Blower repeated, in front of Kvemshagen and Proc, his recollection of his conversation with Proc in which Proc told him Atwell was the target of the monitoring software. MacDonald accused Whistle Blower of breaking two of the District’s “confidentiality” requirements. One of those, which appeared in Whistle Blower’s job description, stated that he “will not release or discuss non-routine municipal or departmental business without prior authorization.” From whom was Whistle Blower expected to get “prior authorization”? The entire management chain, from Ciarniello down to Kvemshagen and Proc, were all part of the decision to install monitoring software on Atwell’s computer. This put Whistle Blower in an extraordinarily untenable position. The other District policy MacDonald claimed Whistle Blower defied was, ironically, the District’s Code of Ethics, which states: “A municipal employee shall not use information which is not available to the general public for his or her own personal profit or advantage and shall not provide such information to others unless it is in the course of the employee’s duties to do so.” Who could believe that Whistle Blower sought Woodland’s advice for “personal profit or advantage”? And surely Section 30.3 (c) made it his legal duty to do whatever he could do to “avoid having any person contravene this Act.” A sworn affidavit describing what Whistle Blower told Woodland shows the only information divulged to Woodland was that Saanich’s IT division was ordered to install a key-logger on Mayor Atwell’s computer. Apparently unaware of the prohibition against discipline under FIPPA’s Whistle-blower protection, MacDonald suspended him without pay for two days. At that point, Whistle Blower was already looking for a better place to work. So he quit. He has since found that place. Still, that an individual who had a properly functioning moral compass would be disciplined for breaking a “code of ethics” is vexing. Saanich managers are clearly struggling with the basic principles of democratic governance. As Focus went to press with this story, we received word that Laidlaw had called a meeting of the few remaining members of the District’s IT division—over half the staff have either quit or gone on sick leave since Saanich became Spyynich—where they were threatened with dismissal if they spoke with Focus; former employees, too, were threatened with legal action unless they divulged to Laidlaw “the particulars of any information you have provided to anyone not currently employed at the District of Saanich.” Focus to the District of Saanich: This is Canada, not North Korea. David Broadland is the publisher of Focus Magazine.
  21. May 2015 Did Saanich staff conspire to spy on the newly-elected mayor? FOLLOWING RELEASE of BC Information and Privacy Commissioner Elizabeth Denham’s report on the controversial installation of employee monitoring software on 13 District of Saanich computers—including incoming Mayor Richard Atwell’s—many Saanich citizens expressed frustration that Denham had left fundamental questions unanswered: Did Saanich managers conspire to spy on Atwell? If so, who ordered the spying? And who, they asked, will now determine what actually happened? Saanich Council’s decision on April 13 to turn further investigation of the matter over to Interim CAO Andy Laidlaw did nothing to allay concern that these questions would be left unanswered. Indeed, shortly after Denham’s report was released, Laidlaw wrote to her and stated, “My purpose in writing you is to express our concern about how you have chosen to characterize the District…We believe your conclusions are only accurate to the limited number of interviews conducted and the narrow scope of material reviewed…” With Laidlaw already assuming a defensive position on behalf of his staff, it’s pretty much a foregone conclusion that his report will step backwards from where Denham left the central questions. And since councillors directed Laidlaw—on the recommendation of staff—to deliver his report at a meeting closed to the public, there’s little hope it will shed any light. In camera meetings are like black holes—no light can escape the gravitational clutches of the legal advice that will inevitably be attached to Laidlaw’s report. Although many people—including most Saanich councillors—want to leave this story behind, that’s unlikely to happen. The evidence Denham’s investigation gathered included interviews with all the key players in the software scandal. Their explanations for why they did what they did are, at times, a challenge to accept at face value. Denham’s report concluded what happened in Saanich was illegal—District employees broke BC privacy law. But her report also provides several key facts that are consistent with Atwell’s allegation that he was spied on following his election last November. The word “spying” simply implies secretly gathering information. If Saanich managers intended to covertly gather information from Atwell, that would be a serious erosion of the democratic principle that control of government must be kept in the hands of the people—through their elected representatives—and not the other way around. Focus has obtained a document created by a person we will call “WB”—for “Whistle Blower.” WB’s document outlines a much different rationale for installing the Spector 360 software than put forward by Saanich staff in Denham’s report. It documents WB’s recollection of what he was told by Assistant IT Manager John Proc on November 20, 2014. At that time WB was an IT analyst with Saanich, but he has since left the District’s employ. The document was apparently created by WB for the purpose of getting a fellow Saanich IT worker’s signature certifying that he agreed to the facts presented in WB’s document. In the document, WB noted that he was “present during initial talks about monitoring software on 2014-11-20.” WB went on to write, “This is my recollection of conversation between myself, [name redacted] and John Proc…John Proc came to us ([name redacted] and I) with a directive that had just come down to IT in regards to installing monitoring software on the mayor’s computer. He said, ‘They are nervous about the new mayor. We’re installing it on the directors’ computers as well to make it [look like] it is not targeted’ and ‘this won’t last long’.” In his commentary in the document WB noted that he had “made his concerns known that he was nervous about being involved in this and to move forward he would need some assurance that everybody subject to having a keylogger installed on their computer would be given [the opportunity for providing] proper consent above our normal end user agreement.” WB emailed this document to Jon Woodland on January 17, 2015, asking him “What do you think?” Woodland is the manager of information technology for the Township of Esquimalt. Before that he worked in Saanich’s IT department for 16 years. WB initially contacted Woodland in early December, 2014, and told him about the installation of surveillance software on Mayor Atwell’s computer. Woodland, after checking with his own employer, then told Atwell on December 11. Atwell eventually connected with WB independently. WB was interviewed by Denham, but his “recollection” did not appear in her report. (John Proc was asked by Focus to comment on WB’s recollection but did not respond to repeated emails.) The takeaway from WB’s recollection is that Atwell was targeted for surveillance and that installation of software on other computers was simply a ruse. This is a serious allegation. Does Denham’s report provide any evidence that could disprove this allegation? It does not. In fact, a careful read of the “Chronology of Events” section of the report offers much information that supports WB’s allegation. In the text below, I provide a part of Denham’s “Chronology of Events” from her investigation report. The part I’ve included covers Saanich’s explanation of who led the initiative and why they did what they did, up to the point when the software was enabled. Denham’s chronology is included verbatim—with one general exception. Wherever she identified a person simply as a staff position such as “CAO,” or “Director of Corporate Services,” I have identified that person by name. I have also provided additional information and analysis, identified as “Focus commentary.” All of the Saanich staff named below were sent emails by Focus that included questions specific to the role they played. Each person was invited to provide any commentary they might want included in this article. Saanich’s Director of Legislative Services Carrie MacPhee, acting on behalf of the holidaying Laidlaw, informed me that no Saanich employees would respond to my questions. “Given Council’s direction,” she explained, “it would be premature for staff to be commenting at this time and it is anticipated that any future comment will be provided by Mr Laidlaw or Council once they have considered Mr Laidlaw’s report.” Commissioner Denham’s “Chronology of Events” Denham: Through interviews and review of documents the following chronology of events was established relating to the selection and implementation of Spector 360. This chronology is the basis for my analysis of the District’s compliance with FIPPA. May 2014: The District contracted with an IT security consultant to perform an information security audit (“IT Audit”) on the District’s IT infrastructure. The IT Audit revealed security shortcomings which District IT staff have been working to address since that time. The District stated in a January 14, 2015 media release, Spector 360 was purchased in response to one of the recommendations in the IT Audit. My staff reviewed the IT audit report and it did not make any specific recommendation that could be interpreted to recommend the purchase and installation of employee-monitoring software. The Audit’s author, also interviewed by my Office, confirmed that he did not make any such recommendation nor did he intend to make any recommendation that could be interpreted to recommend the installation of monitoring software such as Spector 360. Focus commentary: The “audit report” Denham mentions was executed by Wordsworth & Associates. The Commissioner is clearly skeptical of the District’s claim that its decision to purchase Spector 360 employee monitoring software was a credible response to the recommendations of Wordsworth & Associates. That claim is further eroded by Saanich’s response to an FOI filed by Focus for communications between Director of Corporate Services Laura Ciarniello and Manager of Information Technology Forrest Kvemshagen that took place as a result of the recommendations of the audit report. Ciarniello and Kvemshagen were at the centre of the decision to purchase and install the employee monitoring software. But according to Saanich’s FOI office, Ciarniello and Kvemshagen left no written record that they ever communicated about the recommendations of the Wordsworth & Associates audit report; there’s also no written record that they ever communicated on whether or not Spector 360 software would address any of those recommendations. The absence of such records over a six-month period suggests little or no effort had been made to address the security issues raised by Wordsworth & Associates. A senior IT analyst at Saanich, whose name Focus is withholding at his request, has confirmed that other than quick and easy patches to the District’s computer network security provisions, none of the major recommendations made by Wordsworth & Associates have been acted upon. The takeaway from this is that Saanich’s claim that the Spector 360 initiative was a response to the Wordsworth & Associates security audit recommendations is not supported by any written evidence. Denham’s chronology: Nov. 15, 2014: Richard Atwell was elected as the mayor for the District of Saanich. Nov. 17 to 19, 2014: The Director of Corporate Services Laura Ciarniello continued discussions with the Manager of IT Forrest Kvemshagen about the need to remedy outstanding IT security issues, and the need to accelerate resolution of some of those issues prior to the new mayor taking office. According to Ciarniello, the motivation for this renewed focus on IT security was the perception by District Directors that the new mayor was experienced in the area of IT and would be able to identify and criticize current weaknesses in the District’s IT security. After discussions with Kvemshagen, Ciarniello decided to procure and install software which would provide for comprehensive monitoring and recording of all actions undertaken by key District employees and officers. Focus commentary: Motive is key. Two days after Atwell was elected, senior managers initiated a program that we have since learned may have left the District’s computer network more vulnerable to attack than before, but would have allowed intensive monitoring of Atwell’s computer use. What might have motivated this reaction? Ciarniello admitted to Denham that District Directors were aware of Atwell’s IT experience. He also has a reputation for his ability to extract damning information from local government. Did Ciarniello and her boss, Paul Murray, worry that Atwell might hack into the District’s information vault? Denham’s chronology: Ciarniello opted to secure the workstations used by employees and officers of the District who are deemed to be “high-profile” and therefore likely targets for an IT security breach. Ciarniello stated that this strategy was adopted so that the District Directors would be able to reassure Mayor Atwell that steps had been taken to secure the District’s IT infrastructure. Focus commentary: Did this specific direction make sense as an effective strategy to address the District’s security issues? In Denham’s “Conclusions” she notes, “ecurity measures taken by the District may have resulted in a net reduction to IT security by concentrating the personal information of key employees and officers in one location, creating a ‘honeypot’ for external attackers.” If Ciarniello was intent on improving security, why would she have chosen a strategy that could actually weaken security? Was the choice of effective monitoring over effective security a mistake? Or was it deliberate? Whistle Blower’s understanding is that it was deliberate. Moreover, there is no record that any District director made any attempt to “reassure” Atwell that “steps had been taken to secure the District’s IT infrastructure.” He was kept in the dark until WB came forward. If not for that person’s initiative, Atwell might still be typing away, unaware he was being intensely monitored. Denham’s chronology: Assistant Manager of IT John Proc stated that deploying monitoring software only on the workstations of high-profile users was considered an interim measure until the District was able to install and configure a district-wide Intrusion Detection System (“IDS”) and Intrusion Prevention System (“IPS”) capability that would protect all District workstations. The Assistant Manager stated that this was considered an effective interim step because the district-wide IDS/IPS solution would be too expensive to rapidly implement. Focus commentary: The foundational logic of using an employee monitoring strategy is difficult to grasp. Does it follow that a course of action that would create a “honeypot” for external attackers would be deemed “effective” just because a real security system would be “too expensive to rapidly implement”? Alarm bells are ringing. Were the objectives of the initiative rapid implementation and low cost—or effective network security? Denham’s chronology: Nov. 19, 2014: Ciarniello met with Chief Administrative Officer Paul Murray, the Chief of the Fire Department, and the Directors of Legislative Services, Planning, Parks and Recreation, and Finance. The use of a security strategy focussed on high-profile users was discussed and the Directors were advised that protection and monitoring software would be installed on the following employee workstations: 1. the Mayor; 2. two shared workstations for Councillors; 3. the CAO; 4. the Directors of Corporate Services, Legislative Services, Planning, Parks and Recreation, Finance, and Engineering; 5. the Chief of the Fire Department; and 6. two executive administrative assistants. Focus commentary: Saanich’s FOI office confirmed that no minutes were made of the proceedings of that November 19 meeting. Prodded, Saanich did provide the only record made at the meeting by Ciarniello—a hand-written note to herself that contained 13 words: “Council machines”; “Paul + Directors”; “Assistants”; “Same protection from hacking put on all machines.” The absence of a substantial record of the meeting, other than that it took place, suggests an organization that has a keen grasp on the downside of accountability. Of the computers on which Spector 360 was installed, two were shared by several people—those used by councillors. All other computers were dedicated to a single individual, including the mayor’s. Of those individuals, records show that only Mayor Atwell was kept in the dark that monitoring software had been installed on his computer. Does this prove Atwell was being targeted? It comes close, but what about the councillors’ computers? They were to be monitored too. I asked Councillor Colin Plant if he had been informed that his computer use would be monitored. Plant said, “No,” and added, “I was never given a Network Access Terms and Conditions form to sign. When the story hit that the mayor had been given one and hadn’t signed it [Atwell maintains he wasn’t given this form] I asked the Director of Corporate Services why I had not been asked to sign a form. I was told that because the Council computers did not access the Saanich network we did not need to sign that form.” Let’s review that. The supposed purpose of installing the software was to protect the District’s network, but the councillors’ computers didn’t have access to that network. Why, then, were the councillors’ computers being monitored if they didn’t have access to the Saanich network? Was this part of a strategy to make it appear—in case of discovery—that Atwell had been treated the same as other elected officials? Noteworthy, too, is that Director of Legislative Services Carrie MacPhee was present at that November 19 meeting. MacPhee is the District’s officer responsible for compliance with FIPPA’s privacy provisions. In a recent letter to Laidlaw, Denham referred to MacPhee’s presence at that November 19 meeting and pointed out, “n the documents provided to my staff by the District we can find no mention of any concerns being raised regarding the privacy implications of this course of action, or of the need for the District to consider its obligations under FIPPA before proceeding.” Denham has said that the whole affair could have been avoided if Saanich had completed a “Privacy Impact Assessment.” Saanich has, in the past, performed PIAs before instituting other IT initiatives. As Denham’s report makes clear, had one been undertaken in this case, Saanich would have needed to consult with her office before proceeding to implementation. Why did MacPhee fail to recommend completing a PIA before proceeding with the intensive monitoring strategy? (MacPhee did not respond to repeated emails.) The answer to that, at least in part, seems straightforward: Completing a PIA would have slowed implementation and would have required that Atwell and the councillors be notified that their personal information was going to be collected—before it was collected. If the objective had really been to impress the mayor as Ciarniello told Denham, wouldn’t notifying Atwell and the councillors have been the chosen course of action? If spying was the objective, though, what would be the purpose in warning Atwell that he would be spied upon? For whatever reason, MacPhee didn’t insist on a PIA and Kvemshagen and Ciarniello were able to proceed quickly and quietly. Denham’s chronology: Immediately after this meeting Ciarniello directed Kvemshagen to research and procure protection and monitoring software. Kvemshagen then directed Proc to research and source software that could be installed on selected workstations and record all user activity. Proc understood that the goal was to have a forensic auditing capability. The software was also to have the ability to determine whether user accounts were accessing areas which they were not supposed to be accessing. Nov. 20, 2014: After researching available options through an online search, Proc reported back to Kvemshagen, recommending that the District acquire Spector 360. Kvemshagen reported to Ciarniello that available alternatives had been researched and that he recommended Spector 360. Kvemshagen stated that this program would provide IT staff with information to assist in identifying and mitigating a security breach. Focus commentary: The speed with which decisions were made is breathtaking. After taking no action for six months, it took only three days to decide on a new “security” system and one day to do an online search for which system to buy. An interesting feature of the desired security system was that “The software was also to have the ability to determine whether user accounts were accessing areas which they were not supposed to be accessing.” This would have been particularly useful if the motivating concern was that Atwell would use his IT technology skills to look for skeletons in the closet. Elsewhere in her report, though, Denham points out “that the software didn’t restrict access to sensitive IT resources and could only provide information about a security breach after it had taken place.” Unless, of course, the information being collected by the system was being monitored more or less continuously. Then it could be known right away when someone went looking for skeletons in the closet. This was the point in the timeline at which the conversation in Whistle Blower’s recollection took place. Denham’s chronology: Nov. 21, 2014: Spector 360 was purchased. Nov. 26 to Dec. 3, 2014: District IT staff installed Spector 360 on 13 employee workstations. Spector 360 was installed with the default configuration, which provided for: 1. automated screenshots at 30-second intervals; 2. monitoring and logging of chat and instant messaging; 3. a log of all websites visited; 4. recording all email activity (a copy of every email is retained for 30 days); 5. a log of file transfer data to track the movement of files on and off the District network; 6. a log of every keystroke made by a user; 7. a log of program activity, recording which windows were open and which window had the focus of the user; 8. a log of when the user logged in and logged out; 9. tracking of every file created, deleted, renamed, or copied; and 10. a record of network activity including applications that are connecting to the internet, when the connections are made, the internet address they connect to, ports being used, and the network bandwidth consumed by those connections. Data collected by the Spector 360 tool was encrypted and stored on a virtual server located at Saanich City Hall. The virtual server is dedicated to Spector 360. The server was configured to retain the data for a period of three months. There is no backup copy of this information. Kvemshagen and Proc both described the implementation and configuration of Spector 360 as providing a reactive approach to IT security, helping to enable rapid remediation after a security breach. District IT staff were directed by Proc to use a “silent” installation, which refers to installation without any user input on the target computer and were specifically directed to configure the software to enable keystroke logging and timed screenshots. With regard to the specific direction to enable screenshots, Proc stated that there were concerns from IT staff that frequent screenshots could result in a possible drain on IT resources. However, in consultation with the vendor for Spector 360 it was determined that the software could be configured to enable screenshots with negligible effect on IT resources. With regard to the specific direction to enable keystroke logging, District IT staff had expressed concerns about the privacy implications of keystroke logging. Proc directed staff to enable keystroke logging because it had been specifically authorized by District management. Focus commentary: The specific direction to enable keystroke logging and frequent screenshots is one of the strongest pieces of evidence that the motivation of District staff was internal surveillance, not security from outside hackers. The IT analyst who was instructed by Proc to download the Spector 360 software on November 24 seemed to know that when she wrote Proc and asked, “So do I get filled in with what’s going on? [redacted]? Surveillance software. Sounds ominous.” Elsewhere in her report Denham noted that some aspects of the Spector 360 software were “at least minimally related to the securing of the District’s IT resources.” But a couple of points Denham made that I’ve already mentioned are worth reiterating. First, she noted that the software didn’t restrict access to sensitive IT resources and could only provide information about a security breach after it had taken place. Secondly, she observed that “Any tool that monitors network traffic or collects confidential information in one place is a primary target for attackers. This is particularly the case where, as with the District’s implementation of Spector 360, logs that monitor administrator access to the server are not enabled.” Saanich’s failure to enable the logging of administrative access to the information collected by the software led Denham to make a statement that she couldn’t rule on whether the information had been used. Why would those logs not have been enabled? Although Denham noted this was “a common security failing,” it’s also consistent with a plan to quietly monitor Atwell’s activity. Was the decision to not enable access logs made so that no one could later see who accessed the data collected, or how often they had accessed it? The surreptitious nature of the initiative is alluded to by Denham in her reference to Proc’s instruction to an IT analyst to perform a “silent installation” of the software once it had been purchased and downloaded. But that doesn’t fully capture the extent of the District’s culture of secrecy. Months later, the District tried to cover up the secretive circumstances in which the software had been deployed. FOI requests that captured Proc’s emailed instruction for a “SILENT deploy” all came back with the word “SILENT” unjustifiably blacked out. A small, but telling misuse of FIPPA by the District. Denham’s “Chronology of Events” continued on right through to January 21, 2015, when the software was disabled by Saanich, but the key staff decisions and justifications for those decisions are captured in the text above. The striking differences between what Saanich staff said were their objectives and what Denham reported they achieved seem to all support Whistle Blower’s allegation that the software initiative’s objective was to monitor Atwell, and that other installations of the software were done to make it look like the mayor wasn’t being targeted. Sadness in Saanich At the April 13 meeting at which Saanich councillors voted to have Laidlaw give them an in camera report, Atwell asked his fellow councillors to support a motion that Saanich apologize to those people whose privacy rights had been violated. Citizens at the crowded meeting were given a chance to voice their opinions on Atwell’s motion. One of the speakers was Shellie MacDonald, who said, “I feel sad that someone in our staff for our community decided to spy on an elected official. I feel sad that that’s our culture. I feel sad that people on this council did not work as a team to find out what was going on, and talked freely about things like impeachment in the media afterwards. “We elected you folks to do your best and work together. You all promised to do your best for us, for the benefit of our community, and it’s just been sad for months. “In human culture, when people are sad, then the people who caused that need to apologize. And if there’s any doubt about who needs an apology…I do. And everyone who voted for you people because you promised to do your best and we haven’t seen it; and you promised to be responsible and we haven’t seen it. And you need to work as a team and you haven’t done that. We need that and we need an apology.” The majority of the councillors were unmoved. Apparently unwilling to fully accept Denham’s verdict that Saanich staff had broken privacy laws, and out of touch with the widely-held perception in their community that democratic representation had been eroded, if not insulted, most councillors weren’t feeling sad. They voted against any apology and then voted again to avoid even an expression of regret. David Broadland is the publisher of Focus Magazine.
  22. April 2015 Victoria City Council has been fooled again on the Johnson Street Bridge project. ONE OF THE GREAT PARADOXES of the Johnson Street Bridge Replacement Project is that as the costs go up and the benefits to taxpayers go down, the company managing the project for the City of Victoria makes more and more money. In a February 27 letter to the City, MMM Group asked for an additional $1.8 million. Although a precise account of MMM’s likely total take on the project is not yet available, the latest ask appears to push it close to $17 million. Yet in 2010 MMM estimated their services would cost $7.8 million. Since then, while MMM’s bill climbed, the project has undergone a continuous paring away of most of the original objectives of the project. Since mid-2010 the following changes were made: rail service across the bridge was removed from the project scope; the width of the roadway was reduced and the safety zone for bicycles eliminated; the navigational channel was reduced to little more than its current width; the Wilkinson Eyre signature-bridge architectural quality was downgraded to Nanaimo Light Industrial; the material quality of the finished bridge was cheapened to the point where it will now be structurally reliant on 4000 gallons of epoxy grout; and, instead of being removed, the concrete piers of the existing bridge will be left in place, with unknown consequences. Will the narrow, unbraced concrete remnants fall into the navigational channel in an earthquake, and block it, hindering recovery? Even with all those reductions in scope, the overall cost of the project rose from $63 million in early 2010 to $92.8 million in 2012, and has since risen to between $113 million and $120 million today, when claims for more money from the various companies involved in the project are included. The City is in a “mediation” process involving all the parties asking for more money and would, naturally, prefer that everyone believe these claims are all just a big put-on and will vaporize into a cloud of goodwill between the builder of the bridge (PCL) and MMM, who are, right now, at each others’ throats. The latest loss in scope, which I wrote about last month, is the level of seismic performance MMM recommended to the City in 2010. Back then, MMM’s Joost Meyboom told City councillors the new bridge should be built to a “Lifeline” standard that would enable immediate access to emergency vehicles following an “M8.5” earthquake (read “magnitude 8.5”). Meyboom said that, compared to an “M7.5” earthquake, “M8.5” protection would cost an additional “$8.5 million.” Councillors then voted to include the “M8.5” standard in the project, and that level of seismic performance was widely promoted by the City during the referendum campaign. In fact, days before the referendum, Meyboom emailed City staff pointing out that a new study about the Cascadia subduction zone west of Vancouver Island suggested the zone could produce an M9.0 earthquake and so the City might want to consider—for more money—an even higher level of seismic protection. But last month I revealed here that an August, 2012 document authored by MMM showed it had secretly lowered the standard. The document, titled Johnson Street Bridge Seismic Design Criteria, stated that the bridge could experience “possible permanent loss of service” following an M7.5 earthquake. Just as surprising was the fact the document included no commentary whatever on the level of damage expected following a M8.5 earthquake, or whether emergency vehicles would be able to use the bridge following such a quake—a feature City councillors thought they had bought back in 2010. My article prompted a 50-minute back-and-forth discussion between City councillors and the project’s latest director, Jonathan Huggett, at a March 12 meeting. Huggett referred to MMM’s seismic design criteria only once at the meeting and then only refered to it as “a memo,” even though the document is listed in the construction contract for the project as a “Regulatory Document” that “forms part of the contract.” He avoided the contents of the document and instead expressed doubt that there would be any incentive to lower the seismic design criteria. “What would be the motive to reduce the design standard?” Huggett asked councillors. “Hardesty and Hanover are not responsible for the construction costs of this bridge. They designed it. So if it turns out it costs more money, it’s not their problem. And PCL didn’t design the bridge and they have a contract to build the bridge and they’ll build whatever they’re told to build. So I’m at a loss to understand who might have suggested reducing the standard and what possible advantage they would have got out of it.” As Huggett must have known, however, the contents of MMM’s document wouldn’t have had any input from either PCL or Hardesty and Hanover—it was written solely by MMM in August, 2012. MMM’s position at that time is easily understood. The company was trying to save the project. Before MMM published its seismic design criteria, all three companies bidding for the construction contract had indicated they couldn’t build MMM’s design on the City’s $66 million budget. MMM’s challenge was to find some way to lower those bids. Lowering the seismic standard for the project would have had exactly the same effect—increasing the likelihood that the project could proceed—as, say, advising the City to accept a bid that had only a four-percent contingency. Let me parse this point a bit, because it provides guidance on MMM’s credibility on the seismic issue. Why did MMM recommend that the City accept a bid with a four-percent contingency? Was it because MMM thought that was adequate? No. (I’ll provide proof for this later.) It was done to ensure that at least one bid was within the City’s affordability ceiling (The other two bids ended up $16 milllion and $26 million above the City’s budget). Otherwise the project likely would have been dead, and if it had died the City would not have signed—in November, 2012—a $9.2 million contract with MMM for additional project management and engineering. Recommending that very small contingency, though, isn’t the only proof that MMM were changing primary aspects of the project during the procurement process in 2012 to keep the project alive. Throughout the fall of 2012 they negotiated an agreement with Transport Canada to remove a significant cost from the project’s scope: removal of the existing bridge’s concrete piers. Although the only claim made for leaving the piers in place has been that they would provide “marine habitat,” an email from an MMM employee obtained by FOI shows that the move to leave the piers in place was done to reduce the scope of the project while the RFP was still open, in the hope of “maintaining a commercially competitive environment.” The takeaway from that is that MMM were actively reducing the scope in the hope of obtaining a viable bid. So MMM had a financial motive to save the project by reducing the physical scope, they engaged in that across a broad front, and this appears to have included lowering the seismic performance. Huggett, at the meeting, unable to see a motive, noted that the bridge had been designed using the most stringent codes. He spent much time listing these codes, but had apparently not noticed that MMM’s Seismic Design Criteria prominently stated that the provisions of all those codes were secondary to the stipulations of its own document. Although most of the councillors at the March 12 meeting readily accepted Huggett’s claim that there was nothing to be concerned about, Councillor Ben Isitt asked that MMM’s Johnson Street Bridge Seismic Design Criteria be projected on an overhead screen above the Council chambers. (Unbelievably, this had to be retrieved from Focus’ website.) When confronted with the actual document that was at the core of the issue, Huggett had no explanation. In a quick reversal of their earlier warm reception of Huggett’s comforting assurances, councillors passed Isitt’s motion asking Huggett to report back to them on why what he was telling them was at odds with what MMM’s Seismic Design Criteria stated. In an unusual motion, Huggett was directed by council to meet with me and answer questions I might have. But in the days that followed, Huggett declined to meet and refused to answer questions posed to him by email, stating that he would hold a technical briefing for all media once he had responded to the council’s request for an explanation. As this edition went to press, that technical briefing hadn’t taken place. Instead, Huggett sought an explanation from MMM, and on March 20 MMM responded by letter to Huggett. That letter was then made public. In part, it stated, “With respect to the bridge performance after a 2500-year return period seismic event, we wish to clarify that the 2500-year event is not part of the seismic design criteria specified in the JSB 2012 [Project Definition Report] and was not analyzed in the design.” That requires a little interpretation. A “2500-year return period seismic event” in Victoria has a rough equivalence to an earthquake with a magnitude of M8.5. That equivalence, in regards to this project, has been stated in writing by both MMM and Stantec. So MMM’s letter admitted that it could provide no documented evidence that engineers had considered what would happen to the bridge in an M8.5 earthquake. (It should be noted that in its critical review of MMM’s design, Kiewit Infrastructure’s engineers, who prepared a bid for the construction contract, rejected the mechanical concept, and instead proposed a design in which the moveable part of the bridge was firmly attached to the supporting bascule pier. Those engineers noted, “This method reduces seismic, mechanical and maintenance related technical challenges in the design.”) Huggett apparently had some doubt about MMM’s admission of having conducted no analyses for a 2500-year earthquake, because he then wrote back to MMM asking for an explanation. MMM responded with a second letter which boiled down to this: because the bridge has been classified, on paper, as a “critical bridge,” there is an “inference” that “it is expected to be available for use by emergency and security/defense vehicles immediately following a 1:2500 year earthquake.” MMM’s letter continued on to state: “…it is not necessary (or required) to actually analyze the structure for a 1:2500 year earthquake for us to be able to confidently state that the JSB will be available for use by emergency and security/defense vehicles following a large earthquake.” Armed with these two letters, Huggett then made a presentation to councillors on the issue at a meeting on March 26. I’ll come back to that meeting later, but first I’d like to pose some questions that naturally arise from this situation, questions for which councillors serious about representing the public interest would want answers. First off, MMM is saying that they didn’t do an M8.5 analysis because they have written on paper that the bridge is a “Critical Bridge” and that, by definition, there is an “inference” that a “Critical Bridge” would provide the performance City council requested in 2010. But wouldn’t councillors ask themselves, “Since MMM cannot provide an actual set of seismic analyses for an M8.5 earthquake, why should I believe their assertion? Has the information they have provided me in the past been credible?” On the issue of credibility, MMM’s record is concerning. Let’s go back to the contingency issue as an example. When councillors were asked to approve a $66 million construction contract with PCL in December 2012, they were told: “The City’s Consultant, MMM Group, has reviewed the contract documents prepared by [the City’s legal advisor] and the City, including optimizations, contingency, project risks and the value engineering opportunities, and in their professional opinion recommend that the City proceed with the project and enter into a contract with PCL Westcoast.” That “recommended” contingency amounted to four percent. Then, in March 2014, after PCL had submitted a change order requesting an additional $9.5 million ($7.9 million net) as a result of delays and costs they attributed to MMM, MMM’s Joost Meyboom, in a letter to the City, stated that the design PCL submitted to the City in its bid was “at best 10 percent complete.” Meyboom then observed,“We note that it is not unreasonable for scope to vary by 30 percent from a 10 percent design and that this is normally accounted for with appropriate contingency.” So MMM first recommended that the City accept a four percent contingency, and then later, when it suited their purpose, suggested that it should have been 30 percent. Should councillors now trust MMM? Is its recent claim that the bridge will allow emergency vehicle access after an M8.5 earthquake to be trusted? Or should councillors trust what MMM said when it claimed there could be “permanent loss of service” following an M7.5 earthquake, which is the claim that’s included in the construction contract? The second question is this: We live in a region of high seismic hazard. What is the normal requirement for conducting seismic assessments, when designing significant public infrastructure for our region? For guidance on this, councillors might want to look to the Port Mann Bridge Project in Vancouver. It has been built in an area that is considered to have a significantly lower level of seismic hazard than Victoria. Yet for that project the Province required four separate seismic analyses for the 2500-year return period earthquake—including a “damage assessment analysis.” Although both the Port Mann Bridge and the Johnson Street Bridge have the same “Lifeline Structure” designations, the Port Mann Project did four analyses, the Johnson Street Bridge Project did none. Councillors, no doubt, would want to know: Why weren’t these four analyses done for Victoria’s bridge? The third question is whether a set of 2500-year analyses would have represented significant additional cost or not. The only significant variables in such analyses are all related to the structure of the bridge itself. Those variables were all determined for the 1000-year analysis. Since these variables wouldn’t change between a 1000-year analysis and a 2500-year analysis—its the same bridge in each analysis—why would there be any significant cost to running both sets of analyses? Wouldn’t councillors want to know how much it would cost an engineer to enter a different value for spectral acceleration and then push the “analyze” button on the computer program? Since pushing the 2500-year button seems to be the normal practice in southwestern BC—witness the Port Mann Bridge Project—did the seismic engineers, in fact, push the button and later say they didn’t because they didn’t like what they found? By the way, the physical difference between a 1000-year event, for which MMM claims an analysis has been done, and a 2500-year event—which MMM admits it didn’t do—is very large. Last month I reported here that the energy released in the 2500-year event was 10 times that of a 1000-year event. That was incorrect. According to the US Geological Survey, an M8.5 earthquake releases 31.6 times as much energy as a M7.5 earthquake. Given that MMM’s Seismic Design Criteria state that the bridge could sustain “possible permanent loss of service” in an M7.5 earthquake, what would happen to it in an earthquake that was 31 times more energetic? Surely, councillors would want to know that, wouldn’t they? Although Huggett’s initial response to the issue was to tell councillors they didn’t need to be concerned because everything was being done according to code, at least having MMM’s letters in hand demonstrated that he did follow council’s direction to find an explanation for the discrepancy between his position that there was nothing to be concerned about and the actual stipulations of MMM’s Seismic Design Criteria. If I were a councillor, though, I would want to know if Huggett, as a paid representative of the chickens, went to anyone other than the fox for an opinion on whether the fox was having the chickens for lunch. So how did councillors do? On March 26, Huggett gave City council an update on the project’s escalating costs. I have written about these cost escalations in detail in previous stories and there’s nothing new on that front except that costs have gone up by an additional $4.8 million. Following Huggett’s presentation, Councillor Isitt noted that the project was “a disaster” and said, “I do have grave concern’s about MMM’s performance.” When Isitt asked Huggett whether MMM could be replaced as project manager, Huggett told councillors that the one MMM employee working on the job site was putting in long hours and said MMM “was doing a good job.” Councillors’ refusal to approve the full $4.8 million requested by Huggett amounted to closing the chicken coop door after the hens had already been eaten by the fox. Although councillors still refuse to acknowledge that the current cost of the project to City taxpayers is in the range of $113 million to $120 million, councillors being out of touch with reality on this project isn’t news. The escalation update was followed by a long in camera meeting on the City’s legal difficulties with MMM and PCL. Claims that the City is in mediation with the various parties have been made for several months now, and that has been useful to City staff in preventing councillors from asking, in public, substantive questions about the project’s woes. Ostensibly this muzzle has been put on councillors to protect the City’s position in any legal action that might occur if mediation fails. At the same time, though, it prevents public discussion of who at City Hall is responsible for decisions made that seem to have left the City without any case for pursuing legal action against their project manager, including holding MMM to account for its verbal recommendation to the City on the four percent contingency in the contract with PCL. The record of several closed meetings on this project, obtained by FOI, shows that advice given to councillors by City staff at these hidden-from-the-public-eye meetings has usually led to decisions that later turned out to be based on misinformation. Following the March 26 closed meeting, Huggett presented his report on the seismic issue to councillors. It was short and to the point. Huggett blamed the issue on those raising it, calling media reports on the issue “irresponsible.” He invoked MMM’s two letters as proof there was nothing for councillors to be concerned about and expressed dismay over the amount of time he’d spent not answering questions. Only Councillor Isitt asked anything close to a substantial question, but he, evidently, didn’t comprehend that Huggett hadn’t provided him a substantial answer. What seemed evident to this observer is that the fox has now infiltrated the chicken house, and the chickens can’t tell the difference between a rooster and a fox. David Broadland is the publisher of Focus Magazine.
  23. March 2015 Engineers recommended a high level of seismic protection for the new bridge and then, as their cost estimates went south, they secretly cut that level of protection to the bone. A DOCUMENT OBTAINED THROUGH AN FOI shows that the new Johnson Street Bridge could experience “possible permanent loss of service” following a magnitude 7.5 earthquake that engineers have estimated has a “30-35 percent chance of occurring within the next 50 years.” This is a much lower level of seismic protection than was recommended to the City of Victoria in 2010. Engineering company MMM Group strongly advised the City to pay an extra $10 million for a level of seismic performance that would protect the bridge in a magnitude 8.5 earthquake. Councillors voted to fund the recommended level of seismic protection and the issue helped win a referendum that approved the bridge. By definition, a magnitude 8.5 (M8.5) earthquake releases 10 times as much energy as a magnitude 7.5 (M7.5) earthquake. Surprisingly, the document that specifies the lower seismic standard was written by MMM Group. Their Johnson Street Bridge Seismic Design Criteria is a set of standards to which the bridge has been seismically engineered. It delineates the allowable impacts different strengths of earthquakes can have on the bridge (far right column of table below). MMM’s seismic design criteria were created in August 2012, just as three construction companies were preparing bids for a contract to build the bridge. The design criteria were not included in the publicly-released Project Definition Report, a move that kept the lower level of seismic protection hidden from councillors and public view. Discarding the M8.5 seismic standard recommended by MMM would have cut costs and helped to keep the project within the council-approved budget. But that decision appears to represent a conflict of interest for MMM. If the City hadn’t received bids within its self-imposed limit of $66.1 million, the project likely would have failed. If that had happened MMM would have lost a $9.1 million contract for additional project management and engineering during the construction phase. So MMM had a financial incentive for dropping the seismic standard it had recommended. The role City engineers played in the lowering of the seismic standard is unknown. The City’s Director of Engineering and Public Works Dwayne Kalynchuk did not answer questions posed to him about the lower seismic standard. Kalynchuk confirmed the bridge is being built to MMM’s 2012 seismic design criteria. When City councillors were asked to approve a construction contract for the project in December 2012, City engineers didn’t warn councillors about concerns raised by two of the three bidders, Kiewit Infrastructure and WCC Construction. Kiewit had advised the City the design “may represent a fundamentally high risk and expensive design approach.” Instead, City staff recommended going ahead with the project on the basis that the third company, PCL Constructors Westcoast, had provided the only bid within the City’s budget for the project. Asked if City councillors had been advised about the lower level of seismic protection, Councillor Geoff Young said, “I would think I would have remembered if we had been told the seismic standards were being reduced.” Young added, “I would certainly say that if indeed the new bridge has been designed to a much lower level of seismic protection than engineers recommended, lower than council requested, and lower than was promised in the referendum campaign, then this is a serious departure from what was expected. If so, I think we should request information from Hardesty & Hanover [the company that engineered the bascule leaf and main piers] about the design standard achieved, if only for the purposes of emergency planning and for planning of any Bay Street Bridge renovations.” As Focus was going to press with this edition, City councillors met privately with Karen Martin, a partner in the law firm Dentons Canada, LLP. Martin’s resumé notes that she “practices in the areas of construction/infrastructure” and has “experience as counsel on large construction trials.” Presumably the City is hiring Martin to represent it during coming litigation involving MMM, Hardesty & Hanover, and/or PCL. Since March, 2014 the City and the companies have been in dispute over delays caused by unresolved design issues, cost increases, and steel fabrication problems in China. Attempts to obtain records about these issues through FOI have been unsuccessful. Whether or not the significantly diminished seismic protection of the bridge is at issue in the preliminary legal maneuvering is unknown, but the low level of protection implied by MMM’s Seismic Design Criteria raises questions about whether the new bridge is being constructed to adequate seismic standards. In 2009 City councillors voted to replace the current bridge after being told it would collapse in an M6.5 earthquake. In June, 2010 councillors met twice with engineers to consider what level of seismic protection the project should include. A presentation to City of Victoria councillors by MMM engineer Joost Meyboom on June 14, 2010 stated there was a “35 percent probability of a major quake (M7.0 to M7.9) in the next 50 years.” Meyboom recommended that a new bridge “be designed for an M8.5 earthquake.” He told councillors, “If you’re going to spend $100 million on a facility, the premium to pay for a very high seismic performance is a relatively low price for insurance.” Much the same information was in a written report to councillors at a meeting on June 17, 2010. Signed by Kalynchuk, the report again warned councillors there was a “30-35 percent probability of experiencing a major earthquake (in the range of M7.0 to M7.9) in the next 50 years as per Natural Resources Canada.” The report added, “Staff agree with [MMM’s] recommendations that the seismic design should be at the highest level under the current bridge design code, which is for an M8.5 seismic event…” At the June 17 meeting Meyboom told councillors: “The premium you pay to go from 7.5 to 8.5 is not a big number when you’re talking about spending in excess of $80 million dollars on a project.” That premium, Meyboom advised councillors, was $8.5 million. “I wouldn’t call myself an expert in seismic,” Meyboom said, “but I’m very knowledgeable about it…The risk of earthquake in Victoria, just to put it in perspective, is the highest in Canada, and it’s comparable to the highest in North America.” Following the staff presentation, councillors asked questions and expressed their understanding of the seismic issue. The City and others videotaped the meeting, so statements made by the engineers were recorded. The engineers’ recommendations—and the way in which that advice was understood by councillors—are unambiguous. Then-mayor Dean Fortin told the meeting that one factor that convinced him the M8.5 standard was essential was “getting the emergency vehicles back and forth—and not only on that day [of the earthquake]—but for the next year or two or three or however long it takes.” Fortin also emphasized the need for the M8.5 standard to insure protection of the taxpayers’ “investment.” “Do you spend $70 to $80 million on a bridge and not get the insurance and then it falls down?” he asked. “That’s a bit of a penny-wise and pound-foolish approach.” The mayor summed up: “That’s the lens I’m putting it in. Do we go to that 8.5?” A motion by then-Councillor John Luton to “approve the seismic design of both the rehabilitation or replacement options at the M 8.5 (‘lifeline’) level” was passed, with only Councillor Geoff Young opposed. The imposition of this standard made the option of rehabilitating the existing bridge to the same seismic standard more expensive than replacement—according to MMM—making it easy for councillors to then decide to put the borrowing of money for a replacement bridge, rather than rehabilitation, to a referendum. In that referendum, the City informed voters that it was essential to build the bridge to withstand an M8.5 earthquake. For example, brochures sent to every household stated: “The safety of the travelling public is top of mind. The bridge will be upgraded to a lifeline structure able to withstand an 8.5 magnitude earthquake—the highest standard of earthquake protection—to ensure the safety of users, disaster response capability, protection of investment and post-disaster recovery.” Those same brochures stated: “Victoria is located in the most active seismic zone in Canada and recent studies have indicated that there is a 30-35 percent probability of a major earthquake occurring in Victoria within the next 50 years.” The City’s referendum campaign material defined “major earthquake” as one having a “magnitude of 7.0 to 7.9.” A majority of electors voted “Yes” to borrow for a new bridge in the referendum held on November 20, 2010. That appears to be the last time anyone on City council or in the City’s engineering department showed interest in the issue of seismic protection or how much it was costing. By August 2012, at the time MMM produced its Seismic Design Criteria, that company’s concern seemed to have shifted from preventing the bridge from collapsing in an M8.5 earthquake to preventing the bridge project from collapsing under the weight of underestimated costs. A key question is this: Has a significant risk to the public’s investment been imposed by lowering the level of seismic protection, as MMM warned against back in 2010? Since ten times as much energy is released in an M8.5 earthquake as compared with M7.5 (see graphic below), MMM appears to have lowered the level of protection to only 10 percent of what it had previously recommended. Moreover, the expected outcomes for the new bridge following even an M7.5 earthquake are much worse than the City expected following an M8.5 earthquake. Comparisons with the seismic design criteria used for Seattle’s South Park Bridge are telling (see table below). That structure, which has a section that opens for marine traffic like the Johnson Street Bridge, was completed in 2014. The design chosen was a tried-and-true double-leaf bascule; it spans about the same channel width as the Johnson Street Bridge. For the South Park Bridge, the expected impact following an M7.5 earthquake is “minor to moderate damage with some loss of operation.” The Johnson Street Bridge, on the other hand, can expect to experience “possible permanent loss of service” in an M7.5 earthquake. “Permanent” implies the bridge would not be repairable. The two sites have similar expected peak ground acceleration values, which are a measure of the expected ground motion for different magnitudes of seismic events. Given that, comparing seismic design criteria for each of the major elements of the two bridges gives the distinct impression that there is something fatally flawed about the Johnson Street Bridge’s design. One wee flaw: Although the bascule leaf is expected to weigh close to 2300 metric tonnes, it won’t be firmly attached to anything—no anchor bolts will keep it from jumping around in an earthquake. Instead, its two 15-metre-diameter rings will float on steel rollers. Moveable span locks at the west end of the leaf and at the east end of the counterweight—both of which will need to fit loosely so they can be easily operated several times a day when the bridge opens and closes—are all that will hold the bascule leaf in place during an earthquake. If they fail, the bridge could experience “permanent loss of service.” MMM’s seismic design instructions for these span locks in an M7.5 earthquake is: “failure may occur but this should not lead to global structural collapse.” The words “should not” are not particularly reassuring. Moreover, MMM’s seismic design criteria say nothing at all about what the span locks “should do” in an M8.5 earthquake. Curiosity about what the bridge engineers had discovered from seismic analyses carried out on the design prompted me, in November 2013, to file an FOI for any analyses done on the design. For the South Park Bridge, engineers did extensive analysis of what was needed before they chose the mechanical design. Their analyses were made public even before construction began. For the Johnson Street Bridge, nothing had been released. In response to my FOI the City advised me they had “no records.” Lisa Helps, a councillor at the time, offered to make a public request for these records, which she did at an April, 2014 council meeting. At that meeting Kalynchuk told Helps, “I believe that information has been requested under freedom of information and has been released.” Then-Mayor Fortin, who has always been quick to believe that his staff were doing a terrific job on the bridge project, prompted Kalynchuk: “And posted to our website...?” Kalynchuk provided a reassuring response: “Oooh, it’s binders of material, so I’m not sure that’s available. We’ll see if there’s a summary that can be posted.” I filed a second FOI, this time for the “binders of material.” I also asked Kalynchuk if he’d seen the actual seismic analyses. Kalynchuk replied, “…the staff from [Hardesty & Hanover] stated at their April presentation a full seismic review was conducted on the bridge. While I assumed some data was provided to the City, this was incorrect.” Meanwhile, my second FOI for seismic analyses worked its way through the system. Once again, the City’s response came back: “No records.” I then stepped into the long queue at the Office of the Information and Privacy Commissioner and many months passed. Then, finally, an investigator from the Office of the Information and Privacy Commissioner told me that MMM had informed the City the seismic analyses might contain “trade secrets,” so MMM wasn’t obligated to release them. The City admitted to the OIPC investigator that the company that performed the seismic analyses—Hardesty & Hanover—had provided only a verbal account of the results of the analyses to MMM, and MMM had provided only a verbal account to the City. No written communications discussing the seismic analyses by any of the parties had, apparently, taken place, because my FOI for those records also proved fruitless. If you’re getting the impression that MMM and Hardesty & Hanover didn’t want anyone to see those seismic analyses, then you’re reading this the way I am. A couple of months ago, though, I asked Mayor Helps to intervene. She agreed, and between her assistance and the threat of dragging the City to an inquiry at the Office of the Information and Privacy Commissioner, the word got back to Hardesty & Hanover’s New York office to produce something. I am reporting to you that as Focus went to press, 3183 pages of scanned computer printouts were delivered by courier to our office. Most of the pages contain long strings of numbers—the raw data that would be used to do the seismic analyses like those released for Seattle’s South Park bridge. A rough guess is that there are 687,096 of them—most seven digits long, and as soon as I figure out what they are and enter each of them into some—at this moment—unknown computer program, I’ll report back to you on what I discover. Don’t hold your breath, though. Back in 2010, when the City was deciding whether to rehabilitate the existing bridge or build a new one, MMM Group led the City to believe it could build a new “signature” bridge for $77 million. That included a $10 million premium for seismic protection of the City’s investment to an M8.5 standard. The City expected this level of protection to provide immediate access to emergency vehicles following a catastrophic earthquake. Although the cost of the bridge is now hovering around $110 million, the seismic standard has been significantly lowered. It would appear MMM never consulted with the City, officially, on lowering that standard. That will leave the City in the position of having no disaster response route across the harbour, a situation that Meyboom himself said back in 2010 the City needed to address. As Councillor Young has noted, this means the City may have to re-examine the extent to which the Bay Street Bridge is upgraded, which could mean expenditure of millions more than the City has budgeted. Any doubts about the level of seismic protection included with the new bridge could be cleared up by the City insisting that MMM and Hardesty & Hanover release comprehensive seismic analyses like those released for the South Park Bridge—not just raw data—that can be independently verified by someone not associated with the project. David Broadland is the publisher of Focus Magazine.
  24. February 2015 In their coverage of two stories, was the local daily concocting a case for an overturn of November’s election in Saanich? BILL CLEVERLEY, municipal affairs reporter for the Times Colonist, described his “favourite news story of 2014” in a December 20 piece called A gotcha moment on April Fool’s Day: “Working with Saanich Mayor Frank Leonard and Oak Bay Mayor Nils Jensen, we concocted a story about them approaching the Province to rename the University of Victoria to the University of Saanich Oak Bay—USOB—to better reflect where the campus is located.” Two weeks later, Cleverley wrote a short story that, like his April Fool’s joke with Leonard and Jensen, was a thin concoction of inaccurate information and imagined public interest. This time, though, the laughs were on newly-elected Saanich Mayor Richard Atwell, whose private life Cleverley exposed to public ridicule, from coast to coast to coast. That public shaming began with Cleverley's January 5 piece titled, “Police called after Saanich mayor involved in altercation.” Cleverley’s complete description of the actual events went like this: “Saanich police were called to an altercation involving newly elected Mayor Richard Atwell, who is the chairman of the police board, in December. Sources say that police were called to the home of an Atwell campaign supporter about 11 pm on December 11. Atwell, who had been sworn in as mayor 10 days earlier, had apparently been in the home with the woman when her fiancé arrived. Sources say an altercation between Atwell and the man ensued and police were called.” Cleverley’s story then quoted Saanich police spokesperson Sgt Steve Eassie as saying, “I know that our relationship from the past is one that I would normally be able to share details with you, but I’m not able to share anything.” The way most people would understand that story is something like this: Atwell and a woman were in the woman’s home late at night; the woman’s partner came home, caught Atwell and the woman in a compromising situation, and a fight ensued between the two men. The fight might have attracted the attention of neighbours, or a passerby on the street, because police were called. Later, police were unexpectedly constrained about talking to Cleverley about the case by someone—possibly the new chair of the police board, Mayor Atwell. The story clearly had prurient interest, but it’s not news that politicians have private lives. What was it that made this incident important to report on and not just another opportunity for scandal-mongering? Before I consider that question, let’s look at how the TC dealt with another opportunity for exploiting a prurient-interest story involving a mayor of Saanich. In June of 2009, Frank Leonard’s divorce from Elaine Leonard was finalized. About the same time, Leonard and former Saanich Councillor Jackie Ngai conceived a child. Back then, rumours of an affair between Ngai and Leonard had circulated for some time. During that time, the Times Colonist didn’t publish any stories about the Leonards’ divorce, Frank Leonard’s marriage to Ngai, the birth of their child or how the two politicians had become a couple in the first place. There were no questions raised about whether Ngai’s and Leonard’s relationship had conflicted in any way with their duties as elected officials. That is to say, in Leonard’s case, what happened behind closed doors was off limits. That was fine with everybody. So why was the December 11 incident different? The difference is that the Atwell affair involved a 911 call, but whether that fact alone makes this a public matter depends entirely on the exact circumstances of that call. In the January 5 story, the only rationale provided for why the paper was exposing Atwell’s private life to public ridicule were comments from Integrity BC’s Dermod Travis. Cleverley quoted Travis who said Atwell’s role as chairman of the Saanich Police Board “puts him in a very difficult position and it also, frankly, puts the police in a very difficult position because, in the future, it could raise questions going both ways.” Travis’ comments in the story read more as advice to Atwell that he ought to contain the damage rather than a reasoned explanation of a significant public interest at stake. The paper offered no other explanation for why this was any different than the Leonard-Ngai affair. Noteworthy is that Travis articulated no case for a “conflict of interest.” He has since confirmed to me that he never used the term “conflict of interest” in talking with media about the Atwell incident. Two days after Cleverley’s story was published, Atwell provided information about the incident that differed substantially from the sparse details in Cleverley’s 80-word account of what had happened that winter night. Atwell said he was invited to the home and both the man and woman were present when he arrived. After deciding he wasn’t welcome at the home, Atwell started to leave but the man pushed him and then grabbed him from behind. Atwell recently told me he didn’t hit back, he left the house, and once on the street—out of concern for the safety of everyone in the house—he called 911. It was around 8 pm. (Not 11pm as the TC story had it.) There are two troubling problems with Cleverley’s January 5 story. The first problem is that Cleverley didn’t mention that Atwell placed the call to police. This is such a primary piece of the story that if Cleverley was aware Atwell had placed the call, he had a journalistic duty to include that information in the story. Atwell placing the call puts a completely different flavour on the story: he had nothing to hide. On the other hand, if Cleverley knew Atwell had placed the call but intentionally kept that out of the story, Cleverley would have been hiding from readers an important fact about what happened. Why would he do that? Giving Cleverley the benefit of the doubt, let’s assume the story was published without anyone at the paper knowing that Atwell himself had placed the call to police. We then collide with the second troubling problem: the paper’s sole rationale, offered after the fact, for publishing the story in the first place. The day after Atwell gave his first account of what happened, a TC editorial explained why the paper published Cleverley’s January 5 story. An unidentified editorial writer stated: “What happened in a private residence in Saanich on the evening of December 11 was none of the public’s business—until the police were called. At that point, it became a public matter, especially given that the chairman of the police board was involved in the incident.” The writer went on to restate that assertion in a slightly different way: “While we accept that what occurred was a minor incident, that isn’t the issue. Politicians’ private lives are by and large ignored by the media, but that changes once those private lives overlap with public duties. When the head of the police board calls his police department to resolve a dispute, there’s an obvious potential for conflict.” Let me distill that down a bit. The paper’s position now was that whatever went on in that house was no one’s business until Atwell called the police “to resolve a dispute.” Then it became the public’s business. I’ll address the validity of that argument in a moment, but first let me draw your attention to a serious inconsistency in the only rationale the paper has offered for publishing the story. As mentioned earlier, the paper’s January 5 story contained no mention that Atwell called police, yet it ran the story anyway. Only after Atwell revealed that he had called police did the paper make his placing of the call the reason why it had published the story two days before. Let’s consider the paper’s claim that because Atwell called police to the house that evening “to resolve a dispute, an obvious potential for conflict” arose. To be clear, it’s Atwell’s position as chairman of the Saanich Police Board that makes his case special, in the paper’s opinion, but the paper didn’t describe any mechanism by which Atwell’s phone call to police could create a “conflict.” Atwell, however, says that he called 911 out of concern for the safety of those in the home, not “to resolve a dispute.” Two days later, TC Editor-in-Chief Dave Obee made another attempt at justifying why the paper had published the January 5 story. Obee wrote, “Atwell’s private life became public news because a line was crossed. When private matters might affect an elected person’s ability to do the job, the public has a right to known.” Again, the TC’s read on this is questionable. If a mayor gets the flu and has to remain at home for a few days, does the public need to know? Where does Obee draw the line on what private matters might affect performance? If Obee seemed unable to clearly articulate exactly how “a line was crossed,” we might presume he meant it had something to do with a potential “conflict of interest” because his editorial included this line: “The key point: Atwell is the chairman of the Saanich police board. All of us, elected to public office or not, have to avoid any perception of conflict of interest in our careers. And yes, these considerations apply to the media as well.” Obee then gave an example of a recent case of a conflict of interest involving a media personality in eastern Canada. Notably, he made no attempt to explain how a conflict of interest could arise from Atwell’s 911 call. Obee’s concern for a perception of conflict of interest is at odds with his record of disinterest in the subject, especially in regards to mayors and police. Just before the November election, the Saanich Police Association endorsed Leonard, who, if re-elected as mayor, would have again served as chairman of the Saanich Police Board. Obee and Cleverley apparently missed the implication. Election endorsements from police associations have the appearance of a quid pro quo arrangement. In exchange for the endorsement—and presumably members’ electoral support—the Saanich Police Association expects something back from Leonard. Yet Cleverley and Obee overlooked that story entirely. Not one word. The same thing happened in the 2011 election: Saanich Police Association endorses Leonard, TC looks the other way. I asked Dermod Travis about these endorsements; he called them “highly inappropriate.” I sent Travis the BC Police Board Handbook section on “Conflict of Interest” and asked him to comment on how Atwell’s 911 call and his position as Police Board chairman might constitute a conflict of interest. Travis declined to tackle that riddle. If Travis can’t make a connection, who can? Travis had been a frequent expert commenter in both the TC’s stories and the media frenzy that followed publication of Cleverley’s story. Recently, I asked Obee by email what his thinking on this story had been. He had the ultimate say in whether the story was published or not and his decision to run that January 5 story has led to tremendous damage to Atwell’s reputation. Does Obee stand by the accuracy of the story? Obee didn’t answer that question directly. Instead, he said, “Mayor Richard Atwell gave a different account after the initial story ran. We have no reason to disbelieve him.” I asked Obee why, if he knew Atwell had made the call to police on December 11, that fact had been left out of the first story. “We believed that information to be true, but could not confirm it,” Obee said. Does this ring true? Cleverley got other significant details wrong—the time and the circumstances. Next to nothing about Cleverley’s story was truly “confirmed,” but Obee okayed the story anyway. If he was so concerned about conflicts of interest, why hadn’t Obee covered the Police Association’s endorsement of Leonard just before the election? “We were covering all the municipalities, and we had to be selective,” Obee said. “The endorsement did not seem that significant at the time. We mentioned it for the first time in an editorial after the Atwell incident became major news.” Consider how different the coverage was by the Saanich News of both the political endorsement of Leonard by the Police Association and its handling of the 911 call story. In the case of the endorsement, reporter Daniel Palmer’s story included comment from the Police Association, Leonard and Atwell. His story was straightforward election reporting: accurate information fairly presented on the day it happened. In a January 9 editorial the Saanich News explained why it had refrained from joining the media-mobbing that followed Cleverley’s January 5 story: “Greater Victoria media outlets were falling over one another this week chasing a story published by the Times Colonist that quoted sources as saying Saanich Mayor Richard Atwell was involved in a December 11 incident where police were called to a private residence…The News chose not to publish Tuesday’s story of a story because we received no confirmation of facts, no first-person accounts of the incident, no evidence nor a police report to independently corroborate the TC’s story. No one agreed to go on record. In short, it didn’t meet our basic requirements for publication.” Cleverley’s use of unnamed “sources” for his story didn’t create a yellow flag for other media, though. Even though the information was inaccurate, most mainstream media ran with the TC’s story. That paper’s policy on using unnamed sources was explained to me by Obee: “We allow unnamed sources when careers or lives or family relationships might be put at risk if the identities are known. Even then, we need to know and trust the people involved, and we need confirmation from at least one other source.” As any Journalism 101 textbook would confirm, that’s a very low bar for using unnamed sources. At the least the TC should have sought corroboration from two other sources and included an explanation in their story about why the source had sought anonymity. Was the source a Saanich politician with an axe to grind? Was it a member of the Saanich Police Association ticked off at the public’s repudiation of its candidate? Moreover, the use of unnamed sources ought to be confined to significant stories involving an important public interest—which is entirely different from a story that would interest the public because of its sensational innuendo. Cleverley and Obee didn’t demonstrate this story’s public-interest legitimacy; not in the story itself or in the shifting, defensive editorials that came later. Cleverley’s use of unnamed “sources” figured prominently in another series of stories that also served to discredit Atwell: what’s come to be known as the Paul Murray affair. ON DECEMBER 11, THE TC PUBLISHED A STORY by Cleverley titled, “Mayor tries to oust Saanich’s top bureaucrat.” Cleverley stated: “In a move that could cost taxpayers hundreds of thousands of dollars in severance, Saanich chief administrative officer Paul Murray is being pushed out the door by newly elected Mayor Richard Atwell, sources say. Atwell, elected on a platform of change, met Murray before being sworn in as mayor to tell the administrator he was done and to begin negotiating a ‘resignation package,’ the sources said.” Cleverley went on to report that Atwell had not returned calls for comment. He noted, “If Murray leaves, it won’t be cheap. According to Saanich’s latest Statement of Financial Information, Murray was paid $199,881 and was reimbursed for $9,193 in expenses in 2013. Murray’s contract calls for a minimum payout of 18 months’ salary or about $300,000 for early departure. Benefits could push the cost into the $400,000 to $500,000 range.” On December 17, a second story by Cleverley appeared: “Saanich mayor forces out CAO; council condemns action.” The story noted: “Actions taken by Saanich Mayor Richard Atwell to force chief administrator Paul Murray out the door have been condemned by his council and will cost the municipality $480,000.” Cleverley repeated a claim made in the earlier story: “Sources told the Times Colonist last week that prior to being sworn in, Atwell, accompanied by lawyer Troy DeSouza, who does not work for the municipality, met with Murray to tell him Atwell wanted him gone.” His story included paragraphs from a Saanich Council media release, including these: “‘The actions taken by Mayor Atwell left council with no viable options other than to proceed to end the employment relationship with Mr. Murray,’ the statement says. ‘Council is also concerned about the financial impact the mayor’s actions have had on the citizens of Saanich—a total payment of $476,611 [inclusive of accrued vacation of $55,448].’” Most people would read those stories something like this: Atwell, for no good reason, and without consulting Saanich councillors, tried to fire CAO Paul Murray. His council attempted to intervene but Atwell prevailed and councillors were forced to go along with Atwell’s firing of Murray. Atwell’s actions cost taxpayers $476,611. He alone is responsible for that cost. Atwell wouldn’t respond to calls for comment. Once again, the unnamed “sources” seem to have provided Cleverley with inaccurate, incomplete information. Atwell recently described to me the chain of events that led to Saanich Council’s decision to terminate Murray’s employment contract. I asked him if the Times Colonist had ever made a serious attempt to get his side of the story. “No,” Atwell said. He recalled Cleverley leaving telephone messages asking for a single piece of information, but never an offer to have a dialogue. So let’s consider Atwell’s version of the events that led to Murray leaving Saanich. After his election, but before he had been sworn in as mayor, Atwell had informal meetings with Saanich employees in order to familiarize himself with his new workplace. One manager he met told Atwell he had heard Murray say before the election that he (Murray) wouldn’t work with Atwell if he was elected. Atwell told me he has a written statement from the manager to this effect. The manager also told Atwell that other Saanich employees had told him that Murray had informed another meeting of Saanich employees that he (Murray) would not work with Atwell if he was elected. At the same time, Atwell had been put in contact with lawyer Troy DeSouza as an advisor on local government. Amongst other subjects, Atwell and DeSouza discussed the information provided by the manager mentioned earlier. DeSouza advised Atwell that this was “essentially a vote of no confidence” on Murray’s part. DeSouza then contacted Saanich Municipal Solicitor Chris Nation to advise him that Atwell and he were going to have a meeting with Murray. On November 26 or 27, Atwell told me, “I invited Paul into my office. I thanked him for his service and told him I thought I needed a new CAO to go forward and I asked him if he would be willing to take a dignified exit in line with his contract. He asked me if I was asking him to resign and I said ‘No.’ He asked me if I was recording the meeting and I said ‘No.’ He and DeSouza conversed about how this could be handled, then Paul went away to talk with his lawyer.” “He seemed keen to leave [employment with Saanich] when we had this conversation,” Atwell said. Later, DeSouza informed Atwell that Murray had come back to him and said something like: I want to leave on December 1. Make it happen. At that point Atwell had not been sworn in as mayor and had no official capacity at Saanich Municipal Hall. “I wasn’t forcing him out; he really wanted to leave. And so it all seemed like there was going to be some kind of agreement and it could be done quickly. Whether council was going to approve it was still up to council.” Atwell continued his account of what happened next: “Through the municipal solicitor, and working with the municipal clerk, I was trying to set up a council meeting to talk about Paul’s exit. Nothing had been negotiated and council had not authorized negotiation.” At that time Atwell was also talking with councillors one-on-one about the Murray situation, including Councillors Vic Derman, Vicki Sanders and Judy Brownoff. But councillors were unable, or unwilling, to meet officially, Atwell said. “Most of the councillors had a heads-up on this and I was planning to go into the meeting [that Nation was organizing] with the rest of the councillors.” At that point, Atwell said, “Nation decided he wasn’t going to be involved because Murray was a friend of his. DeSouza was then engaged by Nation to represent Saanich in this negotiation with Murray, who was to have his own lawyer. But [Councillor] Susan Brice had gone behind the scenes and she’d sent a letter to council…stating that DeSouza wasn’t the regular legal counsel that dealt with personnel issues, and that she was uncomfortable with this [arrangement of lawyers].” Atwell feels Brice is still loyal to Leonard and noted that they had shared the same campaign office for many elections. Regardless, as a result of the letter, Atwell says, DeSouza was “kicked out” of an in camera council meeting held on December 8. “By the time council met on December 8, they had put themselves in a position where Mr Murray could have sued for constructive dismissal,” Atwell said. As Cleverley pointed out in one of his stories, Atwell had no legal authority to dismiss Murray; that required a vote of at least two-thirds of councillors, and as Atwell told me, councillors could have chosen to give Murray a vote of confidence and invite him to stay. They didn’t do that and Murray didn’t want to stay. This was, obviously, a much more complex story than “Saanich mayor forces out CAO; council condemns action.” Atwell was willing to tell what he could to Cleverley, but says he was not given the chance. Another aspect of this story that never made it into Cleverley’s brief stories was a comparison of the terms of Murray’s contract with the contracts of other recently-departed top municipal officials. As noted in Cleverley’s first story on the Murray affair, his contract stipulated “a minimum payout of 18 months’ salary.” Murray had been CAO for 2.5 years and was paid $199,881 annually. Former Victoria City Manager Gail Stephens received $240,346 in the last full year of her employment (2012). Her employment contract stipulated “12 months written notice or payment of salary and benefits in lieu thereof” in case she was terminated without cause. She had been City Manager since 2009 and her contract had been renewed through to 2017 with the same “12 months” stipulated for severance. Stephens resigned in 2013, receiving no severance. The City of Victoria’s second in command, Operations General Manager Peter Sparanese, was terminated without cause a month later. His severance agreement, obtained by FOI, provided payment of “10 months salary plus 13 percent in lieu of benefits.” He had received $217,965 in salary in 2012. His payout was $205,249. Murray, who was paid $18,000 a year less than Sparanese, received $421,163. Even if Murray had accepted the contract-stipulated minimum, he still would have got $100,000 more than the more highly-paid Sparanese. Where did Murray get such a sweetheart deal? The terms were negotiated and approved by the previous Frank-Leonard-led council, which included Councillors Brice, Derman, Sanders, Brownoff, Dean Murdock and Leif Wergeland—all of whom, according to Cleverley, had “condemned” and “censured” Atwell for the payout. The councillors even blamed Atwell for Murray’s accrued vacation pay—all $55,448 worth—even though this was owed to Murray regardless of whether he left or stayed. I asked Obee why the Times Colonist had used those exact words—“condemned” and “censured.” The actual wording from Saanich councillors on their expression of distaste for having to follow through on the contract they had previously approved was “does not support.” Obee said, “The words we used were fair and accurate.” In the TC’s coverage of this story, it portrayed Atwell as a danger to the public purse carelessly overstepping the boundaries of his office. The paper made no serious attempt to get Atwell’s side of the story; they provided no context for the reader to understand fully why Murray was eligible for the generous payout he received, and how that compared with other recent local cases. Cleverley’s use of unnamed sources—who likely were one or more councillors breaching their public oath of confidentiality—is in itself dangerous to the public interest. When a politician with an axe to grind provides the TC with selective, confidential information that politician wants the public to know, isn’t there an expectation of a favour in return? The resulting news story might be biased in the direction of that informant’s position, in the hope there would be more prohibited information available later on. That bias is evident in the TC’s coverage of the Murray affair. By practising this form of journalism, Cleverley and Obee are, in fact, in a conflict of interest. Their primary responsibility is to their readers, not to the hidden agenda of some politician or policeman with an axe to grind. IN BOTH THESE STORIES, the Times Colonist seemed to be treating Atwell unfairly. The stories lacked context, used loaded language, included erroneous information while leaving out factual information, and were published without including Atwell’s side of the story. In the case of the 911 call story, the paper has provided no believable rationale for why Atwell’s private life should be exposed to public ridicule. It applied clearly different standards to Atwell’s privacy than it did in the case of the previous mayor. Critically, the stories relied on information from people who would not go on the record, leaving the stories vulnerable to the criticism that they were driven by hidden agendas. In doing this, the paper has severely damaged Atwell’s reputation in this community. When I challenged Editor-in-Chief Obee on these issues, he responded: “If you had simply asked if the Times Colonist had some motivation other than accurate news coverage, I would have said no. We strive to give readers information they should have. We don’t have an agenda.” The “agenda” that many people fear is at work, and is being aided by the Times Colonist’s biased coverage, is perhaps best represented by a line from one of Cleverley’s stories, which quoted Councillor Wergeland: “All I can say is: ‘Who is going to be leading council in Saanich? The jury is still out.’ But someone will lead.” The “jury is still out”? Didn't the real jury deliver its decision on this question in November? David Broadland is the publisher of Focus Magazine.
  25. January 2015 Will Oak Bay Mayor Nils Jensen and Victoria’s Dwayne Kalynchuk lead the region’s big issue back to a gunfight at McLoughlin Point? THE EFFORT TO LOCATE a central sewage treatment plant at Esquimalt’s McLoughlin Point has shifted into a new phase. After being temporarily shut down by Environment Minister Mary Polak’s refusal to force Esquimalt to host the facility, the McLoughlinuts now seem intent on a campaign to eliminate any other possibility. By “McLoughlinut” I mean a person or organization that has repeatedly expressed the belief that any solution to Victoria’s treatment deficit must include a large secondary treatment plant at McLoughlin Point. The McLoughlinut mantra is that anything else is “too expensive.” Before November’s election campaigns began, the community’s attention was riveted on the apparent failure of the CRD to locate a $783-million central treatment plant on the rocky point at the entrance to Victoria Harbour. Pro-McLoughlin politicians and power-brokers in the region hoped the election would bring a broad repudiation of the Barb Desjardins-Richard Atwell-Lisa Helps-Cairine Green alliance. None of these mayoral candidates were McLoughlinuts. If that had happened it seems likely the CRD’s plan for central sewage treatment would have been quickly reactivated and a delegation of re-elected mayors sent to Ms Polak to seek reversal of her April decision. Instead, Desjardins received a bigger vote and Atwell and Helps defeated two of the most powerful supporters of the McLoughlin plan. But a number of staunch McLoughlinuts were re-elected as CRD directors, along with David Screech as mayor of View Royal. Screech has, in the past, supported the McLoughlin plan. Indeed, at the first CRD board meeting attended by newly-elected mayors and councillor-directors, McLoughlinut Nils Jensen, the re-elected mayor of Oak Bay, defeated Desjardins in a secret-ballot election for chair of the board. In an interview with CBC’s Gregor Craigie the following day, Jensen outlined his idea of a process to find a solution to the sewage treatment issue. He said the CRD was encouraging “people to come forward if they have a proposal for their community for a single plant. That’s one track. The other track that’s being contemplated is eastside and westside committees looking at a two-plant solution.” Jensen, who famously missed most of the CRD’s sewage committee meetings in his first term, subsequently anointed himself chair of that committee. The fundamental difference between the region’s two sewage factions is based on two factors. First, whether or not the burden of hosting sewage treatment should be shared equitably: the McLoughlinuts say the burden should be forced on Esquimalt, but Esquimalt says it will take them to court if they try. Secondly, the quality of treatment—should it be lower quality secondary treatment or higher quality tertiary treatment? The McLoughlinuts say tertiary treatment is too expensive, their opponents say that has never been proven. Mixed into both positions are claims of potential resource and energy recovery. But neither position depends on including or excluding that possibility. The westside mayors (Esquimalt, Colwood, View Royal, Langford), along with First Nations participants, have begun a public engagement process to consider their options. The fact those mayors have agreed there might be an option to McLoughlin Point eliminates them as McLoughlinuts, at least for now. But on the eastside, there’s been no real progress toward a non-McLoughlin solution. A shout-out from the CRD last September for all communities to put forward a possible site for a treatment plant in their community produced no response from Victoria, Oak Bay or Saanich. Victoria decided in October to at least go through the motions of considering a plan B, but a December 18 meeting of its council suggests that process has been designed to lead right back to McLoughlin Point. At that meeting councillors received a progress report on the engineering and public works department’s exploration of a sub-regional treatment system. Urban Systems has been awarded a contract to design a public engagement process that will allow the public to make its preferences about sewage treatment known. Invoking the example of the Johnson Street Bridge, Councillor Pam Madoff suggested council should make it clear to the public that councillors would not necessarily act on those preferences. After addressing the issue of how to politely ignore public input, councillors voted to initiate investigation of potential sites for sewage treatment. In doing that they seemed not to have comprehended that Director of Engineering and Public Works Dwayne Kalynchuk had already outflanked any move to a non-McLoughlin solution by appointing a proven, reliable soldier to kill that possibility at the outset. Here’s how that happened: Last October, after the apparent failure of the CRD’s McLoughlin plan and the splitting off of the westside group, the City of Victoria realized it might need to find its own way. Overseeing exploration of that fell to Kalynchuk’s department. It issued an RFP for an engineering study that would make recommendations on the City’s treatment options. The RFP cited a 2009 Kerr Wood Leidal study and instructed responding engineering companies to use the same costing assumptions that Kerr Wood Leidal had used in 2009. At that December 18 meeting councillors learned that the contract had been awarded to Kerr Wood Leidal. That should have triggered an alarm in council chambers and here’s why: Back in 2009, Kalynchuk was at the CRD heading the effort to develop regional sewage treatment. Under his leadership, the engineering firm Kerr Wood Leidal (in partnership with two other engineering firms) undertook a study of distributed treatment plants. Its findings have provided the entire basis for the pro-McLoughlin origin story that claims, as a recent Times Colonist editorial put it, “The CRD’s waste-treatment committee did extensive professional studies into the options, including distributed treatment, which was deemed to be too expensive, and came up with the proposal for the central plant.” The Kerr Wood Leidal study’s findings, though, have been dismissed by the Sewage Treatment Action Group and other knowledgeable critics as having little or no applicability to the network of tertiary treatment plants STAG envisioned in its RITE Plan. Some of the criticisms are easy to understand. The Kerr Wood Leidal study’s cost estimates, for example, were for secondary treatment. That form of treatment would have required construction of nine new marine outfalls, but a system of strategically-located tertiary treatment plants could use existing outfalls. Another criticism has been that Kerr Wood Leidal used population growth projections that have since proven to be too high, and so its cost estimates were based on building plants that would provide greater capacity than is currently required or even projected. RITE Plan proponents want a cost estimate based on addressing current capacity requirements, and suggest small plants could be added later as needed. One finding of the study that’s hard to comprehend involves the energy recovery estimates it developed. Kerr Wood Leidal used a methodology and assumptions that led it to conclude there would be more demand for energy in both Colwood and Royal Bay than in downtown Victoria. Since the study produced a result so transparently flawed, its critics say, it can’t be trusted. Yet the study has been used by the Times Colonist, CRD bureaucrats—in fact all McLoughlinuts—as proof that any form of distributed treatment would be “too expensive.” Was the 2009 study designed to produce a predetermined result, that distributed plants would be too expensive? This appears to be what happened with another pricey infrastructure project, the Johnson Street Bridge. Soon after Kalynchuk left the CRD and became head of the City’s engineering department, a study was done that compared the cost of rehabilitation of the bridge with the cost of replacement. That comparison was fudged to favour replacement. Unrealistic assumptions were imposed (a repaired bridge must last 100 years) that seemed designed to deliver a predetermined outcome. History now seems to be repeating itself. There’s little doubt that a new Kerr Wood Leidal study that uses the same costing assumptions as their 2009 study will lead to the same recommendation: a central plant at McLoughlin Point. That property is owned by the CRD, is large enough and has the required zoning in place for a secondary sewage treatment facility that would serve Victoria, Oak Bay and a portion of Saanich. Some Victoria taxpayers might experience fainting spells at the thought of the City’s engineering department overseeing development of a sewage treatment project whose starting price is likely to be in the neighbourhood of $300 million. The department, under Kalynchuk’s leadership, has taken the new Johnson Street Bridge project from an original estimate of $63 million to a currently unknown cost—estimated by Focus at about $108 million. Ironically, the December 18 council meeting was scheduled to include a quarterly update on the bridge project. That report, dropped from the agenda without explanation, would have included a synopsis of the project’s legal problems, an updated cost estimate and a new completion date. Construction of the main elements on the project’s critical path—the bascule leaf and the bascule pier—was halted last July and October, respectively. That long-anticipated update has been “delayed until the New Year for a fulsome and complete as possible report,” according to Mayor Helps. On December 20, a high tide unexpectedly flooded the bascule pier cofferdam raising more questions about the planning and execution of the project. Yet the same folks who shepherded this project are now in charge of sewage treatment. Victoria residents who wish to be politely ignored might want to mark their calendars. Councillors requested that a public engagement strategy for sewage treatment options be ready by the end of January. FOR INQUIRING MINDS that would like to know if the McLoughlinuts might be wrong, let me introduce you to Oscar Regier. A retired civil engineer, Regier has some 40 years’ experience in the investigation, design, construction and project management of municipal and industrial infrastructure projects. He was the design project manager on the award-winning Dockside Green wastewater treatment plant and he has been giving technical advice to Richard Atwell on the potential for a distributed enhanced-tertiary sewage treatment system for the Victoria region. Regier’s tertiary-level sewage treatment plant at Dockside Green sits below Café Fantastico and Fol Epi Bakery between Tyee and Harbour roads. A visit will confirm there’s no odour produced by the plant and water it reclaims circulates to a series of lush water gardens immediately adjacent to Dockside’s residential towers. The Atwell-Helps-Desjardins alternative to a central plant at McLoughlin Point—or anywhere—would rely on adaptation of the Dockside Green technology to a larger scale, which the brain trust at the CRD has convinced the majority of its board members isn’t possible. The CRD dismissed Dockside’s potential with a simple calculation: multiplying Dockside’s cost per unit of treatment capacity by the total treatment capacity the CRD needed. It decided that simple calculation was proof enough it would cost $2 billion to use distributed tertiary treatment. No one at the CRD has ever spoken with Regier. Perhaps someone should. He recently researched the costs of some 40 tertiary treatment facilities built in North America during the last 10 years, and adjusted their final costs so they could be compared with the estimated cost of McLoughlin Point’s treatment plant. He says, “Several tertiary treatment facilities with a wide range of capacities have unit costs in the same range [around $2 million per million litres per day of maximum sustained capacity] as the McLoughlin Point proposal, which provides only secondary treatment. These are existing plants so the costs are real and final—not class C estimates.” Let me give you a sense of what that cost translates to for Jensen’s municipality, Oak Bay. The CRD says Oak Bay would need treatment capacity to process about 12 million litres of sewage each day. Using Regier’s figures, that would cost $24 million. That’s just for the treatment plant and doesn’t include conveyancing or biosolids treatment. By comparison, Oak Bay’s share of the expected construction cost of a secondary plant at McLoughlin Point is about $14 million. Regier’s research shows that a 12-million-litre-per-day tertiary treatment plant would require a site area of about 4500 square metres, roughly equivalent to the area occupied by 6 tennis courts. Oak Bay’s Windsor Park, for instance, has three tennis courts at its west end that occupy roughly the area required to treat half the municipality’s sewage. There are two other large costs associated with sewage treatment for both the CRD’s McLoughlin secondary treatment scheme and distributed tertiary treatment: conveyancing (pumps and pipelines) and biosolids treatment—the process that reduces the solids the treatment plants take out of the sewage. The CRD’s McLoughlin plan would spend hundreds of millions on each. What about distributed tertiary? Again, let’s look at Jensen’s Oak Bay and use Windsor Park as an example. Right across Currie Road from the tennis courts, the CRD owns two residential properties that house a sewage pumping station disguised as single-family homes. To hook up a 12-million-litre-per-day underground tertiary plant that could treat the equivalent of Oak Bay’s daily production of sewage, the CRD would need to run two pipes under Currie Road; the current input to the Currie Road pumping station would become the input to the Currie Road café-bakery, er, sewage treatment plant, and the output from the plant would go back under the road and be pumped to the Clover Point outfall, exactly as is now. Oak Bay’s share of conveyancing construction costs in the McLoughlin scheme is about $4 million. That compares with an estimated cost of $500,000 to connect an underground Windsor Park treatment plant with the Currie pumping station. The extra cost of constructing tertiary treatment for Oak Bay is now less than $7 million above secondary treatment. For the 50-year life expectancy of these plants, that crunches down to an extra cost of $140,000 per year. That works out to $20 per Oak Bay household per year. Is that “too expensive”? The other big cost, biosolids treatment, would require either an on-site gasifier or a truck pulling into the café once a day to remove solids to a gasifier located where plenty of energy could be used. Saanich Councillor Vic Derman showed his fellow sewage committee directors years ago that using gasifiers instead of pumping everyone’s poop 18 kilometres to a biodigester at Hartland Road would save the CRD “$200 million plus $3-4 million in annual operating costs.” Would it be possible to execute a systematic adaptation of distributed tertiary plants to the CRD’s existing system of forcemains and pumping stations—like that suggested above for the Currie Road pumping station? “Yes, I think so, at a number of locations along or near the trunk mains leading to Clover Point and Macaulay Point, including some of the pump stations,” Regier said. He described to me the differences between a system that utilized “independent” plants compared to one with “inter-related” plants and outlined how that might work. When I expressed difficulty in understanding what he meant, he said, “Think about a big picture puzzle. If you only have one or two pieces of the puzzle without the picture, you have no idea what you are dealing with. If you have most or all of the pieces, you can start sorting them out and get a much better idea of what the final result will look like.” So choosing actual locations for plants in a larger system is difficult until decisions have been made about how the larger system will work. Of course, when it comes to sewage treatment plants, no one—except the folks who live in Dockside Green—wants one built near them. Regier was cautious on this issue: “I hesitate to name specific sites because there will be an instant knee-jerk uproar and rejection without sober thought and analysis.” Pressed, though, he provided some possibilities: “Penrhyn, Currie and Marigold could be suitable for larger DT [distributed tertiary] plants; Trent for a smaller DT plant. Clover and Macaulay should probably have DT plants to serve the adjacent areas and possibly some backup/standby capability in case of failure at an upsteam DT plant, instead of pumping their sewage back up to a another DT. If the ‘westside’ develops something on their side then Craigflower might become redundant and it could be modified to pump reclaimed water to Central Saanich to irrigate a large agricultural area.” Even though these numbers suggests a distributed tertiary system could break the siting stalemate in which the region is now locked, there are two good reasons why CRD bureaucrats and local politicians don’t want to see a cost estimate for such a system. For one thing, CRD bureaucrats decided a distributed system would cost too much and then went on to spend over $85 million preparing for a central treatment plant. A study that demonstrates Regier’s distributed system would cost less would show that those bureaucrats screwed up. Why would they risk that? They could lose their jobs. For the politicians, there’s the problem of the “knee-jerk uproar.” Once a few possible locations are named—like Windsor Park in south Oak Bay or Clover Point in Victoria—only elected officials with great personal courage would be able to stand up to the blow-back. No one in either Saanich, Victoria or Oak Bay has yet shown they possess that courage. McLoughlinism depends on no politician having that courage. But to go ahead with a central plant at McLoughlin will likely mean a protracted legal battle between the CRD and Esquimalt, and taxpayers losing all promised senior government funding. The solution? Jensen ought to visit Café Fantastico and stroll through the water gardens. David Broadland is the publisher of Focus Magazine.
×
×
  • Create New...