Sometimes in the pediatric ophthalmology clinic, things just don’t go my way. While we all love our patients, some of them can be a little (OK, a lot) more draining than others: the uncooperative 6-year-old who could read the smallest line of the eye chart in 10 seconds last time, but decides to stubbornly dig in his heels and make me beg for every Snellen letter over 15 agonizingly slow and oppositionally defiant minutes. Next door is the 4-year-old with supposed low tone who suddenly has more strength (and appendages) than John Cena when I try to put eyedrops in. Should it really take four of us to put eyedrops in this kiddo? Yet it does, and we could have used a fifth. Next door is the 3-year-old who was having a tantrum when she walked in the office, continued it all during our “less-than-reliable-but-we-gave-it-a-really-good-try” dilated exam, and is still screaming on the way out the door. My eardrums are ringing. And those were just the first three patients this morning.
As I have gotten older and a little less patient, I may catch myself taking a deep breath while rubbing kid snot off my pre-pandemic tie and stopping the bleeding on my arm from what felt like a dog bite, but was likely more human in origin, whilst silently musing to myself, “Pav, you should have been an architect.” I used to love to draw floor plans as a kid, and I often wonder if I have some sort of latent maniacal tendency toward architecture that went unsatisfied, deep down in repressed parts of my soul that I have long since buried. Right now, though, architecture is a job, any job, that isn’t trying to refract this screaming, tantrummy 3-year-old currently in my exam chair, winning the battle of my draining medical resolve.
But are architecture and medicine so far apart?
Currently, two of the leading architectural building certifications are LEED (Leadership in Energy and Environmental Design) and the WELL Building Standard (surprise, WELL isn’t an acronym to my understanding). LEED ratings tell us how friendly a building is toward the environment. How much carbon footprint does the building cast? How energy efficient is it? Maybe it even generates some extra energy for the 1970s building next door? Is wastewater repurposed? Were the building materials created sustainably?
Conversely, but adjunctively, the more recently devised WELL certification ratings tell us how friendly a building is toward its occupants. How does your building or office affect your mood? Your nutrition? Your comfort? Your fitness? (Example: Are stairwells placed where occupants will preferentially choose them over the elevator?). WELL theory is sort of like the Danish concept of “hygge” (defined on Wiki as a quality of coziness and comfortable conviviality that engenders a feeling of contentment or well-being). I didn’t get the feeling of hygge this morning from any of the three patients above, but when they get older, we will laugh about it together. Not now, though. And I think I need a Band-Aid for my arm.
But with so much emphasis on architectural health and wellness (both to the environment, and more recently, to its occupants), it is striking that it is human “un-wellness” that has driven most architectural change in the modern world.
Prior to Pasteur’s Germ Theory in 1861, which postulated that microorganisms could cause a lot of bad stuff, Earthlings really had no idea what started certain pandemics, or how they miraculously ended. The Black Plague/Bubonic Plague (Y. pestis) killed 30-60% of Europe’s population between 1346-1353. No one knew how it came or why it ended. Most assumed it was God’s will. Similarly, the Yellow Fever Epidemic of Philadelphia (1793) killed 5,000 of the then-nation’s capital (now cheesesteak capital) of 50,000 inhabitants (the biggest city in America at the time). It is estimated that 20,000 of those who could, including President George Washington, evacuated steamy Philadelphia that summer, while 15% of those unable to leave died. Again, nobody knew how the fevers came or why they went away. If you don’t know about germs, you cannot prevent them. Therefore, no changes were made to the general population’s living conditions and, therefore, more pandemics came and went.
Even in 1858, during the London Cholera Outbreak (just three years before Pasteur’s Germ Theory) no one had a clue that the excreta of 3 million Londoners was making themselves sick. Fortunately, however, with all that poop came a lot of stink, as it all ended up in the River Thames … and it was the Thames stench in 1858 that got Parliament to take action. Evidently, the smell in London was so bad that summer that Parliament approved 2.5 million British pounds (about $416 million U.S. dollars in today’s money) after just three days of debate, to fund the London Sewer System.
Granted, the bricklayers for the brick underground sewers were only paid a maximum rate of 25 pence (33 American cents) a day, so it was quite a bargain by today’s standards to create a city saving system. It wasn’t the fear of V. cholerae that created the impetus for the architectural wonder that was the London Sewer System of 1858, but as the London Times reported, it was the “force of sheer stench” that created the political and economic drive to design and build the sewer system. Either way, an architectural answer (unknowingly) helped end the London cholera epidemic by improving sanitation. The fact that the new sewer system just transported the excretory goo 10 miles downstream from London, where it was discharged into the same River Thames, is a story for a different day. It was someone else’s problem downstream, and London’s cholera epidemic ended. Cheerio!
New Yorkers at that time were dealing with cholera, too. Immigrants were arriving by the thousands and were living in squalor, 10 or more to a room in tenement housing, with apartments crammed tightly together. Nobody knew that the stagnant water from the poor sanitary conditions, lack of sewage systems, poor drainage and unpaved alleyways where water accumulated were the breeding grounds of cholera, which is estimated to have killed 15,000 New York city inhabitants in 1849 alone (2.5% of the 1849 population of 600,000)! Compare that to COVID-19, which has killed 30,000 NYC residents out of a 2020 NYC population 18.8 million (0.16%). Cholera in 1849 was 15x more deadly than COVID-19 in 2020.
Fortunately, NYC had an architectural breakthrough that helped. Frederick Olmstead, regarded as the founding father of American landscape architecture, proposed that NYC needed more protected green space. In his words, these parks would be the “lungs” of the city where citizens could breathe easily, separated from their cramped tenement lives. Hence, Olmstead proposed and designed what later became Central Park (and also Prospect Park) in NYC. Countless cities, including Pittsburgh (in 1931 with North and South Parks) followed suit. Another example of an architectural leap forward to help fight medical disease. And a place for Macaulay Culkin to catch the bad guys in “Home Alone 2.”
I likely wouldn’t have been a successful architect, because all my designs would have been college gothic or church gothic. Those building styles soothe my “hygge,” but, unfortunately, not everyone else’s. Different architectural styles have evolved over the years, many surprisingly due to medical reasons. For example, the Modernist Architectural movement of the early 1900s was pioneered by LeCorbusier (France), the Bauhaus movement (Walter Gropius and Mies van der Rohe) in Germany, Alvar Aalto in Finland and Frank Lloyd Wright in America. Modernists used innovative technologies in their construction, using steel, glass and tons of concrete in buildings that are distinctive for their functionalism, minimalism and total lack of ornamentation. If you see a concrete slab with windows, that’s a modernist building.
But architectural historians have postulated that many of these modernist innovators may have had some mental health diagnoses that favored this particular type of building style that they designed and fostered. Indeed, psychiatrists have posthumously diagnosed LeCorbusier with Autism Spectrum Disorder, with his preference for social isolation, avoidance of visual stimuli (lots of plain gray concrete) and minimalism. His buildings are “comforting” to someone seeking to avoid interpersonal contact and interaction. Gropius, of Bauhaus fame, was a German veteran of WWI who was badly wounded in the trenches and later thought to suffer from PTSD, a term not officially coined until 1980, many years after his death, but a disease that has namelessly existed since people learned to fight. Gropius’ architectural buildings after WWI often resembled military battlefield pill boxes: concrete slabs with narrow windows and hidden doorways. Like a concrete bunker, but above ground … safe for the battlefield or the office. Maybe not the most comforting and convivial, but functional, nonetheless. That is Modernist Architecture.
These modernist buildings turned out to be just what the doctor ordered in the fight against tuberculosis in the early 1900s. Before Waksman and the discovery of Streptomycin in 1943, the only “treatment” for TB was convalescence in a sanatorium. TB patients were sent off to isolate, rest and sit in the sun (one of the few things known to kill TB germs at the time). Aalto, in 1933 Finland, designed a modernist sanatorium with long walls of windows, large terraces to fall asleep outside in the sun, and light-colored ceilings to promote peace and quiet. Empty white walls, bare floors, everything clean and sterile. Wide open spaces. Patients “felt” clean due to the wide-open architectural space and the open floor plans. It is no accident that most hospitals since that time follow these modernist principles, such that today’s hospital buildings conjure up ideas of cleanliness, order, functionality, safety and improved health to the inpatient occupant. No hygge here, though one could argue there ought to be. But modernist hospitals are designed such that the architecture itself is part of the treatment plan, or at least makes the patient feel that way.
COVID-19, quarantining, social distancing, masking, remote learning/working are all new lifestyle modifications borne out of survival this year. American architecture will have to respond, in many ways reversing course. Will urbanization, in full steam until March 2020, give a rebirth to suburbanization and ruralization? The wide-open floor plans of the 1980s and 1990s are suddenly not conducive to six people on separate zoom calls simultaneously. Walls and “places to isolate oneself” are in, both at home and work. Shared break rooms are out. Plexiglass dividers are in. School superintendents across the country note one positive to come out of the pandemic: the end of the school cafeteria and the subsequent “cafeteria anxiety” that middle/high schoolers face. Outside is “in,” with more outside space for learning and working. Another reason for that is that one of the major weapons against the exploding myopia epidemic is “time spent outdoors” to let the eyes “defocus” on the horizon.
Anyway, the list goes on. It will be interesting to see how American architecture changes due to COVID-19, and the realization that there may be a different pandemic around the corner with a different mode of spread. Perhaps the “Metabolism Architectural Movement” from the 1960s and 1970s will regain prominence. These are buildings that are instantly changeable, with pods and cubes that can be added or subtracted to a building as needs arise, such as the infamous Nakagin Capsule Tower in Tokyo, which may have been ahead of its time. Maybe future buildings shouldn’t be “forever” but should be planned as “changeable” and adaptable, just like human metabolism. As always, while we might not like the change, architects will rise to the challenge that our changed medical status has created. Alas, time for me to put away my “wanna-be” architectural hat, and hand it over to my daughter, my “gonna-be” architect. I will have to live vicariously through her. Time to go find another Band-Aid. My next patient is waiting.