Visual impairment has been a major issue for humankind for its entire history, but has become more pressing with society’s evolution into a world which revolves around visual acuity. Whether it’s about navigating a busy city or interacting with the countless screens that fill modern life, coping with reduced or no vision is a challenge. For countless individuals, the use of braille and accessibility technology such as screen readers is essential to interact with the world around them.
For refractive visual impairment we currently have a range of solutions, from glasses and contact lenses to more permanent options like LASIK and similar which seek to fix the refractive problem by burning away part of the cornea. When the eye’s lens itself has been damaged (e.g. with cataracts), it can be replaced with an artificial lens.
But what if the retina or optic nerve has been damaged in some way? For individuals with such (nerve) damage there has for decades been the tempting and seemingly futuristic concept to restore vision, whether through biological or technological means. Quite recently, there have been a number of studies which explore both approaches, with promising results.
Setting the Scene
In the developed world, the leading causes of blindness are age-related macular degeneration (AMD), diabetic retinopathy and glaucoma. Of note here is that the treatment of cataracts and refractive issues has massively decreased the total number of total blindness cases compared to the developing world, leaving types of visual impairment which are hard to treat.
In the three aforementioned causes of blindness, the retina is damaged due to a variety of causes, destroying either part of the retina (e.g. mostly the macula with macular degeneration) or the entire retina, often in a slow progression of the loss of visual acuity until no functional retinal structure remains. In these cases, as well as in conditions where e.g. the retina becomes detached from the back of the eye (retinal detachment, e.g. due to blunt trauma), the optic nerve and processing centers of the brain remain intact and functional.
As most types of vision loss including those from childhood blindness feature an undamaged visual cortex, a lot of the focus on restoring vision has consequently been on this part of the brain. Many studies have focused on developing prostheses that replace the functionality of the eye including the retina and optic nerve. More recently, the possibility of restoring functionality of a damaged retina and optic nerve by having the tissue regrow itself have been examined as well.
The Genetic Time Machine
It’s a poorly kept secret that human cells are essentially immortal. Unfortunately, those the bits that make them immortal and capable of infinite regeneration toggled off once a cell reaches a certain point in its drive towards becoming a specific type of tissue (for example: muscle or liver tissue, or part of the spinal cord or retina). Yuancheng Lu et al. recently studied the reversal of ageing and injury-induced vision loss through epigenetic reprogramming (bioRxiv preprint version).
In mouse models, they showed that by re-enabling the expression of three genes (Oct4, Sox2 and Klf4: OSK) via an adeno-associated viral (AAV) vector (stripped-down adeno virus) in the cells of the eye, the ectopic expression of these genes resulted in the regeneration of injured axons, regrowing a damaged optic nerve and recovering from a damaged retina due to glaucoma. In addition, the age of the cells (indicated by DNA methylation levels) after 4 weeks of OSK expression had been reset to their youthful state.
OSK along with c-Myc (OSKM) are known to be involved in the ability of cells to regenerate tissues, based on data from previous experiments. The reason why OSK and not OSKM was used in this particular experiment is because ectopic c-Myc expression has been shown to result in tissue dysplasia: basically abnormal cell development with predictably negative results. Yet even though the mice in this study regained a significant amount of their lost vision, it is important to remember that all of these experiments are to fill in the blanks where we still lack understanding.
Key to all this is our understanding of two mechanisms: the regenerative capabilities of cells and the epigenetic ‘clock’ which underlies the process of ageing. DNA methylation appears to play a major role in both, with its role in the latter causing a gradual change and faltering in biochemical processes. Methyl groups can bind to the DNA molecule, where they serve to alter the expression of genes. The resetting of these methylation patterns is a standard feature in the mammalian reproductive system (reprogramming), after fertilization of an ovum by a sperm. Without this mechanism, the resulting embryo would have the same genetic age as the parents, which was a concern with Dolly the (cloned) sheep.
Obviously, the goal in this kind of gene therapy to erase the effects of ageing and injuries, but not to turn every cell into a stem cell. If however the damaged tissues, such as nerves and organs, could be reset back to a more youthful state using OSK or similar, it might mean that a person could not only regenerate a damaged optic nerve and retina, but also reverse the effects of ageing, including macular degeneration and so on.
The Age of Cybernetics
Sadly, regenerating tissues through epigenetic programming as a regular or even experimental treatment in humans is still a long while off at this point. However, the use of implants and human-computer interfaces to restore lost senses is further along, to the point where retinal implants like the Argus II have already been approved for the treatment of macular degeneration and other conditions which leave the transmitting and possibly processing layers of the retina intact.
When however the retina is too badly damaged and possibly the optic nerve as well, then one quickly ends up at the experimental area of direct-brain stimulation. Here the retinal mapping property of the visual cortex is exploited: the routing of the retinal signals onto the visual cortex forms essentially a 2D map. This makes the job of figuring out which part to trigger to ‘light up’ the target part of a person’s vision significantly easier. The main difficulty is in figuring out ‘how’.
As I touched upon in the article on Neuralink from last year, a major issue with the innervating of the brain with electrodes is that neurons in the brain are by all definitions minuscule. This means that the best we can do here is to essentially jam probes into roughly the right area and hope that we hit at least some of the right neurons with an electric pulse in order to cause the intended effect. The sobering conclusion is that ‘high-tech’ for retinal implants is hundreds of pixels, with prospective visual cortex implants presumably in the same ballpark. Not to mention the low visual fidelity one can expect from what would be the optimistic equivalent of a grainy, black and white image.
Since any brain implant using today’s technology stimulates many thousands of neurons simultaneously, the best result one can hope for in the visual cortex is that of the production of a phosphene: the experience of ‘seeing’ a bright spot on one’s field of view that was not caused due to light stimulating the retina. Another way to cause a phosphene is through mechanical stimulation, e.g. when pushing (lightly) on one’s eyes or when suffering an impact on the head (‘seeing stars’).
A recent article published in Nature by Chen et al. titled ‘Shape perception via a high-channel count neuroprosthesis in monkey visual cortex‘ details an experiment where a 1024-channel prosthesis was implanted in the visual cortex of non-human primates (NHP, in this case macaque monkeys). From similar experiments on human subjects, we know that these perceived dots seem to range in size from a pinpoint to a few centimeters at arm’s length and can differ in perceived color, presumably depending on which neurons in the visual cortex get stimulated more strongly.
Unique in this experiment was the use of intracortical electrodes (Utah arrays) where previous experiments would usually use conductors on the surface of the visual cortex. This allowed for lower currents to induce the desired response in the target area (V1) of the visual cortex, the effect of which was measured in a higher cortical area (V2):
The goal of the experiment was to determine whether the monkeys could recognize the shapes that showed up for them as a grouping of phosphenes. If they pointed out the right shape afterwards, they were rewarded. The same was done with the determination of motion: here the monkeys had their eye motions monitored to see which direction the phosphenes were perceived to be moving.
A major limitation with a study like this one is that it involves human researchers interpreting the actions of monkeys who are interpreting input from said researchers. As noted by Chen et al., the occasional drops in accuracy could very well have been due to a lack of motivation on the side of the monkeys, especially at the end of a recording session.
Despite the relatively promising results of the study — with generally an above-chance result during recording sessions — moving such studies to human subjects in order to turn it into a medical product would be highly complicated. Not only due to the need to cover the full visual cortex surface (25 to 30 cm2 mean area per hemisphere), but also due to the need to further increase the resolution of the array and to develop a wireless version with electrodes that can stay in place for decades without causing damage to the surrounding tissue.
The End of the Proverbial Tunnel
Seeing these results from different studies following different paths towards accomplishing ultimately the same goal would seem cause for careful optimism. As with all scientific studies, there is no guarantee that a particular approach will turn into a viable therapy within a matter of years. Some will never make it out of the laboratory, but may spawn new ideas and new approaches.
In the case of phosphenes, they were known about since the 1920s and experimented with in the second half of the last century, but the technology to (safely) create brain implants has taken much longer. Similarly, the concept of epigenetics in its current definition, along with its reprogramming has been around for a while, but has seen major advances the past years.
Regardless, thanks to the tireless efforts by countless scientists around the globe, it seems that we may actually reach a point in the near future where blindness has become a thing of the past.