For any enquiries please email me at:



erlendprendergast@gmail.com

 Drop me a line if you want to chat or see my CV

                                                                                                                                               




                    
                                                                    


                               
                               





     



            



 
14/10/19

             

Dezeen: “Six Designs that protect your digital data”  

    

https://www.dezeen.com/2019/09/14/privacy-roundup-digital-data-technology-surveillance-ai-facial-recognition/  



             

                 




03/07/19
  
              Design Thinkers Academy London on Medium: “Designing to Disrupt Amazon’s Alexa”    
 
              https://medium.com/@DTAcademyLondon/designing-to-disrupt-amazons-alexa-a570c1006912  


             
                     
                                                                                   


19/06/19
  
              Interesting Engineering: “A creator has designed a satirical device to counter digital surveillance”
     
              https://interestingengineering.com/a-creator-has-designed-a-satirical-device-to-counter-digital-surveillance


                
   



10/06/19

                Dezeen:  “CounterBug is a digital self defence device that eases cyber paranoia”

                https://www.dezeen.com/2019/06/10/counterbug-erlend-prendergast-technology/


        
                      
                                       
                                                                       
                                                                                                                                             





Approaches for Digital Self Defence

As of January 2019, over 100 million Alexa-enabled devices had been sold. That’s 100 million users relying on Alexa to tweak their thermostats, stream their music and schedule their appointments. Whilst there are many who view these devices simply as helping hands - there are others who consider them to be Trojan horses in the age of digital surveillance. This project centres around those who view Alexa as both of these things; those who use Alexa, but do so with a looming paranoia about where their data might end up, and the purpose for which it might be used.

The outcome of the project, CounterBug, is a family of satirical accessories intended to confuse the algorithms of Amazon’s Alexa. CounterBug uses disinformation as a form of guerrilla data security, bombarding Alexa with false data - in turn protecting the user from the threats of state spying, vested corporate interest and potential criminal hackers.

Each accessory responds to a different form of surveillance paranoia; one to censor the user’s language so that they don’t get in trouble with the NSA, one to disrupt Amazon’s tailored advertising algorithms, and one that chats to Alexa about virtuous topics whilst they’re out the house.   






-----------------------------------------------------------------------------------------------






























































































                                                                     
                                                                                                                                          

Everyday Entanglements: big data in clinical trials

This project was completed during my studies at the Glasgow School of Art, working alongside the Institute of Cancer Sciences at Glasgow University to produce a vision of the future based on current trends relating to Precision Medicine (PM) and cancer treatment.

The focus of this project was whether data regarding a patient’s lifestyle and environment, gathered by technological devices, could/should have a role to play in the delivery of precision medicine. Personal data is captured constantly and passively, becoming increasingly detailed and intimate as technological products dematerialise, making them more available to corporal integration. The shift towards wearable technologies that are able to gather data constantly regarding our physical state; for example our heart rate, body temperature and precise location - carries great opportunities for healthcare, but with this comes an array of ethical implications.

                                                                                           

                              

Diagram: how might AI impact clinical trials?


Data collection devices - prototypes

                                     

                                                        

People respond differently to the same treatments; depending on factors such as sex, age, race, ethnicity, lifestyle, and genetic background. As clinical trails form the basis of evidence from which the safety and efficacy of new medicines and treatments can be evaluated, it is crucially important that they represent a diverse cross-section of society.  However, this is not the case, and there are many barriers to diversity in trials, facing both patients and clinicians.

Trials are complex, with long lists of inclusions and exclusions for eligibility, depending largely on the patient’s medical history, but also taking into account lifestyle and environment factors such as whether the patient smokes, their weight and their level of literacy. Furthermore, it is difficult for clinicians to be aware of all currently active trials and the individual stages these trials are in, at all times.



TrialSeek is a speculative service which democratises the clinical trail process by gathering and analysing an individual's lifestyle and environment data in order to match them with a suitable trial.  It consists of two devices - a wearable tracker that monitors the user’s corporal data, and a tracker carried in a pocket or bag which tracks data regarding the user’s environment including pollution levels, spending habits, and social interactions.




                                           
                                                                              


                                                                                   


                                                         








                                                                     
                                                                                  


Object Sympathy


Experiments using Cinema 4D.