News and Awards

    • We have started a new project on automatic scene tagging for TV content. Funded by Viacom Inc./NYC Media Lab. Oct. 2018 – Jan. 2019.

 

    • We have received a NSF award (jointly with Prof. Yong Liu) on Dynamic Predictive Streaming of 360 Degree Video. Funding period is Sept. 2018 – Aug. 2021.

 

 

 

    • We have received a NIH R01 award (jointly with Dr. Jeffrey Ketterling of Lizzi Center for Biomedical Engineering, Riverside Research, and Professor Dan Turnbull of Skirball Institute of Biomolecular Medicine, NYU School of Medicine) for a project titled “In utero mouse embryo phenotyping with high-frequency ultrasound,” Sept. 2016 – June 2020. 
      The focus of our team will be on developing advanced image analysis and machine learning methods for analyzing brain development in mouse embryos and characterizing defects caused by mutations based on high frequency ultrasound images of mouse embryos.

 

    • We have received a NIH R01 award (jointly with Professor Mei Fu of NYU College of Nursing) for a project titled “SCH: EXP: Improving early detection and intervention of lymphedema,” Sept 2016 – Aug. 2019. The project titled “Kinect-enhanced Lymphedema Intervention Training System” which investigates the use of machine learning and video analysis for detection and intervention of Breast-Cancer related lymphedema. This project is part of the NSF/NIH joint program on Smart and Connected Health.

 

 

 

 

 

 

 

 

 

 

 

    • IEEE Communications Society Multimedia Communication Technical Committee Best Paper Award in 2011: Zhengye Liu, Yanming Shen, Keith W. Ross, Shivendra S. Panwar, and Yao Wang, “LayerP2P: Using Layered Video Chunks in P2P Live Streaming,” IEEE Trans. on Multimedia (TMM), Pp. 1340-1352. November 2009.