About me
It is my pleasure to meet you here. I am Zhaoji Zhang(张 兆骥 in Simplified Chinese), a graduate student from Peking University. After 4 years of hard work at Peking University, I earned 2 Bachelor of Science on Chemical Biology and Intelligence Science and Technology(also known as AI). I am persuing my Ph.D. in Integrated Life Science at Center of Life Science(CLS)@PKU-THU (2025-). Currently, I am still in rotation and willing to explore different aspects of neuroscience labs.
My curiosity about Neuroscience, especially circuit, computation, and system neuroscience, has led me to be an intern at Yatang-Li’s Lab at the Chinese Institute for Brain Research, Beijing (CIBR, Beijing), and finally led me intothe integrated life science program at the Center for Life Science, Academy for Advanced Interdisciplinary Studies, Peking University.
My resume and my undergraduate academic transcript are available for a deeper look into my academic journey, and you can find my research profile on ORCID.
If you want to contact me, please email me at 2501111571@stu.pku.edu.cn or zhang-zj@stu.pku.edu.cn. Checking my student e-mail is one of my daily routines, so I would respond to that within around 24 hours. (However, it is not always stable. December 2025: I’m glad to say that our mailbox broke again, and for the first time since the website was established, I have saved the evidence. If the email is really important for you, please send it to my personal email as a backup or send it again after several hours or workdays.)
Research Interests
- Neuroscience: circuit-system neuroscience, dendritic computing, Saliency Detection & Perception, Visual Encoding&Decoding, BMI/BCI.
- Currently, I am thinking of a novel idea of dendritic logical computing. Following the traditional Rosenblatt’s Perceptron computing mechanism(y = f(Σwx+b)), Artificial Neural Networks have created a new era. However, as proposed by David Beniaguev and Michael London in Single cortical neurons as deep artificial neural networks, the neuron is not just a simple point and a non-linear function as the Perceptron assumed. On the contrary, the dendritic computation is relatively much more complex. They used a DNN to mimic a neuron, but actually, the model might be simplified as a few logical gates and a single soma non-linear function. Since the synapses could be modelled as AND gate, AND NOT gate, and OR gate, we want to develop a graph-based heuristic searching algorithm (including genetic algorithm), to reconstruct or even design a neuronal dendritic morphological model.
- Another, more biological idea is to find if we could discover synapse-level position shift or exchange. The position exchanges between OR gates are not that important (and there are almost no results showing that spines would shift a lot), while exchanging an OR gate with one AND or AND NOT gate would cause great influence at a relatively low cost (and the inhibitory synapse might be more ‘flexible’ than excitatory synapses at the same time). Figuring out the determination and plasticity mechanisms in real biological systems seems very interesting to me.
- I am also looking forward to joining a lab that values computation and experiment at the same time, for computation helps us understand the mechanism, and experiment enables us to examine our ideas and always guides our approach.
- Computer Vision: Saliency Detection, Video Saliency Detection, segmentation.
- A novel approach of computing plausible fixation points from a saliency map could use contour integration on pre-computed saliency maps. We hypothesised that there may exist some 1D-projection-integration circuit, which is observed in large flocks. the contour may be determined by saliency maps and Markovian transition(similar to algorithms in GBVS, by Harel et al.).
- I want to build a giant dataset of dashboard camera recordings of traffic accidents. Although a lot of efforts have been put into these fields since 2016 Anticipating Accidents in Dashcam Videos, we still have the chance to build a dozen-thousand-video dataset with fine language labelling at near-zero cost, if we use online Danmaku(弹幕) comments as language labels. If you are not familiar with Danmaku, you may visit one video with some Danmaku, and the colored texts floating around are what people refer to as Danmaku.
- Another interesting problem is to train a model to do new jobs like segmenting an octopus and doing pose estimations (which is quite complex due to its flexibility).
- Multi Agent Reinforcement Learning:
- Researchers have done a lot of work in MARL, including AlphaStar. Although nowadays MARL is sometimes referred to as a technique for building ‘Agentic’ LLMs, the key findings (to me) are RL algorithms(including natural gradients, GRPO, PPO, etc.), self-play, reservoir computing, and insights in game theory.
- And it is very cool to discover how herds, octopus, and other multi-body/agent dynamical systems behave. How do they perform such complex behaviors through very simple computational techniques, just like things happening in our brain, the ANNs, and superconducting materials? The studies of natural messaging and learning might give us new insights for designing algorithms of reinforcement learning.
- Recently, I have learned some basic electronic skills, including Arduino and ESP programming, Soldering, and PCB design. Maybe I could put them into use in our lab in the future. Molecular Biology (also organic chemistry) experiments are sometimes boring, time-consuming, highly repetitive, and tedious for many human experimenters. What if we do a little coding here?
News
- 2025 December: The SUPERChem Benchmark developed by the College of Chemistry and Molecular Engineering, Peking University was published on Arxiv. I am listed as one of the contributors(7th/62). The benchmark is not very suitable for testing LLMs, due to the answers are limited to multiple choices and the questions are not so difficult, at least each of them was designed for high school students to solve in less than 20 minutes. However, the benchmark is way better than the old Chemistry benchmarks, including the Humanity’s Last Exam. Frankly speaking, the questions in HLE are not ‘difficult’ to think, but ‘difficult’ to memorize and retrieve, i.e., store and search in a human’s brain.
- 2025 December: I’m glad to say that our mailbox broke again, and for the first time, I saved the evidence. If the email is really important for you, please send it to my personal email as a backup.
- 2025 October: After the First Rotation in Prof. Fang Fang’s lab, I entered [Prof. Mu Zhou’s lab] (https://zhoulab.pages.dev/) for the second rotation position.
- 2025 September: I attended The 18th Annual Meeting of Chinese Neuroscience Society(CNS) at Xi’an. We held a poster at P15-46, and published our abstract online. However, the full work has not been published yet. We may not send out private data now, but it might be available soon. During the conference, I met a couple of my old friends and (ex)colleagues, and attended several inspiring sessions, including those held by Songting Li, Kexin Yuan, Xinyu Zhao, Zengcai Guo, Huihui ZHang, Athena Akrami, Qianli Yang, Jiayi Zhang, and other researchers. Thank you for your excellent reports and great work.
- 2025 September: I entered Prof. Fang Fang’s Lab at the School of Psychological and Cognitive Sciences, Peking University, for the first rotation position, and doing some computational modeling of Rhythmic Attention Sampling.
- 2025 September: I have a great time talking to Prof. Kexin Yuan at Tsinghua University and Yixiao Gao, the PhD student who developed the VIVIT technique. We had a lot of fun, wish we could have more collaboration in the future.
- 2025 September: I got trained in the 4th Training Course on Neural Modeling and Programming held by Si Wu Lab and successfully got the certificate of using brainpy to model neural dynamics.
- 2025 August: Attend the first neuro-morphic computing summer camp(第一届类脑计算特训营) held by Guangdong Institute of Intelligence Science and Technology (GDIIST) at Hengqin district, Zhuhai city, Guangdong province, China. We talked a lot about the neuro-morphic computing, the Brainpy framework, and advanced neuroscience topics. Prof. Songting Li from STJU gave us a talk about dendritic computing and small-world networks, which made me realize that the logic computing on dendrites could be as complex and powerful a tool for neuron computing. I also talked with Dr. Tao Tang and Dr. Yan Chen at GDIIST. They both are very successful researchers and are willing to talk to any dedicated students; thus, we had a great time.
- 2025 August: Our poster of biologically plausible saliency detection is going to be displayed on the CNS 2025 conference at Xi’an, Shaanxi, China, in late September 2025.
- 2025 July: I just graduated from Peking Univ on 02/07/2025. Currently, I am still continuing research at CIBR. Maybe after August, I will have more personal time to travel around the world(or just in China). But it turned out that I worked there for around 2 full months, using their 4 ✖ A100 GPUs to do some extra deep-learning work. (Eventually, I spent the whole summer there, but I almost completed the whole project, especially for the traffic accident anticipation part.)
- 2025 April: We have published our research about Saliency Detection in mouse Superior Colliculus(SC) recently on Communications Biology in the name of Preference-independent saliency map in the mouse superior colliculus. It might be an underrated decision because saliency detection is such an important problem and mouse SC may give us an alternative way to think about it, in another way besides Zhaoping’s traditional V1 Saliency hypothesis.
Powered by Jekyll and Minimal Light theme.