|
Ming Jin |
I am currently an assistant professor at
School of Information and Communication Technology (ICT), Griffith University.
Prior to this, I obtained my Ph.D. degree in the Faculty of Information Technology at Monash University
in 2024.
I specialize in time series analytics and spatio-temporal data mining, with a proven track record of
publishing high-impact papers in top-ranked venues, including NeurIPS, ICLR, ICML, KDD, and TPAMI, among
others.
My research outputs have been selected as Most Influential (x2) & ESI Hot (top 0.1%; x5) & ESI Highly
Cited (top 1%; x6) Papers, with some having
become widely used baselines such as Time-LLM (2024), Time-MoE (2025), TimeMixer 2.0 (2025), and
TimeOmni series (2026),
gaining substantial recognition in the open-source community.
I am a committee member of IEEE CIS Task Force on AI for Time Series and Spatio-Temporal Data.
I also serve as an associate editor for Neurocomputing (Q1 IF 6.5) and actively contribute as an
area chair or senior program committee member for prestigious AI and data mining conferences.
My research interests are in (1) time series intelligence, (2) spatio-temporal data mining, and (3)
multimodal learning & agentic AI with a focus on temporal settings in
solving both fundamental and real-world problems.
📌 Long-term recruitment: PhD students & research interns. I am
looking for (1)
well-matched (w.r.t.
research interests), (2) qualified, and (3) highly self-motivated candidates to work with me. If you are
interested, please fill this Google
Form and I will be in touch. Scholarships may be available in
2027. I have no quota for 2026 CSC visiting PhDs but open for next year applications.
Due to the overwhelming volume of emails I receive daily, I kindly apologize in advance for not being
able to respond to each one individually.
|
TimeOmni-1: Incentivizing Complex Reasoning with Time Series in Large Language Models |
|
ShapeX: Shapelet-Driven Post Hoc Explanations for Time Series Classification Models |
|
T2S: High-resolution Time Series Generation with Text-to-Series Diffusion Models |
|
Time-MQA: Time Series Multi-Task Question Answering with Context Enhancement |
|
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts |
|
TimeMixer++: A General Time Series Pattern Machine for Universal Predictive Analysis |
|
Towards Neural Scaling Laws for Time Series Foundation Models |
|
A Survey on Graph Neural Networks for Time Series: Forecasting,
Classification, Imputation, |
|
Time-LLM: Time Series Forecasting by Reprogramming Large Language Models |
|
Towards Expressive Spectral-Temporal Graph Neural Networks for Time Series Forecasting |
|
Position: What Can Large Language Models Tell Us about Time Series Analysis |
|
Large Models for Time Series and Spatio-Temporal Data: A Survey and Outlook |
|
Neural Temporal Walks: Motif-Aware Representation Learning on Continuous-Time Dynamic Graphs |
|
Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs |
|
Graph Self-Supervised Learning: A Survey |
|
ANEMONE: Graph Anomaly Detection with Multi-Scale Contrastive Learning |
|
Generative and Contrastive Self-Supervised Learning for Graph Anomaly Detection |
|
Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph Representation Learning |
Conferences:
Journals:
Others: