|
Ming Jin |
I am currently an assistant professor at
School of Information and Communication Technology (ICT), Griffith University.
Prior to this, I obtained my Ph.D. degree in the Faculty of Information Technology at Monash University
in 2024.
I specialize in time series analytics and spatio-temporal data mining, with a good track record of
publishing high-impact papers in top-ranked venues, including NeurIPS, ICLR, ICML, KDD, and TPAMI, among
others.
My research outputs have been selected as Most Influential & Highly Cited Papers, with some having
become widely used baselines such as TimeLLM, gaining substantial recognition in the open-source
community.
I am a committee member of IEEE CIS Task Force on AI for Time Series and Spatio-Temporal Data.
I also serve as an associate editor for Neurocomputing (Q1 IF 6.5) and actively contribute as an
area chair or senior program committee member for prestigious AI and data mining conferences.
I am dedicated to conducting high-impact research and open for collaboration.
My research interests are in (1) time series intelligence, (2) spatio-temporal data mining, and (3)
multimodal learning
with a special focus on temporal settings (e.g., physical AI on time series and spatio-temporal data) in
solving both fundamental and real-world problems.
📌 Long-time Recruitment: PhD Students & Research Interns. I am
looking for (1)
well-matched (w.r.t.
research interests), (2) qualified, and (3) highly self-motivated candidates to work with me. If you are
interested, please fill this Google
Form and I will be in touch. Scholarships may be available in
2026/2027. I have no quota for 2026 CSC visiting PhDs but open for next year applications.
Due to the overwhelming volume of emails I receive daily, I kindly apologize in advance for not being
able to respond to each one individually.
|
TimeOmni-1: Incentivizing Complex Reasoning with Time Series in Large Language Models |
|
ShapeX: Shapelet-Driven Post Hoc Explanations for Time Series Classification Models |
|
T2S: High-resolution Time Series Generation with Text-to-Series Diffusion Models |
|
Time-MQA: Time Series Multi-Task Question Answering with Context Enhancement |
|
Time-MoE: Billion-Scale Time Series Foundation Models with Mixture of Experts |
|
TimeMixer++: A General Time Series Pattern Machine for Universal Predictive Analysis |
|
Towards Neural Scaling Laws for Time Series Foundation Models |
|
A Survey on Graph Neural Networks for Time Series: Forecasting,
Classification, Imputation, |
|
Time-LLM: Time Series Forecasting by Reprogramming Large Language Models |
|
Towards Expressive Spectral-Temporal Graph Neural Networks for Time Series Forecasting |
|
Position: What Can Large Language Models Tell Us about Time Series Analysis |
|
Large Models for Time Series and Spatio-Temporal Data: A Survey and Outlook |
|
Neural Temporal Walks: Motif-Aware Representation Learning on Continuous-Time Dynamic Graphs |
|
Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs |
|
Graph Self-Supervised Learning: A Survey |
|
ANEMONE: Graph Anomaly Detection with Multi-Scale Contrastive Learning |
|
Generative and Contrastive Self-Supervised Learning for Graph Anomaly Detection |
|
Multi-Scale Contrastive Siamese Networks for Self-Supervised Graph Representation Learning |
Conferences:
Journals:
Others: