
From July 1 to July 5, 2025, Renmin University of China Law School hosted the fourth session of its Advanced Training Workshop on Electronic Evidence and Intelligent Case Handling. Over 60 legal and IT professionals from law firms, public security agencies, procuratorial bodies, universities, and other relevant enterprises and institutions across the country participated in the training.

During the five-day immersive learning program, Renmin University of China’s Electronic Evidence Research Team organized 12 high-quality lectures, three in-depth academic salons, and a special event titled "Criminal Defense Night," offering rich content delivered in innovative formats. Taking AI-powered judicial case handling as their starting point, the team systematically explored cutting-edge topics such as AI-assisted review techniques for electronic data and advanced methods for analyzing massive financial datasets. From dissecting real-world case studies to hands-on training in defense strategies, they meticulously crafted a comprehensive theoretical and practical framework that bridges electronic evidence with intelligent, technology-driven case management.


On the morning of July 1, Zou Jinpei, Associate Professor in the Department of Computer Science at the University of Hong Kong and Director of the Research Center for Information Security and Cryptography, served as the keynote speaker, delivering a lecture titled "Applications of Machine Learning in Fraud Detection—Case Studies" to the participants. Professor Zou, drawing on the case of Enron—the largest financial fraud in history—introduced the participants to the definition and various types of financial fraud. He also shared insights into the approach for analyzing financial fraud data, emphasizing that, given the sheer volume and complexity of fraudulent cases, it’s essential to leverage machine learning and data mining techniques. These tools can help identify red flags associated with fraudulent activities and uncover credible evidence of wrongdoing, ultimately supporting informed investigative decisions. Professor Zou noted that while there are diverse models for fraud data analysis—broadly categorized into classification and regression approaches—there’s no one-size-fits-all rule in practice. Instead, he highlighted exploratory data analysis (EDA) as the go-to technique. He explained that EDA involves a systematic process: starting with attribute differentiation, followed by univariate and multivariate analyses, error and missing value detection, outlier identification, and feature engineering. By combining these steps with visual aids like heatmaps, investigators can effectively pinpoint potential red flags and ensure robust detection of suspicious patterns.

Dr. Qin Shengzhi, Honorary Lecturer of the Electronic Investigation and Forensics Course at the University of Hong Kong, provided supplementary explanations for this session, outlining the fundamental principles of artificial intelligence technologies such as linear regression, machine learning, and neural networks. He also explored practical pathways for applying machine learning techniques to fraud detection and identification. Dr. Qin further shared real-world examples, including the use of blockchain for delivering court notifications and leveraging AI agents to enhance forensic investigations, highlighting the significant potential of smart IT solutions in judicial applications—as well as proven methods for their effective implementation.
Following this, Professor Zou elaborated on her research findings, which utilized EDA and machine learning techniques to uncover fraudulent patterns in USDT transactions on the TRON blockchain. She emphasized that cryptocurrencies have become a hotspot for money-laundering activities due to their peer-to-peer nature, inherent anonymity, and the fact that some platforms fail to enforce robust customer-verification processes. However, with the aid of advanced AI technologies, it’s now possible to detect emerging fraud patterns—such as small but frequent transactions, large-scale trades within short timeframes, rapid transfers of funds to low-regulation jurisdictions, instant withdrawals bypassing intermediary steps, and deposits into wallets holding known stolen funds—enabling more efficient identification and mitigation of financial fraud.

Finally, during the Q&A session, participants asked questions about how to locally deploy AI agents while ensuring case information remains confidential, the traceability of Bitcoin, and the relationship between an algorithm’s accuracy and its overall effectiveness. Professor Zou and Dr. Qin provided detailed responses to these inquiries. Professor Liu Pinxin then offered additional insights, noting that lawyers could leverage Bitcoin’s inherent traceability to uncover the flow of funds involved in cases, potentially leading to new sources of legal work. At the same time, Professor Liu cautioned attendees that an algorithm’s reported accuracy does not necessarily equate to its real-world effectiveness or correctness. He urged practitioners not to be misled by theoretical accuracy figures but instead to actively broaden their analytical approaches—carefully scrutinizing algorithm-based evidence such as financial analysis reports and mounting robust challenges against them in court.
On the afternoon of July 1, Liu Pinxin, a professor and doctoral supervisor at Renmin University of China's School of Law, delivered a lecture titled "Principles and Examples of Electronic Evidence and Intelligent Case Handling" to the participants. Professor Liu, adopting a cutting-edge, interdisciplinary perspective and drawing from numerous real-world case studies, demonstrated that digital evidence consists of vast amounts of electronic data messages, accompanying metadata, and interconnected forensic traces—essentially forming a "digital landscape." This insight provided participants with fresh approaches to handling cases. Professor Liu also introduced the groundbreaking theory of the "Virtual Field of Digital Evidence," emphasizing that storage devices like USB drives, hard disks, and cloud services are not isolated pieces of evidence but rather "artificial fields" composed of digital signals. Within these fields, every single data interaction trace could potentially serve as a critical clue for reconstructing the truth behind a case.

Professor Liu further introduced the theory of the "First Data Field—Second Data Field—Third Data Field," providing a practical analytical framework for judicial practice. He explained that the First Data Field refers to the initial electronic evidence encountered by investigators, typically corresponding to the digital crime scene that mirrors the physical crime scene. This includes not only the data field belonging to the suspect but also associated data systems prepared specifically for investigative purposes. The Second Data Field is the dataset meticulously extracted and organized by investigators during their work. Meanwhile, the Third Data Field emerges when the data from the Second Field undergoes advanced analysis, resulting in structured information such as feature-based time-series data fields or trajectory-based sequence data fields.
Professor Liu emphasized that, in addition to deeply analyzing the First Data Field and extracting the Second Data Field, it’s crucial to proactively build the Third Data Field to enable smarter, more efficient case handling. Drawing from real-life cases he has personally managed, he guided participants on how to strategically shape field spaces from a defense-oriented perspective. Specifically:
1. Begin by categorizing vast amounts of heterogeneous data according to their types, transforming them into structured datasets.
2. Next, identify key elements as relational nodes, systematically organizing, retrieving, and analyzing the data based on these connections.
3. Finally, cross-reference and validate the retrieved results to synthesize new evidence tailored to the defense perspective, ultimately supporting the proof of critical case facts.
Chen Jie, a doctoral student at Renmin University of China, presented the team's latest research findings and shared how the team leverages advanced technologies like large-scale models to process vast amounts of data from real-world case studies. She demonstrated live how to efficiently sift through massive datasets, identify relevant evidence, build personalized knowledge bases, and deploy and query local large models—offering practical insights into these cutting-edge approaches.
Finally, Teacher Liu emphasized that we are entering a triadic data space of "human—machine—object," where field theory will become the existential framework of the future. Grounded in the principles of the electronic evidence field, legal professionals must not only establish three distinct data fields but also forge analogical connections between these data fields and the physical world.
On the morning of July 2nd, Professor Liu Pinxin from the Law School of Renmin University of China, serving as the moderator, discussed being inspired by this course. 。 First, as Song Ci wrote in *Xi Yuan Ji Lu*, "No judicial matter is more critical than the death penalty; among cases punishable by death, none is more important than the initial investigation; and of all initial investigations, none is more crucial than forensic examination." Therefore, the legal community—represented by lawyers—must enhance their investigative skills and strengthen their expertise in electronic data inspection and analysis, striving to become the modern-day equivalents of Song Ci’s iconic character, the Chief Justice.
Second, there are currently instances where certain investigation agencies collaborate with public security authorities to exploit profit-driven motives, effectively "fabricating" criminal cases. In this context, lawyers must fully embrace the opportunities presented by big data—literally "making, serving, and benefiting from" it—and integrate advanced techniques for reviewing electronic evidence alongside intelligent case-handling approaches into every stage of a legal proceeding.
Finally, defense attorneys should master the art of conducting "chain-based reviews" of the evidence-handling process, systematically identifying vulnerabilities in the prosecution’s chain of custody. By doing so, they can pinpoint weaknesses in the opposing side’s evidence, enabling precise cross-examination and delivering effective, targeted defense strategies.

Following this, Zhao Xianwei, Director of the Prosecution Research Division at the Prosecution Technology and Information Center of the Supreme People's Procuratorate, delivered a lecture to the trainees on "The Prosecutorial Perspective on Electronic Data Examination." During the Q&A session, participants raised several key questions: the feasibility of AI-based voice forgery detection, strategies for addressing concerns about涉案U盘 (involved USB drives) being swapped by investigators, scenarios where the checksum algorithms of forensic documents differ from those of extracted files, the reliability of data recovery tools, best practices for extracting online data, and the heavy reliance in practice on expert opinions as decisive pieces of evidence.
Regarding AI-generated voice forgery detection, Director Zhao Xianwei noted that current technological capabilities are still insufficient to accurately identify fabricated audio, suggesting that alternative approaches may be needed to tackle this challenge effectively. On the issue of USB drives allegedly swapped by investigators, Professor Liu Pinxin recommended analyzing the drive’s disk image to determine whether any data traces were left behind after the device was seized, potentially indicating unauthorized access.
When it comes to discrepancies between the checksum algorithms of forensic documents and extracted files, Director Zhao emphasized that such inconsistencies do not automatically imply tampering—instead, he advised verifying the original data by recalculating the checksum using the same algorithm. Meanwhile, Professor Liu urged lawyers to question the reliability of the data under these circumstances, urging prosecutors to provide corrective explanations. He also highlighted that when electronic evidence is collected without proper legal procedures—often treated merely as investigative leads rather than formal evidence—the risk of substitution increases significantly.On the topic of data recovery tool reliability, Director Zhao emphasized the importance of maintaining a cautious yet trusting attitude toward these tools, noting that while different software may produce slightly varying results in terms of file formats, they typically do not compromise the actual content of the recovered data. In contrast, Professor Liu encouraged defense attorneys to reduce their over-reliance on specialized forensic tools, instead opting for intelligently deployed digital agents to thoroughly scrutinize evidence—a strategy that could help dispel the "software myth."
Finally, regarding precautions for extracting online data, Professor Liu stressed the need to adhere to the principle of proportionality during the取证 process, highlighting that evidence should ideally be collected without altering its original state whenever possible, reserving modifications only when absolutely necessary. He further advised prioritizing the extraction of evidence based on its security level—from the most secure to the least secure—depending on the specific context.Finally, regarding the overreliance on expert opinions as crucial evidence in legal proceedings, Director Zhao acknowledged that this trend has been effectively curbed in practice, with prosecutors now exercising greater caution when handling evidence directly linked to convictions. Meanwhile, Professor Liu noted that if critical evidence is indeed found to be falsified, defense attorneys can present corroborating materials to expose the deception and clearly communicate to prosecutors the severe legal consequences of improperly relying on such evidence—highlighting, in particular, the lifelong accountability that could await those involved. By taking this proactive approach, lawyers can draw much-needed attention to these issues, ultimately contributing to a significant improvement in the overall quality of case management.

During the Q&A session, participants raised several key questions: the feasibility of AI-based voice forgery detection, strategies for addressing concerns about涉案U盘 (involved USB drives) being swapped by investigators, scenarios where the checksum algorithms of forensic documents differ from those of extracted files, the reliability of data recovery tools, best practices for extracting online data, and the heavy reliance in practice on expert opinions as decisive pieces of evidence.
Regarding AI-generated voice forgery detection, Director Zhao Xianwei noted that current technological capabilities are still insufficient to accurately identify fabricated audio, suggesting that alternative approaches may be needed to tackle this challenge effectively. On the issue of USB drives allegedly swapped by investigators, Professor Liu Pinxin recommended analyzing the drive’s disk image to determine whether any data traces were left behind after the device was seized, potentially indicating unauthorized access.
When it comes to discrepancies between the checksum algorithms of forensic documents and extracted files, Director Zhao emphasized that such inconsistencies do not automatically imply tampering—instead, he advised verifying the original data by recalculating the checksum using the same algorithm. Meanwhile, Professor Liu urged lawyers to question the reliability of the data under these circumstances, urging prosecutors to provide corrective explanations. He also highlighted that when electronic evidence is collected without proper legal procedures—often treated merely as investigative leads rather than formal evidence—the risk of substitution increases significantly.
On the topic of data recovery tool reliability, Director Zhao stressed the importance of maintaining a cautious yet trusting attitude toward these tools, acknowledging that while different software may yield slightly varying results in terms of file formats, they typically do not affect the actual content of the recovered data. In contrast, Professor Liu encouraged defense attorneys to reduce their over-reliance on specialized forensic tools, instead leveraging intelligently deployed digital agents to rigorously scrutinize evidence—a move that could help debunk the "software myth."
Finally, regarding precautions for extracting online data, Professor Liu underscored the need to adhere to the principle of proportionality during the取证 process, emphasizing that evidence should ideally be collected without altering its original state whenever possible, reserving modifications only when absolutely necessary. He further advised prioritizing the extraction of evidence based on its security level—from the most secure to the least secure—depending on the specific context.
Lastly, concerning the overreliance on expert opinions as critical evidence in legal proceedings, Director Zhao acknowledged that this trend has been effectively curbed in practice, with prosecutors exercising greater caution when dealing with evidence directly tied to convictions. Meanwhile, Professor Liu pointed out that if critical evidence is indeed found to be falsified, defense attorneys can present corroborating materials proving the deception and clearly communicate to prosecutors the severe legal consequences of improperly relying on such evidence—including lifelong accountability for those involved. By doing so, lawyers can draw attention to these issues, ultimately helping to elevate the overall quality of case handling.
On the afternoon of July 2nd, Gao Feng, Deputy Director of the Prosecution Support Department at the Shanghai People's Procuratorate and Head of the Shanghai People's Procuratorate Forensic Science Center, served as the keynote speaker, delivering a lecture titled "Big Data Thinking, Artificial Intelligence Technology, and Reconstruction of Case Facts" to the participants. Professor Gao Feng showcased the groundbreaking application of 3D reconstruction technology at crime scenes in several major and high-profile cases. This innovative technique, which integrates drone-based mapping, laser scanning, and virtual reality, is revolutionizing forensic evidence collection—shifting it from "2D documentation" to "spatiotemporal restoration."
In the 2024 Shanghai Songjiang shopping mall incident involving a brutal stabbing attack, Professor Gao’s team employed advanced 3D replication technology to precisely reconstruct the suspect’s sequence of actions: executing 18 stabs over the course of just 1 minute and 15 seconds across nearly 20,000 square meters. Even the direction of blood spatter was accurately simulated through dynamic virtual analysis.
Moreover, this cutting-edge technology has already turned the tide in a 20-year-old cold case. In an old intentional injury homicide case, the team successfully recreated the 9.45-square-meter bedroom where the crime occurred, relying on blurry crime scene photos and architectural blueprints. By cross-referencing critical details like window spacing, they were able to corroborate the testimony of a key witness, ultimately leading to the Supreme People’s Procuratorate approving the reopening of the investigation and pursuing prosecution.

“Traditional on-site sketches are often rough and prone to distortion, but 3D reconstruction allows evidence to ‘speak for itself.’” Professor Gao Feng used the example of a public interest lawsuit in a certain province involving illegal wastewater discharge, demonstrating via animation how a company secretly installed hidden pipelines to divert tap water. The clear, visual evidence dramatically improved courtroom efficiency. He emphasized that this technology not only aids in presenting evidence during trials but also helps uncover logical inconsistencies by integrating evidence into a cohesive spatial framework. In a recent traffic accident dispute, the team employed a 3D perspective to debunk the flawed conclusion that “pixel distance equals actual distance.” Today, this cutting-edge technique has already been standardized and widely applied in various types of cases, including intentional injury and drug smuggling cases. Professor Gao urged, “Public security agencies should prioritize the comprehensive collection of 3D data at crime scenes—this is the key to addressing the inevitable dispersion of evidence over time.” As Shanghai’s forensic technology department opens up its support network nationwide, crime scene reconstruction technology is poised to become an invaluable tool for solving even the most complex and challenging criminal cases.
On the morning of July 3, Wang Ran, an associate professor at the School of Discipline Inspection and Supervision of Renmin University of China, served as the keynote speaker, delivering a lecture titled "Techniques for Reviewing Electronic Evidence" to the participants. Teacher Wang uses "cases" as a guiding thread, sharing exam-marking techniques to help participants unlock a new perspective on reviewing electronic evidence.
First, what exactly are these "cases" in the context of electronic evidence? We can start by examining relevant laws and regulations governing electronic data, such as the Supreme People’s Court, Supreme People’s Procuratorate, and Ministry of Public Security’s *Provisions on Several Issues Concerning the Collection, Extraction, and Examination of Electronic Data in Criminal Cases*, the Ministry of Public Security’s *Rules on Electronic Evidence Collection in Criminal Investigations*, and the National Supervisory Commission’s *Implementation Regulations of the People’s Republic of China Supervision Law*. Specifically, there are various evidence-gathering measures, including seizure and sealing, on-site extraction, online extraction, and remote forensic examination—each corresponding to distinct types of "records" paired with "lists," collectively forming what we might call "cases."
Next, how do we identify these "cases" within electronic data? The comprehensive system for handling electronic evidence consists of three key components: "Identification," "Data," and "Collection." These correspond to the "expert opinion (Identification)," the "evidence materials (Data)," and the "evidence collection record (Collection)," respectively. To systematically organize this information, we can align the expert opinion, evidence materials, and actual data associated with the same piece of electronic evidence, then meticulously compare them item by item—checking hash values, file names, data sizes, serial numbers, modification times, file paths, and other relevant details.
Finally, how do we thoroughly examine these "cases"? Specific methods for reviewing cases include comparative analysis of similar evidence, chronological scrutiny, screenshot verification, verbal testimony comparison, and screen-recording analysis. Additionally, Teacher Wang introduces a set of "exclusion rules" for electronic evidence, covering scenarios where electronic data is deemed flawed or unreliable, where electronic storage devices (physical or documentary evidence) were illegally obtained or fail to meet authenticity standards, and even extending to the exclusion of expert opinions derived from such questionable evidence. Together, these form an integrated "Identification-Data-Collection" framework for ensuring rigorous and reliable evidence evaluation.
To wrap up, the teacher demonstrates practical case studies, illustrating step-by-step approaches to evidence review.

In the second half, Teacher Wang will share with everyone. Delivering the course "From Big Data Case Handling to Big Data Platform Development." Teacher Gao provided a detailed introduction to various practical methods, including burning CDs, creating USB drives or hard disks, generating disk images, and directly copying content to produce screen recordings, video clips, and screenshots—offering students clear guidance for acquiring electronic data in real-world scenarios. In terms of database analysis methods and tools, Teacher Gao first explained the fundamental concepts and data models used in databases. He highlighted common data models such as network, hierarchical, and relational databases, emphasizing that relational databases are the most widely adopted. Additionally, he introduced NoSQL databases as an emerging trend, noting their unique advantages in handling complex logical processes within big data and AI applications. Finally, he briefly touched on the basics of SQL statements to give participants a solid foundation for working with databases.

On the afternoon of July 3, Senior Engineer Deng Changzhi from the Institute of Software, Chinese Academy of Sciences, served as the keynote speaker, delivering a lecture titled "A Penetrative Review Method and Application of Electronic Evidence Related to Funds" to the participants.

First Professor Deng Changzhi pointed out that cash flows are the DNA of economic cases. Professor Deng reviewed the evolving research hotspots in the field of fund analysis over the past decade, covering topics such as fund data acquisition techniques, fund data analysis methods, challenges in evidencing fund data, and the process of transforming fund data analysis reports into admissible evidence. Following this, Professor Deng outlined the essential knowledge required for scrutinizing fund analysis reports, including China’s financial regulatory framework, fund flow networks, account structures, fund types, and the structural characteristics and features of fund data. Secondly Professor Deng explained the core principles of the funds-penetration review method. Starting with the logic of economic crime investigation, he demonstrated—through practical examples—how to approach each stage: data acquisition, data cleansing, data analysis, and evidence transformation. He detailed the underlying logic, business rules, computational algorithms, and AI models used at each step, while also outlining the relevant national and industry standards currently being developed. In addition, Professor Deng outlined the specific objects of scrutiny at each stage, along with the technical principles, tools, and strategic approaches employed. He focused particularly on identifying initial funding sources, uncovering transaction behavior patterns, and tracing account holders to reveal broader money flow dynamics. Finally, using three real-world cases—involved multi-party illegal fundraising schemes, online gambling operations, and misappropriation of funds—he illustrated key aspects of fund nature, cash flow patterns, and critical issues in financial investigations. By integrating visual aids, case studies, and hands-on insights from his own experience, Professor Deng helped participants deepen their understanding of advanced techniques for fund tracking, control, and analysis. At the end of the course Teacher Deng Changzhi outlined the future trends in fund review, such as leveraging advanced AI model technologies to streamline fund data table structures, as well as advancing automated review processes and report generation.
On the morning of July 4, Teacher Gao Xiansong delivered a session titled "Database and Log Analysis in Criminal Cases," providing an in-depth explanation of knowledge and analytical techniques related to electronic data examination, helping participants enhance their skills in handling electronic data during case proceedings.

First, Teacher Gao Xiansong focuses on the review steps and methods of the electronic data lifecycle. He pointed out that electronic data has long been integral to both our daily lives and professional environments, yet for lawyers, electronic data forensics still presents numerous challenges. Professor Gao also introduced fundamental concepts such as the methods of electronic data forensics and the four main categories of electronic data examination. He further emphasized that electronic data forensics serves as the first layer of filtering, while electronic data examination acts as the second, more refined stage in the process. Subsequently, regarding the electronic evidence review process Teacher Gao emphasized the importance of carefully examining case files for documents related to electronic evidence, including seizure records, inspection reports, forensic analysis reports on digital data, statements from complainants, and witness testimonies. Pay special attention to the details in the expert opinion reports—such as QR codes, the date of examination, the scope of practice, the methods used for analysis, and the specific requirements outlined. Any instances where the examination criteria are either overly detailed or excessively simplistic, show biased guidance, fall outside the practitioner’s authorized scope, present incomplete findings that fail to fully clarify the case, or where the stated requirements contradict the final conclusions, could indicate potential issues. Additionally, the attachments accompanying the expert opinions—such as storage media like CDs, USB drives, and hard disks—as well as screenshots or recordings obtained through online extraction or remote inspections—are critical areas for review. These visual records often reveal crucial insights, such as verifying whether the quantities and names of items listed in the database match those described in the expert report during the unit’s remote download process, and ensuring that the inspection procedures themselves were conducted in compliance with legal standards. Finally, don’t overlook the original evidence materials—including database images, server images, hard drive images, U-drive copies, mobile phone and computer extraction reports, and compressed files. Thoroughly scrutinize these materials for details like intrusion logs, attack logs, malware traces, backdoor activities, and webshell indicators, as they can provide vital clues to the investigation.
In the section on basic methods and approaches for acquiring electronic data, Teacher Gao provided a detailed introduction to various practical methods, including burning CDs, creating USB drives or hard disks, generating disk images, and directly copying content to produce screen recordings, video clips, and screenshots—offering学员清晰 guidance for acquiring electronic data in real-world scenarios. In terms of database analysis methods and tools, Teacher Gao first explained the fundamental concepts and data models used in databases. He highlighted common data models such as network, hierarchical, and relational databases, emphasizing that relational databases are the most widely adopted. Additionally, he introduced NoSQL databases as an emerging trend, noting their unique advantages in handling complex logical processes within big data and AI applications. Finally, he briefly touched on the basics of SQL statements to give participants a solid foundation for working with databases. During the database analysis methods and presentation session, Teacher Gao emphasized the importance of filtering out invalid data, duplicate records, test data, and other such operations. For SQL database analysis, Teacher Gao introduced the database’s purposes, explained the meanings of various fields, and demonstrated practical application techniques for professional tools like Navicat as well as helpful auxiliary tools. Through scenario-based presentations, she reinforced participants' understanding of the data modeling and analysis process. In terms of log analysis methods and tools, Teacher Gao introduced the types of logs, including system logs, server web logs, and more. Following this, the log analysis tool LogParser was introduced, and its ability to identify key elements—such as IP addresses, timestamps, and GET/POST request types mapped to URL paths—was demonstrated through real-world business case examples. Meanwhile, Teacher Gao also shared practical tips for using lightweight log analysis tools, empowering participants to develop comprehensive log-analysis skills—from basic data extraction all the way to advanced business insights.
On the afternoon of July 4, Zhang Peijun, the technical director and senior forensic expert at the Beijing Guodun Information Center Forensic Science Institute, presented a talk titled "Exploring Evidence Collection and Defense Strategies for Instant Messaging Chat Logs." The presentation dissected the core technical principles behind storing, deleting, recovering, and even forging WeChat chat records, aiming to enhance legal professionals' ability to scrutinize digital evidence.

Professor Zhang Peijun first presented the conclusion that, under today’s mainstream mobile operating systems, it is virtually impossible to fully recover deleted files such as photos, voice recordings, and documents using standard forensic techniques. He pointed out that what many people perceive as successful recovery often involves merely locating thumbnails or partial data fragments—content invisible to users in normal view—rather than restoring the files themselves. This limitation stems from the widespread adoption of "File-Based Encryption (FBE)" technology in modern smartphones.
Professor Zhang went on to explain the evolution of smartphone encryption methods: Before Android 9.0, most devices relied on "Full-Disk Encryption (FDE)," which still left room for brute-force attacks. However, with the introduction of FBE in Android 9.0 and later versions, cracking these systems became nearly impossible. In contrast, recovering text-based chat logs remains a viable option. The key distinction lies in how these two types of data are stored: Unlike standalone files, text messages aren’t saved individually but are instead consolidated into an encrypted SQLite database file named EnMicroMsg.db. Accessing this database requires specialized decryption methods.
He also highlighted a critical issue: Since WeChat version 7.0, the app has increasingly implemented SQLite’s official "secure deletion" mechanism across its databases, effectively making it impossible to retrieve content directly from its original storage location. So, where does the hope for recovery lie? Professor Zhang revealed the existence of an "FTS5 full-text search index" database, which plays a crucial role in enabling WeChat’s fast chat history search feature. Even if the primary database has been wiped clean, fragments or copies of the deleted messages may still persist within this FTS5 index file. By carefully analyzing this overlooked FTS5 database, investigators can potentially uncover and extract previously deleted chat records—though this process amounts more to "discovery" than true "recovery."
To underscore the importance of technical precision, Professor Zhang shared a real-world case involving the authentication of fabricated chat logs. He emphasized that while the fake records might appear identical to genuine ones at the user interface level, their underlying "digital fingerprints"—the unique metadata and structural characteristics embedded in the data—can be meticulously examined by experts to expose inconsistencies and verify authenticity.
Teacher Zhang Peijun concluded by emphasizing that the core of reviewing electronic evidence should not stop at automated reports generated by commercial software—but instead, legal professionals must delve deeper to fully understand the underlying principles behind how data is created and stored. This, in turn, requires legal practitioners to develop a certain level of technical literacy. Only by truly grasping both "what happens" and "why it happens" can we effectively challenge and verify electronic evidence in judicial practice within our increasingly digital age.
In the second half, an associate professor from Nankai University's Law School Teacher Zhu Tonghui, under the title "Morphemes, Morpheme Tables, Word Vector Calculations, and Intelligent Evidence," shared a comprehensive strategy—drawing from multiple real-world case studies—that spans from formal examination techniques starting with surface-level analysis all the way to in-depth, substantive verification of the underlying technical mechanisms.

Faced with what appears to be a flawless fake screenshot of WeChat chat logs, must lawyers simply stand by and wait for tech experts to step in? Teacher Zhu Tonghui provided a negative answer. Using the case of fabricated chat records shared by Teacher Zhang Peijun as an example, he emphasized how Teacher Liu Pinxin, during the lecture itself, keenly spotted numerous suspicious details in the chat screenshots—details that were both significant and enlightening. Zhu pointed out that even without deep knowledge of databases or encryption principles, one can still challenge the validity of screenshot evidence by identifying logical inconsistencies in the timing, discrepancies in friends' statuses over time, and visual elements that lack consistency—features that are easily detectable with the naked eye. Teacher Zhu underscored that this "bottom-up" review strategy aims not to outright dismiss the evidence but rather to accumulate enough formal red flags. By doing so, it plants seeds of doubt in the judge’s mind, creating favorable conditions for requesting professional forensic analysis and accessing original data—ultimately helping avoid being put on the defensive due to insufficient evidence.
As artificial intelligence and big data analytics are increasingly applied in the legal field, examining algorithms and AI-generated evidence has become a new challenge. Teacher Zhu Tonghui provided a clear and concise explanation of the underlying logic, helping legal professionals develop a sound understanding. He pointed out that the core capability of large language models isn’t "understanding" per se—but rather a probability-based form of "word association." At their most fundamental level, these models break down human language into tiny semantic units called Tokens, assigning each a unique mathematical identifier, before proceeding to process them further. Word vector embeddings and computation The probability of these token numbers appearing sequentially. Precisely because of this, large language models, in their effort to maintain the "persona" of quickly and fluently answering questions, may sacrifice accuracy by fabricating nonexistent legal provisions or facts. Therefore, legal professionals must never place blind trust in such systems. At the heart of scrutinizing big data evidence lies the critical examination of the underlying algorithmic rules. Professor Zhu pointed out that, in a case involving online gambling charges where Professor Liu conducted technical cross-examination, the algorithm used by the forensic institution to identify key groups (gambling communities) was riddled with serious flaws and even outright falsification. This underscores the importance of challenging the algorithm’s inherent logic and accuracy—this is, after all, the linchpin of evaluating big data evidence. Professor Zhu Tonghui further illustrated, through database analyses of multiple multi-level marketing cases, how meticulous data scrutiny can fundamentally undermine the prosecution’s evidentiary foundation. For instance, in a case involving overseas servers linked to a pyramid scheme, Professors Zhu and Gao Xiansong, acting as expert witnesses, uncovered numerous issues during an in-depth review of the database analysis process outlined in the forensic report—problems that could have been easily avoided with careful, data-driven investigation. Moreover, Professor Zhu highlighted that, in cases like these involving foreign servers, the legality of the evidence-gathering process itself deserves close attention. When forensic institutions or cyber police directly conduct remote investigations or inspections on servers located abroad, such actions inherently risk infringing upon national cyber sovereignty, raising serious doubts about the legitimacy of their procedures—and creating valuable opportunities for defense strategies outside the courtroom. Finally, Professor Zhu mentioned how he and Professor Gao, through their detailed examination of remote inspection reports, hash values of collected evidence, login records, backup documentation, server logs, and control panel logs, uncovered instances of extensive hacking activity and clear signs of violent tampering. Combined with the fact that the complainant’s information couldn’t be matched in the database, these findings strongly suggest that the case may involve not only offshore fishing-style manipulation but also deliberate evidence fabrication—a scenario that poses grave concerns for both procedural and substantive justice. Professor Zhu Tonghui’s insights serve as a vital bridge, connecting legal reasoning with the rapidly evolving landscape of digital technology. He emphasized that while legal professionals don’t need to become technical experts, they must cultivate a robust awareness of evidence in the digital age. By applying legal logic and keen analytical skills—coupled with a fundamental understanding of technological principles—they can thoroughly scrutinize evidence across its form, content, and procedural aspects. Such an approach not only helps uncover critical breakthroughs in complex cases but also strengthens the defense of procedural fairness and substantive justice, ultimately reducing the likelihood of wrongful convictions within the criminal justice system.
On the morning of July 5, Liu Pinxin, a professor and doctoral supervisor at Renmin University of China's Law School, delivered a lecture to the participants on the topic of "Intelligent Review of Electronic Evidence." In today’s global digital wave, Professor Liu, drawing on localized practical experience, has introduced the core concept of "intelligent review" specifically tailored to handle the massive volumes of electronic data encountered in case handling. Professor Liu points out that traditional review methods often face significant challenges, making the shift toward intelligent review the modern choice for effectively managing electronic evidence.

Professor Liu, drawing from representative examples, has identified the key elements of intelligent review for electronic evidence: First, "examine electronic materials rather than paper-based ones"—meaning that reviewing electronic evidence requires direct engagement with the raw data, avoiding the pitfalls of paper conversion, which often limits analysis to only a subset of the original data while missing out on crucial ancillary information and contextual traces. Second, "relying on model-based thinking"—which involves abstracting algorithmic analysis models from typical cases and behavioral patterns to streamline the review process. Finally, "employing a two-step approach: machine processing followed by human verification." Here, intelligent algorithms handle the initial screening and correlation analysis of vast amounts of data, while investigators then step in to conduct deeper scrutiny and filtering—ultimately creating a collaborative review model that combines the strengths of both humans and machines.
Based on this, Professor Liu recommends adopting an evidence system built around the "Identify—Analyze—Retrieve—Document" framework, complemented by three levels of intelligent review methods: beginner, intermediate, and advanced. He then shared practical and actionable review techniques with the participants.
At the beginner level, the focus is on "technical tactics," where techniques like spatiotemporal analysis and the reconstruction of first-, second-, and third-level data fields significantly enhance the efficiency and organization of electronic evidence extraction.
Moving to the intermediate level, participants learn to strengthen their "logic-driven" approach, employing methods such as timestamp verification, targeted time-period screening, case-related contextual analysis, keyword reviews, dictionary-based matching, and high-frequency pattern recognition—all designed to efficiently sift through massive volumes of digital data.
Finally, at the advanced level, Professor Liu highlights the potential of "AI-powered solutions," encouraging attendees to explore AI-based case-analysis platforms and large-scale legal models. By leveraging intelligent case retrieval and comparison tools, these technologies provide robust support for evidence evaluation and legal application.
However, given the prosecution’s increasingly sophisticated AI-assisted review capabilities, defense teams inevitably face an objective technological gap. To counter this, Professor Liu suggests shifting defense strategies toward a deeper examination of the probative value of electronic evidence—specifically through both straightforward text searches and more complex textual analyses.
Drawing from real-world cases he has handled involving charges of organizing and leading pyramid schemes, Professor Liu demonstrated cutting-edge investigative tools in action, guiding participants step-by-step through the meticulous review of vast amounts of digital data. Through hands-on practice and insightful case studies, he imparted invaluable lessons that left the audience thoroughly equipped to tackle similar challenges in their own work.
Dr. Chen Jie and Teacher Mao Zijian demonstrated advanced methods for intelligent review, including leveraging a specialized legal large-model platform for similar case retrieval, utilizing a general-purpose large-model platform to perform extensive text summarization, retrieval, and analysis, as well as conducting financial data analysis.

On the afternoon of July 5, Guo Hong, a senior-level engineer and director of the Audiovisual and Electronic Data Identification Laboratory at the Institute of Forensic Science, served as the keynote speaker, delivering a lecture titled "An Exploration of Forensic Examination and Review of Electronic Data and Audiovisual Materials in the Age of Artificial Intelligence." Professor Guo pointed out that as AI technology increasingly permeates the judicial field, forensic examinations of electronic data and audio-visual materials are facing challenges such as the diversification of evidence formats and the growing sophistication of forgery techniques. Therefore, establishing a robust and scientific review system has become particularly urgent. Professor Guo delved into three core frameworks for reviewing electronic data: First, the "Technology-Law" Collaborative Theory, which underscores the need for forensic assessments to balance technical standards with legal regulations—for instance, ensuring that electronic data extraction adheres to both the *General Rules on Forensic Procedures* and relevant technical guidelines. Second, the "Multimodal Evidence Chain" Theory, which treats electronic data, audio-visual materials, and traditional evidence as an integrated, cohesive whole—such as assembling a comprehensive evidentiary framework by combining EDR data, sensor logs, and on-site investigation records. Finally, the "AI Countermeasure" Theory addresses emerging issues like deepfakes and algorithmic biases by proposing a dual-review mechanism: "human verification combined with model validation."

During the practical session, Teacher Guo highlighted the importance of focusing on four key dimensions when conducting reviews, drawing from multiple real-world cases—including deepfake identification and data verification for intelligent connected vehicles. First, she emphasized **qualification review**, ensuring that both institutions and personnel possess the necessary judicial appraisal credentials. Second, she stressed the need for **tool review**, paying close attention to whether appraisal tools are certified and verifying the authenticity of any open-source software used. Third, material review became central, with a focus on scrutinizing the provenance of evidence, tracing its chain of custody, and confirming the consistency of hash values. Finally, she underscored the significance of **standard review**, confirming that appraisal methods adhere to current national standards such as GB/T 29360.
Teacher Guo also pointed out that when dealing with AI-generated evidence, it’s crucial to employ a combination of techniques—such as manual identification, metadata tracing, algorithmic analysis, and cross-modal comparison—to detect and expose signs of forgery. Looking ahead at the industry’s future direction, she noted that as emerging technologies like the metaverse and blockchain continue to evolve, electronic evidence review will increasingly shift toward a more advanced "intelligent review platform + expert think tank" model.
In particular, she urged judicial professionals to continuously update their technical knowledge systems, staying vigilant in upholding the integrity of evidence amid cutting-edge challenges like AI hallucinations and algorithmic "black boxes." Together, they can strengthen the technological safeguards essential for maintaining judicial fairness in the digital age.
After each thematic session, participants eagerly engaged with the instructor, actively seeking clarification on challenging practical issues and delving deeper into the theoretical principles covered in the course. The instructor provided thorough explanations, ensuring that learners received highly practical, job-relevant guidance tailored to real-world scenarios. This meaningful interaction and exchange not only significantly enhanced the participants' learning outcomes but also created a valuable platform for knowledge sharing and experience collaboration. At the successful conclusion of the workshop, attendees unanimously praised the five-day program as packed with actionable insights—combining solid theoretical foundations with practical tactical advice—and expressed that it was instrumental in strengthening their expertise in electronic evidence review.

(The exciting highlights from the salon seminar and the special evening on criminal defense will be featured in our upcoming report—stay tuned!)
//
The future has arrived—legal professionals must embrace the opportunities and challenges posed by vast amounts of data and AI-powered case management, working together to shape a brighter tomorrow!

Beijing Headquarters Address: Floor 17, East Section, China Resources Building, No. 8 Jianguomen North Avenue, Dongcheng District, Beijing
Wuhan Branch Office Address: Room 1001, 10th Floor, Huangpu International Center, Jiang'an District, Wuhan City, Hubei Province

Layout: Wang Xin
Review: Management Committee

Related News