1. Introduction
Nutrition stands as a pivotal determinant impacting human health, with proficient dietary interventions emerging as a predominant strategy in the prevention and treatment of non-communicable chronic diseases (NCDs) [
1], especially for obesity [
2], Type-2 diabetes (T2D) [
3], and cardiovascular disease (CVD) [
4]. In recent years, nutritional research has been rapidly increasing; however, the attribute risk of inadequate dietary intake contributing to the onset of these diseases remains substantial [
5], [
6]. This circumstance compels nutrition research to actively explore potential solutions to bridge this gap. “Precision nutrition” has been considered to be a possible solution to this gap for nearly a decade [
7], predominantly based on evidence showcasing inter-individual variability in response to dietary interventions [
8], [
9], [
10]. Therefore, current studies on precision nutrition mainly focus on revealing individual variability in response to diets. While understanding such variability can provide important knowledge for achieving precision nutrition, it is imperative not to disregard other pivotal factors integral to its realization. Notably, current review articles on precision nutrition frequently concentrate on reviewing findings of individual variations in diet response based on different biological mechanisms [
11], [
12], [
13], while a comprehensive review article encapsulating the entire landscape of precision nutrition remains conspicuously absent.
In fact, the area of precision nutrition extends beyond the exploration of individual variability; it encompasses a comprehensive continuum spanning from the measurement of dietary intake to the development of intervention strategies based on individual variability. Concurrently, the pursuit of precision nutrition necessitates the exploration and establishment of pathways that effectively translate personalized intervention strategies to broader population levels. Furthermore, achieving the practical integration of precision nutrition within healthcare systems requires rigorous research and the establishment of policies. This review summarizes current progress in the aspects mentioned above and discusses the future directions of precision nutrition.
2. The history of nutritional research and current challenges in contemporary nutrition research
Significant accomplishments were made in nutrition science throughout the 19th century in the field of preventing and treating nutritional deficiencies, causing that period to be documented as the golden years of nutrition [
14]. During that period, the classical nutrition experiment model was established, in which purified nutrients were deliberately removed and then reintroduced into the diets of experimental animals in order to discern their implications on health and disease. This approach proved instrumental in effectively combating a diverse spectrum of diseases, including scurvy, rickets, beriberi, pellagra, night blindness, and xerophthalmia [
15]. However, from the early 20th century to the present day, there has been an immense paradigm shift in the landscape of human diseases, with the focus transitioning from nutrition deficiency diseases to NCDs, including obesity, T2D, CVD, and cancer [
16]. At this stage, nutrition science is presented with serious challenges, producing various confusing conclusions [
17], [
18], [
19], [
20]. Therefore, it is urgent to understand why the achievements of nutrition science—as observed in the context of mitigating nutritional deficiency-related diseases—fail to be replicated when addressing the spectrum of NCDs and to explore how nutrition science should evolve in order to tackle these contemporary challenges.
The challenges encountered in nutrition research regarding the prevention of NCDs can be attributed to several possible reasons, among which the methodologies and experimental models that have historically evolved from the study of nutritional deficiency diseases may be a primary contributor. In the context of deficiency-related diseases, a straightforward causal-effect relationship is often apparent between a particular disease and a specific nutrient. Furthermore, there is usually a well-defined physiological mechanism linking the nutrient to the disease. Moreover, the disease can be prevented or reversed by providing the nutrient in the diet. Some nutritional scientists have even termed this methodology and experimental approach the “single-nutrient model,” underscoring the significance of individual nutrients in disease prevention [
21]. The single-nutrient model played a lead role in the response of nutrition scientists to the global rise in the incidence of NCDs. Consequently, the focus of nutrition research became fixated on identifying the pivotal nutrient that was responsible for the occurrence of NCDs. Along with this research framework, observational studies found a variety of micro-nutrients that were hypothesized to play a role in the development of NCDs, such as calcium, magnesium, zinc, vitamin D, and phytochemicals. However, the results from intervention studies yielded inconsistent results, as the majority of randomized controlled trials (RCTs) failed to replicate the findings observed in observational studies. For example, a number of observational studies found that a dietary calcium intake below 800 mg·d
−1 is associated with a greater risk of obesity [
22], [
23], metabolic syndrome [
24], and T2D [
25]. However, RCTs investigating calcium supplementation failed to demonstrate any beneficial effects in mitigating these diseases [
26], [
27]. Some studies even found that calcium supplementation higher than 800 mg·d
−1 may result in a greater risk of CVD [
28], [
29]. In addition, nutrition deficiency diseases generally have a single and specific pathway that is responsible for the occurrence, whereas the pathogenesis of NCDs is relatively complicated, with various pathways being involved in their occurrence. This results in high heterogeneity in relation to one specific type of NCD among different individuals, which means that the intervention target of NCDs can vary across different individuals.
Secondly, most current evidence for factors supporting diet and health comes from observational study. The measurement of dietary intake in these studies predominantly hinges on memory-based dietary assessment methods, which are susceptible to inherent biases. This methodological limitation contributes to the ambiguity surrounding findings in nutritional research [
30]. In fact, researchers have contended that no valid scientific foundation exists for any of the conclusions drawn from self-reported dietary data over the past five decades. This emphasizes the substantial limitations inherent in the utilization of memory-based dietary assessment methods and calls into question the reliability of conclusions drawn from such data [
31], [
32], [
33].
Thirdly, another prevailing limitation in contemporary nutritional research pertains to its predominant focus on the “what to eat” aspect of nutrition. Some nutritional scientists have proposed that nutrition science should break with the traditional methodology of the single-nutrient research model, and have developed theories that may be suitable for the prevention and treatment of NCDs, such as nutrition synergy and nutrition geometry [
21], [
34], [
35]. Although these theories represent a progressive shift from the traditional research paradigm by accentuating the significance of considering the collective and synergistic impacts of dietary intake on health, they remain rooted in the same dimension as the traditional model—that is, emphasizing “what to eat.” While these theories do bring nutritional science forward, they retain their primary focus on dietary content. Indeed, in the complex landscape of the real world, nutrition science encompasses far more dimensions than solely the aspect of “what to eat.” It is crucial to recognize that delving into these often-overlooked dimensions holds the potential to yield a more effective array of precision nutrition intervention strategies.
Fourthly, a prevailing issue is the disconnection between much of nutritional research and primary healthcare practice. Only a handful of population-based nutritional findings have effectively transitioned into actionable public health or clinical guidelines. Consequently, there is a pressing need to establish an implementation pathway that ensures the consistent translation of scientific insights into tangible healthcare practices.
3. The landscape and framework of precision nutrition
The ultimate objective of nutrition research in the new era is to enhance population health and alleviate the burden of NCDs, and many processes are required to achieve these goals. As shown in
Fig. 1, in the framework of precision nutrition, the fundamental basis of precision nutrition is accurately depicting the individual response to food and nutrients. On this fundamental basis, the other components of precision nutrition should be established; these generally include accurate measurement of dietary exposures and body nutritional status; the creation of multidimensional nutritional intervention strategies that address the aspects of
what, how, and when to eat; and an implementation pathway for a precision dietary intervention method at the population level, as well as policymaking regarding precision nutrition.
4. The concept of “individualized nutritional requirement phonotype omics”
The foundation of precision nutrition is the understanding of individualized nutritional requirements, as only a precise comprehension of these requisites can facilitate accurate and effective nutrition interventions or guidance. Within this review, we introduce the concept of “individualized nutritional requirement phenotype omics.” While this term has some overlap with related terms such as personalized nutrition, nutritional genomics, and nutrition genetics [
36], [
37], [
38], the former concepts predominantly emphasize distinct components within the broader context of determining accurate nutritional requirements. However, it is important to acknowledge that the nutritional requirements of the body constitute an intricate systemic process, influenced by a myriad of biological mechanisms. The notion of individualized nutritional requirement phenotype omics is predominantly built upon three crucial biological facets: the genome, epigenetics, and the gut microbiota (
Fig. 2). Single-nucleotide polymorphism (SNP) is the first biological mechanism that has been demonstrated to regulate individualized nutrition requirements. The first rigorous evidence of this was obtained from the preventing overweight using novel dietary strategies (POUNDS) trial [
39], which was designed to dissect the potential roles of different compositions of macronutrients in weight loss. No significant differences in weight loss were found among various diets, suggesting that the composition of macronutrients was probably not involved in the development of obesity. However, after taking genetic polymorphism into consideration, the study found that individuals who carried a variation on the insulin receptor substrate 1 (
IRS1), fat mass and obesity associated gene (
FTO), and association of versican (
VCAN) genes lost more weight on a low-fat diet than other individuals [
40], [
41], suggesting that variations of these genes may play a determined role in regulating the effect of fat intake on weight loss. In practical terms, individuals carrying variations on these genes should avoid a high-fat diet to prevent obesity and other obesity-induced NCDs. During the past two decades, a series of SNPs have been identified that can regulate individuals’ various nutrient intakes, including calcium, vitamin D, folic acids, and unsaturated fatty acids [
42], [
43], [
44], [
45]. Therefore, these SNP types can be used to establish accurate individualized nutritional guidelines for these macronutrients and micronutrients.
With the accumulating knowledge of gene sequencing, a number of genome-wide association studies have shown that the variation in SNPs can only explain relatively low differences in individuals’ phonotypes [
46], [
47], and most metabolic differences at the individual level could not be found. Epigenetics has been demonstrated to be another important mechanism in regulating individual nutrient requirements. Epigenomic modifications encompass heritable and reversible alterations in gene function that transpire without modifications to the underlying DNA sequence. These changes include processes such as DNA methylation, histone modification, and regulation by non-coding RNAs. Among them, DNA methylation is the most extensively studied epigenetic mechanism [
13]; it involves a covalent chemical modification of DNA, wherein a methyl group replaces the hydrogen atom at position H5 of cytosine. This reaction is catalyzed by enzymes known as DNA methyltransferases. Crucially, DNA methylation plays a pivotal role in regulating gene expression, which it does by either obstructing the binding of transcriptional proteins or recruiting proteins involved in gene repression. This epigenetic phenomenon holds significant sway over the functioning of genes. Evidence regarding this issue mainly comes from studies on early-life nutrition—a critical stage for the epigenetic determination of nutritional programming. Studies related to famine have provided substantial support for this hypothesis, finding that individuals exposed to famine in early life were more likely to have NCDs during adulthood [
48], [
49], [
50]—an association that was more obvious among individuals with a low-quality diet [
51], [
52]. Some studies demonstrated that exposure to famine modeled the characteristics of gene expression through epigenetic mechanisms, resulting in a so-called “thrifty genes phenotype” [
53]. This gene phenotype is characterized by a lower energy metabolism rate; moreover, it is more likely to store fat when consuming the same energy [
53], suggesting that participants with this phenotype should avoid excessive energy intake during their adulthood in order to maintain health. Hence, when contemplating individual nutritional requirements, it is imperative to consider the influence of epigenetic mechanisms. In fact, certain studies have identified specific genes, such as insulin-like growth factor II (
IGF2), insulin receptor (
INSR), and carnitine palmitoyltransferase 1A (
CPT1A) [
54], [
55], [
56], as potential epigenetic targets that are notably sensitive to early-life nutrition exposure. Epigenetic modifications to these genes can play a substantial role in shaping an individual’s nutritional requirements during adulthood.
The gut microbiota has been postulated as an additional mechanism for determining individual nutrient requirements. It serves as an important player in modulating nutritional responses through the production of co-metabolites. Research has shown that the currently identified gut microbiota can account for approximately 10% of the inter-individual variance observed after dietary intake [
57]. A cohort study involving 800 participants validated the efficacy of a personalized postprandial glucose prediction model based on serum parameters, eating habits, lifestyle behaviors, and gut microbiota [
58]. This model enhanced the effectiveness of dietary interventions aimed at reducing postprandial glucose levels, emphasizing that microbiota composition and function are crucial components in determining individualized nutritional requirements. In line with this finding, it was found to be possible to accurately predict individual glycemic responses to different types of bread utilizing only baseline microbiome features, with the relative abundances of
Coprobacter fastidiosus and Lachnospiraceae
bacterium 3_1_46FAA playing a contributory role [
59]. Furthermore, a comprehensive study that encompassed three distinct cohorts of obese adults from Belgium, Finland, and Britain shed light on the impact of varying abundances of Firmicute species at baseline on producing different responses to dietary interventions [
60]. Another study revealed that individuals initially low in
Blautia wexlerae and
Bacteroides dorei exhibited enhanced weight loss on a calorie-restricted diet [
61]. Moreover, a recent RCT study revealed that the baseline abundance of the
Prevotella-to-
Bacteroides (P/B) ratio was associated with a greater loss of body weight and body fat when exposed to an intervention rich in dietary fiber in individuals with obesity [
62]. Coherently, a high
Prevotella abundance in overweight adults was identified as a predictive factor for successful weight loss when consuming a diet rich in whole grain and fiber [
63]. There is also a noteworthy link between increased microbial diversity and the stability of microbial ecosystems. In this context, a study discovered an inverse relationship between the responsiveness of an individual’s microbiota to weight-loss diets and that microbiota’s diversity. Therefore, the unique characteristics of an individual’s gut microbiota can serve as predictive factors for individualized nutritional requirements. Nevertheless, it is important to recognize that the enduring effects of long-term dietary changes on the regulation of the gut microbiota should not be underestimated. Dietary modifications over an extended period can have a substantial impact on the composition and functioning of the gut microbiota [
57].
5. Accurate measurement of dietary exposure and nutritional status
Precision nutrition research has predominantly focused on unraveling the intricate interactions between gene expression and dietary intake in the context of health maintenance and disease prevention. However, certain other crucial facets of nutrition often remain overlooked. The foundation of nutrition research lies in dietary surveys, as these constitute the fundamental element for gathering basic nutritional data. Accurate dietary surveys are pivotal in furnishing robust evidence to comprehend the health impacts of nutrition. Over the past few decades, advancements in the measurement of dietary exposure have been relatively constrained, creating a substantial gap in the evolution of precision nutrition. Dietary assessment methods typically encompass 24 h dietary recall, dietary records, and food frequency questionnaires (FFQs). These methods generally rely on self-reporting, which introduces the potential for recall bias, making them less than ideal for precision nutrition research. Efforts to enhance the accuracy and precision of dietary assessment methods are crucial for advancing the field [
64]. Moreover, a few studies have indicated that some participants may unconsciously modify their less healthy eating habits or even opt not to disclose their actual food consumption. Social desirability biases can lead individuals to under-report their consumption of unhealthy foods or exaggerate their intake of healthy foods [
65]. In addition, self-report-based methods often result in inaccurate estimates of portion sizes [
66]. Estimating portion sizes can prove to be a challenging task for participants, and variations in how different individuals interpret these sizes can introduce discrepancies and inaccuracies in reported dietary intake. These challenges highlight the need for improved and more objective methods of dietary assessment in precision nutrition research.
To solve this important gap and address the issues of time constraints and low adherence associated with traditional dietary surveys, researchers have developed an Internet-based dietary survey [
67], [
68]. This approach allows participants to record their dietary intake whenever and wherever it is convenient for them. It also provides visual aids such as pictures to assist participants in estimating portion sizes for each food item, potentially enhancing the accuracy of dietary surveys. However, it is important to note that this method still relies on self-reporting and does not eliminate the classic survey biases previously documented.
With advances in artificial intelligence (AI), researchers have devised image-recognition methods for measuring dietary intake [
69], [
70]. In this approach, participants are required to take and upload photos of their food before consumption, and algorithms are employed to calculate portion sizes and nutrient content from these images. This method offers a relatively objective and accurate means of measuring food intake. However, this method has notably stringent requirements for the quality of the food photos taken by participants and may not effectively distinguish mixed foods containing various types of ingredients.
A handful of wearable devices have also been developed for the purpose of measuring dietary exposure [
71], [
72], [
73]. These devices are typically implanted under the skin. Once worn by participants, the devices can selectively monitor nutrient levels in sweat, urine, or blood using electrochemical sensors. These sensors are often enhanced with membrane immobilization of nanostructures, conductive polymers, and enzymes, enabling regular monitoring of human nutrient levels to facilitate nutritional measurement. While this method can offer a higher degree of accuracy compared with other dietary exposure measurement approaches, it is relatively invasive and can lead to lower participant adherence due to the discomfort associated with implantation. Balancing accuracy with participant comfort and adherence remains a challenge in the development of such wearable devices for dietary assessment.
In the area of precision measurement for dietary intake, identifying biomarkers in urine or blood using metabolomics is currently emerging as a dominant and promising method. Several well-established biomarkers have been used for dietary exposure assessment, including those for salt, protein, sucrose, and fructose intake [
74]. For example, 24 h urinary nitrogen is a widely recognized biomarker of protein intake that is often employed to validate self-reported dietary protein intake [
75]. Urinary concentrations of sucrose and fructose have also been identified as dose-responsive and predictive biomarkers for dietary sugar consumption [
76], [
77]. However, it is notable that, while these biomarkers are accepted as accurate and valuable, they primarily reflect dietary nutrient patterns rather than the consumption of specific foods. This underscores the ongoing need for the development of food intake biomarkers that can provide more granular information about an individual’s dietary choices.
As our understanding of dietary biomarkers continues to grow, recent studies have categorized the main subclasses of food biomarkers [
78] as follows: ① food compound intake biomarkers, which are substances from foods that cannot be absorbed or metabolized by the body (exogenous metabolites); ② food intake biomarkers, which are compounds from foods that can be absorbed or metabolized by the body (endogenous metabolites); ③ dietary pattern intake biomarkers, which encompass both exogenous and endogenous metabolites and reflect broader dietary patterns; and ④ nutritional status biomarkers, which provide information about long-term nutritional status. These biomarkers, which may be single metabolites or combinations of metabolites, can effectively indicate the consumption of either a specific food or a food group. Importantly, they exhibit clear time- and dose-response patterns following intake, making them valuable tools in dietary assessment and precision nutrition research. To date, a range of biomarkers for specific food intake have been identified, enabling accurate reflection of the consumption of foods such as whole grains, coffee, fish, red meat, sugar beverages, and more, as outlined in
Table 1 [
79], [
80], [
81], [
82], [
83], [
84], [
85], [
86], [
87], [
88], [
89], [
90], [
91], [
92], [
93], [
94]. Moreover, the field of dietary biomarkers has advanced to establish biomarkers that can reflect the intake of dietary patterns, as diets are often composed of a mixture of various foods. These dietary pattern biomarkers may prove more useful in assessing the overall quality of individuals’ diets, compared with biomarkers of individual foods. Currently, several dietary pattern biomarkers have been identified, including the Western dietary pattern and the Mediterranean diet or healthy eating pattern [
95], [
96], [
97].
However, it is important to note that, while the identification of biomarkers offers a more precise method for assessing dietary intake, a consensus protocol in this field has yet to be firmly established. Nevertheless, criteria have been developed to guide the search for biomarkers of dietary intake [
98]:
(1) Dose-response relationship: There should be a clear and demonstrable dose-response relationship between the levels of biomarkers and dietary intake.
(2) Time-response relationship: Understanding the time-response relationship between the biomarker and dietary intake is crucial in knowing how quickly the biomarker responds after food consumption and how long it remains detectable.
(3) Stability: Dietary biomarkers should exhibit stability, with minimal degradation, in order to accurately assess the long-term intake of specific foods or diets.
(4) Performance parameters: The performance of dietary biomarkers—including accuracy, sensitivity, and specificity—should be well-documented. These parameters provide insights into biomarkers’ reliability and precision.
(5) Reproducibility: The reproducibility of dietary biomarkers must be robust. Results should be highly repeatable in different settings and populations.
6. Effective dietary intervention strategies
A dietary intervention is a pivotal component in the realm of precision nutrition, particularly for the prevention of NCDs. However, effective and precise dietary intervention strategies for NCD prevention remain elusive. Many dietary intervention approaches tend to emphasize the supplementation of individual nutrients. Observational studies have linked a range of minerals, vitamins, and phytochemicals to NCD incidence, providing a basis for establishing dietary intervention methods. Nonetheless, a growing body of conflicting results between intervention studies and observational research has emerged [
99], [
100]. This disparity suggests that dietary intervention strategies solely centered on individual nutrients may prove ineffective when translating evidence from observational studies into real-world intervention strategies. A few studies have also found that the association of a single food or nutrient with NCD incidence could be modified by other dietary components, resulting in different associations [
101], [
102]. The complexities of dietary interactions and the multifaceted nature of NCDs call for more comprehensive and tailored dietary intervention approaches.
Because a diet generally includes different foods and nutrients, making up a complicated mixture of different chemical compounds, the concept of a “dietary pattern” has been proposed, with the aim of capturing the synergistic effects of foods and nutrients. A few dietary patterns have been established that can be used in the prevention and treatment of NCDs. Patterns such as the alternative healthy eating index (AHEI), dietary approaches to stop hypertension (DASH), and the Mediterranean diet have been developed and have been demonstrated to be effective in the prevention and treatment of NCDs [
103], [
104], [
105], [
106]. However, such dietary patterns often require individuals to meticulously select their daily food choices, which can result in relatively low adherence rates [
107], [
108], [
109].
More importantly, precision nutrition interventions have traditionally concentrated on regulating dietary intake, essentially addressing the question of “what to eat.” However, there are other crucial dimensions that should be considered in the development of more accurate dietary intervention strategies [
110]. One such dimension is the cooking method, which pertains to the question of “how to eat.” Surprisingly, this dimension has often been overlooked in existing strategies. A substantial portion of the evidence regarding the health impacts of specific foods is derived from observational studies that do not take cooking methods into account. A recent randomized controlled clinical trial shed light on the importance of cooking methods. For example, the trial demonstrated that—even when participants consumed the same foods and nutrients—the choice of cooking method for foods such as meat could produce different health impacts [
111]. This finding highlights the significance of considering “how to eat” in precision nutrition interventions for more accurate and effective strategies.
In addition, studies have emphasized the importance of not only the quantity and quality of daily dietary intake but also the timing of intake in maintaining health [
112], introducing yet another dimension for establishing dietary intervention methods. Interventions targeting intake timing may offer a more accessible approach to maintaining health, as they do not necessarily require individuals to change the quantity and quality of their daily intake but rather focus on modifying their timing-related behaviors. One emerging dietary intervention approach related to intake timing is time-restricted feeding (TRF) [
113], which aims to extend the daily fasting duration period. An increasing number of intervention trials have shown that prolonging the daily fasting duration can significantly improve metabolism and mitigate inflammation and oxidative stress [
114], [
115], [
116]. This suggests that interventions targeting intake timing can have a substantial impact on health outcomes and offer an alternative strategy for precision nutrition.
Beyond TRF, another promising dietary intervention approach related to intake timing is chrono-nutrition, which has been established based on the principles of the circadian rhythm. Unlike TRF, chrono-nutrition does not primarily focus on the duration of the daily fasting period but instead emphasizes the timing and order of daily food intake in alignment with the circadian rhythm. The greater the alignment between food intake and the circadian rhythm, the more favorable the impact on the body’s metabolism and overall health [
101]. Conversely, a misalignment between food intake and the circadian rhythm can increase the risk of various diseases. Observational studies have revealed that different intake times for dietary patterns, specific foods, macronutrients, minerals, vitamins, and phytochemicals can have varying health effects [
117], [
118], [
119], [
120], [
121], [
122], [
123], [
124], [
125], [
126], [
127]. These findings underscore the significance of considering the timing of food intake as a key dimension in precision nutrition interventions for optimal health outcomes. Building upon these studies, current guidelines for optimal intake times for specific foods and nutrients have been summarized, as illustrated in
Fig. 3. In summary:
(1) Daily energy intake should be distributed as 4:4:2 between breakfast, lunch, and dinner [
118], [
119], [
120].
(2) Animal foods and fruits are recommended for consumption during the daytime [
122].
(3) Whole grains, vegetables, dairy products, and vitamin and mineral supplementation are best consumed at dinner [
124], [
125], [
126], [
127].
It has been reported that insulin secretion, insulin sensitivity, lipolysis, glucagon secretion, and the secretion of other hormones involved in energy metabolism gradually increase during the daytime and decrease at night [
128], [
129], [
130], [
131], [
132]. Therefore, high energy intake during the daytime and low energy intake at night may align with the rhythm of these hormones, maintaining the body’s homeostasis. In contrast, low energy intake during the daytime and high energy intake at night may misalign with the rhythm of these hormones, resulting in a great risk of NCDs. Moreover, during the nighttime, there is an increasing activity among bacteria utilizing dietary fiber from vegetables or whole grains to create short-chain fatty acids, which gradually diminishes during the day [
133]. This implies that consuming more vegetables or whole grains during dinner can lead to the increased production of short-chain fatty acids, by aligning better with the circadian rhythm of these bacteria. Furthermore, the synthesis of serotonin and melatonin is generally activated in the evening [
134]. Dairy products are rich in dietary tryptophan [
135], so consuming higher levels of dairy products after dinner might provide more tryptophan for the synthesis of serotonin and melatonin, which is more compatible with the circadian pattern of serotonin and melatonin for maintaining good sleep quality.
The beneficial health impacts of taking in dietary minerals and vitamins in the evening could be supported by the following mechanisms. Firstly, the circadian rhythm of the clock gene Period 1 (
Per1) within the suprachiasmatic nucleus (SCN) is the master pacemaker controlling circadian rhythmicity [
136]. A reduced extracellular concentration of potassium may cause membrane hyperpolarization, thereby reversibly abolishing the rhythmic expression of
Per1 [
137], [
138]. Secondly, the circadian rhythm of parathyroid hormone (PTH) exhibits a biphasic rhythm with a small but significant increase in the late afternoon and a larger, broader increase in the late evening into the early morning with a peak [
139]. Thirdly, human capacity for DNA excision repair shows an ascending pattern during the night [
140], [
141], [
142], and it has been documented that the nutrition statuses of various minerals and vitamins are involved in these mechanisms.
Therefore, these eating patterns are believed to align with the circadian rhythm of the whole-body metabolism and the gut microbiota, potentially contributing to the maintenance of good health and significantly reducing the risk of developing NCDs. By considering the timing of food intake as part of precision nutrition, individuals can potentially enhance the effectiveness of dietary interventions for better health outcomes.
In summary, precision nutrition intervention strategies should extend beyond the traditional focus on the quantity and quality of dietary intake to encompass considerations of cooking methods and intake timing as well. This more comprehensive approach holds the potential to be highly effective in achieving NCD prevention through nutrition. By taking into account these additional dimensions, precision nutrition can offer more tailored and nuanced dietary interventions that optimize health outcomes.
7. An implementation pathway for precision nutrition at the population level
The ideal setting for implementing precision nutrition interventions is within community-based primary healthcare institutions. However, despite the progress that has been achieved in precision nutrition research, most of its achievements remain within scientific research institutions, making it challenging to translate them into practical applications within primary healthcare. Furthermore, while research into understanding individualized nutrition requirements is advancing, few of these findings have successfully made their way into primary healthcare practices. A predominant reason for this situation is that the concept of individualization is not easily adaptable to the population level. Community doctors or primary healthcare staff cannot feasibly guide every resident in crafting personalized cooking menus. Therefore, to achieve the ultimate goal of making precision nutrition accessible to everyone, one of the most critical tasks is to establish an implementation pathway for precision nutrition at the population level. This pathway should allow for the effective dissemination and application of precision nutrition principles across larger and more diverse communities, as outlined in
Fig. 4.
In recent years, there has been rapid development in the field of AI, offering a potential pathway for translating precision nutrition from personalized nutrition requirements to the population level. Combined with medical devices, mobile computing, and sensor technology, AI approaches are being used to simulate clinician diagnostics, enable early intervention, and ultimately focus on disease prevention. This shift has the potential to move healthcare from a model of “disease care” to one of “preventive care,” which can alleviate the burden of managing large patient populations.
Some research efforts have already developed AI algorithms capable of accurately predicting the development of diseases based on electronic health records [
143], [
144], [
145], [
146]. Moreover, studies have shown the utility of AI-based decision support systems in enhancing the effectiveness of managing NCDs in primary care [
147], [
148]. However, there has been limited exploration of AI’s potential in the context of precision nutrition intervention.
It is crucial to develop AI algorithms in the future that can capture individualized nutritional requirements based on genomics, epigenetics, and gut microbiota variations. Furthermore, AI-based decision support systems should be designed for precision nutrition interventions, considering factors such as dietary structure, cooking methods, and intake timing. This development is essential for achieving precision nutrition at the population level, making personalized nutrition recommendations more accessible and effective for larger and more diverse communities.
8. Future direction of precision nutrition
In conclusion, precision nutrition holds promise for addressing various unanswered questions and achieving health benefits at the population level. Advances in high-performance technologies have opened up opportunities to explore the roles of various omics data in understanding the pathophysiology and management of complex diseases. These rich and personalized data, combined with access to electronic health records and the rapidly evolving power of computational analysis and bioinformatics, are laying the foundation for the development of individualized nutritional requirements.
It is imperative to establish methods for accurately measuring dietary intake and evaluating nutritional status. This can be accomplished by integrating wearable and mobile devices with AI and robust, stable biomarkers. Current measurement of dietary exposures, especially in prospective cohort studies, should take advantage of both traditional and newly developed approaches to obtain robust evidence in terms of nutrition and health. Meanwhile, comprehensive dietary intervention strategies that consider what, when, and how to eat should be developed for each individual, taking into account each person’s unique nutritional requirements. Also, future nutritional research should consider different eating habits and food styles across populations of different ethnicities. The different metabolic phenotypes induced by different ethnicities may result in different biomarkers when the same food is consumed, and different eating habits may require different intervention strategies.
Regarding the long-term impact of diet on health, well-conducted prospective cohort studies have greatly contributed to nutritional science and will remain a very important tool for investigating evidence for precision nutrition in the future. When combined with the other abovementioned key components of precision nutrition, future well-conducted prospective cohort studies with long-term follow-up, which include accurate measurements of diet and endpoints with repeated measurements, may provide important knowledge.
Lastly, the dissemination of information on precision nutrition to residents in community settings through primary healthcare staff can bridge the gap between research and practice. This integrated approach has the potential to significantly contribute to the prevention of the rapidly increasing burden of NCDs and reduce the strain on healthcare systems. Notably, compared with traditional nutritional studies, we recognize some limitations in terms of the four components of precision nutrition; for example, it is difficult to obtain omics data at the population level, it is impractical to repeatedly collect biospecimens, and long-term follow-up of dietary intervention trials can be hard to maintain. We believe that, with advancement in the measurement of omics data, the collection of biospecimens, and the informatized management of dietary intervention trials, great progress can be made in nutrition science. Therefore, we envision this as the future landscape of precision nutrition, offering hope for the prevention and treatment of NCDs and improved quality of life.
Acknowledgments
This research was supported funds from the National Natural Science Foundation (U21A20398 to Changhao Sun; 82173498 to Tianshu Han; 82204016 to Wei Wei; and 82204017 to Wenbo Jiang).
Compliance with ethics guidelines
Tianshu Han, Wei Wei, Wenbo Jiang, Yiding Geng, Zijie Liu, Ruiming Yang, Chenrun Jin, Yating Lei, Xinyi Sun, Jiaxu Xu, Chen Juan, and Changhao Sun declare that they have no conflicts of interest or financial conflicts to disclose.