public inbox for devel@edk2.groups.io
 help / color / mirror / Atom feed
From: "Christian Rodriguez" <christian.rodriguez@intel.com>
To: "Shi, Steven" <steven.shi@intel.com>,
	"devel@edk2.groups.io" <devel@edk2.groups.io>
Cc: "Gao, Liming" <liming.gao@intel.com>,
	"Feng, Bob C" <bob.c.feng@intel.com>,
	"Johnson, Michael" <michael.johnson@intel.com>
Subject: Re: [PATCH v4 1/5] BaseTools: Improve the cache hit in the edk2 build cache
Date: Wed, 14 Aug 2019 18:33:47 +0000	[thread overview]
Message-ID: <3A7DCC9A944C6149BF832E1C9B718ABC01F9FDCC@ORSMSX114.amr.corp.intel.com> (raw)
In-Reply-To: <20190814181130.8020-2-steven.shi@intel.com>

For all 5 patches in patch set: Acked-by: Christian Rodriguez <christian.rodriguez@intel.com>

>-----Original Message-----
>From: Shi, Steven
>Sent: Wednesday, August 14, 2019 11:11 AM
>To: devel@edk2.groups.io
>Cc: Gao, Liming <liming.gao@intel.com>; Feng, Bob C
><bob.c.feng@intel.com>; Rodriguez, Christian
><christian.rodriguez@intel.com>; Johnson, Michael
><michael.johnson@intel.com>; Shi, Steven <steven.shi@intel.com>
>Subject: [PATCH v4 1/5] BaseTools: Improve the cache hit in the edk2 build
>cache
>
>From: "Shi, Steven" <steven.shi@intel.com>
>
>BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=1927
>
>Current cache hash algorithm does not parse and generate
>the makefile to get the accurate dependency files for a
>module. It instead use the platform and package meta files
>to get the module depenedency in a quick but over approximate
>way. These meta files are monolithic and involve many redundant
>dependency for the module, which cause the module build
>cache miss easily.
>This patch introduces one more cache checkpoint and a new
>hash algorithm besides the current quick one. The new hash
>algorithm leverages the module makefile to achieve more
>accurate and precise dependency info for a module. When
>the build cache miss with the first quick hash, the
>Basetool will caculate new one after makefile is generated
>and then check again.
>
>Cc: Liming Gao <liming.gao@intel.com>
>Cc: Bob Feng <bob.c.feng@intel.com>
>Signed-off-by: Steven Shi <steven.shi@intel.com>
>---
> BaseTools/Source/Python/AutoGen/AutoGenWorker.py |  21
>+++++++++++++++++++++
> BaseTools/Source/Python/AutoGen/CacheIR.py       |  28
>++++++++++++++++++++++++++++
> BaseTools/Source/Python/AutoGen/DataPipe.py      |   8 ++++++++
> BaseTools/Source/Python/AutoGen/GenMake.py       | 223
>+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>+++++++++++++++++++++++++++++++++++++++++++++++++++++++-------
>-----------------------------------------------------------------------------------------------
>-------
> BaseTools/Source/Python/AutoGen/ModuleAutoGen.py | 639
>+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>+++++++++++++++++++++++++++++++++++++++++++++++++++-------------
>--------------------------------------------
> BaseTools/Source/Python/Common/GlobalData.py     |   9 +++++++++
> BaseTools/Source/Python/build/build.py           | 129
>+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>++++++++++++++++++++++++++++++++++++++++++----------------------------
> 7 files changed, 863 insertions(+), 194 deletions(-)
>
>diff --git a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
>b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
>old mode 100644
>new mode 100755
>index e583828741..a84ed46f2e
>--- a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
>+++ b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py
>@@ -182,6 +182,12 @@ class AutoGenWorkerInProcess(mp.Process):
>             GlobalData.gDisableIncludePathCheck = False
>             GlobalData.gFdfParser = self.data_pipe.Get("FdfParser")
>             GlobalData.gDatabasePath = self.data_pipe.Get("DatabasePath")
>+            GlobalData.gBinCacheSource = self.data_pipe.Get("BinCacheSource")
>+            GlobalData.gBinCacheDest = self.data_pipe.Get("BinCacheDest")
>+            GlobalData.gCacheIR = self.data_pipe.Get("CacheIR")
>+            GlobalData.gEnableGenfdsMultiThread =
>self.data_pipe.Get("EnableGenfdsMultiThread")
>+            GlobalData.file_lock = self.file_lock
>+            CommandTarget = self.data_pipe.Get("CommandTarget")
>             pcd_from_build_option = []
>             for pcd_tuple in self.data_pipe.Get("BuildOptPcd"):
>                 pcd_id = ".".join((pcd_tuple[0],pcd_tuple[1]))
>@@ -193,10 +199,13 @@ class AutoGenWorkerInProcess(mp.Process):
>             FfsCmd = self.data_pipe.Get("FfsCommand")
>             if FfsCmd is None:
>                 FfsCmd = {}
>+            GlobalData.FfsCmd = FfsCmd
>             PlatformMetaFile =
>self.GetPlatformMetaFile(self.data_pipe.Get("P_Info").get("ActivePlatform")
>,
>                                              self.data_pipe.Get("P_Info").get("WorkspaceDir"))
>             libConstPcd = self.data_pipe.Get("LibConstPcd")
>             Refes = self.data_pipe.Get("REFS")
>+            GlobalData.libConstPcd = libConstPcd
>+            GlobalData.Refes = Refes
>             while True:
>                 if self.module_queue.empty():
>                     break
>@@ -223,8 +232,20 @@ class AutoGenWorkerInProcess(mp.Process):
>                         Ma.ConstPcd =
>libConstPcd[(Ma.MetaFile.File,Ma.MetaFile.Root,Ma.Arch,Ma.MetaFile.Path)
>]
>                     if (Ma.MetaFile.File,Ma.MetaFile.Root,Ma.Arch,Ma.MetaFile.Path)
>in Refes:
>                         Ma.ReferenceModules =
>Refes[(Ma.MetaFile.File,Ma.MetaFile.Root,Ma.Arch,Ma.MetaFile.Path)]
>+                if GlobalData.gBinCacheSource and CommandTarget in [None, "",
>"all"]:
>+                    Ma.GenModuleFilesHash(GlobalData.gCacheIR)
>+                    Ma.GenPreMakefileHash(GlobalData.gCacheIR)
>+                    if Ma.CanSkipbyPreMakefileCache(GlobalData.gCacheIR):
>+                       continue
>+
>                 Ma.CreateCodeFile(False)
>                 Ma.CreateMakeFile(False,GenFfsList=FfsCmd.get((Ma.MetaFile.File,
>Ma.Arch),[]))
>+
>+                if GlobalData.gBinCacheSource and CommandTarget in [None, "",
>"all"]:
>+                    Ma.GenMakeHeaderFilesHash(GlobalData.gCacheIR)
>+                    Ma.GenMakeHash(GlobalData.gCacheIR)
>+                    if Ma.CanSkipbyMakeCache(GlobalData.gCacheIR):
>+                        continue
>         except Empty:
>             pass
>         except:
>diff --git a/BaseTools/Source/Python/AutoGen/CacheIR.py
>b/BaseTools/Source/Python/AutoGen/CacheIR.py
>new file mode 100755
>index 0000000000..2d9ffe3f0b
>--- /dev/null
>+++ b/BaseTools/Source/Python/AutoGen/CacheIR.py
>@@ -0,0 +1,28 @@
>+## @file
>+# Build cache intermediate result and state
>+#
>+# Copyright (c) 2019, Intel Corporation. All rights reserved.<BR>
>+# SPDX-License-Identifier: BSD-2-Clause-Patent
>+#
>+
>+class ModuleBuildCacheIR():
>+    def __init__(self, Path, Arch):
>+        self.ModulePath = Path
>+        self.ModuleArch = Arch
>+        self.ModuleFilesHashDigest = None
>+        self.ModuleFilesHashHexDigest = None
>+        self.ModuleFilesChain = []
>+        self.PreMakefileHashHexDigest = None
>+        self.CreateCodeFileDone = False
>+        self.CreateMakeFileDone = False
>+        self.MakefilePath = None
>+        self.AutoGenFileList = None
>+        self.DependencyHeaderFileSet = None
>+        self.MakeHeaderFilesHashChain = None
>+        self.MakeHeaderFilesHashDigest = None
>+        self.MakeHeaderFilesHashChain = []
>+        self.MakeHashDigest = None
>+        self.MakeHashHexDigest = None
>+        self.MakeHashChain = []
>+        self.PreMakeCacheHit = False
>+        self.MakeCacheHit = False
>diff --git a/BaseTools/Source/Python/AutoGen/DataPipe.py
>b/BaseTools/Source/Python/AutoGen/DataPipe.py
>old mode 100644
>new mode 100755
>index 2052084bdb..84e77c301a
>--- a/BaseTools/Source/Python/AutoGen/DataPipe.py
>+++ b/BaseTools/Source/Python/AutoGen/DataPipe.py
>@@ -158,3 +158,11 @@ class MemoryDataPipe(DataPipe):
>         self.DataContainer = {"FdfParser": True if GlobalData.gFdfParser else
>False}
>
>         self.DataContainer = {"LogLevel": EdkLogger.GetLevel()}
>+
>+        self.DataContainer = {"BinCacheSource":GlobalData.gBinCacheSource}
>+
>+        self.DataContainer = {"BinCacheDest":GlobalData.gBinCacheDest}
>+
>+        self.DataContainer = {"CacheIR":GlobalData.gCacheIR}
>+
>+        self.DataContainer =
>{"EnableGenfdsMultiThread":GlobalData.gEnableGenfdsMultiThread}
>\ No newline at end of file
>diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py
>b/BaseTools/Source/Python/AutoGen/GenMake.py
>old mode 100644
>new mode 100755
>index 499ef82aea..ce047e7f64
>--- a/BaseTools/Source/Python/AutoGen/GenMake.py
>+++ b/BaseTools/Source/Python/AutoGen/GenMake.py
>@@ -906,6 +906,11 @@ cleanlib:
>                                     self._AutoGenObject.IncludePathList +
>self._AutoGenObject.BuildOptionIncPathList
>                                     )
>
>+        self.DependencyHeaderFileSet = set()
>+        if FileDependencyDict:
>+            for Dependency in FileDependencyDict.values():
>+                self.DependencyHeaderFileSet.update(set(Dependency))
>+
>         # Get a set of unique package includes from MetaFile
>         parentMetaFileIncludes = set()
>         for aInclude in self._AutoGenObject.PackageIncludePathList:
>@@ -1115,7 +1120,7 @@ cleanlib:
>     ## For creating makefile targets for dependent libraries
>     def ProcessDependentLibrary(self):
>         for LibraryAutoGen in self._AutoGenObject.LibraryAutoGenList:
>-            if not LibraryAutoGen.IsBinaryModule and not
>LibraryAutoGen.CanSkipbyHash():
>+            if not LibraryAutoGen.IsBinaryModule:
>
>self.LibraryBuildDirectoryList.append(self.PlaceMacro(LibraryAutoGen.BuildDi
>r, self.Macros))
>
>     ## Return a list containing source file's dependencies
>@@ -1129,114 +1134,9 @@ cleanlib:
>     def GetFileDependency(self, FileList, ForceInculeList, SearchPathList):
>         Dependency = {}
>         for F in FileList:
>-            Dependency[F] = self.GetDependencyList(F, ForceInculeList,
>SearchPathList)
>+            Dependency[F] = GetDependencyList(self._AutoGenObject,
>self.FileCache, F, ForceInculeList, SearchPathList)
>         return Dependency
>
>-    ## Find dependencies for one source file
>-    #
>-    #  By searching recursively "#include" directive in file, find out all the
>-    #  files needed by given source file. The dependencies will be only
>searched
>-    #  in given search path list.
>-    #
>-    #   @param      File            The source file
>-    #   @param      ForceInculeList The list of files which will be included forcely
>-    #   @param      SearchPathList  The list of search path
>-    #
>-    #   @retval     list            The list of files the given source file depends on
>-    #
>-    def GetDependencyList(self, File, ForceList, SearchPathList):
>-        EdkLogger.debug(EdkLogger.DEBUG_1, "Try to get dependency files for
>%s" % File)
>-        FileStack = [File] + ForceList
>-        DependencySet = set()
>-
>-        if self._AutoGenObject.Arch not in gDependencyDatabase:
>-            gDependencyDatabase[self._AutoGenObject.Arch] = {}
>-        DepDb = gDependencyDatabase[self._AutoGenObject.Arch]
>-
>-        while len(FileStack) > 0:
>-            F = FileStack.pop()
>-
>-            FullPathDependList = []
>-            if F in self.FileCache:
>-                for CacheFile in self.FileCache[F]:
>-                    FullPathDependList.append(CacheFile)
>-                    if CacheFile not in DependencySet:
>-                        FileStack.append(CacheFile)
>-                DependencySet.update(FullPathDependList)
>-                continue
>-
>-            CurrentFileDependencyList = []
>-            if F in DepDb:
>-                CurrentFileDependencyList = DepDb[F]
>-            else:
>-                try:
>-                    Fd = open(F.Path, 'rb')
>-                    FileContent = Fd.read()
>-                    Fd.close()
>-                except BaseException as X:
>-                    EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=F.Path +
>"\n\t" + str(X))
>-                if len(FileContent) == 0:
>-                    continue
>-                try:
>-                    if FileContent[0] == 0xff or FileContent[0] == 0xfe:
>-                        FileContent = FileContent.decode('utf-16')
>-                    else:
>-                        FileContent = FileContent.decode()
>-                except:
>-                    # The file is not txt file. for example .mcb file
>-                    continue
>-                IncludedFileList = gIncludePattern.findall(FileContent)
>-
>-                for Inc in IncludedFileList:
>-                    Inc = Inc.strip()
>-                    # if there's macro used to reference header file, expand it
>-                    HeaderList = gMacroPattern.findall(Inc)
>-                    if len(HeaderList) == 1 and len(HeaderList[0]) == 2:
>-                        HeaderType = HeaderList[0][0]
>-                        HeaderKey = HeaderList[0][1]
>-                        if HeaderType in gIncludeMacroConversion:
>-                            Inc = gIncludeMacroConversion[HeaderType] % {"HeaderKey" :
>HeaderKey}
>-                        else:
>-                            # not known macro used in #include, always build the file by
>-                            # returning a empty dependency
>-                            self.FileCache[File] = []
>-                            return []
>-                    Inc = os.path.normpath(Inc)
>-                    CurrentFileDependencyList.append(Inc)
>-                DepDb[F] = CurrentFileDependencyList
>-
>-            CurrentFilePath = F.Dir
>-            PathList = [CurrentFilePath] + SearchPathList
>-            for Inc in CurrentFileDependencyList:
>-                for SearchPath in PathList:
>-                    FilePath = os.path.join(SearchPath, Inc)
>-                    if FilePath in gIsFileMap:
>-                        if not gIsFileMap[FilePath]:
>-                            continue
>-                    # If isfile is called too many times, the performance is slow down.
>-                    elif not os.path.isfile(FilePath):
>-                        gIsFileMap[FilePath] = False
>-                        continue
>-                    else:
>-                        gIsFileMap[FilePath] = True
>-                    FilePath = PathClass(FilePath)
>-                    FullPathDependList.append(FilePath)
>-                    if FilePath not in DependencySet:
>-                        FileStack.append(FilePath)
>-                    break
>-                else:
>-                    EdkLogger.debug(EdkLogger.DEBUG_9, "%s included by %s was not
>found "\
>-                                    "in any given path:\n\t%s" % (Inc, F,
>"\n\t".join(SearchPathList)))
>-
>-            self.FileCache[F] = FullPathDependList
>-            DependencySet.update(FullPathDependList)
>-
>-        DependencySet.update(ForceList)
>-        if File in DependencySet:
>-            DependencySet.remove(File)
>-        DependencyList = list(DependencySet)  # remove duplicate ones
>-
>-        return DependencyList
>
> ## CustomMakefile class
> #
>@@ -1618,7 +1518,7 @@ cleanlib:
>     def GetLibraryBuildDirectoryList(self):
>         DirList = []
>         for LibraryAutoGen in self._AutoGenObject.LibraryAutoGenList:
>-            if not LibraryAutoGen.IsBinaryModule and not
>LibraryAutoGen.CanSkipbyHash():
>+            if not LibraryAutoGen.IsBinaryModule:
>                 DirList.append(os.path.join(self._AutoGenObject.BuildDir,
>LibraryAutoGen.BuildDir))
>         return DirList
>
>@@ -1754,7 +1654,7 @@ class TopLevelMakefile(BuildFile):
>     def GetLibraryBuildDirectoryList(self):
>         DirList = []
>         for LibraryAutoGen in self._AutoGenObject.LibraryAutoGenList:
>-            if not LibraryAutoGen.IsBinaryModule and not
>LibraryAutoGen.CanSkipbyHash():
>+            if not LibraryAutoGen.IsBinaryModule:
>                 DirList.append(os.path.join(self._AutoGenObject.BuildDir,
>LibraryAutoGen.BuildDir))
>         return DirList
>
>@@ -1762,3 +1662,108 @@ class TopLevelMakefile(BuildFile):
> if __name__ == '__main__':
>     pass
>
>+## Find dependencies for one source file
>+#
>+#  By searching recursively "#include" directive in file, find out all the
>+#  files needed by given source file. The dependencies will be only searched
>+#  in given search path list.
>+#
>+#   @param      File            The source file
>+#   @param      ForceInculeList The list of files which will be included forcely
>+#   @param      SearchPathList  The list of search path
>+#
>+#   @retval     list            The list of files the given source file depends on
>+#
>+def GetDependencyList(AutoGenObject, FileCache, File, ForceList,
>SearchPathList):
>+    EdkLogger.debug(EdkLogger.DEBUG_1, "Try to get dependency files for
>%s" % File)
>+    FileStack = [File] + ForceList
>+    DependencySet = set()
>+
>+    if AutoGenObject.Arch not in gDependencyDatabase:
>+        gDependencyDatabase[AutoGenObject.Arch] = {}
>+    DepDb = gDependencyDatabase[AutoGenObject.Arch]
>+
>+    while len(FileStack) > 0:
>+        F = FileStack.pop()
>+
>+        FullPathDependList = []
>+        if F in FileCache:
>+            for CacheFile in FileCache[F]:
>+                FullPathDependList.append(CacheFile)
>+                if CacheFile not in DependencySet:
>+                    FileStack.append(CacheFile)
>+            DependencySet.update(FullPathDependList)
>+            continue
>+
>+        CurrentFileDependencyList = []
>+        if F in DepDb:
>+            CurrentFileDependencyList = DepDb[F]
>+        else:
>+            try:
>+                Fd = open(F.Path, 'rb')
>+                FileContent = Fd.read()
>+                Fd.close()
>+            except BaseException as X:
>+                EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=F.Path +
>"\n\t" + str(X))
>+            if len(FileContent) == 0:
>+                continue
>+            try:
>+                if FileContent[0] == 0xff or FileContent[0] == 0xfe:
>+                    FileContent = FileContent.decode('utf-16')
>+                else:
>+                    FileContent = FileContent.decode()
>+            except:
>+                # The file is not txt file. for example .mcb file
>+                continue
>+            IncludedFileList = gIncludePattern.findall(FileContent)
>+
>+            for Inc in IncludedFileList:
>+                Inc = Inc.strip()
>+                # if there's macro used to reference header file, expand it
>+                HeaderList = gMacroPattern.findall(Inc)
>+                if len(HeaderList) == 1 and len(HeaderList[0]) == 2:
>+                    HeaderType = HeaderList[0][0]
>+                    HeaderKey = HeaderList[0][1]
>+                    if HeaderType in gIncludeMacroConversion:
>+                        Inc = gIncludeMacroConversion[HeaderType] % {"HeaderKey" :
>HeaderKey}
>+                    else:
>+                        # not known macro used in #include, always build the file by
>+                        # returning a empty dependency
>+                        FileCache[File] = []
>+                        return []
>+                Inc = os.path.normpath(Inc)
>+                CurrentFileDependencyList.append(Inc)
>+            DepDb[F] = CurrentFileDependencyList
>+
>+        CurrentFilePath = F.Dir
>+        PathList = [CurrentFilePath] + SearchPathList
>+        for Inc in CurrentFileDependencyList:
>+            for SearchPath in PathList:
>+                FilePath = os.path.join(SearchPath, Inc)
>+                if FilePath in gIsFileMap:
>+                    if not gIsFileMap[FilePath]:
>+                        continue
>+                # If isfile is called too many times, the performance is slow down.
>+                elif not os.path.isfile(FilePath):
>+                    gIsFileMap[FilePath] = False
>+                    continue
>+                else:
>+                    gIsFileMap[FilePath] = True
>+                FilePath = PathClass(FilePath)
>+                FullPathDependList.append(FilePath)
>+                if FilePath not in DependencySet:
>+                    FileStack.append(FilePath)
>+                break
>+            else:
>+                EdkLogger.debug(EdkLogger.DEBUG_9, "%s included by %s was not
>found "\
>+                                "in any given path:\n\t%s" % (Inc, F,
>"\n\t".join(SearchPathList)))
>+
>+        FileCache[F] = FullPathDependList
>+        DependencySet.update(FullPathDependList)
>+
>+    DependencySet.update(ForceList)
>+    if File in DependencySet:
>+        DependencySet.remove(File)
>+    DependencyList = list(DependencySet)  # remove duplicate ones
>+
>+    return DependencyList
>\ No newline at end of file
>diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
>b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
>old mode 100644
>new mode 100755
>index 9ecf5c2dbe..613b0d2fb8
>--- a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
>+++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py
>@@ -26,6 +26,8 @@ from Workspace.MetaFileCommentParser import
>UsageList
> from .GenPcdDb import CreatePcdDatabaseCode
> from Common.caching import cached_class_function
> from AutoGen.ModuleAutoGenHelper import PlatformInfo,WorkSpaceInfo
>+from AutoGen.CacheIR import ModuleBuildCacheIR
>+import json
>
> ## Mapping Makefile type
> gMakeTypeMap = {TAB_COMPILER_MSFT:"nmake", "GCC":"gmake"}
>@@ -252,6 +254,8 @@ class ModuleAutoGen(AutoGen):
>         self.AutoGenDepSet = set()
>         self.ReferenceModules = []
>         self.ConstPcd                  = {}
>+        self.Makefile         = None
>+        self.FileDependCache  = {}
>
>     def __init_platform_info__(self):
>         pinfo = self.DataPipe.Get("P_Info")
>@@ -1608,12 +1612,37 @@ class ModuleAutoGen(AutoGen):
>
>         self.IsAsBuiltInfCreated = True
>
>+    def CacheCopyFile(self, OriginDir, CopyDir, File):
>+        sub_dir = os.path.relpath(File, CopyDir)
>+        destination_file = os.path.join(OriginDir, sub_dir)
>+        destination_dir = os.path.dirname(destination_file)
>+        CreateDirectory(destination_dir)
>+        try:
>+            CopyFileOnChange(File, destination_dir)
>+        except:
>+            EdkLogger.quiet("[cache warning]: fail to copy file:%s to folder:%s" %
>(File, destination_dir))
>+            return
>+
>     def CopyModuleToCache(self):
>-        FileDir = path.join(GlobalData.gBinCacheDest, self.PlatformInfo.Name,
>self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir,
>self.MetaFile.BaseName)
>+        self.GenPreMakefileHash(GlobalData.gCacheIR)
>+        if not (self.MetaFile.Path, self.Arch) in GlobalData.gCacheIR or \
>+           not GlobalData.gCacheIR[(self.MetaFile.Path,
>self.Arch)].PreMakefileHashHexDigest:
>+            EdkLogger.quiet("[cache warning]: Cannot generate PreMakefileHash
>for module: %s[%s]" % (self.MetaFile.Path, self.Arch))
>+            return False
>+
>+        self.GenMakeHash(GlobalData.gCacheIR)
>+        if not (self.MetaFile.Path, self.Arch) in GlobalData.gCacheIR or \
>+           not GlobalData.gCacheIR[(self.MetaFile.Path,
>self.Arch)].MakeHashChain or \
>+           not GlobalData.gCacheIR[(self.MetaFile.Path,
>self.Arch)].MakeHashHexDigest:
>+            EdkLogger.quiet("[cache warning]: Cannot generate MakeHashChain
>for module: %s[%s]" % (self.MetaFile.Path, self.Arch))
>+            return False
>+
>+        MakeHashStr = str(GlobalData.gCacheIR[(self.MetaFile.Path,
>self.Arch)].MakeHashHexDigest)
>+        FileDir = path.join(GlobalData.gBinCacheDest,
>self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch,
>self.SourceDir, self.MetaFile.BaseName, MakeHashStr)
>+        FfsDir = path.join(GlobalData.gBinCacheDest,
>self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain,
>TAB_FV_DIRECTORY, "Ffs", self.Guid + self.Name, MakeHashStr)
>+
>         CreateDirectory (FileDir)
>-        HashFile = path.join(self.BuildDir, self.Name + '.hash')
>-        if os.path.exists(HashFile):
>-            CopyFileOnChange(HashFile, FileDir)
>+        self.SaveHashChainFileToCache(GlobalData.gCacheIR)
>         ModuleFile = path.join(self.OutputDir, self.Name + '.inf')
>         if os.path.exists(ModuleFile):
>             CopyFileOnChange(ModuleFile, FileDir)
>@@ -1631,38 +1660,73 @@ class ModuleAutoGen(AutoGen):
>                 CreateDirectory(destination_dir)
>                 CopyFileOnChange(File, destination_dir)
>
>-    def AttemptModuleCacheCopy(self):
>-        # If library or Module is binary do not skip by hash
>-        if self.IsBinaryModule:
>+    def SaveHashChainFileToCache(self, gDict):
>+        if not GlobalData.gBinCacheDest:
>+            return False
>+
>+        self.GenPreMakefileHash(gDict)
>+        if not (self.MetaFile.Path, self.Arch) in gDict or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].PreMakefileHashHexDigest:
>+            EdkLogger.quiet("[cache warning]: Cannot generate PreMakefileHash
>for module: %s[%s]" % (self.MetaFile.Path, self.Arch))
>+            return False
>+
>+        self.GenMakeHash(gDict)
>+        if not (self.MetaFile.Path, self.Arch) in gDict or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].MakeHashChain or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].MakeHashHexDigest:
>+            EdkLogger.quiet("[cache warning]: Cannot generate MakeHashChain
>for module: %s[%s]" % (self.MetaFile.Path, self.Arch))
>             return False
>-        # .inc is contains binary information so do not skip by hash as well
>-        for f_ext in self.SourceFileList:
>-            if '.inc' in str(f_ext):
>-                return False
>-        FileDir = path.join(GlobalData.gBinCacheSource, self.PlatformInfo.Name,
>self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir,
>self.MetaFile.BaseName)
>-        HashFile = path.join(FileDir, self.Name + '.hash')
>-        if os.path.exists(HashFile):
>-            f = open(HashFile, 'r')
>-            CacheHash = f.read()
>-            f.close()
>-            self.GenModuleHash()
>-            if GlobalData.gModuleHash[self.Arch][self.Name]:
>-                if CacheHash == GlobalData.gModuleHash[self.Arch][self.Name]:
>-                    for root, dir, files in os.walk(FileDir):
>-                        for f in files:
>-                            if self.Name + '.hash' in f:
>-                                CopyFileOnChange(HashFile, self.BuildDir)
>-                            else:
>-                                File = path.join(root, f)
>-                                sub_dir = os.path.relpath(File, FileDir)
>-                                destination_file = os.path.join(self.OutputDir, sub_dir)
>-                                destination_dir = os.path.dirname(destination_file)
>-                                CreateDirectory(destination_dir)
>-                                CopyFileOnChange(File, destination_dir)
>-                    if self.Name == "PcdPeim" or self.Name == "PcdDxe":
>-                        CreatePcdDatabaseCode(self, TemplateString(),
>TemplateString())
>-                    return True
>-        return False
>+
>+        # save the hash chain list as cache file
>+        MakeHashStr = str(GlobalData.gCacheIR[(self.MetaFile.Path,
>self.Arch)].MakeHashHexDigest)
>+        CacheDestDir = path.join(GlobalData.gBinCacheDest,
>self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch,
>self.SourceDir, self.MetaFile.BaseName)
>+        CacheHashDestDir = path.join(CacheDestDir, MakeHashStr)
>+        ModuleHashPair = path.join(CacheDestDir, self.Name +
>".ModuleHashPair")
>+        MakeHashChain = path.join(CacheHashDestDir, self.Name +
>".MakeHashChain")
>+        ModuleFilesChain = path.join(CacheHashDestDir, self.Name +
>".ModuleFilesChain")
>+
>+        # save the HashChainDict as json file
>+        CreateDirectory (CacheDestDir)
>+        CreateDirectory (CacheHashDestDir)
>+        try:
>+            ModuleHashPairList = [] # tuple list: [tuple(PreMakefileHash,
>MakeHash)]
>+            if os.path.exists(ModuleHashPair):
>+                f = open(ModuleHashPair, 'r')
>+                ModuleHashPairList = json.load(f)
>+                f.close()
>+            PreMakeHash = gDict[(self.MetaFile.Path,
>self.Arch)].PreMakefileHashHexDigest
>+            MakeHash = gDict[(self.MetaFile.Path,
>self.Arch)].MakeHashHexDigest
>+            ModuleHashPairList.append((PreMakeHash, MakeHash))
>+            ModuleHashPairList = list(set(map(tuple, ModuleHashPairList)))
>+            with open(ModuleHashPair, 'w') as f:
>+                json.dump(ModuleHashPairList, f, indent=2)
>+        except:
>+            EdkLogger.quiet("[cache warning]: fail to save ModuleHashPair file in
>cache: %s" % ModuleHashPair)
>+            return False
>+
>+        try:
>+            with open(MakeHashChain, 'w') as f:
>+                json.dump(gDict[(self.MetaFile.Path, self.Arch)].MakeHashChain, f,
>indent=2)
>+        except:
>+            EdkLogger.quiet("[cache warning]: fail to save MakeHashChain file in
>cache: %s" % MakeHashChain)
>+            return False
>+
>+        try:
>+            with open(ModuleFilesChain, 'w') as f:
>+                json.dump(gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesChain,
>f, indent=2)
>+        except:
>+            EdkLogger.quiet("[cache warning]: fail to save ModuleFilesChain file in
>cache: %s" % ModuleFilesChain)
>+            return False
>+
>+        # save the autogenfile and makefile for debug usage
>+        CacheDebugDir = path.join(CacheHashDestDir, "CacheDebug")
>+        CreateDirectory (CacheDebugDir)
>+        CopyFileOnChange(gDict[(self.MetaFile.Path, self.Arch)].MakefilePath,
>CacheDebugDir)
>+        if gDict[(self.MetaFile.Path, self.Arch)].AutoGenFileList:
>+            for File in gDict[(self.MetaFile.Path, self.Arch)].AutoGenFileList:
>+                CopyFileOnChange(str(File), CacheDebugDir)
>+
>+        return True
>
>     ## Create makefile for the module and its dependent libraries
>     #
>@@ -1671,6 +1735,11 @@ class ModuleAutoGen(AutoGen):
>     #
>     @cached_class_function
>     def CreateMakeFile(self, CreateLibraryMakeFile=True, GenFfsList = []):
>+        gDict = GlobalData.gCacheIR
>+        if (self.MetaFile.Path, self.Arch) in gDict and \
>+          gDict[(self.MetaFile.Path, self.Arch)].CreateMakeFileDone:
>+            return
>+
>         # nest this function inside it's only caller.
>         def CreateTimeStamp():
>             FileSet = {self.MetaFile.Path}
>@@ -1701,8 +1770,8 @@ class ModuleAutoGen(AutoGen):
>             for LibraryAutoGen in self.LibraryAutoGenList:
>                 LibraryAutoGen.CreateMakeFile()
>
>-        # Don't enable if hash feature enabled, CanSkip uses timestamps to
>determine build skipping
>-        if not GlobalData.gUseHashCache and self.CanSkip():
>+        # CanSkip uses timestamps to determine build skipping
>+        if self.CanSkip():
>             return
>
>         if len(self.CustomMakefile) == 0:
>@@ -1718,6 +1787,24 @@ class ModuleAutoGen(AutoGen):
>
>         CreateTimeStamp()
>
>+        MakefileType = Makefile._FileType
>+        MakefileName = Makefile._FILE_NAME_[MakefileType]
>+        MakefilePath = os.path.join(self.MakeFileDir, MakefileName)
>+
>+        MewIR = ModuleBuildCacheIR(self.MetaFile.Path, self.Arch)
>+        MewIR.MakefilePath = MakefilePath
>+        MewIR.DependencyHeaderFileSet =
>Makefile.DependencyHeaderFileSet
>+        MewIR.CreateMakeFileDone = True
>+        with GlobalData.file_lock:
>+            try:
>+                IR = gDict[(self.MetaFile.Path, self.Arch)]
>+                IR.MakefilePath = MakefilePath
>+                IR.DependencyHeaderFileSet = Makefile.DependencyHeaderFileSet
>+                IR.CreateMakeFileDone = True
>+                gDict[(self.MetaFile.Path, self.Arch)] = IR
>+            except:
>+                gDict[(self.MetaFile.Path, self.Arch)] = MewIR
>+
>     def CopyBinaryFiles(self):
>         for File in self.Module.Binaries:
>             SrcPath = File.Path
>@@ -1729,6 +1816,11 @@ class ModuleAutoGen(AutoGen):
>     #                                       dependent libraries will be created
>     #
>     def CreateCodeFile(self, CreateLibraryCodeFile=True):
>+        gDict = GlobalData.gCacheIR
>+        if (self.MetaFile.Path, self.Arch) in gDict and \
>+          gDict[(self.MetaFile.Path, self.Arch)].CreateCodeFileDone:
>+            return
>+
>         if self.IsCodeFileCreated:
>             return
>
>@@ -1744,8 +1836,9 @@ class ModuleAutoGen(AutoGen):
>         if not self.IsLibrary and CreateLibraryCodeFile:
>             for LibraryAutoGen in self.LibraryAutoGenList:
>                 LibraryAutoGen.CreateCodeFile()
>-        # Don't enable if hash feature enabled, CanSkip uses timestamps to
>determine build skipping
>-        if not GlobalData.gUseHashCache and self.CanSkip():
>+
>+        # CanSkip uses timestamps to determine build skipping
>+        if self.CanSkip():
>             return
>
>         AutoGenList = []
>@@ -1785,6 +1878,16 @@ class ModuleAutoGen(AutoGen):
>                             (" ".join(AutoGenList), " ".join(IgoredAutoGenList), self.Name,
>self.Arch))
>
>         self.IsCodeFileCreated = True
>+        MewIR = ModuleBuildCacheIR(self.MetaFile.Path, self.Arch)
>+        MewIR.CreateCodeFileDone = True
>+        with GlobalData.file_lock:
>+            try:
>+                IR = gDict[(self.MetaFile.Path, self.Arch)]
>+                IR.CreateCodeFileDone = True
>+                gDict[(self.MetaFile.Path, self.Arch)] = IR
>+            except:
>+                gDict[(self.MetaFile.Path, self.Arch)] = MewIR
>+
>         return AutoGenList
>
>     ## Summarize the ModuleAutoGen objects of all libraries used by this
>module
>@@ -1854,46 +1957,468 @@ class ModuleAutoGen(AutoGen):
>
>         return GlobalData.gModuleHash[self.Arch][self.Name].encode('utf-8')
>
>+    def GenModuleFilesHash(self, gDict):
>+        # Early exit if module or library has been hashed and is in memory
>+        if (self.MetaFile.Path, self.Arch) in gDict:
>+            if gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesChain:
>+                return gDict[(self.MetaFile.Path, self.Arch)]
>+
>+        DependencyFileSet = set()
>+        # Add Module Meta file
>+        DependencyFileSet.add(self.MetaFile)
>+
>+        # Add Module's source files
>+        if self.SourceFileList:
>+            for File in set(self.SourceFileList):
>+                DependencyFileSet.add(File)
>+
>+        # Add modules's include header files
>+        # Search dependency file list for each source file
>+        SourceFileList = []
>+        OutPutFileList = []
>+        for Target in self.IntroTargetList:
>+            SourceFileList.extend(Target.Inputs)
>+            OutPutFileList.extend(Target.Outputs)
>+        if OutPutFileList:
>+            for Item in OutPutFileList:
>+                if Item in SourceFileList:
>+                    SourceFileList.remove(Item)
>+        SearchList = []
>+        for file_path in self.IncludePathList + self.BuildOptionIncPathList:
>+            # skip the folders in platform BuildDir which are not been generated
>yet
>+            if
>file_path.startswith(os.path.abspath(self.PlatformInfo.BuildDir)+os.sep):
>+                continue
>+            SearchList.append(file_path)
>+        FileDependencyDict = {}
>+        ForceIncludedFile = []
>+        for F in SourceFileList:
>+            # skip the files which are not been generated yet, because
>+            # the SourceFileList usually contains intermediate build files, e.g.
>AutoGen.c
>+            if not os.path.exists(F.Path):
>+                continue
>+            FileDependencyDict[F] = GenMake.GetDependencyList(self,
>self.FileDependCache, F, ForceIncludedFile, SearchList)
>+
>+        if FileDependencyDict:
>+            for Dependency in FileDependencyDict.values():
>+                DependencyFileSet.update(set(Dependency))
>+
>+        # Caculate all above dependency files hash
>+        # Initialze hash object
>+        FileList = []
>+        m = hashlib.md5()
>+        for File in sorted(DependencyFileSet, key=lambda x: str(x)):
>+            if not os.path.exists(str(File)):
>+                EdkLogger.quiet("[cache warning]: header file %s is missing for
>module: %s[%s]" % (File, self.MetaFile.Path, self.Arch))
>+                continue
>+            f = open(str(File), 'rb')
>+            Content = f.read()
>+            f.close()
>+            m.update(Content)
>+            FileList.append((str(File), hashlib.md5(Content).hexdigest()))
>+
>+
>+        MewIR = ModuleBuildCacheIR(self.MetaFile.Path, self.Arch)
>+        MewIR.ModuleFilesHashDigest = m.digest()
>+        MewIR.ModuleFilesHashHexDigest = m.hexdigest()
>+        MewIR.ModuleFilesChain = FileList
>+        with GlobalData.file_lock:
>+            try:
>+                IR = gDict[(self.MetaFile.Path, self.Arch)]
>+                IR.ModuleFilesHashDigest = m.digest()
>+                IR.ModuleFilesHashHexDigest = m.hexdigest()
>+                IR.ModuleFilesChain = FileList
>+                gDict[(self.MetaFile.Path, self.Arch)] = IR
>+            except:
>+                gDict[(self.MetaFile.Path, self.Arch)] = MewIR
>+
>+        return gDict[(self.MetaFile.Path, self.Arch)]
>+
>+    def GenPreMakefileHash(self, gDict):
>+        # Early exit if module or library has been hashed and is in memory
>+        if (self.MetaFile.Path, self.Arch) in gDict and \
>+          gDict[(self.MetaFile.Path, self.Arch)].PreMakefileHashHexDigest:
>+            return gDict[(self.MetaFile.Path, self.Arch)]
>+
>+        # skip binary module
>+        if self.IsBinaryModule:
>+            return
>+
>+        if not (self.MetaFile.Path, self.Arch) in gDict or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDigest:
>+            self.GenModuleFilesHash(gDict)
>+
>+        if not (self.MetaFile.Path, self.Arch) in gDict or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDigest:
>+           EdkLogger.quiet("[cache warning]: Cannot generate
>ModuleFilesHashDigest for module %s[%s]" %(self.MetaFile.Path, self.Arch))
>+           return
>+
>+        # Initialze hash object
>+        m = hashlib.md5()
>+
>+        # Add Platform level hash
>+        if ('PlatformHash') in gDict:
>+            m.update(gDict[('PlatformHash')].encode('utf-8'))
>+        else:
>+            EdkLogger.quiet("[cache warning]: PlatformHash is missing")
>+
>+        # Add Package level hash
>+        if self.DependentPackageList:
>+            for Pkg in sorted(self.DependentPackageList, key=lambda x:
>x.PackageName):
>+                if (Pkg.PackageName, 'PackageHash') in gDict:
>+                    m.update(gDict[(Pkg.PackageName, 'PackageHash')].encode('utf-
>8'))
>+                else:
>+                    EdkLogger.quiet("[cache warning]: %s PackageHash needed by
>%s[%s] is missing" %(Pkg.PackageName, self.MetaFile.Name, self.Arch))
>+
>+        # Add Library hash
>+        if self.LibraryAutoGenList:
>+            for Lib in sorted(self.LibraryAutoGenList, key=lambda x: x.Name):
>+                if not (Lib.MetaFile.Path, Lib.Arch) in gDict or \
>+                   not gDict[(Lib.MetaFile.Path, Lib.Arch)].ModuleFilesHashDigest:
>+                    Lib.GenPreMakefileHash(gDict)
>+                m.update(gDict[(Lib.MetaFile.Path,
>Lib.Arch)].ModuleFilesHashDigest)
>+
>+        # Add Module self
>+        m.update(gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDigest)
>+
>+        with GlobalData.file_lock:
>+            IR = gDict[(self.MetaFile.Path, self.Arch)]
>+            IR.PreMakefileHashHexDigest = m.hexdigest()
>+            gDict[(self.MetaFile.Path, self.Arch)] = IR
>+
>+        return gDict[(self.MetaFile.Path, self.Arch)]
>+
>+    def GenMakeHeaderFilesHash(self, gDict):
>+        # Early exit if module or library has been hashed and is in memory
>+        if (self.MetaFile.Path, self.Arch) in gDict and \
>+          gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHashDigest:
>+            return gDict[(self.MetaFile.Path, self.Arch)]
>+
>+        # skip binary module
>+        if self.IsBinaryModule:
>+            return
>+
>+        if not (self.MetaFile.Path, self.Arch) in gDict or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].CreateCodeFileDone:
>+            if self.IsLibrary:
>+                if (self.MetaFile.File,self.MetaFile.Root,self.Arch,self.MetaFile.Path)
>in GlobalData.libConstPcd:
>+                    self.ConstPcd =
>GlobalData.libConstPcd[(self.MetaFile.File,self.MetaFile.Root,self.Arch,self.M
>etaFile.Path)]
>+                if (self.MetaFile.File,self.MetaFile.Root,self.Arch,self.MetaFile.Path)
>in GlobalData.Refes:
>+                    self.ReferenceModules =
>GlobalData.Refes[(self.MetaFile.File,self.MetaFile.Root,self.Arch,self.MetaFil
>e.Path)]
>+            self.CreateCodeFile()
>+        if not (self.MetaFile.Path, self.Arch) in gDict or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].CreateMakeFileDone:
>+
>self.CreateMakeFile(GenFfsList=GlobalData.FfsCmd.get((self.MetaFile.File,
>self.Arch),[]))
>+
>+        if not (self.MetaFile.Path, self.Arch) in gDict or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].CreateCodeFileDone or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].CreateMakeFileDone:
>+           EdkLogger.quiet("[cache warning]: Cannot create CodeFile or Makefile
>for module %s[%s]" %(self.MetaFile.Path, self.Arch))
>+           return
>+
>+        DependencyFileSet = set()
>+        # Add Makefile
>+        if gDict[(self.MetaFile.Path, self.Arch)].MakefilePath:
>+            DependencyFileSet.add(gDict[(self.MetaFile.Path,
>self.Arch)].MakefilePath)
>+        else:
>+            EdkLogger.quiet("[cache warning]: makefile is missing for module
>%s[%s]" %(self.MetaFile.Path, self.Arch))
>+
>+        # Add header files
>+        if gDict[(self.MetaFile.Path, self.Arch)].DependencyHeaderFileSet:
>+            for File in gDict[(self.MetaFile.Path,
>self.Arch)].DependencyHeaderFileSet:
>+                DependencyFileSet.add(File)
>+        else:
>+            EdkLogger.quiet("[cache warning]: No dependency header found for
>module %s[%s]" %(self.MetaFile.Path, self.Arch))
>+
>+        # Add AutoGen files
>+        if self.AutoGenFileList:
>+            for File in set(self.AutoGenFileList):
>+                DependencyFileSet.add(File)
>+
>+        # Caculate all above dependency files hash
>+        # Initialze hash object
>+        FileList = []
>+        m = hashlib.md5()
>+        for File in sorted(DependencyFileSet, key=lambda x: str(x)):
>+            if not os.path.exists(str(File)):
>+                EdkLogger.quiet("[cache warning]: header file: %s doesn't exist for
>module: %s[%s]" % (File, self.MetaFile.Path, self.Arch))
>+                continue
>+            f = open(str(File), 'rb')
>+            Content = f.read()
>+            f.close()
>+            m.update(Content)
>+            FileList.append((str(File), hashlib.md5(Content).hexdigest()))
>+
>+        with GlobalData.file_lock:
>+            IR = gDict[(self.MetaFile.Path, self.Arch)]
>+            IR.AutoGenFileList = self.AutoGenFileList.keys()
>+            IR.MakeHeaderFilesHashChain = FileList
>+            IR.MakeHeaderFilesHashDigest = m.digest()
>+            gDict[(self.MetaFile.Path, self.Arch)] = IR
>+
>+        return gDict[(self.MetaFile.Path, self.Arch)]
>+
>+    def GenMakeHash(self, gDict):
>+        # Early exit if module or library has been hashed and is in memory
>+        if (self.MetaFile.Path, self.Arch) in gDict and \
>+          gDict[(self.MetaFile.Path, self.Arch)].MakeHashChain:
>+            return gDict[(self.MetaFile.Path, self.Arch)]
>+
>+        # skip binary module
>+        if self.IsBinaryModule:
>+            return
>+
>+        if not (self.MetaFile.Path, self.Arch) in gDict or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDigest:
>+            self.GenModuleFilesHash(gDict)
>+        if not gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHashDigest:
>+            self.GenMakeHeaderFilesHash(gDict)
>+
>+        if not (self.MetaFile.Path, self.Arch) in gDict or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDigest or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesChain or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHashDigest
>or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHashChain:
>+           EdkLogger.quiet("[cache warning]: Cannot generate ModuleFilesHash
>or MakeHeaderFilesHash for module %s[%s]" %(self.MetaFile.Path,
>self.Arch))
>+           return
>+
>+        # Initialze hash object
>+        m = hashlib.md5()
>+        MakeHashChain = []
>+
>+        # Add hash of makefile and dependency header files
>+        m.update(gDict[(self.MetaFile.Path,
>self.Arch)].MakeHeaderFilesHashDigest)
>+        New = list(set(gDict[(self.MetaFile.Path,
>self.Arch)].MakeHeaderFilesHashChain) - set(MakeHashChain))
>+        New.sort(key=lambda x: str(x))
>+        MakeHashChain += New
>+
>+        # Add Library hash
>+        if self.LibraryAutoGenList:
>+            for Lib in sorted(self.LibraryAutoGenList, key=lambda x: x.Name):
>+                if not (Lib.MetaFile.Path, Lib.Arch) in gDict or \
>+                   not gDict[(Lib.MetaFile.Path, Lib.Arch)].MakeHashChain:
>+                    Lib.GenMakeHash(gDict)
>+                if not gDict[(Lib.MetaFile.Path, Lib.Arch)].MakeHashDigest:
>+                    print("Cannot generate MakeHash for lib module:",
>Lib.MetaFile.Path, Lib.Arch)
>+                    continue
>+                m.update(gDict[(Lib.MetaFile.Path, Lib.Arch)].MakeHashDigest)
>+                New = list(set(gDict[(Lib.MetaFile.Path, Lib.Arch)].MakeHashChain) -
>set(MakeHashChain))
>+                New.sort(key=lambda x: str(x))
>+                MakeHashChain += New
>+
>+        # Add Module self
>+        m.update(gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDigest)
>+        New = list(set(gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesChain) -
>set(MakeHashChain))
>+        New.sort(key=lambda x: str(x))
>+        MakeHashChain += New
>+
>+        with GlobalData.file_lock:
>+            IR = gDict[(self.MetaFile.Path, self.Arch)]
>+            IR.MakeHashDigest = m.digest()
>+            IR.MakeHashHexDigest = m.hexdigest()
>+            IR.MakeHashChain = MakeHashChain
>+            gDict[(self.MetaFile.Path, self.Arch)] = IR
>+
>+        return gDict[(self.MetaFile.Path, self.Arch)]
>+
>+    ## Decide whether we can skip the left autogen and make process
>+    def CanSkipbyPreMakefileCache(self, gDict):
>+        if not GlobalData.gBinCacheSource:
>+            return False
>+
>+        # If Module is binary, do not skip by cache
>+        if self.IsBinaryModule:
>+            return False
>+
>+        # .inc is contains binary information so do not skip by hash as well
>+        for f_ext in self.SourceFileList:
>+            if '.inc' in str(f_ext):
>+                return False
>+
>+        # Get the module hash values from stored cache and currrent build
>+        # then check whether cache hit based on the hash values
>+        # if cache hit, restore all the files from cache
>+        FileDir = path.join(GlobalData.gBinCacheSource,
>self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch,
>self.SourceDir, self.MetaFile.BaseName)
>+        FfsDir = path.join(GlobalData.gBinCacheSource,
>self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain,
>TAB_FV_DIRECTORY, "Ffs", self.Guid + self.Name)
>+
>+        ModuleHashPairList = [] # tuple list: [tuple(PreMakefileHash,
>MakeHash)]
>+        ModuleHashPair = path.join(FileDir, self.Name + ".ModuleHashPair")
>+        if not os.path.exists(ModuleHashPair):
>+            EdkLogger.quiet("[cache warning]: Cannot find ModuleHashPair file:
>%s" % ModuleHashPair)
>+            return False
>+
>+        try:
>+            f = open(ModuleHashPair, 'r')
>+            ModuleHashPairList = json.load(f)
>+            f.close()
>+        except:
>+            EdkLogger.quiet("[cache warning]: fail to load ModuleHashPair file:
>%s" % ModuleHashPair)
>+            return False
>+
>+        self.GenPreMakefileHash(gDict)
>+        if not (self.MetaFile.Path, self.Arch) in gDict or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].PreMakefileHashHexDigest:
>+            EdkLogger.quiet("[cache warning]: PreMakefileHashHexDigest is
>missing for module %s[%s]" %(self.MetaFile.Path, self.Arch))
>+            return False
>+
>+        MakeHashStr = None
>+        CurrentPreMakeHash = gDict[(self.MetaFile.Path,
>self.Arch)].PreMakefileHashHexDigest
>+        for idx, (PreMakefileHash, MakeHash) in enumerate
>(ModuleHashPairList):
>+            if PreMakefileHash == CurrentPreMakeHash:
>+                MakeHashStr = str(MakeHash)
>+
>+        if not MakeHashStr:
>+            return False
>+
>+        TargetHashDir = path.join(FileDir, MakeHashStr)
>+        TargetFfsHashDir = path.join(FfsDir, MakeHashStr)
>+
>+        if not os.path.exists(TargetHashDir):
>+            EdkLogger.quiet("[cache warning]: Cache folder is missing: %s" %
>TargetHashDir)
>+            return False
>+
>+        for root, dir, files in os.walk(TargetHashDir):
>+            for f in files:
>+                File = path.join(root, f)
>+                self.CacheCopyFile(self.OutputDir, TargetHashDir, File)
>+        if os.path.exists(TargetFfsHashDir):
>+            for root, dir, files in os.walk(TargetFfsHashDir):
>+                for f in files:
>+                    File = path.join(root, f)
>+                    self.CacheCopyFile(self.FfsOutputDir, TargetFfsHashDir, File)
>+
>+        if self.Name == "PcdPeim" or self.Name == "PcdDxe":
>+            CreatePcdDatabaseCode(self, TemplateString(), TemplateString())
>+
>+        with GlobalData.file_lock:
>+            IR = gDict[(self.MetaFile.Path, self.Arch)]
>+            IR.PreMakeCacheHit = True
>+            gDict[(self.MetaFile.Path, self.Arch)] = IR
>+        print("[cache hit]: checkpoint_PreMakefile:", self.MetaFile.Path,
>self.Arch)
>+        #EdkLogger.quiet("cache hit: %s[%s]" % (self.MetaFile.Path, self.Arch))
>+        return True
>+
>+    ## Decide whether we can skip the make process
>+    def CanSkipbyMakeCache(self, gDict):
>+        if not GlobalData.gBinCacheSource:
>+            return False
>+
>+        # If Module is binary, do not skip by cache
>+        if self.IsBinaryModule:
>+            print("[cache miss]: checkpoint_Makefile: binary module:",
>self.MetaFile.Path, self.Arch)
>+            return False
>+
>+        # .inc is contains binary information so do not skip by hash as well
>+        for f_ext in self.SourceFileList:
>+            if '.inc' in str(f_ext):
>+                with GlobalData.file_lock:
>+                    IR = gDict[(self.MetaFile.Path, self.Arch)]
>+                    IR.MakeCacheHit = False
>+                    gDict[(self.MetaFile.Path, self.Arch)] = IR
>+                print("[cache miss]: checkpoint_Makefile: .inc module:",
>self.MetaFile.Path, self.Arch)
>+                return False
>+
>+        # Get the module hash values from stored cache and currrent build
>+        # then check whether cache hit based on the hash values
>+        # if cache hit, restore all the files from cache
>+        FileDir = path.join(GlobalData.gBinCacheSource,
>self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch,
>self.SourceDir, self.MetaFile.BaseName)
>+        FfsDir = path.join(GlobalData.gBinCacheSource,
>self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain,
>TAB_FV_DIRECTORY, "Ffs", self.Guid + self.Name)
>+
>+        ModuleHashPairList = [] # tuple list: [tuple(PreMakefileHash,
>MakeHash)]
>+        ModuleHashPair = path.join(FileDir, self.Name + ".ModuleHashPair")
>+        if not os.path.exists(ModuleHashPair):
>+            EdkLogger.quiet("[cache warning]: Cannot find ModuleHashPair file:
>%s" % ModuleHashPair)
>+            return False
>+
>+        try:
>+            f = open(ModuleHashPair, 'r')
>+            ModuleHashPairList = json.load(f)
>+            f.close()
>+        except:
>+            EdkLogger.quiet("[cache warning]: fail to load ModuleHashPair file:
>%s" % ModuleHashPair)
>+            return False
>+
>+        self.GenMakeHash(gDict)
>+        if not (self.MetaFile.Path, self.Arch) in gDict or \
>+           not gDict[(self.MetaFile.Path, self.Arch)].MakeHashHexDigest:
>+            EdkLogger.quiet("[cache warning]: MakeHashHexDigest is missing for
>module %s[%s]" %(self.MetaFile.Path, self.Arch))
>+            return False
>+
>+        MakeHashStr = None
>+        CurrentMakeHash = gDict[(self.MetaFile.Path,
>self.Arch)].MakeHashHexDigest
>+        for idx, (PreMakefileHash, MakeHash) in enumerate
>(ModuleHashPairList):
>+            if MakeHash == CurrentMakeHash:
>+                MakeHashStr = str(MakeHash)
>+
>+        if not MakeHashStr:
>+            print("[cache miss]: checkpoint_Makefile:", self.MetaFile.Path,
>self.Arch)
>+            return False
>+
>+        TargetHashDir = path.join(FileDir, MakeHashStr)
>+        TargetFfsHashDir = path.join(FfsDir, MakeHashStr)
>+        if not os.path.exists(TargetHashDir):
>+            EdkLogger.quiet("[cache warning]: Cache folder is missing: %s" %
>TargetHashDir)
>+            return False
>+
>+        for root, dir, files in os.walk(TargetHashDir):
>+            for f in files:
>+                File = path.join(root, f)
>+                self.CacheCopyFile(self.OutputDir, TargetHashDir, File)
>+
>+        if os.path.exists(TargetFfsHashDir):
>+            for root, dir, files in os.walk(TargetFfsHashDir):
>+                for f in files:
>+                    File = path.join(root, f)
>+                    self.CacheCopyFile(self.FfsOutputDir, TargetFfsHashDir, File)
>+
>+        if self.Name == "PcdPeim" or self.Name == "PcdDxe":
>+            CreatePcdDatabaseCode(self, TemplateString(), TemplateString())
>+        with GlobalData.file_lock:
>+            IR = gDict[(self.MetaFile.Path, self.Arch)]
>+            IR.MakeCacheHit = True
>+            gDict[(self.MetaFile.Path, self.Arch)] = IR
>+        print("[cache hit]: checkpoint_Makefile:", self.MetaFile.Path, self.Arch)
>+        return True
>+
>     ## Decide whether we can skip the ModuleAutoGen process
>-    def CanSkipbyHash(self):
>+    def CanSkipbyCache(self, gDict):
>         # Hashing feature is off
>-        if not GlobalData.gUseHashCache:
>+        if not GlobalData.gBinCacheSource:
>             return False
>
>-        # Initialize a dictionary for each arch type
>-        if self.Arch not in GlobalData.gBuildHashSkipTracking:
>-            GlobalData.gBuildHashSkipTracking[self.Arch] = dict()
>+        if self in GlobalData.gBuildHashSkipTracking:
>+            return GlobalData.gBuildHashSkipTracking[self]
>
>         # If library or Module is binary do not skip by hash
>         if self.IsBinaryModule:
>+            GlobalData.gBuildHashSkipTracking[self] = False
>             return False
>
>         # .inc is contains binary information so do not skip by hash as well
>         for f_ext in self.SourceFileList:
>             if '.inc' in str(f_ext):
>+                GlobalData.gBuildHashSkipTracking[self] = False
>                 return False
>
>-        # Use Cache, if exists and if Module has a copy in cache
>-        if GlobalData.gBinCacheSource and self.AttemptModuleCacheCopy():
>+        if not (self.MetaFile.Path, self.Arch) in gDict:
>+            return False
>+
>+        if gDict[(self.MetaFile.Path, self.Arch)].PreMakeCacheHit:
>+            GlobalData.gBuildHashSkipTracking[self] = True
>             return True
>
>-        # Early exit for libraries that haven't yet finished building
>-        HashFile = path.join(self.BuildDir, self.Name + ".hash")
>-        if self.IsLibrary and not os.path.exists(HashFile):
>-            return False
>+        if gDict[(self.MetaFile.Path, self.Arch)].MakeCacheHit:
>+            GlobalData.gBuildHashSkipTracking[self] = True
>+            return True
>
>-        # Return a Boolean based on if can skip by hash, either from memory or
>from IO.
>-        if self.Name not in GlobalData.gBuildHashSkipTracking[self.Arch]:
>-            # If hashes are the same, SaveFileOnChange() will return False.
>-            GlobalData.gBuildHashSkipTracking[self.Arch][self.Name] = not
>SaveFileOnChange(HashFile, self.GenModuleHash(), True)
>-            return GlobalData.gBuildHashSkipTracking[self.Arch][self.Name]
>-        else:
>-            return GlobalData.gBuildHashSkipTracking[self.Arch][self.Name]
>+        return False
>
>     ## Decide whether we can skip the ModuleAutoGen process
>     #  If any source file is newer than the module than we cannot skip
>     #
>     def CanSkip(self):
>+        # Don't skip if cache feature enabled
>+        if GlobalData.gUseHashCache or GlobalData.gBinCacheDest or
>GlobalData.gBinCacheSource:
>+            return False
>         if self.MakeFileDir in GlobalData.gSikpAutoGenCache:
>             return True
>         if not os.path.exists(self.TimeStampPath):
>diff --git a/BaseTools/Source/Python/Common/GlobalData.py
>b/BaseTools/Source/Python/Common/GlobalData.py
>old mode 100644
>new mode 100755
>index bd45a43728..452dca32f0
>--- a/BaseTools/Source/Python/Common/GlobalData.py
>+++ b/BaseTools/Source/Python/Common/GlobalData.py
>@@ -119,3 +119,12 @@ gModuleBuildTracking = dict()
> # Top Dict:     Key: Arch Type              Value: Dictionary
> # Second Dict:  Key: Module\Library Name    Value: True\False
> gBuildHashSkipTracking = dict()
>+
>+# Common dictionary to share module cache intermediate result and state
>+gCacheIR = None
>+# Common lock for the file access in multiple process AutoGens
>+file_lock = None
>+# Common dictionary to share platform libraries' constant Pcd
>+libConstPcd = None
>+# Common dictionary to share platform libraries' reference info
>+Refes = None
>\ No newline at end of file
>diff --git a/BaseTools/Source/Python/build/build.py
>b/BaseTools/Source/Python/build/build.py
>old mode 100644
>new mode 100755
>index 4bfa54666b..d7c817b95c
>--- a/BaseTools/Source/Python/build/build.py
>+++ b/BaseTools/Source/Python/build/build.py
>@@ -595,7 +595,7 @@ class BuildTask:
>     #
>     def AddDependency(self, Dependency):
>         for Dep in Dependency:
>-            if not Dep.BuildObject.IsBinaryModule and not
>Dep.BuildObject.CanSkipbyHash():
>+            if not Dep.BuildObject.IsBinaryModule and not
>Dep.BuildObject.CanSkipbyCache(GlobalData.gCacheIR):
>                 self.DependencyList.append(BuildTask.New(Dep))    # BuildTask list
>
>     ## The thread wrapper of LaunchCommand function
>@@ -811,7 +811,7 @@ class Build():
>         self.AutoGenMgr = None
>         EdkLogger.info("")
>         os.chdir(self.WorkspaceDir)
>-        self.share_data = Manager().dict()
>+        GlobalData.gCacheIR = Manager().dict()
>         self.log_q = log_q
>     def StartAutoGen(self,mqueue,
>DataPipe,SkipAutoGen,PcdMaList,share_data):
>         try:
>@@ -820,6 +820,13 @@ class Build():
>             feedback_q = mp.Queue()
>             file_lock = mp.Lock()
>             error_event = mp.Event()
>+            GlobalData.file_lock = file_lock
>+            FfsCmd = DataPipe.Get("FfsCommand")
>+            if FfsCmd is None:
>+                FfsCmd = {}
>+            GlobalData.FfsCmd = FfsCmd
>+            GlobalData.libConstPcd = DataPipe.Get("LibConstPcd")
>+            GlobalData.Refes = DataPipe.Get("REFS")
>             auto_workers =
>[AutoGenWorkerInProcess(mqueue,DataPipe.dump_file,feedback_q,file_loc
>k,share_data,self.log_q,error_event) for _ in range(self.ThreadNumber)]
>             self.AutoGenMgr =
>AutoGenManager(auto_workers,feedback_q,error_event)
>             self.AutoGenMgr.start()
>@@ -827,14 +834,28 @@ class Build():
>                 w.start()
>             if PcdMaList is not None:
>                 for PcdMa in PcdMaList:
>+                    if GlobalData.gBinCacheSource and self.Target in [None, "", "all"]:
>+                        PcdMa.GenModuleFilesHash(share_data)
>+                        PcdMa.GenPreMakefileHash(share_data)
>+                        if PcdMa.CanSkipbyPreMakefileCache(share_data):
>+                           continue
>+
>                     PcdMa.CreateCodeFile(False)
>                     PcdMa.CreateMakeFile(False,GenFfsList =
>DataPipe.Get("FfsCommand").get((PcdMa.MetaFile.File, PcdMa.Arch),[]))
>
>+                    if GlobalData.gBinCacheSource and self.Target in [None, "", "all"]:
>+                        PcdMa.GenMakeHeaderFilesHash(share_data)
>+                        PcdMa.GenMakeHash(share_data)
>+                        if PcdMa.CanSkipbyMakeCache(share_data):
>+                            continue
>+
>             self.AutoGenMgr.join()
>             rt = self.AutoGenMgr.Status
>             return rt, 0
>-        except Exception as e:
>-            return False,e.errcode
>+        except FatalError as e:
>+            return False, e.args[0]
>+        except:
>+            return False, UNKNOWN_ERROR
>
>     ## Load configuration
>     #
>@@ -1199,10 +1220,11 @@ class Build():
>                 mqueue.put(m)
>
>             AutoGenObject.DataPipe.DataContainer =
>{"FfsCommand":FfsCommand}
>+            AutoGenObject.DataPipe.DataContainer = {"CommandTarget":
>self.Target}
>             self.Progress.Start("Generating makefile and code")
>             data_pipe_file = os.path.join(AutoGenObject.BuildDir,
>"GlobalVar_%s_%s.bin" % (str(AutoGenObject.Guid),AutoGenObject.Arch))
>             AutoGenObject.DataPipe.dump(data_pipe_file)
>-            autogen_rt, errorcode = self.StartAutoGen(mqueue,
>AutoGenObject.DataPipe, self.SkipAutoGen, PcdMaList,self.share_data)
>+            autogen_rt,errorcode = self.StartAutoGen(mqueue,
>AutoGenObject.DataPipe, self.SkipAutoGen, PcdMaList,
>GlobalData.gCacheIR)
>             self.Progress.Stop("done!")
>             if not autogen_rt:
>                 self.AutoGenMgr.TerminateWorkers()
>@@ -1799,6 +1821,15 @@ class Build():
>                 CmdListDict = None
>                 if GlobalData.gEnableGenfdsMultiThread and self.Fdf:
>                     CmdListDict = self._GenFfsCmd(Wa.ArchList)
>+
>+                # Add Platform and Package level hash in share_data for module
>hash calculation later
>+                if GlobalData.gBinCacheSource or GlobalData.gBinCacheDest:
>+                    GlobalData.gCacheIR[('PlatformHash')] =
>GlobalData.gPlatformHash
>+                    for PkgName in GlobalData.gPackageHash.keys():
>+                        GlobalData.gCacheIR[(PkgName, 'PackageHash')] =
>GlobalData.gPackageHash[PkgName]
>+                GlobalData.file_lock = mp.Lock()
>+                GlobalData.FfsCmd = CmdListDict
>+
>                 self.Progress.Stop("done!")
>                 MaList = []
>                 ExitFlag = threading.Event()
>@@ -1808,20 +1839,23 @@ class Build():
>                     AutoGenStart = time.time()
>                     GlobalData.gGlobalDefines['ARCH'] = Arch
>                     Pa = PlatformAutoGen(Wa, self.PlatformFile, BuildTarget,
>ToolChain, Arch)
>+                    GlobalData.libConstPcd = Pa.DataPipe.Get("LibConstPcd")
>+                    GlobalData.Refes = Pa.DataPipe.Get("REFS")
>                     for Module in Pa.Platform.Modules:
>                         if self.ModuleFile.Dir == Module.Dir and self.ModuleFile.Name
>== Module.Name:
>                             Ma = ModuleAutoGen(Wa, Module, BuildTarget, ToolChain,
>Arch, self.PlatformFile,Pa.DataPipe)
>                             if Ma is None:
>                                 continue
>                             MaList.append(Ma)
>-                            if Ma.CanSkipbyHash():
>-                                self.HashSkipModules.append(Ma)
>-                                if GlobalData.gBinCacheSource:
>-                                    EdkLogger.quiet("cache hit: %s[%s]" % (Ma.MetaFile.Path,
>Ma.Arch))
>-                                continue
>-                            else:
>-                                if GlobalData.gBinCacheSource:
>-                                    EdkLogger.quiet("cache miss: %s[%s]" %
>(Ma.MetaFile.Path, Ma.Arch))
>+
>+                            if GlobalData.gBinCacheSource and self.Target in [None, "",
>"all"]:
>+                                Ma.GenModuleFilesHash(GlobalData.gCacheIR)
>+                                Ma.GenPreMakefileHash(GlobalData.gCacheIR)
>+                                if Ma.CanSkipbyPreMakefileCache(GlobalData.gCacheIR):
>+                                   self.HashSkipModules.append(Ma)
>+                                   EdkLogger.quiet("cache hit: %s[%s]" % (Ma.MetaFile.Path,
>Ma.Arch))
>+                                   continue
>+
>                             # Not to auto-gen for targets 'clean', 'cleanlib', 'cleanall', 'run',
>'fds'
>                             if self.Target not in ['clean', 'cleanlib', 'cleanall', 'run', 'fds']:
>                                 # for target which must generate AutoGen code and makefile
>@@ -1841,6 +1875,18 @@ class Build():
>                                     self.Progress.Stop("done!")
>                                 if self.Target == "genmake":
>                                     return True
>+
>+                                if GlobalData.gBinCacheSource and self.Target in [None, "",
>"all"]:
>+                                    Ma.GenMakeHeaderFilesHash(GlobalData.gCacheIR)
>+                                    Ma.GenMakeHash(GlobalData.gCacheIR)
>+                                    if Ma.CanSkipbyMakeCache(GlobalData.gCacheIR):
>+                                        self.HashSkipModules.append(Ma)
>+                                        EdkLogger.quiet("cache hit: %s[%s]" %
>(Ma.MetaFile.Path, Ma.Arch))
>+                                        continue
>+                                    else:
>+                                        EdkLogger.quiet("cache miss: %s[%s]" %
>(Ma.MetaFile.Path, Ma.Arch))
>+                                        Ma.PrintFirstMakeCacheMissFile(GlobalData.gCacheIR)
>+
>                             self.BuildModules.append(Ma)
>                             # Initialize all modules in tracking to 'FAIL'
>                             if Ma.Arch not in GlobalData.gModuleBuildTracking:
>@@ -1985,11 +2031,18 @@ class Build():
>                 if GlobalData.gEnableGenfdsMultiThread and self.Fdf:
>                     CmdListDict = self._GenFfsCmd(Wa.ArchList)
>
>+                # Add Platform and Package level hash in share_data for module
>hash calculation later
>+                if GlobalData.gBinCacheSource or GlobalData.gBinCacheDest:
>+                    GlobalData.gCacheIR[('PlatformHash')] =
>GlobalData.gPlatformHash
>+                    for PkgName in GlobalData.gPackageHash.keys():
>+                        GlobalData.gCacheIR[(PkgName, 'PackageHash')] =
>GlobalData.gPackageHash[PkgName]
>+
>                 # multi-thread exit flag
>                 ExitFlag = threading.Event()
>                 ExitFlag.clear()
>                 self.AutoGenTime += int(round((time.time() -
>WorkspaceAutoGenTime)))
>                 self.BuildModules = []
>+                TotalModules = []
>                 for Arch in Wa.ArchList:
>                     PcdMaList    = []
>                     AutoGenStart = time.time()
>@@ -2009,6 +2062,7 @@ class Build():
>                             ModuleList.append(Inf)
>                     Pa.DataPipe.DataContainer = {"FfsCommand":CmdListDict}
>                     Pa.DataPipe.DataContainer = {"Workspace_timestamp":
>Wa._SrcTimeStamp}
>+                    Pa.DataPipe.DataContainer = {"CommandTarget": self.Target}
>                     for Module in ModuleList:
>                         # Get ModuleAutoGen object to generate C code file and
>makefile
>                         Ma = ModuleAutoGen(Wa, Module, BuildTarget, ToolChain, Arch,
>self.PlatformFile,Pa.DataPipe)
>@@ -2019,30 +2073,34 @@ class Build():
>                             Ma.PlatformInfo = Pa
>                             Ma.Workspace = Wa
>                             PcdMaList.append(Ma)
>-                        if Ma.CanSkipbyHash():
>-                            self.HashSkipModules.append(Ma)
>-                            if GlobalData.gBinCacheSource:
>-                                EdkLogger.quiet("cache hit: %s[%s]" % (Ma.MetaFile.Path,
>Ma.Arch))
>-                            continue
>-                        else:
>-                            if GlobalData.gBinCacheSource:
>-                                EdkLogger.quiet("cache miss: %s[%s]" % (Ma.MetaFile.Path,
>Ma.Arch))
>-
>-                        # Not to auto-gen for targets 'clean', 'cleanlib', 'cleanall', 'run',
>'fds'
>-                            # for target which must generate AutoGen code and makefile
>-
>-                        self.BuildModules.append(Ma)
>+                        TotalModules.append(Ma)
>                         # Initialize all modules in tracking to 'FAIL'
>                         if Ma.Arch not in GlobalData.gModuleBuildTracking:
>                             GlobalData.gModuleBuildTracking[Ma.Arch] = dict()
>                         if Ma not in GlobalData.gModuleBuildTracking[Ma.Arch]:
>                             GlobalData.gModuleBuildTracking[Ma.Arch][Ma] = 'FAIL'
>+
>                     mqueue = mp.Queue()
>                     for m in Pa.GetAllModuleInfo:
>                         mqueue.put(m)
>                     data_pipe_file = os.path.join(Pa.BuildDir, "GlobalVar_%s_%s.bin" %
>(str(Pa.Guid),Pa.Arch))
>                     Pa.DataPipe.dump(data_pipe_file)
>-                    autogen_rt, errorcode = self.StartAutoGen(mqueue, Pa.DataPipe,
>self.SkipAutoGen, PcdMaList,self.share_data)
>+                    autogen_rt, errorcode = self.StartAutoGen(mqueue, Pa.DataPipe,
>self.SkipAutoGen, PcdMaList,  GlobalData.gCacheIR)
>+
>+                    # Skip cache hit modules
>+                    if GlobalData.gBinCacheSource:
>+                        for Ma in TotalModules:
>+                            if (Ma.MetaFile.Path, Ma.Arch) in GlobalData.gCacheIR and \
>+                                GlobalData.gCacheIR[(Ma.MetaFile.Path,
>Ma.Arch)].PreMakeCacheHit:
>+                                    self.HashSkipModules.append(Ma)
>+                                    continue
>+                            if (Ma.MetaFile.Path, Ma.Arch) in GlobalData.gCacheIR and \
>+                                GlobalData.gCacheIR[(Ma.MetaFile.Path,
>Ma.Arch)].MakeCacheHit:
>+                                    self.HashSkipModules.append(Ma)
>+                                    continue
>+                            self.BuildModules.append(Ma)
>+                    else:
>+                        self.BuildModules.extend(TotalModules)
>
>                     if not autogen_rt:
>                         self.AutoGenMgr.TerminateWorkers()
>@@ -2050,9 +2108,24 @@ class Build():
>                         raise FatalError(errorcode)
>                 self.AutoGenTime += int(round((time.time() - AutoGenStart)))
>                 self.Progress.Stop("done!")
>+
>+                if GlobalData.gBinCacheSource:
>+                    EdkLogger.quiet("Total cache hit driver num: %s, cache miss driver
>num: %s" % (len(set(self.HashSkipModules)), len(set(self.BuildModules))))
>+                    CacheHitMa = set()
>+                    CacheNotHitMa = set()
>+                    for IR in GlobalData.gCacheIR.keys():
>+                        if 'PlatformHash' in IR or 'PackageHash' in IR:
>+                            continue
>+                        if GlobalData.gCacheIR[IR].PreMakeCacheHit or
>GlobalData.gCacheIR[IR].MakeCacheHit:
>+                            CacheHitMa.add(IR)
>+                        else:
>+                            # There might be binary module or module which has .inc files,
>not count for cache miss
>+                            CacheNotHitMa.add(IR)
>+                    EdkLogger.quiet("Total module num: %s, cache hit module num:
>%s" % (len(CacheHitMa)+len(CacheNotHitMa), len(CacheHitMa)))
>+
>                 for Arch in Wa.ArchList:
>                     MakeStart = time.time()
>-                    for Ma in self.BuildModules:
>+                    for Ma in set(self.BuildModules):
>                         # Generate build task for the module
>                         if not Ma.IsBinaryModule:
>                             Bt = BuildTask.New(ModuleMakeUnit(Ma,
>Pa.BuildCommand,self.Target))
>--
>2.17.1


  reply	other threads:[~2019-08-14 18:33 UTC|newest]

Thread overview: 7+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2019-08-14 18:11 [PATCH v4 0/5] Build cache enhancement Steven Shi
2019-08-14 18:11 ` [PATCH v4 1/5] BaseTools: Improve the cache hit in the edk2 build cache Steven Shi
2019-08-14 18:33   ` Christian Rodriguez [this message]
2019-08-14 18:11 ` [PATCH v4 2/5] BaseTools: Print first cache missing file for build cachle Steven Shi
2019-08-14 18:11 ` [PATCH v4 3/5] BaseTools: Change the [Arch][Name] module key in Build cache Steven Shi
2019-08-14 18:11 ` [PATCH v4 4/5] BaseTools: Add GenFds multi-thread support in build cache Steven Shi
2019-08-14 18:11 ` [PATCH v4 5/5] BaseTools: Improve the file saving and copying reliability Steven Shi

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-list from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=3A7DCC9A944C6149BF832E1C9B718ABC01F9FDCC@ORSMSX114.amr.corp.intel.com \
    --to=devel@edk2.groups.io \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox