From mboxrd@z Thu Jan 1 00:00:00 1970 Authentication-Results: mx.groups.io; dkim=missing; spf=pass (domain: intel.com, ip: 134.134.136.126, mailfrom: christian.rodriguez@intel.com) Received: from mga18.intel.com (mga18.intel.com [134.134.136.126]) by groups.io with SMTP; Wed, 14 Aug 2019 11:33:50 -0700 X-Amp-Result: SKIPPED(no attachment in message) X-Amp-File-Uploaded: False Received: from orsmga004.jf.intel.com ([10.7.209.38]) by orsmga106.jf.intel.com with ESMTP/TLS/DHE-RSA-AES256-GCM-SHA384; 14 Aug 2019 11:33:49 -0700 X-ExtLoop1: 1 X-IronPort-AV: E=Sophos;i="5.64,386,1559545200"; d="scan'208";a="328144090" Received: from orsmsx101.amr.corp.intel.com ([10.22.225.128]) by orsmga004.jf.intel.com with ESMTP; 14 Aug 2019 11:33:49 -0700 Received: from orsmsx111.amr.corp.intel.com (10.22.240.12) by ORSMSX101.amr.corp.intel.com (10.22.225.128) with Microsoft SMTP Server (TLS) id 14.3.439.0; Wed, 14 Aug 2019 11:33:48 -0700 Received: from orsmsx114.amr.corp.intel.com ([169.254.8.96]) by ORSMSX111.amr.corp.intel.com ([169.254.12.226]) with mapi id 14.03.0439.000; Wed, 14 Aug 2019 11:33:48 -0700 From: "Christian Rodriguez" To: "Shi, Steven" , "devel@edk2.groups.io" CC: "Gao, Liming" , "Feng, Bob C" , "Johnson, Michael" Subject: Re: [PATCH v4 1/5] BaseTools: Improve the cache hit in the edk2 build cache Thread-Topic: [PATCH v4 1/5] BaseTools: Improve the cache hit in the edk2 build cache Thread-Index: AQHVUsvGa63u8BfHD0u+eW7IrVxO5qb6945A Date: Wed, 14 Aug 2019 18:33:47 +0000 Message-ID: <3A7DCC9A944C6149BF832E1C9B718ABC01F9FDCC@ORSMSX114.amr.corp.intel.com> References: <20190814181130.8020-1-steven.shi@intel.com> <20190814181130.8020-2-steven.shi@intel.com> In-Reply-To: <20190814181130.8020-2-steven.shi@intel.com> Accept-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: x-ctpclassification: CTP_NT x-titus-metadata-40: eyJDYXRlZ29yeUxhYmVscyI6IiIsIk1ldGFkYXRhIjp7Im5zIjoiaHR0cDpcL1wvd3d3LnRpdHVzLmNvbVwvbnNcL0ludGVsMyIsImlkIjoiMzhiNTk5ZTgtYWM1Ny00MWM5LWEyYTAtYzBkODE5OTdjZWYxIiwicHJvcHMiOlt7Im4iOiJDVFBDbGFzc2lmaWNhdGlvbiIsInZhbHMiOlt7InZhbHVlIjoiQ1RQX05UIn1dfV19LCJTdWJqZWN0TGFiZWxzIjpbXSwiVE1DVmVyc2lvbiI6IjE3LjEwLjE4MDQuNDkiLCJUcnVzdGVkTGFiZWxIYXNoIjoidWhHNjhaT0NyUnVtT3YrRGR5MzlYaHF5Qm9nOUZDR1gydlpRSzBpV3ZMVU1LcjR6eXREdDRPMTZFVjdPZ1RGVyJ9 dlp-product: dlpe-windows dlp-version: 11.2.0.6 dlp-reaction: no-action x-originating-ip: [10.22.254.138] MIME-Version: 1.0 Return-Path: christian.rodriguez@intel.com Content-Language: en-US Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable For all 5 patches in patch set: Acked-by: Christian Rodriguez >-----Original Message----- >From: Shi, Steven >Sent: Wednesday, August 14, 2019 11:11 AM >To: devel@edk2.groups.io >Cc: Gao, Liming ; Feng, Bob C >; Rodriguez, Christian >; Johnson, Michael >; Shi, Steven >Subject: [PATCH v4 1/5] BaseTools: Improve the cache hit in the edk2 build >cache > >From: "Shi, Steven" > >BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=3D1927 > >Current cache hash algorithm does not parse and generate >the makefile to get the accurate dependency files for a >module. It instead use the platform and package meta files >to get the module depenedency in a quick but over approximate >way. These meta files are monolithic and involve many redundant >dependency for the module, which cause the module build >cache miss easily. >This patch introduces one more cache checkpoint and a new >hash algorithm besides the current quick one. The new hash >algorithm leverages the module makefile to achieve more >accurate and precise dependency info for a module. When >the build cache miss with the first quick hash, the >Basetool will caculate new one after makefile is generated >and then check again. > >Cc: Liming Gao >Cc: Bob Feng >Signed-off-by: Steven Shi >--- > BaseTools/Source/Python/AutoGen/AutoGenWorker.py | 21 >+++++++++++++++++++++ > BaseTools/Source/Python/AutoGen/CacheIR.py | 28 >++++++++++++++++++++++++++++ > BaseTools/Source/Python/AutoGen/DataPipe.py | 8 ++++++++ > BaseTools/Source/Python/AutoGen/GenMake.py | 223 >+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ >+++++++++++++++++++++++++++++++++++++++++++++++++++++++------- >--------------------------------------------------------------------------= --------------------- >------- > BaseTools/Source/Python/AutoGen/ModuleAutoGen.py | 639 >+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ >+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ >+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ >+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ >+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ >+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ >+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ >+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ >+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ >+++++++++++++++++++++++++++++++++++++++++++++++++++------------- >-------------------------------------------- > BaseTools/Source/Python/Common/GlobalData.py | 9 +++++++++ > BaseTools/Source/Python/build/build.py | 129 >+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ >++++++++++++++++++++++++++++++++++++++++++---------------------------- > 7 files changed, 863 insertions(+), 194 deletions(-) > >diff --git a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py >b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py >old mode 100644 >new mode 100755 >index e583828741..a84ed46f2e >--- a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py >+++ b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py >@@ -182,6 +182,12 @@ class AutoGenWorkerInProcess(mp.Process): > GlobalData.gDisableIncludePathCheck =3D False > GlobalData.gFdfParser =3D self.data_pipe.Get("FdfParser") > GlobalData.gDatabasePath =3D self.data_pipe.Get("DatabasePath= ") >+ GlobalData.gBinCacheSource =3D self.data_pipe.Get("BinCacheSo= urce") >+ GlobalData.gBinCacheDest =3D self.data_pipe.Get("BinCacheDest= ") >+ GlobalData.gCacheIR =3D self.data_pipe.Get("CacheIR") >+ GlobalData.gEnableGenfdsMultiThread =3D >self.data_pipe.Get("EnableGenfdsMultiThread") >+ GlobalData.file_lock =3D self.file_lock >+ CommandTarget =3D self.data_pipe.Get("CommandTarget") > pcd_from_build_option =3D [] > for pcd_tuple in self.data_pipe.Get("BuildOptPcd"): > pcd_id =3D ".".join((pcd_tuple[0],pcd_tuple[1])) >@@ -193,10 +199,13 @@ class AutoGenWorkerInProcess(mp.Process): > FfsCmd =3D self.data_pipe.Get("FfsCommand") > if FfsCmd is None: > FfsCmd =3D {} >+ GlobalData.FfsCmd =3D FfsCmd > PlatformMetaFile =3D >self.GetPlatformMetaFile(self.data_pipe.Get("P_Info").get("ActivePlatform"= ) >, > self.data_pipe.Get("P_Info")= .get("WorkspaceDir")) > libConstPcd =3D self.data_pipe.Get("LibConstPcd") > Refes =3D self.data_pipe.Get("REFS") >+ GlobalData.libConstPcd =3D libConstPcd >+ GlobalData.Refes =3D Refes > while True: > if self.module_queue.empty(): > break >@@ -223,8 +232,20 @@ class AutoGenWorkerInProcess(mp.Process): > Ma.ConstPcd =3D >libConstPcd[(Ma.MetaFile.File,Ma.MetaFile.Root,Ma.Arch,Ma.MetaFile.Path) >] > if (Ma.MetaFile.File,Ma.MetaFile.Root,Ma.Arch,Ma.Meta= File.Path) >in Refes: > Ma.ReferenceModules =3D >Refes[(Ma.MetaFile.File,Ma.MetaFile.Root,Ma.Arch,Ma.MetaFile.Path)] >+ if GlobalData.gBinCacheSource and CommandTarget in [None,= "", >"all"]: >+ Ma.GenModuleFilesHash(GlobalData.gCacheIR) >+ Ma.GenPreMakefileHash(GlobalData.gCacheIR) >+ if Ma.CanSkipbyPreMakefileCache(GlobalData.gCacheIR): >+ continue >+ > Ma.CreateCodeFile(False) > Ma.CreateMakeFile(False,GenFfsList=3DFfsCmd.get((Ma.MetaF= ile.File, >Ma.Arch),[])) >+ >+ if GlobalData.gBinCacheSource and CommandTarget in [None,= "", >"all"]: >+ Ma.GenMakeHeaderFilesHash(GlobalData.gCacheIR) >+ Ma.GenMakeHash(GlobalData.gCacheIR) >+ if Ma.CanSkipbyMakeCache(GlobalData.gCacheIR): >+ continue > except Empty: > pass > except: >diff --git a/BaseTools/Source/Python/AutoGen/CacheIR.py >b/BaseTools/Source/Python/AutoGen/CacheIR.py >new file mode 100755 >index 0000000000..2d9ffe3f0b >--- /dev/null >+++ b/BaseTools/Source/Python/AutoGen/CacheIR.py >@@ -0,0 +1,28 @@ >+## @file >+# Build cache intermediate result and state >+# >+# Copyright (c) 2019, Intel Corporation. All rights reserved.
>+# SPDX-License-Identifier: BSD-2-Clause-Patent >+# >+ >+class ModuleBuildCacheIR(): >+ def __init__(self, Path, Arch): >+ self.ModulePath =3D Path >+ self.ModuleArch =3D Arch >+ self.ModuleFilesHashDigest =3D None >+ self.ModuleFilesHashHexDigest =3D None >+ self.ModuleFilesChain =3D [] >+ self.PreMakefileHashHexDigest =3D None >+ self.CreateCodeFileDone =3D False >+ self.CreateMakeFileDone =3D False >+ self.MakefilePath =3D None >+ self.AutoGenFileList =3D None >+ self.DependencyHeaderFileSet =3D None >+ self.MakeHeaderFilesHashChain =3D None >+ self.MakeHeaderFilesHashDigest =3D None >+ self.MakeHeaderFilesHashChain =3D [] >+ self.MakeHashDigest =3D None >+ self.MakeHashHexDigest =3D None >+ self.MakeHashChain =3D [] >+ self.PreMakeCacheHit =3D False >+ self.MakeCacheHit =3D False >diff --git a/BaseTools/Source/Python/AutoGen/DataPipe.py >b/BaseTools/Source/Python/AutoGen/DataPipe.py >old mode 100644 >new mode 100755 >index 2052084bdb..84e77c301a >--- a/BaseTools/Source/Python/AutoGen/DataPipe.py >+++ b/BaseTools/Source/Python/AutoGen/DataPipe.py >@@ -158,3 +158,11 @@ class MemoryDataPipe(DataPipe): > self.DataContainer =3D {"FdfParser": True if GlobalData.gFdfParse= r else >False} > > self.DataContainer =3D {"LogLevel": EdkLogger.GetLevel()} >+ >+ self.DataContainer =3D {"BinCacheSource":GlobalData.gBinCacheSour= ce} >+ >+ self.DataContainer =3D {"BinCacheDest":GlobalData.gBinCacheDest} >+ >+ self.DataContainer =3D {"CacheIR":GlobalData.gCacheIR} >+ >+ self.DataContainer =3D >{"EnableGenfdsMultiThread":GlobalData.gEnableGenfdsMultiThread} >\ No newline at end of file >diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py >b/BaseTools/Source/Python/AutoGen/GenMake.py >old mode 100644 >new mode 100755 >index 499ef82aea..ce047e7f64 >--- a/BaseTools/Source/Python/AutoGen/GenMake.py >+++ b/BaseTools/Source/Python/AutoGen/GenMake.py >@@ -906,6 +906,11 @@ cleanlib: > self._AutoGenObject.IncludePathList + >self._AutoGenObject.BuildOptionIncPathList > ) > >+ self.DependencyHeaderFileSet =3D set() >+ if FileDependencyDict: >+ for Dependency in FileDependencyDict.values(): >+ self.DependencyHeaderFileSet.update(set(Dependency)) >+ > # Get a set of unique package includes from MetaFile > parentMetaFileIncludes =3D set() > for aInclude in self._AutoGenObject.PackageIncludePathList: >@@ -1115,7 +1120,7 @@ cleanlib: > ## For creating makefile targets for dependent libraries > def ProcessDependentLibrary(self): > for LibraryAutoGen in self._AutoGenObject.LibraryAutoGenList: >- if not LibraryAutoGen.IsBinaryModule and not >LibraryAutoGen.CanSkipbyHash(): >+ if not LibraryAutoGen.IsBinaryModule: > >self.LibraryBuildDirectoryList.append(self.PlaceMacro(LibraryAutoGen.Build= Di >r, self.Macros)) > > ## Return a list containing source file's dependencies >@@ -1129,114 +1134,9 @@ cleanlib: > def GetFileDependency(self, FileList, ForceInculeList, SearchPathList= ): > Dependency =3D {} > for F in FileList: >- Dependency[F] =3D self.GetDependencyList(F, ForceInculeList, >SearchPathList) >+ Dependency[F] =3D GetDependencyList(self._AutoGenObject, >self.FileCache, F, ForceInculeList, SearchPathList) > return Dependency > >- ## Find dependencies for one source file >- # >- # By searching recursively "#include" directive in file, find out al= l the >- # files needed by given source file. The dependencies will be only >searched >- # in given search path list. >- # >- # @param File The source file >- # @param ForceInculeList The list of files which will be inclu= ded forcely >- # @param SearchPathList The list of search path >- # >- # @retval list The list of files the given source fi= le depends on >- # >- def GetDependencyList(self, File, ForceList, SearchPathList): >- EdkLogger.debug(EdkLogger.DEBUG_1, "Try to get dependency files f= or >%s" % File) >- FileStack =3D [File] + ForceList >- DependencySet =3D set() >- >- if self._AutoGenObject.Arch not in gDependencyDatabase: >- gDependencyDatabase[self._AutoGenObject.Arch] =3D {} >- DepDb =3D gDependencyDatabase[self._AutoGenObject.Arch] >- >- while len(FileStack) > 0: >- F =3D FileStack.pop() >- >- FullPathDependList =3D [] >- if F in self.FileCache: >- for CacheFile in self.FileCache[F]: >- FullPathDependList.append(CacheFile) >- if CacheFile not in DependencySet: >- FileStack.append(CacheFile) >- DependencySet.update(FullPathDependList) >- continue >- >- CurrentFileDependencyList =3D [] >- if F in DepDb: >- CurrentFileDependencyList =3D DepDb[F] >- else: >- try: >- Fd =3D open(F.Path, 'rb') >- FileContent =3D Fd.read() >- Fd.close() >- except BaseException as X: >- EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData= =3DF.Path + >"\n\t" + str(X)) >- if len(FileContent) =3D=3D 0: >- continue >- try: >- if FileContent[0] =3D=3D 0xff or FileContent[0] =3D= =3D 0xfe: >- FileContent =3D FileContent.decode('utf-16') >- else: >- FileContent =3D FileContent.decode() >- except: >- # The file is not txt file. for example .mcb file >- continue >- IncludedFileList =3D gIncludePattern.findall(FileContent) >- >- for Inc in IncludedFileList: >- Inc =3D Inc.strip() >- # if there's macro used to reference header file, exp= and it >- HeaderList =3D gMacroPattern.findall(Inc) >- if len(HeaderList) =3D=3D 1 and len(HeaderList[0]) = =3D=3D 2: >- HeaderType =3D HeaderList[0][0] >- HeaderKey =3D HeaderList[0][1] >- if HeaderType in gIncludeMacroConversion: >- Inc =3D gIncludeMacroConversion[HeaderType] %= {"HeaderKey" : >HeaderKey} >- else: >- # not known macro used in #include, always bu= ild the file by >- # returning a empty dependency >- self.FileCache[File] =3D [] >- return [] >- Inc =3D os.path.normpath(Inc) >- CurrentFileDependencyList.append(Inc) >- DepDb[F] =3D CurrentFileDependencyList >- >- CurrentFilePath =3D F.Dir >- PathList =3D [CurrentFilePath] + SearchPathList >- for Inc in CurrentFileDependencyList: >- for SearchPath in PathList: >- FilePath =3D os.path.join(SearchPath, Inc) >- if FilePath in gIsFileMap: >- if not gIsFileMap[FilePath]: >- continue >- # If isfile is called too many times, the performance= is slow down. >- elif not os.path.isfile(FilePath): >- gIsFileMap[FilePath] =3D False >- continue >- else: >- gIsFileMap[FilePath] =3D True >- FilePath =3D PathClass(FilePath) >- FullPathDependList.append(FilePath) >- if FilePath not in DependencySet: >- FileStack.append(FilePath) >- break >- else: >- EdkLogger.debug(EdkLogger.DEBUG_9, "%s included by %s= was not >found "\ >- "in any given path:\n\t%s" % (Inc, F, >"\n\t".join(SearchPathList))) >- >- self.FileCache[F] =3D FullPathDependList >- DependencySet.update(FullPathDependList) >- >- DependencySet.update(ForceList) >- if File in DependencySet: >- DependencySet.remove(File) >- DependencyList =3D list(DependencySet) # remove duplicate ones >- >- return DependencyList > > ## CustomMakefile class > # >@@ -1618,7 +1518,7 @@ cleanlib: > def GetLibraryBuildDirectoryList(self): > DirList =3D [] > for LibraryAutoGen in self._AutoGenObject.LibraryAutoGenList: >- if not LibraryAutoGen.IsBinaryModule and not >LibraryAutoGen.CanSkipbyHash(): >+ if not LibraryAutoGen.IsBinaryModule: > DirList.append(os.path.join(self._AutoGenObject.BuildDir, >LibraryAutoGen.BuildDir)) > return DirList > >@@ -1754,7 +1654,7 @@ class TopLevelMakefile(BuildFile): > def GetLibraryBuildDirectoryList(self): > DirList =3D [] > for LibraryAutoGen in self._AutoGenObject.LibraryAutoGenList: >- if not LibraryAutoGen.IsBinaryModule and not >LibraryAutoGen.CanSkipbyHash(): >+ if not LibraryAutoGen.IsBinaryModule: > DirList.append(os.path.join(self._AutoGenObject.BuildDir, >LibraryAutoGen.BuildDir)) > return DirList > >@@ -1762,3 +1662,108 @@ class TopLevelMakefile(BuildFile): > if __name__ =3D=3D '__main__': > pass > >+## Find dependencies for one source file >+# >+# By searching recursively "#include" directive in file, find out all th= e >+# files needed by given source file. The dependencies will be only searc= hed >+# in given search path list. >+# >+# @param File The source file >+# @param ForceInculeList The list of files which will be included = forcely >+# @param SearchPathList The list of search path >+# >+# @retval list The list of files the given source file d= epends on >+# >+def GetDependencyList(AutoGenObject, FileCache, File, ForceList, >SearchPathList): >+ EdkLogger.debug(EdkLogger.DEBUG_1, "Try to get dependency files for >%s" % File) >+ FileStack =3D [File] + ForceList >+ DependencySet =3D set() >+ >+ if AutoGenObject.Arch not in gDependencyDatabase: >+ gDependencyDatabase[AutoGenObject.Arch] =3D {} >+ DepDb =3D gDependencyDatabase[AutoGenObject.Arch] >+ >+ while len(FileStack) > 0: >+ F =3D FileStack.pop() >+ >+ FullPathDependList =3D [] >+ if F in FileCache: >+ for CacheFile in FileCache[F]: >+ FullPathDependList.append(CacheFile) >+ if CacheFile not in DependencySet: >+ FileStack.append(CacheFile) >+ DependencySet.update(FullPathDependList) >+ continue >+ >+ CurrentFileDependencyList =3D [] >+ if F in DepDb: >+ CurrentFileDependencyList =3D DepDb[F] >+ else: >+ try: >+ Fd =3D open(F.Path, 'rb') >+ FileContent =3D Fd.read() >+ Fd.close() >+ except BaseException as X: >+ EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=3DF= .Path + >"\n\t" + str(X)) >+ if len(FileContent) =3D=3D 0: >+ continue >+ try: >+ if FileContent[0] =3D=3D 0xff or FileContent[0] =3D=3D 0x= fe: >+ FileContent =3D FileContent.decode('utf-16') >+ else: >+ FileContent =3D FileContent.decode() >+ except: >+ # The file is not txt file. for example .mcb file >+ continue >+ IncludedFileList =3D gIncludePattern.findall(FileContent) >+ >+ for Inc in IncludedFileList: >+ Inc =3D Inc.strip() >+ # if there's macro used to reference header file, expand = it >+ HeaderList =3D gMacroPattern.findall(Inc) >+ if len(HeaderList) =3D=3D 1 and len(HeaderList[0]) =3D=3D= 2: >+ HeaderType =3D HeaderList[0][0] >+ HeaderKey =3D HeaderList[0][1] >+ if HeaderType in gIncludeMacroConversion: >+ Inc =3D gIncludeMacroConversion[HeaderType] % {"H= eaderKey" : >HeaderKey} >+ else: >+ # not known macro used in #include, always build = the file by >+ # returning a empty dependency >+ FileCache[File] =3D [] >+ return [] >+ Inc =3D os.path.normpath(Inc) >+ CurrentFileDependencyList.append(Inc) >+ DepDb[F] =3D CurrentFileDependencyList >+ >+ CurrentFilePath =3D F.Dir >+ PathList =3D [CurrentFilePath] + SearchPathList >+ for Inc in CurrentFileDependencyList: >+ for SearchPath in PathList: >+ FilePath =3D os.path.join(SearchPath, Inc) >+ if FilePath in gIsFileMap: >+ if not gIsFileMap[FilePath]: >+ continue >+ # If isfile is called too many times, the performance is = slow down. >+ elif not os.path.isfile(FilePath): >+ gIsFileMap[FilePath] =3D False >+ continue >+ else: >+ gIsFileMap[FilePath] =3D True >+ FilePath =3D PathClass(FilePath) >+ FullPathDependList.append(FilePath) >+ if FilePath not in DependencySet: >+ FileStack.append(FilePath) >+ break >+ else: >+ EdkLogger.debug(EdkLogger.DEBUG_9, "%s included by %s was= not >found "\ >+ "in any given path:\n\t%s" % (Inc, F, >"\n\t".join(SearchPathList))) >+ >+ FileCache[F] =3D FullPathDependList >+ DependencySet.update(FullPathDependList) >+ >+ DependencySet.update(ForceList) >+ if File in DependencySet: >+ DependencySet.remove(File) >+ DependencyList =3D list(DependencySet) # remove duplicate ones >+ >+ return DependencyList >\ No newline at end of file >diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py >b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py >old mode 100644 >new mode 100755 >index 9ecf5c2dbe..613b0d2fb8 >--- a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py >+++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py >@@ -26,6 +26,8 @@ from Workspace.MetaFileCommentParser import >UsageList > from .GenPcdDb import CreatePcdDatabaseCode > from Common.caching import cached_class_function > from AutoGen.ModuleAutoGenHelper import PlatformInfo,WorkSpaceInfo >+from AutoGen.CacheIR import ModuleBuildCacheIR >+import json > > ## Mapping Makefile type > gMakeTypeMap =3D {TAB_COMPILER_MSFT:"nmake", "GCC":"gmake"} >@@ -252,6 +254,8 @@ class ModuleAutoGen(AutoGen): > self.AutoGenDepSet =3D set() > self.ReferenceModules =3D [] > self.ConstPcd =3D {} >+ self.Makefile =3D None >+ self.FileDependCache =3D {} > > def __init_platform_info__(self): > pinfo =3D self.DataPipe.Get("P_Info") >@@ -1608,12 +1612,37 @@ class ModuleAutoGen(AutoGen): > > self.IsAsBuiltInfCreated =3D True > >+ def CacheCopyFile(self, OriginDir, CopyDir, File): >+ sub_dir =3D os.path.relpath(File, CopyDir) >+ destination_file =3D os.path.join(OriginDir, sub_dir) >+ destination_dir =3D os.path.dirname(destination_file) >+ CreateDirectory(destination_dir) >+ try: >+ CopyFileOnChange(File, destination_dir) >+ except: >+ EdkLogger.quiet("[cache warning]: fail to copy file:%s to fol= der:%s" % >(File, destination_dir)) >+ return >+ > def CopyModuleToCache(self): >- FileDir =3D path.join(GlobalData.gBinCacheDest, self.PlatformInfo= .Name, >self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir, >self.MetaFile.BaseName) >+ self.GenPreMakefileHash(GlobalData.gCacheIR) >+ if not (self.MetaFile.Path, self.Arch) in GlobalData.gCacheIR or = \ >+ not GlobalData.gCacheIR[(self.MetaFile.Path, >self.Arch)].PreMakefileHashHexDigest: >+ EdkLogger.quiet("[cache warning]: Cannot generate PreMakefile= Hash >for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) >+ return False >+ >+ self.GenMakeHash(GlobalData.gCacheIR) >+ if not (self.MetaFile.Path, self.Arch) in GlobalData.gCacheIR or = \ >+ not GlobalData.gCacheIR[(self.MetaFile.Path, >self.Arch)].MakeHashChain or \ >+ not GlobalData.gCacheIR[(self.MetaFile.Path, >self.Arch)].MakeHashHexDigest: >+ EdkLogger.quiet("[cache warning]: Cannot generate MakeHashCha= in >for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) >+ return False >+ >+ MakeHashStr =3D str(GlobalData.gCacheIR[(self.MetaFile.Path, >self.Arch)].MakeHashHexDigest) >+ FileDir =3D path.join(GlobalData.gBinCacheDest, >self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self= .Arch, >self.SourceDir, self.MetaFile.BaseName, MakeHashStr) >+ FfsDir =3D path.join(GlobalData.gBinCacheDest, >self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, >TAB_FV_DIRECTORY, "Ffs", self.Guid + self.Name, MakeHashStr) >+ > CreateDirectory (FileDir) >- HashFile =3D path.join(self.BuildDir, self.Name + '.hash') >- if os.path.exists(HashFile): >- CopyFileOnChange(HashFile, FileDir) >+ self.SaveHashChainFileToCache(GlobalData.gCacheIR) > ModuleFile =3D path.join(self.OutputDir, self.Name + '.inf') > if os.path.exists(ModuleFile): > CopyFileOnChange(ModuleFile, FileDir) >@@ -1631,38 +1660,73 @@ class ModuleAutoGen(AutoGen): > CreateDirectory(destination_dir) > CopyFileOnChange(File, destination_dir) > >- def AttemptModuleCacheCopy(self): >- # If library or Module is binary do not skip by hash >- if self.IsBinaryModule: >+ def SaveHashChainFileToCache(self, gDict): >+ if not GlobalData.gBinCacheDest: >+ return False >+ >+ self.GenPreMakefileHash(gDict) >+ if not (self.MetaFile.Path, self.Arch) in gDict or \ >+ not gDict[(self.MetaFile.Path, self.Arch)].PreMakefileHashHexD= igest: >+ EdkLogger.quiet("[cache warning]: Cannot generate PreMakefile= Hash >for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) >+ return False >+ >+ self.GenMakeHash(gDict) >+ if not (self.MetaFile.Path, self.Arch) in gDict or \ >+ not gDict[(self.MetaFile.Path, self.Arch)].MakeHashChain or \ >+ not gDict[(self.MetaFile.Path, self.Arch)].MakeHashHexDigest: >+ EdkLogger.quiet("[cache warning]: Cannot generate MakeHashCha= in >for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) > return False >- # .inc is contains binary information so do not skip by hash as w= ell >- for f_ext in self.SourceFileList: >- if '.inc' in str(f_ext): >- return False >- FileDir =3D path.join(GlobalData.gBinCacheSource, self.PlatformIn= fo.Name, >self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir, >self.MetaFile.BaseName) >- HashFile =3D path.join(FileDir, self.Name + '.hash') >- if os.path.exists(HashFile): >- f =3D open(HashFile, 'r') >- CacheHash =3D f.read() >- f.close() >- self.GenModuleHash() >- if GlobalData.gModuleHash[self.Arch][self.Name]: >- if CacheHash =3D=3D GlobalData.gModuleHash[self.Arch][sel= f.Name]: >- for root, dir, files in os.walk(FileDir): >- for f in files: >- if self.Name + '.hash' in f: >- CopyFileOnChange(HashFile, self.BuildDir) >- else: >- File =3D path.join(root, f) >- sub_dir =3D os.path.relpath(File, FileDir= ) >- destination_file =3D os.path.join(self.Ou= tputDir, sub_dir) >- destination_dir =3D os.path.dirname(desti= nation_file) >- CreateDirectory(destination_dir) >- CopyFileOnChange(File, destination_dir) >- if self.Name =3D=3D "PcdPeim" or self.Name =3D=3D "Pc= dDxe": >- CreatePcdDatabaseCode(self, TemplateString(), >TemplateString()) >- return True >- return False >+ >+ # save the hash chain list as cache file >+ MakeHashStr =3D str(GlobalData.gCacheIR[(self.MetaFile.Path, >self.Arch)].MakeHashHexDigest) >+ CacheDestDir =3D path.join(GlobalData.gBinCacheDest, >self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self= .Arch, >self.SourceDir, self.MetaFile.BaseName) >+ CacheHashDestDir =3D path.join(CacheDestDir, MakeHashStr) >+ ModuleHashPair =3D path.join(CacheDestDir, self.Name + >".ModuleHashPair") >+ MakeHashChain =3D path.join(CacheHashDestDir, self.Name + >".MakeHashChain") >+ ModuleFilesChain =3D path.join(CacheHashDestDir, self.Name + >".ModuleFilesChain") >+ >+ # save the HashChainDict as json file >+ CreateDirectory (CacheDestDir) >+ CreateDirectory (CacheHashDestDir) >+ try: >+ ModuleHashPairList =3D [] # tuple list: [tuple(PreMakefileHas= h, >MakeHash)] >+ if os.path.exists(ModuleHashPair): >+ f =3D open(ModuleHashPair, 'r') >+ ModuleHashPairList =3D json.load(f) >+ f.close() >+ PreMakeHash =3D gDict[(self.MetaFile.Path, >self.Arch)].PreMakefileHashHexDigest >+ MakeHash =3D gDict[(self.MetaFile.Path, >self.Arch)].MakeHashHexDigest >+ ModuleHashPairList.append((PreMakeHash, MakeHash)) >+ ModuleHashPairList =3D list(set(map(tuple, ModuleHashPairList= ))) >+ with open(ModuleHashPair, 'w') as f: >+ json.dump(ModuleHashPairList, f, indent=3D2) >+ except: >+ EdkLogger.quiet("[cache warning]: fail to save ModuleHashPair= file in >cache: %s" % ModuleHashPair) >+ return False >+ >+ try: >+ with open(MakeHashChain, 'w') as f: >+ json.dump(gDict[(self.MetaFile.Path, self.Arch)].MakeHash= Chain, f, >indent=3D2) >+ except: >+ EdkLogger.quiet("[cache warning]: fail to save MakeHashChain = file in >cache: %s" % MakeHashChain) >+ return False >+ >+ try: >+ with open(ModuleFilesChain, 'w') as f: >+ json.dump(gDict[(self.MetaFile.Path, self.Arch)].ModuleFi= lesChain, >f, indent=3D2) >+ except: >+ EdkLogger.quiet("[cache warning]: fail to save ModuleFilesCha= in file in >cache: %s" % ModuleFilesChain) >+ return False >+ >+ # save the autogenfile and makefile for debug usage >+ CacheDebugDir =3D path.join(CacheHashDestDir, "CacheDebug") >+ CreateDirectory (CacheDebugDir) >+ CopyFileOnChange(gDict[(self.MetaFile.Path, self.Arch)].MakefileP= ath, >CacheDebugDir) >+ if gDict[(self.MetaFile.Path, self.Arch)].AutoGenFileList: >+ for File in gDict[(self.MetaFile.Path, self.Arch)].AutoGenFil= eList: >+ CopyFileOnChange(str(File), CacheDebugDir) >+ >+ return True > > ## Create makefile for the module and its dependent libraries > # >@@ -1671,6 +1735,11 @@ class ModuleAutoGen(AutoGen): > # > @cached_class_function > def CreateMakeFile(self, CreateLibraryMakeFile=3DTrue, GenFfsList =3D= []): >+ gDict =3D GlobalData.gCacheIR >+ if (self.MetaFile.Path, self.Arch) in gDict and \ >+ gDict[(self.MetaFile.Path, self.Arch)].CreateMakeFileDone: >+ return >+ > # nest this function inside it's only caller. > def CreateTimeStamp(): > FileSet =3D {self.MetaFile.Path} >@@ -1701,8 +1770,8 @@ class ModuleAutoGen(AutoGen): > for LibraryAutoGen in self.LibraryAutoGenList: > LibraryAutoGen.CreateMakeFile() > >- # Don't enable if hash feature enabled, CanSkip uses timestamps t= o >determine build skipping >- if not GlobalData.gUseHashCache and self.CanSkip(): >+ # CanSkip uses timestamps to determine build skipping >+ if self.CanSkip(): > return > > if len(self.CustomMakefile) =3D=3D 0: >@@ -1718,6 +1787,24 @@ class ModuleAutoGen(AutoGen): > > CreateTimeStamp() > >+ MakefileType =3D Makefile._FileType >+ MakefileName =3D Makefile._FILE_NAME_[MakefileType] >+ MakefilePath =3D os.path.join(self.MakeFileDir, MakefileName) >+ >+ MewIR =3D ModuleBuildCacheIR(self.MetaFile.Path, self.Arch) >+ MewIR.MakefilePath =3D MakefilePath >+ MewIR.DependencyHeaderFileSet =3D >Makefile.DependencyHeaderFileSet >+ MewIR.CreateMakeFileDone =3D True >+ with GlobalData.file_lock: >+ try: >+ IR =3D gDict[(self.MetaFile.Path, self.Arch)] >+ IR.MakefilePath =3D MakefilePath >+ IR.DependencyHeaderFileSet =3D Makefile.DependencyHeaderF= ileSet >+ IR.CreateMakeFileDone =3D True >+ gDict[(self.MetaFile.Path, self.Arch)] =3D IR >+ except: >+ gDict[(self.MetaFile.Path, self.Arch)] =3D MewIR >+ > def CopyBinaryFiles(self): > for File in self.Module.Binaries: > SrcPath =3D File.Path >@@ -1729,6 +1816,11 @@ class ModuleAutoGen(AutoGen): > # dependent libraries will be c= reated > # > def CreateCodeFile(self, CreateLibraryCodeFile=3DTrue): >+ gDict =3D GlobalData.gCacheIR >+ if (self.MetaFile.Path, self.Arch) in gDict and \ >+ gDict[(self.MetaFile.Path, self.Arch)].CreateCodeFileDone: >+ return >+ > if self.IsCodeFileCreated: > return > >@@ -1744,8 +1836,9 @@ class ModuleAutoGen(AutoGen): > if not self.IsLibrary and CreateLibraryCodeFile: > for LibraryAutoGen in self.LibraryAutoGenList: > LibraryAutoGen.CreateCodeFile() >- # Don't enable if hash feature enabled, CanSkip uses timestamps t= o >determine build skipping >- if not GlobalData.gUseHashCache and self.CanSkip(): >+ >+ # CanSkip uses timestamps to determine build skipping >+ if self.CanSkip(): > return > > AutoGenList =3D [] >@@ -1785,6 +1878,16 @@ class ModuleAutoGen(AutoGen): > (" ".join(AutoGenList), " ".join(IgoredAutoGe= nList), self.Name, >self.Arch)) > > self.IsCodeFileCreated =3D True >+ MewIR =3D ModuleBuildCacheIR(self.MetaFile.Path, self.Arch) >+ MewIR.CreateCodeFileDone =3D True >+ with GlobalData.file_lock: >+ try: >+ IR =3D gDict[(self.MetaFile.Path, self.Arch)] >+ IR.CreateCodeFileDone =3D True >+ gDict[(self.MetaFile.Path, self.Arch)] =3D IR >+ except: >+ gDict[(self.MetaFile.Path, self.Arch)] =3D MewIR >+ > return AutoGenList > > ## Summarize the ModuleAutoGen objects of all libraries used by this >module >@@ -1854,46 +1957,468 @@ class ModuleAutoGen(AutoGen): > > return GlobalData.gModuleHash[self.Arch][self.Name].encode('utf-8= ') > >+ def GenModuleFilesHash(self, gDict): >+ # Early exit if module or library has been hashed and is in memor= y >+ if (self.MetaFile.Path, self.Arch) in gDict: >+ if gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesChain: >+ return gDict[(self.MetaFile.Path, self.Arch)] >+ >+ DependencyFileSet =3D set() >+ # Add Module Meta file >+ DependencyFileSet.add(self.MetaFile) >+ >+ # Add Module's source files >+ if self.SourceFileList: >+ for File in set(self.SourceFileList): >+ DependencyFileSet.add(File) >+ >+ # Add modules's include header files >+ # Search dependency file list for each source file >+ SourceFileList =3D [] >+ OutPutFileList =3D [] >+ for Target in self.IntroTargetList: >+ SourceFileList.extend(Target.Inputs) >+ OutPutFileList.extend(Target.Outputs) >+ if OutPutFileList: >+ for Item in OutPutFileList: >+ if Item in SourceFileList: >+ SourceFileList.remove(Item) >+ SearchList =3D [] >+ for file_path in self.IncludePathList + self.BuildOptionIncPathLi= st: >+ # skip the folders in platform BuildDir which are not been ge= nerated >yet >+ if >file_path.startswith(os.path.abspath(self.PlatformInfo.BuildDir)+os.sep): >+ continue >+ SearchList.append(file_path) >+ FileDependencyDict =3D {} >+ ForceIncludedFile =3D [] >+ for F in SourceFileList: >+ # skip the files which are not been generated yet, because >+ # the SourceFileList usually contains intermediate build file= s, e.g. >AutoGen.c >+ if not os.path.exists(F.Path): >+ continue >+ FileDependencyDict[F] =3D GenMake.GetDependencyList(self, >self.FileDependCache, F, ForceIncludedFile, SearchList) >+ >+ if FileDependencyDict: >+ for Dependency in FileDependencyDict.values(): >+ DependencyFileSet.update(set(Dependency)) >+ >+ # Caculate all above dependency files hash >+ # Initialze hash object >+ FileList =3D [] >+ m =3D hashlib.md5() >+ for File in sorted(DependencyFileSet, key=3Dlambda x: str(x)): >+ if not os.path.exists(str(File)): >+ EdkLogger.quiet("[cache warning]: header file %s is missi= ng for >module: %s[%s]" % (File, self.MetaFile.Path, self.Arch)) >+ continue >+ f =3D open(str(File), 'rb') >+ Content =3D f.read() >+ f.close() >+ m.update(Content) >+ FileList.append((str(File), hashlib.md5(Content).hexdigest())= ) >+ >+ >+ MewIR =3D ModuleBuildCacheIR(self.MetaFile.Path, self.Arch) >+ MewIR.ModuleFilesHashDigest =3D m.digest() >+ MewIR.ModuleFilesHashHexDigest =3D m.hexdigest() >+ MewIR.ModuleFilesChain =3D FileList >+ with GlobalData.file_lock: >+ try: >+ IR =3D gDict[(self.MetaFile.Path, self.Arch)] >+ IR.ModuleFilesHashDigest =3D m.digest() >+ IR.ModuleFilesHashHexDigest =3D m.hexdigest() >+ IR.ModuleFilesChain =3D FileList >+ gDict[(self.MetaFile.Path, self.Arch)] =3D IR >+ except: >+ gDict[(self.MetaFile.Path, self.Arch)] =3D MewIR >+ >+ return gDict[(self.MetaFile.Path, self.Arch)] >+ >+ def GenPreMakefileHash(self, gDict): >+ # Early exit if module or library has been hashed and is in memor= y >+ if (self.MetaFile.Path, self.Arch) in gDict and \ >+ gDict[(self.MetaFile.Path, self.Arch)].PreMakefileHashHexDigest= : >+ return gDict[(self.MetaFile.Path, self.Arch)] >+ >+ # skip binary module >+ if self.IsBinaryModule: >+ return >+ >+ if not (self.MetaFile.Path, self.Arch) in gDict or \ >+ not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDige= st: >+ self.GenModuleFilesHash(gDict) >+ >+ if not (self.MetaFile.Path, self.Arch) in gDict or \ >+ not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDige= st: >+ EdkLogger.quiet("[cache warning]: Cannot generate >ModuleFilesHashDigest for module %s[%s]" %(self.MetaFile.Path, self.Arch)) >+ return >+ >+ # Initialze hash object >+ m =3D hashlib.md5() >+ >+ # Add Platform level hash >+ if ('PlatformHash') in gDict: >+ m.update(gDict[('PlatformHash')].encode('utf-8')) >+ else: >+ EdkLogger.quiet("[cache warning]: PlatformHash is missing") >+ >+ # Add Package level hash >+ if self.DependentPackageList: >+ for Pkg in sorted(self.DependentPackageList, key=3Dlambda x: >x.PackageName): >+ if (Pkg.PackageName, 'PackageHash') in gDict: >+ m.update(gDict[(Pkg.PackageName, 'PackageHash')].enco= de('utf- >8')) >+ else: >+ EdkLogger.quiet("[cache warning]: %s PackageHash need= ed by >%s[%s] is missing" %(Pkg.PackageName, self.MetaFile.Name, self.Arch)) >+ >+ # Add Library hash >+ if self.LibraryAutoGenList: >+ for Lib in sorted(self.LibraryAutoGenList, key=3Dlambda x: x.= Name): >+ if not (Lib.MetaFile.Path, Lib.Arch) in gDict or \ >+ not gDict[(Lib.MetaFile.Path, Lib.Arch)].ModuleFilesHa= shDigest: >+ Lib.GenPreMakefileHash(gDict) >+ m.update(gDict[(Lib.MetaFile.Path, >Lib.Arch)].ModuleFilesHashDigest) >+ >+ # Add Module self >+ m.update(gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDi= gest) >+ >+ with GlobalData.file_lock: >+ IR =3D gDict[(self.MetaFile.Path, self.Arch)] >+ IR.PreMakefileHashHexDigest =3D m.hexdigest() >+ gDict[(self.MetaFile.Path, self.Arch)] =3D IR >+ >+ return gDict[(self.MetaFile.Path, self.Arch)] >+ >+ def GenMakeHeaderFilesHash(self, gDict): >+ # Early exit if module or library has been hashed and is in memor= y >+ if (self.MetaFile.Path, self.Arch) in gDict and \ >+ gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHashDiges= t: >+ return gDict[(self.MetaFile.Path, self.Arch)] >+ >+ # skip binary module >+ if self.IsBinaryModule: >+ return >+ >+ if not (self.MetaFile.Path, self.Arch) in gDict or \ >+ not gDict[(self.MetaFile.Path, self.Arch)].CreateCodeFileDone: >+ if self.IsLibrary: >+ if (self.MetaFile.File,self.MetaFile.Root,self.Arch,self.= MetaFile.Path) >in GlobalData.libConstPcd: >+ self.ConstPcd =3D >GlobalData.libConstPcd[(self.MetaFile.File,self.MetaFile.Root,self.Arch,se= lf.M >etaFile.Path)] >+ if (self.MetaFile.File,self.MetaFile.Root,self.Arch,self.= MetaFile.Path) >in GlobalData.Refes: >+ self.ReferenceModules =3D >GlobalData.Refes[(self.MetaFile.File,self.MetaFile.Root,self.Arch,self.Met= aFil >e.Path)] >+ self.CreateCodeFile() >+ if not (self.MetaFile.Path, self.Arch) in gDict or \ >+ not gDict[(self.MetaFile.Path, self.Arch)].CreateMakeFileDone: >+ >self.CreateMakeFile(GenFfsList=3DGlobalData.FfsCmd.get((self.MetaFile.File= , >self.Arch),[])) >+ >+ if not (self.MetaFile.Path, self.Arch) in gDict or \ >+ not gDict[(self.MetaFile.Path, self.Arch)].CreateCodeFileDone = or \ >+ not gDict[(self.MetaFile.Path, self.Arch)].CreateMakeFileDone: >+ EdkLogger.quiet("[cache warning]: Cannot create CodeFile or Ma= kefile >for module %s[%s]" %(self.MetaFile.Path, self.Arch)) >+ return >+ >+ DependencyFileSet =3D set() >+ # Add Makefile >+ if gDict[(self.MetaFile.Path, self.Arch)].MakefilePath: >+ DependencyFileSet.add(gDict[(self.MetaFile.Path, >self.Arch)].MakefilePath) >+ else: >+ EdkLogger.quiet("[cache warning]: makefile is missing for mod= ule >%s[%s]" %(self.MetaFile.Path, self.Arch)) >+ >+ # Add header files >+ if gDict[(self.MetaFile.Path, self.Arch)].DependencyHeaderFileSet= : >+ for File in gDict[(self.MetaFile.Path, >self.Arch)].DependencyHeaderFileSet: >+ DependencyFileSet.add(File) >+ else: >+ EdkLogger.quiet("[cache warning]: No dependency header found = for >module %s[%s]" %(self.MetaFile.Path, self.Arch)) >+ >+ # Add AutoGen files >+ if self.AutoGenFileList: >+ for File in set(self.AutoGenFileList): >+ DependencyFileSet.add(File) >+ >+ # Caculate all above dependency files hash >+ # Initialze hash object >+ FileList =3D [] >+ m =3D hashlib.md5() >+ for File in sorted(DependencyFileSet, key=3Dlambda x: str(x)): >+ if not os.path.exists(str(File)): >+ EdkLogger.quiet("[cache warning]: header file: %s doesn't= exist for >module: %s[%s]" % (File, self.MetaFile.Path, self.Arch)) >+ continue >+ f =3D open(str(File), 'rb') >+ Content =3D f.read() >+ f.close() >+ m.update(Content) >+ FileList.append((str(File), hashlib.md5(Content).hexdigest())= ) >+ >+ with GlobalData.file_lock: >+ IR =3D gDict[(self.MetaFile.Path, self.Arch)] >+ IR.AutoGenFileList =3D self.AutoGenFileList.keys() >+ IR.MakeHeaderFilesHashChain =3D FileList >+ IR.MakeHeaderFilesHashDigest =3D m.digest() >+ gDict[(self.MetaFile.Path, self.Arch)] =3D IR >+ >+ return gDict[(self.MetaFile.Path, self.Arch)] >+ >+ def GenMakeHash(self, gDict): >+ # Early exit if module or library has been hashed and is in memor= y >+ if (self.MetaFile.Path, self.Arch) in gDict and \ >+ gDict[(self.MetaFile.Path, self.Arch)].MakeHashChain: >+ return gDict[(self.MetaFile.Path, self.Arch)] >+ >+ # skip binary module >+ if self.IsBinaryModule: >+ return >+ >+ if not (self.MetaFile.Path, self.Arch) in gDict or \ >+ not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDige= st: >+ self.GenModuleFilesHash(gDict) >+ if not gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHash= Digest: >+ self.GenMakeHeaderFilesHash(gDict) >+ >+ if not (self.MetaFile.Path, self.Arch) in gDict or \ >+ not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDige= st or \ >+ not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesChain or= \ >+ not gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHash= Digest >or \ >+ not gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHash= Chain: >+ EdkLogger.quiet("[cache warning]: Cannot generate ModuleFilesH= ash >or MakeHeaderFilesHash for module %s[%s]" %(self.MetaFile.Path, >self.Arch)) >+ return >+ >+ # Initialze hash object >+ m =3D hashlib.md5() >+ MakeHashChain =3D [] >+ >+ # Add hash of makefile and dependency header files >+ m.update(gDict[(self.MetaFile.Path, >self.Arch)].MakeHeaderFilesHashDigest) >+ New =3D list(set(gDict[(self.MetaFile.Path, >self.Arch)].MakeHeaderFilesHashChain) - set(MakeHashChain)) >+ New.sort(key=3Dlambda x: str(x)) >+ MakeHashChain +=3D New >+ >+ # Add Library hash >+ if self.LibraryAutoGenList: >+ for Lib in sorted(self.LibraryAutoGenList, key=3Dlambda x: x.= Name): >+ if not (Lib.MetaFile.Path, Lib.Arch) in gDict or \ >+ not gDict[(Lib.MetaFile.Path, Lib.Arch)].MakeHashChain= : >+ Lib.GenMakeHash(gDict) >+ if not gDict[(Lib.MetaFile.Path, Lib.Arch)].MakeHashDiges= t: >+ print("Cannot generate MakeHash for lib module:", >Lib.MetaFile.Path, Lib.Arch) >+ continue >+ m.update(gDict[(Lib.MetaFile.Path, Lib.Arch)].MakeHashDig= est) >+ New =3D list(set(gDict[(Lib.MetaFile.Path, Lib.Arch)].Mak= eHashChain) - >set(MakeHashChain)) >+ New.sort(key=3Dlambda x: str(x)) >+ MakeHashChain +=3D New >+ >+ # Add Module self >+ m.update(gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDi= gest) >+ New =3D list(set(gDict[(self.MetaFile.Path, self.Arch)].ModuleFil= esChain) - >set(MakeHashChain)) >+ New.sort(key=3Dlambda x: str(x)) >+ MakeHashChain +=3D New >+ >+ with GlobalData.file_lock: >+ IR =3D gDict[(self.MetaFile.Path, self.Arch)] >+ IR.MakeHashDigest =3D m.digest() >+ IR.MakeHashHexDigest =3D m.hexdigest() >+ IR.MakeHashChain =3D MakeHashChain >+ gDict[(self.MetaFile.Path, self.Arch)] =3D IR >+ >+ return gDict[(self.MetaFile.Path, self.Arch)] >+ >+ ## Decide whether we can skip the left autogen and make process >+ def CanSkipbyPreMakefileCache(self, gDict): >+ if not GlobalData.gBinCacheSource: >+ return False >+ >+ # If Module is binary, do not skip by cache >+ if self.IsBinaryModule: >+ return False >+ >+ # .inc is contains binary information so do not skip by hash as w= ell >+ for f_ext in self.SourceFileList: >+ if '.inc' in str(f_ext): >+ return False >+ >+ # Get the module hash values from stored cache and currrent build >+ # then check whether cache hit based on the hash values >+ # if cache hit, restore all the files from cache >+ FileDir =3D path.join(GlobalData.gBinCacheSource, >self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self= .Arch, >self.SourceDir, self.MetaFile.BaseName) >+ FfsDir =3D path.join(GlobalData.gBinCacheSource, >self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, >TAB_FV_DIRECTORY, "Ffs", self.Guid + self.Name) >+ >+ ModuleHashPairList =3D [] # tuple list: [tuple(PreMakefileHash, >MakeHash)] >+ ModuleHashPair =3D path.join(FileDir, self.Name + ".ModuleHashPai= r") >+ if not os.path.exists(ModuleHashPair): >+ EdkLogger.quiet("[cache warning]: Cannot find ModuleHashPair = file: >%s" % ModuleHashPair) >+ return False >+ >+ try: >+ f =3D open(ModuleHashPair, 'r') >+ ModuleHashPairList =3D json.load(f) >+ f.close() >+ except: >+ EdkLogger.quiet("[cache warning]: fail to load ModuleHashPair= file: >%s" % ModuleHashPair) >+ return False >+ >+ self.GenPreMakefileHash(gDict) >+ if not (self.MetaFile.Path, self.Arch) in gDict or \ >+ not gDict[(self.MetaFile.Path, self.Arch)].PreMakefileHashHexD= igest: >+ EdkLogger.quiet("[cache warning]: PreMakefileHashHexDigest is >missing for module %s[%s]" %(self.MetaFile.Path, self.Arch)) >+ return False >+ >+ MakeHashStr =3D None >+ CurrentPreMakeHash =3D gDict[(self.MetaFile.Path, >self.Arch)].PreMakefileHashHexDigest >+ for idx, (PreMakefileHash, MakeHash) in enumerate >(ModuleHashPairList): >+ if PreMakefileHash =3D=3D CurrentPreMakeHash: >+ MakeHashStr =3D str(MakeHash) >+ >+ if not MakeHashStr: >+ return False >+ >+ TargetHashDir =3D path.join(FileDir, MakeHashStr) >+ TargetFfsHashDir =3D path.join(FfsDir, MakeHashStr) >+ >+ if not os.path.exists(TargetHashDir): >+ EdkLogger.quiet("[cache warning]: Cache folder is missing: %s= " % >TargetHashDir) >+ return False >+ >+ for root, dir, files in os.walk(TargetHashDir): >+ for f in files: >+ File =3D path.join(root, f) >+ self.CacheCopyFile(self.OutputDir, TargetHashDir, File) >+ if os.path.exists(TargetFfsHashDir): >+ for root, dir, files in os.walk(TargetFfsHashDir): >+ for f in files: >+ File =3D path.join(root, f) >+ self.CacheCopyFile(self.FfsOutputDir, TargetFfsHashDi= r, File) >+ >+ if self.Name =3D=3D "PcdPeim" or self.Name =3D=3D "PcdDxe": >+ CreatePcdDatabaseCode(self, TemplateString(), TemplateString(= )) >+ >+ with GlobalData.file_lock: >+ IR =3D gDict[(self.MetaFile.Path, self.Arch)] >+ IR.PreMakeCacheHit =3D True >+ gDict[(self.MetaFile.Path, self.Arch)] =3D IR >+ print("[cache hit]: checkpoint_PreMakefile:", self.MetaFile.Path, >self.Arch) >+ #EdkLogger.quiet("cache hit: %s[%s]" % (self.MetaFile.Path, self.= Arch)) >+ return True >+ >+ ## Decide whether we can skip the make process >+ def CanSkipbyMakeCache(self, gDict): >+ if not GlobalData.gBinCacheSource: >+ return False >+ >+ # If Module is binary, do not skip by cache >+ if self.IsBinaryModule: >+ print("[cache miss]: checkpoint_Makefile: binary module:", >self.MetaFile.Path, self.Arch) >+ return False >+ >+ # .inc is contains binary information so do not skip by hash as w= ell >+ for f_ext in self.SourceFileList: >+ if '.inc' in str(f_ext): >+ with GlobalData.file_lock: >+ IR =3D gDict[(self.MetaFile.Path, self.Arch)] >+ IR.MakeCacheHit =3D False >+ gDict[(self.MetaFile.Path, self.Arch)] =3D IR >+ print("[cache miss]: checkpoint_Makefile: .inc module:", >self.MetaFile.Path, self.Arch) >+ return False >+ >+ # Get the module hash values from stored cache and currrent build >+ # then check whether cache hit based on the hash values >+ # if cache hit, restore all the files from cache >+ FileDir =3D path.join(GlobalData.gBinCacheSource, >self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self= .Arch, >self.SourceDir, self.MetaFile.BaseName) >+ FfsDir =3D path.join(GlobalData.gBinCacheSource, >self.PlatformInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, >TAB_FV_DIRECTORY, "Ffs", self.Guid + self.Name) >+ >+ ModuleHashPairList =3D [] # tuple list: [tuple(PreMakefileHash, >MakeHash)] >+ ModuleHashPair =3D path.join(FileDir, self.Name + ".ModuleHashPai= r") >+ if not os.path.exists(ModuleHashPair): >+ EdkLogger.quiet("[cache warning]: Cannot find ModuleHashPair = file: >%s" % ModuleHashPair) >+ return False >+ >+ try: >+ f =3D open(ModuleHashPair, 'r') >+ ModuleHashPairList =3D json.load(f) >+ f.close() >+ except: >+ EdkLogger.quiet("[cache warning]: fail to load ModuleHashPair= file: >%s" % ModuleHashPair) >+ return False >+ >+ self.GenMakeHash(gDict) >+ if not (self.MetaFile.Path, self.Arch) in gDict or \ >+ not gDict[(self.MetaFile.Path, self.Arch)].MakeHashHexDigest: >+ EdkLogger.quiet("[cache warning]: MakeHashHexDigest is missin= g for >module %s[%s]" %(self.MetaFile.Path, self.Arch)) >+ return False >+ >+ MakeHashStr =3D None >+ CurrentMakeHash =3D gDict[(self.MetaFile.Path, >self.Arch)].MakeHashHexDigest >+ for idx, (PreMakefileHash, MakeHash) in enumerate >(ModuleHashPairList): >+ if MakeHash =3D=3D CurrentMakeHash: >+ MakeHashStr =3D str(MakeHash) >+ >+ if not MakeHashStr: >+ print("[cache miss]: checkpoint_Makefile:", self.MetaFile.Pat= h, >self.Arch) >+ return False >+ >+ TargetHashDir =3D path.join(FileDir, MakeHashStr) >+ TargetFfsHashDir =3D path.join(FfsDir, MakeHashStr) >+ if not os.path.exists(TargetHashDir): >+ EdkLogger.quiet("[cache warning]: Cache folder is missing: %s= " % >TargetHashDir) >+ return False >+ >+ for root, dir, files in os.walk(TargetHashDir): >+ for f in files: >+ File =3D path.join(root, f) >+ self.CacheCopyFile(self.OutputDir, TargetHashDir, File) >+ >+ if os.path.exists(TargetFfsHashDir): >+ for root, dir, files in os.walk(TargetFfsHashDir): >+ for f in files: >+ File =3D path.join(root, f) >+ self.CacheCopyFile(self.FfsOutputDir, TargetFfsHashDi= r, File) >+ >+ if self.Name =3D=3D "PcdPeim" or self.Name =3D=3D "PcdDxe": >+ CreatePcdDatabaseCode(self, TemplateString(), TemplateString(= )) >+ with GlobalData.file_lock: >+ IR =3D gDict[(self.MetaFile.Path, self.Arch)] >+ IR.MakeCacheHit =3D True >+ gDict[(self.MetaFile.Path, self.Arch)] =3D IR >+ print("[cache hit]: checkpoint_Makefile:", self.MetaFile.Path, se= lf.Arch) >+ return True >+ > ## Decide whether we can skip the ModuleAutoGen process >- def CanSkipbyHash(self): >+ def CanSkipbyCache(self, gDict): > # Hashing feature is off >- if not GlobalData.gUseHashCache: >+ if not GlobalData.gBinCacheSource: > return False > >- # Initialize a dictionary for each arch type >- if self.Arch not in GlobalData.gBuildHashSkipTracking: >- GlobalData.gBuildHashSkipTracking[self.Arch] =3D dict() >+ if self in GlobalData.gBuildHashSkipTracking: >+ return GlobalData.gBuildHashSkipTracking[self] > > # If library or Module is binary do not skip by hash > if self.IsBinaryModule: >+ GlobalData.gBuildHashSkipTracking[self] =3D False > return False > > # .inc is contains binary information so do not skip by hash as w= ell > for f_ext in self.SourceFileList: > if '.inc' in str(f_ext): >+ GlobalData.gBuildHashSkipTracking[self] =3D False > return False > >- # Use Cache, if exists and if Module has a copy in cache >- if GlobalData.gBinCacheSource and self.AttemptModuleCacheCopy(): >+ if not (self.MetaFile.Path, self.Arch) in gDict: >+ return False >+ >+ if gDict[(self.MetaFile.Path, self.Arch)].PreMakeCacheHit: >+ GlobalData.gBuildHashSkipTracking[self] =3D True > return True > >- # Early exit for libraries that haven't yet finished building >- HashFile =3D path.join(self.BuildDir, self.Name + ".hash") >- if self.IsLibrary and not os.path.exists(HashFile): >- return False >+ if gDict[(self.MetaFile.Path, self.Arch)].MakeCacheHit: >+ GlobalData.gBuildHashSkipTracking[self] =3D True >+ return True > >- # Return a Boolean based on if can skip by hash, either from memo= ry or >from IO. >- if self.Name not in GlobalData.gBuildHashSkipTracking[self.Arch]: >- # If hashes are the same, SaveFileOnChange() will return Fals= e. >- GlobalData.gBuildHashSkipTracking[self.Arch][self.Name] =3D n= ot >SaveFileOnChange(HashFile, self.GenModuleHash(), True) >- return GlobalData.gBuildHashSkipTracking[self.Arch][self.Name= ] >- else: >- return GlobalData.gBuildHashSkipTracking[self.Arch][self.Name= ] >+ return False > > ## Decide whether we can skip the ModuleAutoGen process > # If any source file is newer than the module than we cannot skip > # > def CanSkip(self): >+ # Don't skip if cache feature enabled >+ if GlobalData.gUseHashCache or GlobalData.gBinCacheDest or >GlobalData.gBinCacheSource: >+ return False > if self.MakeFileDir in GlobalData.gSikpAutoGenCache: > return True > if not os.path.exists(self.TimeStampPath): >diff --git a/BaseTools/Source/Python/Common/GlobalData.py >b/BaseTools/Source/Python/Common/GlobalData.py >old mode 100644 >new mode 100755 >index bd45a43728..452dca32f0 >--- a/BaseTools/Source/Python/Common/GlobalData.py >+++ b/BaseTools/Source/Python/Common/GlobalData.py >@@ -119,3 +119,12 @@ gModuleBuildTracking =3D dict() > # Top Dict: Key: Arch Type Value: Dictionary > # Second Dict: Key: Module\Library Name Value: True\False > gBuildHashSkipTracking =3D dict() >+ >+# Common dictionary to share module cache intermediate result and state >+gCacheIR =3D None >+# Common lock for the file access in multiple process AutoGens >+file_lock =3D None >+# Common dictionary to share platform libraries' constant Pcd >+libConstPcd =3D None >+# Common dictionary to share platform libraries' reference info >+Refes =3D None >\ No newline at end of file >diff --git a/BaseTools/Source/Python/build/build.py >b/BaseTools/Source/Python/build/build.py >old mode 100644 >new mode 100755 >index 4bfa54666b..d7c817b95c >--- a/BaseTools/Source/Python/build/build.py >+++ b/BaseTools/Source/Python/build/build.py >@@ -595,7 +595,7 @@ class BuildTask: > # > def AddDependency(self, Dependency): > for Dep in Dependency: >- if not Dep.BuildObject.IsBinaryModule and not >Dep.BuildObject.CanSkipbyHash(): >+ if not Dep.BuildObject.IsBinaryModule and not >Dep.BuildObject.CanSkipbyCache(GlobalData.gCacheIR): > self.DependencyList.append(BuildTask.New(Dep)) # Build= Task list > > ## The thread wrapper of LaunchCommand function >@@ -811,7 +811,7 @@ class Build(): > self.AutoGenMgr =3D None > EdkLogger.info("") > os.chdir(self.WorkspaceDir) >- self.share_data =3D Manager().dict() >+ GlobalData.gCacheIR =3D Manager().dict() > self.log_q =3D log_q > def StartAutoGen(self,mqueue, >DataPipe,SkipAutoGen,PcdMaList,share_data): > try: >@@ -820,6 +820,13 @@ class Build(): > feedback_q =3D mp.Queue() > file_lock =3D mp.Lock() > error_event =3D mp.Event() >+ GlobalData.file_lock =3D file_lock >+ FfsCmd =3D DataPipe.Get("FfsCommand") >+ if FfsCmd is None: >+ FfsCmd =3D {} >+ GlobalData.FfsCmd =3D FfsCmd >+ GlobalData.libConstPcd =3D DataPipe.Get("LibConstPcd") >+ GlobalData.Refes =3D DataPipe.Get("REFS") > auto_workers =3D >[AutoGenWorkerInProcess(mqueue,DataPipe.dump_file,feedback_q,file_loc >k,share_data,self.log_q,error_event) for _ in range(self.ThreadNumber)] > self.AutoGenMgr =3D >AutoGenManager(auto_workers,feedback_q,error_event) > self.AutoGenMgr.start() >@@ -827,14 +834,28 @@ class Build(): > w.start() > if PcdMaList is not None: > for PcdMa in PcdMaList: >+ if GlobalData.gBinCacheSource and self.Target in [Non= e, "", "all"]: >+ PcdMa.GenModuleFilesHash(share_data) >+ PcdMa.GenPreMakefileHash(share_data) >+ if PcdMa.CanSkipbyPreMakefileCache(share_data): >+ continue >+ > PcdMa.CreateCodeFile(False) > PcdMa.CreateMakeFile(False,GenFfsList =3D >DataPipe.Get("FfsCommand").get((PcdMa.MetaFile.File, PcdMa.Arch),[])) > >+ if GlobalData.gBinCacheSource and self.Target in [Non= e, "", "all"]: >+ PcdMa.GenMakeHeaderFilesHash(share_data) >+ PcdMa.GenMakeHash(share_data) >+ if PcdMa.CanSkipbyMakeCache(share_data): >+ continue >+ > self.AutoGenMgr.join() > rt =3D self.AutoGenMgr.Status > return rt, 0 >- except Exception as e: >- return False,e.errcode >+ except FatalError as e: >+ return False, e.args[0] >+ except: >+ return False, UNKNOWN_ERROR > > ## Load configuration > # >@@ -1199,10 +1220,11 @@ class Build(): > mqueue.put(m) > > AutoGenObject.DataPipe.DataContainer =3D >{"FfsCommand":FfsCommand} >+ AutoGenObject.DataPipe.DataContainer =3D {"CommandTarget": >self.Target} > self.Progress.Start("Generating makefile and code") > data_pipe_file =3D os.path.join(AutoGenObject.BuildDir, >"GlobalVar_%s_%s.bin" % (str(AutoGenObject.Guid),AutoGenObject.Arch)) > AutoGenObject.DataPipe.dump(data_pipe_file) >- autogen_rt, errorcode =3D self.StartAutoGen(mqueue, >AutoGenObject.DataPipe, self.SkipAutoGen, PcdMaList,self.share_data) >+ autogen_rt,errorcode =3D self.StartAutoGen(mqueue, >AutoGenObject.DataPipe, self.SkipAutoGen, PcdMaList, >GlobalData.gCacheIR) > self.Progress.Stop("done!") > if not autogen_rt: > self.AutoGenMgr.TerminateWorkers() >@@ -1799,6 +1821,15 @@ class Build(): > CmdListDict =3D None > if GlobalData.gEnableGenfdsMultiThread and self.Fdf: > CmdListDict =3D self._GenFfsCmd(Wa.ArchList) >+ >+ # Add Platform and Package level hash in share_data for m= odule >hash calculation later >+ if GlobalData.gBinCacheSource or GlobalData.gBinCacheDest= : >+ GlobalData.gCacheIR[('PlatformHash')] =3D >GlobalData.gPlatformHash >+ for PkgName in GlobalData.gPackageHash.keys(): >+ GlobalData.gCacheIR[(PkgName, 'PackageHash')] =3D >GlobalData.gPackageHash[PkgName] >+ GlobalData.file_lock =3D mp.Lock() >+ GlobalData.FfsCmd =3D CmdListDict >+ > self.Progress.Stop("done!") > MaList =3D [] > ExitFlag =3D threading.Event() >@@ -1808,20 +1839,23 @@ class Build(): > AutoGenStart =3D time.time() > GlobalData.gGlobalDefines['ARCH'] =3D Arch > Pa =3D PlatformAutoGen(Wa, self.PlatformFile, BuildTa= rget, >ToolChain, Arch) >+ GlobalData.libConstPcd =3D Pa.DataPipe.Get("LibConstP= cd") >+ GlobalData.Refes =3D Pa.DataPipe.Get("REFS") > for Module in Pa.Platform.Modules: > if self.ModuleFile.Dir =3D=3D Module.Dir and self= .ModuleFile.Name >=3D=3D Module.Name: > Ma =3D ModuleAutoGen(Wa, Module, BuildTarget,= ToolChain, >Arch, self.PlatformFile,Pa.DataPipe) > if Ma is None: > continue > MaList.append(Ma) >- if Ma.CanSkipbyHash(): >- self.HashSkipModules.append(Ma) >- if GlobalData.gBinCacheSource: >- EdkLogger.quiet("cache hit: %s[%s]" %= (Ma.MetaFile.Path, >Ma.Arch)) >- continue >- else: >- if GlobalData.gBinCacheSource: >- EdkLogger.quiet("cache miss: %s[%s]" = % >(Ma.MetaFile.Path, Ma.Arch)) >+ >+ if GlobalData.gBinCacheSource and self.Target= in [None, "", >"all"]: >+ Ma.GenModuleFilesHash(GlobalData.gCacheIR= ) >+ Ma.GenPreMakefileHash(GlobalData.gCacheIR= ) >+ if Ma.CanSkipbyPreMakefileCache(GlobalDat= a.gCacheIR): >+ self.HashSkipModules.append(Ma) >+ EdkLogger.quiet("cache hit: %s[%s]" % = (Ma.MetaFile.Path, >Ma.Arch)) >+ continue >+ > # Not to auto-gen for targets 'clean', 'clean= lib', 'cleanall', 'run', >'fds' > if self.Target not in ['clean', 'cleanlib', '= cleanall', 'run', 'fds']: > # for target which must generate AutoGen = code and makefile >@@ -1841,6 +1875,18 @@ class Build(): > self.Progress.Stop("done!") > if self.Target =3D=3D "genmake": > return True >+ >+ if GlobalData.gBinCacheSource and self.Ta= rget in [None, "", >"all"]: >+ Ma.GenMakeHeaderFilesHash(GlobalData.= gCacheIR) >+ Ma.GenMakeHash(GlobalData.gCacheIR) >+ if Ma.CanSkipbyMakeCache(GlobalData.g= CacheIR): >+ self.HashSkipModules.append(Ma) >+ EdkLogger.quiet("cache hit: %s[%s= ]" % >(Ma.MetaFile.Path, Ma.Arch)) >+ continue >+ else: >+ EdkLogger.quiet("cache miss: %s[%= s]" % >(Ma.MetaFile.Path, Ma.Arch)) >+ Ma.PrintFirstMakeCacheMissFile(Gl= obalData.gCacheIR) >+ > self.BuildModules.append(Ma) > # Initialize all modules in tracking to 'FAIL= ' > if Ma.Arch not in GlobalData.gModuleBuildTrac= king: >@@ -1985,11 +2031,18 @@ class Build(): > if GlobalData.gEnableGenfdsMultiThread and self.Fdf: > CmdListDict =3D self._GenFfsCmd(Wa.ArchList) > >+ # Add Platform and Package level hash in share_data for m= odule >hash calculation later >+ if GlobalData.gBinCacheSource or GlobalData.gBinCacheDest= : >+ GlobalData.gCacheIR[('PlatformHash')] =3D >GlobalData.gPlatformHash >+ for PkgName in GlobalData.gPackageHash.keys(): >+ GlobalData.gCacheIR[(PkgName, 'PackageHash')] =3D >GlobalData.gPackageHash[PkgName] >+ > # multi-thread exit flag > ExitFlag =3D threading.Event() > ExitFlag.clear() > self.AutoGenTime +=3D int(round((time.time() - >WorkspaceAutoGenTime))) > self.BuildModules =3D [] >+ TotalModules =3D [] > for Arch in Wa.ArchList: > PcdMaList =3D [] > AutoGenStart =3D time.time() >@@ -2009,6 +2062,7 @@ class Build(): > ModuleList.append(Inf) > Pa.DataPipe.DataContainer =3D {"FfsCommand":CmdListDi= ct} > Pa.DataPipe.DataContainer =3D {"Workspace_timestamp": >Wa._SrcTimeStamp} >+ Pa.DataPipe.DataContainer =3D {"CommandTarget": self.= Target} > for Module in ModuleList: > # Get ModuleAutoGen object to generate C code fil= e and >makefile > Ma =3D ModuleAutoGen(Wa, Module, BuildTarget, Too= lChain, Arch, >self.PlatformFile,Pa.DataPipe) >@@ -2019,30 +2073,34 @@ class Build(): > Ma.PlatformInfo =3D Pa > Ma.Workspace =3D Wa > PcdMaList.append(Ma) >- if Ma.CanSkipbyHash(): >- self.HashSkipModules.append(Ma) >- if GlobalData.gBinCacheSource: >- EdkLogger.quiet("cache hit: %s[%s]" % (Ma= .MetaFile.Path, >Ma.Arch)) >- continue >- else: >- if GlobalData.gBinCacheSource: >- EdkLogger.quiet("cache miss: %s[%s]" % (M= a.MetaFile.Path, >Ma.Arch)) >- >- # Not to auto-gen for targets 'clean', 'cleanlib'= , 'cleanall', 'run', >'fds' >- # for target which must generate AutoGen code= and makefile >- >- self.BuildModules.append(Ma) >+ TotalModules.append(Ma) > # Initialize all modules in tracking to 'FAIL' > if Ma.Arch not in GlobalData.gModuleBuildTracking= : > GlobalData.gModuleBuildTracking[Ma.Arch] =3D = dict() > if Ma not in GlobalData.gModuleBuildTracking[Ma.A= rch]: > GlobalData.gModuleBuildTracking[Ma.Arch][Ma] = =3D 'FAIL' >+ > mqueue =3D mp.Queue() > for m in Pa.GetAllModuleInfo: > mqueue.put(m) > data_pipe_file =3D os.path.join(Pa.BuildDir, "GlobalV= ar_%s_%s.bin" % >(str(Pa.Guid),Pa.Arch)) > Pa.DataPipe.dump(data_pipe_file) >- autogen_rt, errorcode =3D self.StartAutoGen(mqueue, P= a.DataPipe, >self.SkipAutoGen, PcdMaList,self.share_data) >+ autogen_rt, errorcode =3D self.StartAutoGen(mqueue, P= a.DataPipe, >self.SkipAutoGen, PcdMaList, GlobalData.gCacheIR) >+ >+ # Skip cache hit modules >+ if GlobalData.gBinCacheSource: >+ for Ma in TotalModules: >+ if (Ma.MetaFile.Path, Ma.Arch) in GlobalData.= gCacheIR and \ >+ GlobalData.gCacheIR[(Ma.MetaFile.Path, >Ma.Arch)].PreMakeCacheHit: >+ self.HashSkipModules.append(Ma) >+ continue >+ if (Ma.MetaFile.Path, Ma.Arch) in GlobalData.= gCacheIR and \ >+ GlobalData.gCacheIR[(Ma.MetaFile.Path, >Ma.Arch)].MakeCacheHit: >+ self.HashSkipModules.append(Ma) >+ continue >+ self.BuildModules.append(Ma) >+ else: >+ self.BuildModules.extend(TotalModules) > > if not autogen_rt: > self.AutoGenMgr.TerminateWorkers() >@@ -2050,9 +2108,24 @@ class Build(): > raise FatalError(errorcode) > self.AutoGenTime +=3D int(round((time.time() - AutoGenSta= rt))) > self.Progress.Stop("done!") >+ >+ if GlobalData.gBinCacheSource: >+ EdkLogger.quiet("Total cache hit driver num: %s, cach= e miss driver >num: %s" % (len(set(self.HashSkipModules)), len(set(self.BuildModules)))) >+ CacheHitMa =3D set() >+ CacheNotHitMa =3D set() >+ for IR in GlobalData.gCacheIR.keys(): >+ if 'PlatformHash' in IR or 'PackageHash' in IR: >+ continue >+ if GlobalData.gCacheIR[IR].PreMakeCacheHit or >GlobalData.gCacheIR[IR].MakeCacheHit: >+ CacheHitMa.add(IR) >+ else: >+ # There might be binary module or module whic= h has .inc files, >not count for cache miss >+ CacheNotHitMa.add(IR) >+ EdkLogger.quiet("Total module num: %s, cache hit modu= le num: >%s" % (len(CacheHitMa)+len(CacheNotHitMa), len(CacheHitMa))) >+ > for Arch in Wa.ArchList: > MakeStart =3D time.time() >- for Ma in self.BuildModules: >+ for Ma in set(self.BuildModules): > # Generate build task for the module > if not Ma.IsBinaryModule: > Bt =3D BuildTask.New(ModuleMakeUnit(Ma, >Pa.BuildCommand,self.Target)) >-- >2.17.1